EP2747414B1 - Monitoring method and camera - Google Patents

Monitoring method and camera Download PDF

Info

Publication number
EP2747414B1
EP2747414B1 EP12197677.3A EP12197677A EP2747414B1 EP 2747414 B1 EP2747414 B1 EP 2747414B1 EP 12197677 A EP12197677 A EP 12197677A EP 2747414 B1 EP2747414 B1 EP 2747414B1
Authority
EP
European Patent Office
Prior art keywords
camera
camera settings
settings
overlapping region
control area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP12197677.3A
Other languages
German (de)
English (en)
French (fr)
Other versions
EP2747414A1 (en
Inventor
Johan NYSTRÖM
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Axis AB
Original Assignee
Axis AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Axis AB filed Critical Axis AB
Priority to EP12197677.3A priority Critical patent/EP2747414B1/en
Priority to TW102143895A priority patent/TWI539227B/zh
Priority to CN201310659742.0A priority patent/CN103873765B/zh
Priority to JP2013254373A priority patent/JP5546676B2/ja
Priority to KR1020130158699A priority patent/KR101578499B1/ko
Priority to US14/133,105 priority patent/US9270893B2/en
Publication of EP2747414A1 publication Critical patent/EP2747414A1/en
Application granted granted Critical
Publication of EP2747414B1 publication Critical patent/EP2747414B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/673Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/188Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position

Definitions

  • the present invention relates to a monitoring camera and a method for controlling camera settings.
  • Monitoring cameras are used in many different applications, both indoors and outdoors, to monitor a variety of environments. In order to receive useful images of good quality from such a camera, it is important to use appropriate settings that are correctly adapted to the present conditions of the monitored scene and which allow an operator of the camera to see any events of interest within the monitored scene. Some cameras are able to monitor different parts of a scene by changing or moving a field of view e.g. by panning, tilting or zooming the camera. For such cameras it may be even more challenging to find the correct settings since e.g. the lighting conditions of the monitored environment may change as the camera moves.
  • a method of controlling camera settings for a monitoring camera arranged to monitor a scene comprises the steps of accessing data defining a camera settings control area within the scene, determining if there is an overlapping region between a current field of view of the camera and the camera settings control area, and if there is an overlapping region, controlling camera settings based on image data captured by the monitoring camera in the overlapping region.
  • the camera settings may include at least one of: focus settings, exposure settings, white balance settings, IR cut filter settings, iris control settings and settings for an illumination unit.
  • the brightness of the image may also be influenced by an IR cut filter mode, e.g. if it is on or off, and the iris control, e.g. in the form of f-number. Additionally or as an alternative, the brightness may also be improved by increasing or decreasing illumination from an illumination unit. Controlling white balance and gain settings based on the overlapping region will also improve the image quality of that region.
  • the step of controlling camera settings may comprise automatically controlling camera settings based on image data captured by the monitoring camera in the overlapping region. This would e.g. mean that an autofocus algorithm is instructed to only use image data from the overlapping region to control the autofocus setting. In other words, image data from the overlapping region is used as input to automatic procedures for determining camera settings which are used for the whole image within the current field of view.
  • the step of controlling camera settings may additionally or alternatively comprise accessing data relating to predefined camera settings values related to the camera settings control area, and controlling camera settings according to the predefined camera settings values. This may be useful when a user would like to specifically set e.g. a focus distance to a certain value once a camera settings control area is within the current field of view. It could e.g. be used when there are trees standing in front of an entrance, and the focus distance may then be set to the distance to the entrance, making sure that an autofocus algorithm does not use the trunks of the trees to set the focus distance.
  • camera settings which influence the entire image may in this way be set in a way that makes sure that the overlapping region is depicted with good image quality.
  • the step of accessing data may comprise accessing data defining a selection of camera settings to be controlled, and the step of controlling camera settings may comprise controlling the defined selection of camera settings.
  • the step of accessing data defining a camera settings control area may comprise accessing data defining the camera settings control area in a coordinate system for the scene, and the step of determining if there is an overlapping region may comprises comparing coordinates of the camera settings control area to coordinates of the current field of view in the coordinate system for the scene.
  • the step of defining a camera settings control area may comprise defining a first and a second camera settings control area
  • the step of determining an overlapping region may comprise determining if there is a first and a second overlapping region corresponding to the first and the second camera settings control area, respectively, and, if there is a first and second overlapping region
  • the step of controlling camera settings may comprise selecting one of the first and the second overlapping region based on region properties for the first and the second overlapping region, and controlling camera settings based on the selected overlapping region, wherein the region properties include at least one of: size of region, amount of detected motion in region, a priority setting for the camera settings control area corresponding to the respective region.
  • a monitoring camera arranged to monitor a scene which comprises a field of view altering unit arranged to alter a field of view of the camera such that the camera is able to capture images of different parts of a scene, a data input arranged to receive data defining a camera settings control area within the scene, an overlap determining unit arranged to determine if there is an overlapping region between a current field of view of the camera and the camera settings control area, and a camera settings control unit arranged to control camera settings based on the overlapping region.
  • the field of view altering unit may comprise a motor arranged to alter the field of view of the camera in at least one of a pan, tilt or zoom direction.
  • the pan, tilt or zoom may be controlled in any suitable manner, e.g. by receiving input from a user via a joystick or by receiving input relating to a field of view altering scheme, such as a guard tour.
  • the data input may be arranged to receive data defining a selection of camera settings to be controlled, and the camera settings control unit may be arranged to control the defined selection of camera settings.
  • the data defining a camera settings control area and/or the data defining a selection of camera settings to be controlled may be based on user input.
  • the user input may e.g. be via a graphical user input where the user draws shapes around any area in the current view of the scene which are to be used as a camera settings control area.
  • the user may also be allowed to move the current field of view of the camera during set-up of the camera settings control areas to be able to define another camera settings control area for another part of the scene.
  • the selection of camera settings to be controlled based on one or more of the camera settings control areas may e.g. be selected by ticking a box in a user interface or by selecting from a drop-down list. It would possible to select some manual settings, e.g.
  • the data input may be arranged to receive data defining the camera settings control area in a coordinate system for the scene, and the overlap determining unit may be arranged to compare coordinates of the camera settings control area to coordinates of the current field of view.
  • the data input may be arranged to receive data defining a first and a second camera settings control area
  • the overlap determining unit may be arranged to determine if there is a first and a second overlapping region corresponding to the first and the second camera settings control area, respectively, and, if there is a first and second overlapping region, the camera settings control unit may be arranged to select one of the first and the second overlapping region based on region properties for the first and the second overlapping region, and to control camera settings based on the selected overlapping region, wherein the region properties include at least one of: size of region, amount of detected motion in region, a priority setting for the camera settings control area corresponding to the respective region.
  • a computer-readable recording medium having recorded thereon a program for implementing the herein described method when executed on a device having processing capabilities.
  • Fig. 1 illustrates a scene 1 with an office building 3, a number of cars 5 driving by, trees 7 standing next to the building 3 and a car park 9 where some cars are parked and which is lit by a lamp post 11.
  • the scene is monitored by a monitoring camera 13, details of which are shown in Fig. 2 .
  • the camera 13 is able to move or shift, or in other words alter, change or adjust, its field of view by panning or tilting, or by zooming using some type of zoom mechanism such as an adjustable zoom lens.
  • a current field of view of the camera 13 is illustrated by the rectangle 15.
  • the camera 13 By altering its field of view 15, the camera 13 covers varying parts of the scene 1.
  • the camera 13 may change how much of the scene 1 that is covered; i.e. when zooming in using a telephoto setting, a smaller part of the scene 1 will be covered, and when zooming out to a wide-angle setting, a larger part of the scene 1 will be covered.
  • the camera may be a so called PTZ camera, but it may also be capable of altering its field of view in only one of the pan, tilt or zoom "dimensions", or in any two of those.
  • the camera 13 is usually mounted in a set position and has a movable field of view and it is therefore able to determine a position of its current field of view within a coordinate system for the scene 1.
  • a coordinate system may sometimes be denoted a common coordinate system or an absolute coordinate system.
  • the camera settings used when capturing images are dynamically adapted based on the captured image in order to achieve a good image quality. Different algorithms may be used for this. If motion is detected in part of the current field of view, the camera settings may be adjusted to give a good image quality in that part of the field of view. Another common algorithm is based on detecting contrast in the image and then choosing the focus distance which gives the largest contrast for the overall field of view.
  • the current field of view may cover a part of the scene where some portion of the field of view is given an inappropriate attention in the settings. This may e.g. happen if motion is detected in an "uninteresting" part of the scene covered by the current field of view and may cause more important parts covered to e.g. become blurry and less useful for surveillance purposes.
  • Such a situation may occur when the current field of view 15 is positioned as shown in Fig. 1 .
  • moving cars are present in the lower part of the field of view, but the most important part to watch is actually around the entrance 17 to the office building 3.
  • the camera were to use the fact that motion is detected in lower part of the currently captured image, which also is closer to the camera than the office building 3, and set e.g. focus and exposure based on that part, the area around the entrance 17 may be captured with less sharpness and additionally be too dark or to bright.
  • the camera were to use a contrast-aided method for setting autofocus, it may be the case that the tree trunks of the trees 7 standing next to the building 3 are given too much attention. As the trees 7 in this case are closer to the camera than the building, the office building 3 may in this case too become slightly out of focus, meaning that the real area of interest - the entrance 17 - will not be as sharp as desired.
  • a number of camera settings control areas 19 have been defined within the scene 1. These areas may be set-up at installation of the camera or later on based on the needs of a user to cover places of specific interest in the scene, such as in this case the entrance 17 to the office building 3.
  • the camera settings control areas 19 are defined in a manner which allows the camera to determine the relation and a possible overlap between its current field of view and the camera settings control areas.
  • the camera settings control areas may be defined in the coordinate system for the scene. In other words, the camera settings control areas are defined in what could be denoted a PTZ-aware manner.
  • a camera settings control area 19 When at least part of a camera settings control area 19 is covered by the current field of view of the camera, the camera will control its camera settings based on this part, illustrated by the shadowed region 21 in Fig. 1 .
  • the control of the camera settings could be done by using image data captured by the monitoring camera in the overlapping region as input to one or more automatic algorithms for determining camera settings.
  • Another option is to use predefined values for one or more camera settings as soon as a camera settings control area being associated with one or more such predefined camera settings values is overlapped by the current field of view.
  • two camera settings control areas 19 have been defined, a first one for the entrance to the office building 3 and another one for the car park 9. It may be noted that the camera settings control area 19 which covers part of the car park 9 has been set up to not cover the lamp on the lamp post 11, thereby making sure that an automatic exposure setting is not adjusted based on image data around the possibly very bright lamp at the lamp post 11.
  • the camera will check, by comparing the position or coordinates of the current field of view with the position or coordinates of any camera settings control areas, if any such areas are encountered.
  • the coordinates of the field of view and the camera settings control areas may be defined in the coordinate system for the scene to start with, or they may be translated into a common coordinate system for the purpose of the comparison.
  • the camera settings will be controlled based on the part of the image which corresponds to overlapping region 21 of the camera settings control area 17 and the current field of view 15.
  • the larger of the overlapping regions may be used to control the camera settings.
  • Another option is to detect if motion is present within any of the overlapping regions, and select the region were motion is detected. If motion is detected in more than one overlapping region, it would be possible to either quantify the amount of motion detected and choose the region with the most motion detected, or to simply go back to selecting the larger overlapping region.
  • Yet another option is to allow priorities to be set for the different camera settings control areas and select the overlapping region which is associated with the camera settings control area with the highest priority. It could also be possible in some instances to combine the image data captured in both the covered camera settings control areas to control some camera settings.
  • the set-up of the camera settings control areas may be defined by a user, e.g. by input to a graphical user interface or by selecting coordinates. It would also be possible to perform some type of intelligent image analysis assisted set-up, e.g. where image elements indicating doors are detected and suggested to the user as possible areas of interest.
  • the camera settings to be controlled may be at least one of focus settings, exposure settings, white balance settings, IR cut filter settings, iris control setting and settings for an illumination unit, and which of these to be controlled may be chosen beforehand. It may also be possible to select some of the camera settings to be controlled automatically based on image data captured in the overlapping region and some to be set to a respective predefined value which typically is determined to be appropriate for that camera setting and the area or fix object in the scene which is covered by that camera settings control area.
  • an illumination unit may either be integrated in the camera or provided as a separate unit, mounted next to or at a distance from the camera.
  • Such an illumination unit may e.g. comprise a number of LEDs and may e.g. provide visible or infra-red illumination.
  • each camera settings control area may be associated with one or more camera settings to be controlled, so that one camera settings control area would be used only for focus purposes and another only for exposure settings.
  • the camera settings to control may be selected by a user, one convenient option would be to use a graphical user interface and let the user check different boxes in a list of available camera settings to control, either applying to all defined camera settings control areas, or different choices for the different camera settings control areas.
  • Fig. 2 illustrates in more detail the monitoring camera 13 arranged to monitor the scene 1.
  • the camera 13 may e.g. be a digital network camera adapted for video surveillance of the scene 1.
  • the camera 13 captures images of the part of the scene 1 covered by the current field of view 15 of the camera 13.
  • the camera 13 comprises a field of view altering unit 23 arranged to alter the field of view 15 of the camera 13.
  • the field of view altering unit 23 may e.g. comprise one or more devices such as motors which can change one or more of the pan, tilt or zoom settings of the camera, by moving the viewing direction of the camera or changing the setting of a zoom lens.
  • the camera may also be capable of digitally panning, tilting or zooming to alter the field of view 15.
  • a data input 25 is arranged to receive data defining a camera settings control area 19 within the scene 1. This data may e.g. be in the form of coordinates in a coordinate system for the scene 1.
  • An overlap determining unit 27 is provided to determine if there is an overlapping region 21 between a current field of view 15 of the camera 13 and a camera settings control area 17, e.g. by comparing the position of the current field of view and the position of the camera settings control area 17.
  • the positions may be expressed in coordinates in a coordinate system for the scene 1.
  • the camera 13 further comprises a camera settings control unit 29 arranged to control camera settings based on the overlapping region 21.
  • the camera settings may be set solely based on this overlapping region, ignoring the remaining image. It could also be possible that other parts of the image is used to some extent, e.g. by using appropriate weighting.
  • the automatic exposure settings may include gain and exposure time.
  • the brightness may also be influenced by letting the automatic iris control set an f-number for an iris for the camera lens based on image data from the overlapping region. In case the camera is equipped with a filter for infra-red light, i.e.
  • an IR cut filter the position or state (usually on or off) of such a filter may also be set based on the image data in the overlapping region.
  • the white balance may also be controlled based on the overlapping region, more or less ignoring any negative effects that could have on the image quality, typically the colors, of the remaining image captured in the current field of view.
  • Another camera setting which could be controlled is a setting for an illumination unit. This unit may then be controlled to make sure that the overlapping region is properly lit. Depending on the properties of the illumination unit it may e.g. be possible to control the direction of the illumination or the strength of the illumination, e.g. by switching on or off part of the illumination or by controlling the strength of illumination from a dimmable light source. The direction of the illumination could also be controlled by mechanically turning the unit to steer the illumination in the desired direction. It may also be possible to control the type of illumination, e.g. IR illumination or visible light.
  • Fig. 3 illustrates a method 300 for monitoring a scene by a monitoring camera.
  • the camera accesses data defining a camera setting control area within the scene, which area may have been previously defined e.g. by a user.
  • the camera determines if an overlapping region exists between the current field of view, FOV, and the camera settings control area. This may e.g. be done by comparing coordinates for the current field of view with coordinates for the camera settings control area, where the coordinates are expressed in a coordinate system for the scene. The determining may also be performed by an external processing unit connected to the camera.
  • camera settings for the monitoring camera are controlled in step 307 based on the overlapping region. If it is determined that no overlap exists, camera settings may instead be controlled in a conventional manner. It would also be possible to determine if the current field of view overlaps more than one camera settings control areas. This could either be done in sequence to the first determination or in parallel. Data regarding the further camera settings control areas would then be accessed by the camera prior to the respective determination.
EP12197677.3A 2012-12-18 2012-12-18 Monitoring method and camera Active EP2747414B1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
EP12197677.3A EP2747414B1 (en) 2012-12-18 2012-12-18 Monitoring method and camera
TW102143895A TWI539227B (zh) 2012-12-18 2013-11-29 監測方法及攝影機
CN201310659742.0A CN103873765B (zh) 2012-12-18 2013-12-09 监控方法和监控相机
JP2013254373A JP5546676B2 (ja) 2012-12-18 2013-12-09 監視方法およびカメラ
KR1020130158699A KR101578499B1 (ko) 2012-12-18 2013-12-18 감시 방법 및 카메라
US14/133,105 US9270893B2 (en) 2012-12-18 2013-12-18 Monitoring method and camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP12197677.3A EP2747414B1 (en) 2012-12-18 2012-12-18 Monitoring method and camera

Publications (2)

Publication Number Publication Date
EP2747414A1 EP2747414A1 (en) 2014-06-25
EP2747414B1 true EP2747414B1 (en) 2015-04-08

Family

ID=47427249

Family Applications (1)

Application Number Title Priority Date Filing Date
EP12197677.3A Active EP2747414B1 (en) 2012-12-18 2012-12-18 Monitoring method and camera

Country Status (6)

Country Link
US (1) US9270893B2 (zh)
EP (1) EP2747414B1 (zh)
JP (1) JP5546676B2 (zh)
KR (1) KR101578499B1 (zh)
CN (1) CN103873765B (zh)
TW (1) TWI539227B (zh)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6431257B2 (ja) 2013-10-21 2018-11-28 キヤノン株式会社 ネットワークシステム及びネットワークデバイスの管理方法、ネットワークデバイスおよびその制御方法とプログラム、管理システム
KR102360453B1 (ko) * 2015-04-10 2022-02-09 삼성전자 주식회사 카메라 설정 방법 및 장치
TWI601423B (zh) 2016-04-08 2017-10-01 晶睿通訊股份有限公司 攝影系統及其同步方法
US10447910B2 (en) * 2016-08-04 2019-10-15 International Business Machines Corporation Camera notification and filtering of content for restricted sites
US10970915B2 (en) * 2017-01-06 2021-04-06 Canon Kabushiki Kaisha Virtual viewpoint setting apparatus that sets a virtual viewpoint according to a determined common image capturing area of a plurality of image capturing apparatuses, and related setting method and storage medium
EP3610469A4 (en) 2017-05-10 2021-04-28 Grabango Co. SERIES CONFIGURED CAMERA ARRANGEMENT FOR EFFICIENT USE
EP3422068B1 (en) * 2017-06-26 2019-05-01 Axis AB A method of focusing a camera
US10506217B2 (en) * 2017-10-09 2019-12-10 Facebook Technologies, Llc Head-mounted display tracking system
EP3550823A1 (en) * 2018-04-05 2019-10-09 EVS Broadcast Equipment SA Automatic control of robotic camera for capturing a portion of a playing field
JP7146444B2 (ja) * 2018-05-11 2022-10-04 キヤノン株式会社 制御装置、制御方法及びプログラム
US20210201431A1 (en) * 2019-12-31 2021-07-01 Grabango Co. Dynamically controlled cameras for computer vision monitoring
JP2023048309A (ja) * 2021-09-28 2023-04-07 オムロン株式会社 設定装置、設定方法および設定プログラム

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5517236A (en) * 1994-06-22 1996-05-14 Philips Electronics North America Corporation Video surveillance system
JP2006005624A (ja) * 2004-06-17 2006-01-05 Olympus Corp 撮像装置
US7474848B2 (en) * 2005-05-05 2009-01-06 Hewlett-Packard Development Company, L.P. Method for achieving correct exposure of a panoramic photograph
CN101404725B (zh) * 2008-11-24 2010-07-21 华为终端有限公司 摄像机、摄像机组、摄像机组的控制方法、装置及系统
JP5209587B2 (ja) * 2009-10-22 2013-06-12 大成建設株式会社 監視カメラシステム、監視カメラ、および、光環境調節方法
JP5566145B2 (ja) * 2010-03-18 2014-08-06 キヤノン株式会社 マスキング機能を有する撮像装置およびその制御方法
AU2010201740B2 (en) 2010-04-30 2013-03-07 Canon Kabushiki Kaisha Method, apparatus and system for performing a zoom operation
JP5637633B2 (ja) 2010-07-06 2014-12-10 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America 画像配信装置
US20130021433A1 (en) * 2011-07-21 2013-01-24 Robert Bosch Gmbh Overview configuration and control method for ptz cameras
JP2013030921A (ja) * 2011-07-27 2013-02-07 Canon Inc 撮像装置、その制御方法及びプログラム

Also Published As

Publication number Publication date
TW201428411A (zh) 2014-07-16
KR101578499B1 (ko) 2015-12-18
KR20140079332A (ko) 2014-06-26
TWI539227B (zh) 2016-06-21
JP5546676B2 (ja) 2014-07-09
EP2747414A1 (en) 2014-06-25
JP2014121092A (ja) 2014-06-30
US9270893B2 (en) 2016-02-23
CN103873765B (zh) 2018-09-25
US20140168432A1 (en) 2014-06-19
CN103873765A (zh) 2014-06-18

Similar Documents

Publication Publication Date Title
EP2747414B1 (en) Monitoring method and camera
US8899849B2 (en) Camera apparatus and method of controlling camera apparatus
US9781334B2 (en) Control method, camera device and electronic equipment
JP5060767B2 (ja) レンズ装置
TWI704808B (zh) 用於監控場景之攝影機之聚焦
KR101658212B1 (ko) 카메라에 대한 최적의 시야각 위치를 선택하는 방법
US20090237554A1 (en) Autofocus system
JP2012119875A5 (ja) 撮像装置、その制御方法及びプログラム
JP2015103852A (ja) 画像処理装置、撮像装置、画像処理装置の制御方法、画像処理装置の制御プログラム及び記憶媒体
JP2010004465A (ja) 立体画像撮影システム
JP2021071732A (ja) 制御装置、制御方法、コンピュータプログラム及び電子機器
KR20220058593A (ko) 스마트한 파노라마 이미지를 획득하기 위한 시스템 및 방법
US20210297573A1 (en) Imaging device, control method, and storage medium
JP5451333B2 (ja) 監視用テレビカメラ装置
US10652460B2 (en) Image-capturing apparatus and image-capturing control method
JP5217843B2 (ja) 構図選択装置、構図選択方法及びプログラム
JP4438065B2 (ja) 撮影制御システム
JP5384173B2 (ja) オートフォーカスシステム
US20210061229A1 (en) Imaging device, method of controlling imaging device, and storage medium
JP2017216599A (ja) 撮像装置及びその制御方法
KR101514684B1 (ko) 폐쇄회로 텔레비전의 제어방법 및 그 제어장치
JP5387048B2 (ja) テレビジョンカメラシステム、テレビジョンカメラの制御方法、及びプログラム
JP6418828B2 (ja) 撮像装置
KR20220072450A (ko) 카메라 시스템 및 그의 영상 밝기 제어 방법
JP2019193196A (ja) 撮像装置およびその制御方法

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20130610

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

RIC1 Information provided on ipc code assigned before grant

Ipc: H04N 5/232 20060101AFI20140917BHEP

Ipc: H04N 5/235 20060101ALI20140917BHEP

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTG Intention to grant announced

Effective date: 20141125

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: SE

Ref legal event code: TRGR

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 721317

Country of ref document: AT

Kind code of ref document: T

Effective date: 20150515

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602012006462

Country of ref document: DE

Effective date: 20150521

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 721317

Country of ref document: AT

Kind code of ref document: T

Effective date: 20150408

REG Reference to a national code

Ref country code: NL

Ref legal event code: VDEP

Effective date: 20150408

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150708

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150810

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 4

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150808

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150709

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602012006462

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

Ref country code: RO

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150408

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

26N No opposition filed

Effective date: 20160111

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20151231

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

Ref country code: LU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20151218

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20151218

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20151231

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20151231

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 5

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20121218

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 6

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150408

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602012006462

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: H04N0005232000

Ipc: H04N0023600000

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230505

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20231121

Year of fee payment: 12

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: SE

Payment date: 20231121

Year of fee payment: 12

Ref country code: FR

Payment date: 20231122

Year of fee payment: 12

Ref country code: DE

Payment date: 20231121

Year of fee payment: 12