TW201129084A - Controlling system and method for camera, adjusting apparatus for camera including the same - Google Patents

Controlling system and method for camera, adjusting apparatus for camera including the same Download PDF

Info

Publication number
TW201129084A
TW201129084A TW099102938A TW99102938A TW201129084A TW 201129084 A TW201129084 A TW 201129084A TW 099102938 A TW099102938 A TW 099102938A TW 99102938 A TW99102938 A TW 99102938A TW 201129084 A TW201129084 A TW 201129084A
Authority
TW
Taiwan
Prior art keywords
camera
user
head
control
image
Prior art date
Application number
TW099102938A
Other languages
Chinese (zh)
Inventor
Hou-Hsien Lee
Chang-Jung Lee
Chih-Ping Lo
Original Assignee
Hon Hai Prec Ind Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hon Hai Prec Ind Co Ltd filed Critical Hon Hai Prec Ind Co Ltd
Priority to TW099102938A priority Critical patent/TW201129084A/en
Priority to US12/786,289 priority patent/US20110187866A1/en
Publication of TW201129084A publication Critical patent/TW201129084A/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Abstract

An adjusting apparatus for a camera includes a camera, an orbit, and a controlling system. The camera captures an image of an user. The controlling system receives the image and detects the image to obtain a face portion of the image. The controlling system further processes the face portion of the image to obtain character of the face portion, and to control the camera correspondingly. The invention further provides a controlling system and a controlling method for the camera.

Description

201129084 六、發明說明: 【發明所屬之技術領域】 [0001] 本發明涉及-種攝影機控制系統及方法,還涉及—種攝 影機調整裝置。 [0002] 〇 【先前技術】 習知攝影機在需要改變鏡頭視角時,只能透過操作專屬 控制器來實現。在醫療領域,醫生在對患者進行手術時 ,經常需要對患部進行多角度的觀察,如此將給醫生帶 來極大的不便。 . . ; [0003] .. . 【發明内容】 蓉於以上内容’有必要提供一種攝影機控制系統及方& ’該控制线及方法可使得用戶能方便的控_影機。 還有必要提供-種包括上述攝影機控制系統的攝影機調 整裝置。 [0004] -種攝影機控制系統,用於控制—第—攝影機,包括: [0005] -臉部㈣模塊,用於接收來自__第二攝影機所拍攝的 用戶的影像,並對其進行彳貞測以得到該影像中的臉部區 域; [0006] 一第一計算模塊,用於對得到的臉部區域進行運算,以 得知用戶的頭部傾斜角度;以及 [0007] 一控制模塊,用於根據得到的頭部傾斜角度發送對應的 控制訊號,以對應控制該第一攝影機移動至一執道的對 應位置。 [0008] 一種攝景夕機控制方法,用於控制一第一攝影機,該攝影 099102938 表單編號A0101 第3頁/共33頁 0992005563-0 201129084 機控制方法包括: [0009] 偵測步驟:接收來自一第二攝影機所拍攝的用戶的影像 ,並對其進行偵測以得到該影像中的臉部區域; [0010] 第一運算步驟:對得到的臉部區域進行運算,以得知用 戶的頭部傾斜角度;以及 [0011] 第一控制步驟:根據得到的頭部傾斜角度發送對應的控 制訊號,以對應控制該第一攝影機移動至一執道的對應 位置。 [0012] 一種攝影機調整裝置,用於調整一第一攝影機,該攝影 機調整裝置包括: [0013] —第二攝影機,用於拍攝用戶的影像; [0014] 一軌道,該第一攝影機設置於該軌道上且可在該軌道上 移動;以及 [0015] 一攝影機控制系統,用於接收第二攝影機所拍攝的用戶 的影像,並對其進行偵測以得到該影像中的臉部區域, 還用於對得到的臉部區域進行運算,以得知臉部區域的 特徵資料,並根據得到的臉部區域的特徵資料發送對應 的控制訊號,以對應控制該第一攝影機。 [0016] 上述攝影機調整裝置、攝影機控制系統及攝影機控制方 法透過對第二攝影機所拍攝得到的用戶的影像進行偵測 ,以得到影像中的臉部區域,並透過對臉部區域進行運 算以得知用戶的頭部傾斜角度,從而發出對應的控制訊 號以控制第一攝影機作出對應的動作,避免透過專屬控 099102938 表單編號A0101 第4頁/共33頁 0992005563-0 201129084 ^來控制第-攝影機的動作。當該攝影機控制系統及 ”機控制方法利用在醫療等領域時,將會給醫療人員 帶來極大的方便。 【實施方式】 [_請參閱m,本發明攝影機調整裝置用於調整一攝影機ι〇 變該攝f彡機10的鏡祕角等。該攝影機調整裝置 的較佳實施方式包括-攝影機控制系統20、另—攝影機 30及一轨道4〇。 〇 _]賴影機30用於拍攝用戶5〇的影像,並將該影像傳送至 攝影機控制系·。該攝影機控制系統_影像進行處 理之後,根據用戶50的頭部旋轉角度、頭部傾斜角度及 臉部與參照物之間的距離對應控制攝影韻的鏡頭視角 、鏡頭位置及縮放比例等。其巾,該攝影機10的鏡頭位 置可透過控制攝影機10在軌道4〇上移動來實現。 [0019] 〇 請-併參閱圖2 ’該攝影機控制系統2〇的較佳實施方式包 括-臉部偵測模塊200、-第一計算模塊21〇、一第二計 算模塊22G、-第三計算模塊_游_第四計算模塊25〇 及一控制模塊260。 [_紐部债測模塊200用於接收來自攝影機3〇所拍攝的用戶 50的影像’並對其進行_崎_影像中的臉部區域 。其中,該臉部偵測模塊200可利用衂讣〇〇討演算法對 影像進行臉部偵測。 [0021]該第一計算模塊21〇用於對得到的臉部區域進行運算,以 得知此時用戶50的頭部的傾斜角度。本實施方式中,以 099102938 表單編號A0101 第5頁/共33頁 0992005563-0 201129084 用戶50的臉部正對攝影機30為參考,即當用戶50的臉部 正對攝影機30時,用戶50的頭部的傾斜角度為0度。其中 ,該第一計算模塊210可透過計算得到的臉部區域與當用 戶50的臉部正對攝影機30時臉部區域之間的夾角來得到 用戶50的臉部的傾斜角度。當然,其他實施方式中,該 第一計算模塊210可利用更加複雜的計算方式,如透過複 雜的演算法得到用戶50的視線方向,從而可以更精確地 得到用戶50的頭部傾斜角度。如圖3A-3C所示,圖3A中 用戶50的頭部傾斜角度為0度、圖3B中用戶50的頭部傾斜 角度為左側X度、圖3C中用戶50的頭部傾斜角度為右側X 度。 [0022] 該第二計算模塊220用於對得到的臉部區域進行運算,以 得知此時用戶50的頭部旋轉角度。本實施方式中,以用 戶50的臉部正對攝影機30為參考,即當用戶50的臉部正 對攝影機30時,用戶50的頭部的旋轉角度為0度。其中, 該第二計算模塊220可透過計算臉部區域中眼球與攝影機 30的連線與當用戶50的臉部正對攝影機30時臉部區域中 眼球與攝影機30的連線之間的夾角來得到用戶50的頭部 的旋轉角度。當然,其他實施方式中,該第二計算模塊 220可利用更加複雜的計算方式,以更精確地得到用戶50 的頭部的旋轉角度。如圖4A-4C所示,圖4A中用戶50的 頭部旋轉角度為〇度、圖4B中用戶50的頭部旋轉角度為向 左X度、圖4C中用戶50的頭部旋轉角度為向右X度。 [0023] 該第三計算模塊230用於對得到的臉部區域進行運算,以 得知此時用戶50的頭部處於抬頭或低頭的狀態。本實施 099102938 表單編號A0101 第6頁/共33頁 0992005563-0 201129084 方式中,以用戶50的臉部正對攝影機30為參考,即當用 戶50的臉部正對攝影機30時,用戶50既不抬頭亦不低頭 。其中,該第三計算模塊230可透過計算臉部區域中眼球 的位置來得知用戶50的頭部為抬頭或低頭。當然,其他 實施方式中,該第三計算模塊230甚至可利用更加複雜的 計算方式,以得到用戶50抬頭或低頭的角度。如圖5A-5C 所示,圖5A中用戶50既不抬頭亦不低頭、圖5B中用戶50 的頭部為抬頭、圖5C中用戶50的頭部為低頭。 [0024] 該第四計算模塊250用於對得到的臉部區域進行運算,以 得知此時用戶50的臉部與攝影機30之間的距離。本實施 方式中,以用戶50的臉部與攝影機30之間的距離為50釐 米為參考,即當用戶50的臉部與攝影機30之間的距離為 50釐米時,該第四計算模塊250將此時用戶50的臉部與攝 影機30的距離記為0釐米。其中,該第四計算模塊250可 透過計算臉部區域的尺寸與當用戶50的臉部與攝影機30 之間的距離為50釐米時臉部區域的尺寸之間的比例來得 到用戶50的臉部與攝影機30之間的距離。當然,其他實 施方式中,該第四計算模塊250可利用更加複雜的計算方 式,以更精確地得到用戶50的臉部與攝影機30之間的距 離。並且,其他實施方式中亦可以其他物體作為參照物 ,該第四計算模塊250則用於得到用戶50的臉部與參照物 之間的距離。 [0025] 其中,上述第一至第四計算模塊210、220、2 30及250等 特徵處理模塊亦可透過對臉部區域進行運算以得到其他 資料,甚至可以包括用戶50的眨眼次數,從而判斷用戶 099102938 表單編號A0101 第7頁/共33頁 0992005563-0 201129084 50此時的動作。在該情況下,用戶50可以自定義 我用戶臉 部區域的某種特徵即表示用戶的某一動作。 [0026] 該控制模塊260用於根據得到的頭部的傾斜角度、頭部的 旋轉角度、用戶50抬頭或低頭的角度以及臉部與攝景;機 30之間的距離發出對應的控制訊號。該控制訊號可透過 自行設置,如當得知頭部的傾斜角度為左側1〇度時,^ 控制模塊260發送第一控制訊號,以控制攝影機1〇^轨首 40逆時針運動10度;當得知頭部的旋轉角度為向左1〇户 時,該控制模塊260發送第二擇制訊號,以控制攝影機^ 的鏡頭向左轉動10度;當得知用戶50抬頭1〇度時,該# 制模塊260發送第三控制訊號,以控制攝影機1〇的鏡頭上 仰10度;當得知臉部與參照物之間的距離為向前1〇复# 時,該控制模塊260發送第四控制訊號,以控制攝麥機^ 的鏡頭焦距放大一倍。 [0027] 本實施方式中,該攝影機10¾包括一氣動單元(圖未示 ),該驅動單元用於根據控制訊號對應控制攝影機1〇的 運動,如沿軌道運動、鏡頭向左或:向右轉動、鏡頭上仰 與下俯及鏡頭的縮放等。其他實施方式中,該影機控制 系統20還包括一網路模塊270。該網路模塊27〇用於將該 控制模塊260得到的控制訊號傳送至驅動單元。 [0028] 下面將以一實例對本發明攝影機控制系統2 0中的第_計 算模塊210的原理進行說明。其他計算模塊的原理與此類 似,在此不再贅述。 [0029] 請參閱圖6A,當用戶50的臉部正對攝影機3〇時,該攝影 099102938 表單編號A0101 第8頁/共33頁 〇992005563-〇 201129084 [0030] Ο 機3晴用戶5G進行拍攝㈣到―影像該影像經過臉部 積測模塊2_測之後得到用戶的臉部區域,如圖6A中圖 像510。該圖像510可被看作是參考圖像,該參考圖像所 對應的用戶的頭部傾斜角度為〇度。此時,該攝影機1〇位 於原點位置,即該軌道4 0的a點處 請-併參閱圖6B ’當用戶50的頭部向左傾㈣度時,該 攝影機_用戶5〇進行拍心得到—影像,該影像經過 臉部_模塊2謝貞測之後得到用戶5()的臉部區域,如圖 6B中圖像520。此時’該第二計算模塊21〇根據參考圖像 對圖像52G進行運算,得知此時用戶5()的:頭部左㈣度。 該控制模〇根據得_料_斜角度為左傾45度發 送控制訊號至攝影機10的驅動單元,該驅動單元則對應 控制攝影機10沿軌道4 0順時針運動4 5度,即位於圖4 B中 軌道40的B點處。201129084 VI. Description of the Invention: [Technical Field of the Invention] [0001] The present invention relates to a camera control system and method, and to a camera adjustment device. [0002] 先前 [Prior Art] Conventional cameras can only be realized by operating a dedicated controller when it is necessary to change the angle of view of the lens. In the medical field, doctors often need to observe the affected part at multiple angles when performing surgery on patients, which will bring great inconvenience to doctors. [0003] .. . [Invention] In the above content, it is necessary to provide a camera control system and a square &> the control line and method enable the user to conveniently control the video camera. It is also necessary to provide a camera adjustment device including the above camera control system. [0004] A camera control system for controlling a - camera, comprising: [0005] - a face (four) module for receiving images from a user photographed by the second camera and licking them Measured to obtain a face region in the image; [0006] a first calculation module for calculating the obtained face region to know the user's head tilt angle; and [0007] a control module Corresponding control signals are sent according to the obtained head tilt angle to correspondingly control the movement of the first camera to a corresponding position of an obedience. [0008] A camera control method for controlling a first camera, the photography 099102938 Form No. A0101 Page 3 / Total 33 Page 0992005563-0 201129084 Machine control method includes: [0009] Detection step: receiving from a user's image taken by a second camera and detecting it to obtain a face region in the image; [0010] a first operation step: performing an operation on the obtained face region to know the user's head a tilt angle; and [0011] a first control step of: transmitting a corresponding control signal according to the obtained head tilt angle to correspondingly control the movement of the first camera to a corresponding position of an obedience. [0012] A camera adjusting device for adjusting a first camera, the camera adjusting device comprising: [0013] a second camera for capturing an image of a user; [0014] a track, the first camera is disposed at the Moving on the track and on the track; and [0015] a camera control system for receiving an image of the user captured by the second camera and detecting it to obtain a face region in the image, The obtained facial region is calculated to learn the feature data of the facial region, and the corresponding control signal is sent according to the obtained feature data of the facial region to correspondingly control the first camera. [0016] The camera adjustment device, the camera control system, and the camera control method detect the image of the user captured by the second camera to obtain a face region in the image, and perform an operation on the face region. Knowing the user's head tilt angle, thereby issuing a corresponding control signal to control the first camera to perform corresponding actions, avoiding controlling the first camera through the exclusive control 099102938 Form No. A0101 Page 4 / Total 33 Page 0992005563-0 201129084 ^ action. When the camera control system and the "machine control method" are utilized in the medical field, etc., it will bring great convenience to medical personnel. [Embodiment] [_Please refer to m, the camera adjustment device of the present invention is used to adjust a camera ι〇 The mirror adjustment angle of the camera 10 is changed. The preferred embodiment of the camera adjustment device includes a camera control system 20, a camera camera 30, and a track 4 〇. 〇 _ _ _ _ _ _ _ _ 5〇 image, and the image is transmitted to the camera control system. The camera control system_image is processed according to the head rotation angle of the user 50, the head tilt angle, and the distance between the face and the reference object. Controlling the lens angle of view, lens position and zoom ratio of the photographic rhyme, etc. The lens position of the camera 10 can be realized by controlling the movement of the camera 10 on the track 4 。 [0019] - please - and see FIG. 2 'The camera A preferred embodiment of the control system 2 includes a face detection module 200, a first calculation module 21, a second calculation module 22G, a third calculation module, a navigation algorithm, and a fourth calculation module. 25〇 and a control module 260. [_ New Zealand debt measurement module 200 is configured to receive an image from the user 50 captured by the camera 3' and perform a face region in the image of the image. The detecting module 200 can perform face detection on the image by using the beating algorithm. [0021] The first calculating module 21 is configured to perform operations on the obtained face region to know the user 50 at this time. In the present embodiment, the face of the user 50 is referenced to the camera 30, that is, when the face of the user 50 faces the camera 30, at 099102938, form number A0101, page 5, page 33, 0992005563-0, 201129084. The inclination angle of the head of the user 50 is 0 degrees, wherein the first calculation module 210 can pass the calculated angle between the face region and the face region when the face of the user 50 faces the camera 30. Obtaining the inclination angle of the face of the user 50. Of course, in other embodiments, the first calculation module 210 can obtain a more complicated calculation manner, such as obtaining a line of sight direction of the user 50 through a complicated algorithm, so that the second calculation module can obtain more accurate use The head tilt angle of 50. As shown in FIGS. 3A-3C, the head 50 of the user 50 in FIG. 3A is inclined at 0 degrees, the head tilt angle of the user 50 in FIG. 3B is X degrees on the left side, and the user 50 in FIG. 3C. The angle of the head tilt is X degrees on the right side. [0022] The second calculation module 220 is configured to calculate the obtained face region to know the angle of rotation of the head of the user 50 at this time. In the embodiment, the user 50 The face is facing the camera 30, that is, when the face of the user 50 faces the camera 30, the rotation angle of the head of the user 50 is 0 degrees. The second calculation module 220 can calculate the face area. The angle between the eyeball and the camera 30 and the line between the eyeball and the camera 30 in the face region when the face of the user 50 is facing the camera 30 obtains the angle of rotation of the head of the user 50. Of course, in other embodiments, the second computing module 220 can utilize a more complicated calculation manner to more accurately obtain the rotation angle of the head of the user 50. As shown in FIGS. 4A-4C, the head rotation angle of the user 50 in FIG. 4A is the degree of twist, the head rotation angle of the user 50 in FIG. 4B is X degrees to the left, and the head rotation angle of the user 50 in FIG. 4C is the direction. Right X degrees. [0023] The third calculating module 230 is configured to perform an operation on the obtained face region to know that the head of the user 50 is in a state of being headed up or bowed. This embodiment 099102938 Form No. A0101 Page 6 / Total 33 Page 0992005563-0 201129084 In the mode, the face of the user 50 is facing the camera 30, that is, when the face of the user 50 faces the camera 30, the user 50 neither Don't look down when you look up. The third calculating module 230 can know that the head of the user 50 is looking up or looking down by calculating the position of the eyeball in the face area. Of course, in other embodiments, the third calculation module 230 can even utilize a more complicated calculation method to obtain an angle at which the user 50 looks up or down. As shown in Figures 5A-5C, the user 50 of Figure 5A neither looks up nor down, the head of the user 50 in Figure 5B is headed, and the head of the user 50 in Figure 5C is headed. [0024] The fourth calculation module 250 is configured to calculate the obtained face region to know the distance between the face of the user 50 and the camera 30 at this time. In this embodiment, the distance between the face of the user 50 and the camera 30 is 50 cm, that is, when the distance between the face of the user 50 and the camera 30 is 50 cm, the fourth calculation module 250 will At this time, the distance between the face of the user 50 and the camera 30 is recorded as 0 cm. The fourth calculation module 250 can obtain the face of the user 50 by calculating the ratio between the size of the face region and the size of the face region when the distance between the face of the user 50 and the camera 30 is 50 cm. The distance from the camera 30. Of course, in other implementations, the fourth computing module 250 can utilize a more sophisticated calculation to more accurately obtain the distance between the face of the user 50 and the camera 30. Moreover, in other embodiments, other objects may be used as reference objects, and the fourth calculation module 250 is configured to obtain the distance between the face of the user 50 and the reference object. [0025] The feature processing modules of the first to fourth computing modules 210, 220, 2, and 250 can also obtain other data by performing operations on the face region, and may even include the number of blinks of the user 50, thereby determining User 099102938 Form No. A0101 Page 7 of 33 0992005563-0 201129084 50 The action at this time. In this case, the user 50 can customize a certain feature of the face area of the user to indicate a certain action of the user. The control module 260 is configured to issue a corresponding control signal according to the obtained tilt angle of the head, the rotation angle of the head, the angle of the user 50 looking up or down, and the distance between the face and the camera 30. The control signal can be set by itself. For example, when it is known that the tilt angle of the head is 1 degree to the left, the control module 260 sends a first control signal to control the camera 1 to rotate the head 40 by 10 degrees counterclockwise; When it is known that the rotation angle of the head is 1 to the left, the control module 260 sends a second selection signal to control the lens of the camera to be rotated 10 degrees to the left; when it is known that the user 50 is looking up 1 degree, The #module module 260 sends a third control signal to control the lens of the camera 1 to tilt 10 degrees; when it is known that the distance between the face and the reference is forward 1 , the control module 260 sends the fourth The control signal is used to control the lens focal length of the camera to double. [0027] In the embodiment, the camera 102a includes a pneumatic unit (not shown) for controlling the movement of the camera 1 according to the control signal, such as along the orbital motion, the lens to the left or the right to the right. , the lens up and down and the zoom of the lens. In other embodiments, the camera control system 20 further includes a network module 270. The network module 27 is configured to transmit the control signal obtained by the control module 260 to the driving unit. [0028] The principle of the first calculation module 210 in the camera control system 20 of the present invention will be described below by way of an example. The principles of other computing modules are similar to those of this type and will not be described here. [0029] Please refer to FIG. 6A, when the face of the user 50 is facing the camera 3, the photography 099102938 form number A0101 page 8 / total page 33 〇 992005563-〇201129084 [0030] Ο machine 3 sunny user 5G shooting (4) Go to the "image" The image is obtained by the face integration module 2_ after the measurement, and the face area of the user is obtained, as shown in the image 510 in FIG. 6A. The image 510 can be regarded as a reference image, and the user's head tilt angle corresponding to the reference image is a twist. At this time, the camera 1 is located at the origin position, that is, at the point a of the track 40 - and referring to FIG. 6B 'When the head of the user 50 is tilted to the left (four degrees), the camera _ user 5 〇 takes the beat - Image, which is obtained by the face_module 2 and then the face area of the user 5() is obtained, as in the image 520 in Fig. 6B. At this time, the second calculation module 21 performs an operation on the image 52G based on the reference image, and knows that the user's left (four) degrees of the head 5() at this time. The control module sends a control signal to the driving unit of the camera 10 according to the angle of the oblique angle of 45 degrees, and the driving unit correspondingly controls the camera 10 to move clockwise 45 degrees along the track 40, that is, in FIG. 4B. At point B of track 40.

[0031]G 請—併參閱圖6C,當用戶50的頭部向右傾斜45度時,該 攝影機30對用戶50進行拍攝以得到一影像,該影像經過 臉部偵測模塊200偵測之後得到用戶50的臉部區域,如圖 [0032] 099102938 6C中圖像530。此時,該第二計算模塊210根據參考圖像 對圖像530進行運算,得知此時用戶50的頭部右傾45度。 該控制模塊260根據得到的頭部的傾斜角度為右傾45度發 送控制訊號至攝影機10的驅動單元’該驅動單元則對應 控制攝影機10沿軌道40逆時針運動45纟即位於圖6(:中 軌道40的C點處》 請參閱®7,本發明攝f彡鮮制方法雜佳實施方式包括 以下步驟: 表單編號A0101 第9頁/典33買 0992005563-0 201129084 [0033] 的用戶C=2°°接收來自攝糊所拍攝 部區域。其中 、進仃偵測叫到該影像中的臉 算法對影像進行=;:Γ祺塊m可利用咖_演 [0034] 算,以得知此時用^模塊210對得到的臉部區域進行運 ,以用戶50的臉Γ 部傾斜角度。本實施方式中 臉部正對攝影機3U對攝影機3G為參考,即#用戶50的 其中,該第—B、’用戶50的頭部的傾斜角度為0度。 當用戶5:的臉=Γ10可透過計算得到的臉部_ 得到用戶5〇的Γ 獅時臉部區域之間的夾角來 ,士 的碩部的傾斜角度。當然,其他實施方u '•亥第-計算模塊21〇可利用更加複雜 ' 精確地得到用戶50的頭部的傾斜角度。方式,以更 [0035]步驟4 皙、μ第—計算模塊220對得到的臉部區域進行運 二以得知此時用戶5G_部麟角度。本實施方式中 ,以用戶50的臉部正對攝影機3〇為參考,即當用戶5〇的 臉部正對攝影機3树,用戶5G的頭部的旋轉角度為〇产。 ’、令’ 4第二計算模塊22〇可透過計算臉部區域中眼球與 攝影機30的連線與當用戶5〇的臉部正對攝影機3〇時臉部 區域中眼球與攝影機3 〇的連線之間的夾角來得到用戶5 〇 的頭部的旋轉角度。當然,其他實施方式争,該第二叶 异模塊220可利用更加複雜的計算方式,以更精媒地得到 用戶50的頭部的旋轉角度。 步驟S74 :該第三計算模塊230對得到的臉部區域進行運 算,以得知此時用戶50的頭部處於括頭或低頭的狀態。 099102938 表單編號A0101 第10頁/共33頁 0992005563-0 [0036] 201129084 本實施方式中,以用戶50的臉部正對攝影機3〇為參考, 即當用戶50的臉部正對攝影機30時,用戶5〇既不抬頭亦 不低頭。其中,該第三計算模塊230可透過計算臉部區威 中眼球的位置來得知用戶50的頭部為抬頭或低頭。當然 ,其他實施方式中,該第三計算模塊23〇甚至可利用更加 複雜的計算方式,以得到用戶5〇抬頭或低頭的角度。如 圖5A-5C所示,圖5A中用戶50既不抬頭亦不低頭、圖5β 中用戶50的頭部為抬頭、圖5C中用戶5〇的頭部為低頭。 〇 _步驟S75 :該第四計算模塊250對得到的臉部區域進行運 算,以得知此時用戶50的臉部與攝影機3〇之間的距離。 本實施方式中,以用戶50的臉部與攝影機3〇之間的距離 為50釐米為參考,即當用戶5〇的臉部與攝影機3〇之間的 距離為50釐米時,該第四計算模塊25〇將此時用戶5〇的臉 部與攝影機30之間的距離記為〇。其中,該第四計算模瑰 250可透過計算臉部區域的尺寸與當用戶5p的臉部與攝影 機30之間的距離為50髮米時臉部區域的尺寸之間的比例 Ο 來得到用戶50的臉部與參照物之間的距離。當然,其他 實施方式中,該第四計算模塊250可利用更加複雜的計算 方式,以更精確地得到用戶50的臉部與參照物之間的距 離。 [0038]其中,上述步驟S72、S73 ' S74及S75同時執行,即當臉 部偵測模塊200得到影像中的臉部區域之後,該第一至第 四計算模塊210、220、230及250則對得到的臉部區域進 行運算,以得知此時用戶50的頭部傾斜角度、頭部旋轉 角度、用户50抬頭或低頭的角度以及臉部與參照物之間 099102938 表·單編號A0101 第11頁/共33頁 0992005563-0 201129084 的距離。另,上述第一至第四計算模塊21〇、22〇、23 0 及250亦可透過對臉部區域進行運算以得到其他資料,甚 至可以包括用戶5 0的眨眼次數,從而判斷用戶5 〇此時的 動作。在該情況下’可以自定義用戶5〇臉部區域的某種 特徵即表示用戶50的某一動作。在得知用戶5〇的動作之 後執行步驟S76。 [0039] [0040] 099102938 步驟S76 :該控制模塊26〇根據得到的頭部的傾斜角度、 頭部的旋轉角度、用戶5〇抬頭或低頭的角度以及臉部與[0031] G - and referring to FIG. 6C, when the head of the user 50 is tilted 45 degrees to the right, the camera 30 captures the user 50 to obtain an image, which is detected by the face detection module 200. The face area of the user 50 is shown as image 530 in [0032] 099102938 6C. At this time, the second calculation module 210 performs an operation on the image 530 according to the reference image, and knows that the head of the user 50 is tilted 45 degrees to the right. The control module 260 sends a control signal to the driving unit of the camera 10 according to the obtained tilt angle of the head 45 degrees to the right. The driving unit correspondingly controls the camera 10 to move counterclockwise along the track 40 by 45 degrees, that is, in the track of FIG. 6 (: At point C of 40, please refer to ®7, the method of the present invention includes the following steps: Form No. A0101 Page 9 / Code 33 Buy 0992005563-0 201129084 [0033] User C=2° ° Receive the area from the camera part of the camera. Among them, the face detection method called the face algorithm in the image is performed on the image =;: the block m can be calculated by using the coffee _ [0034] to know that it is used at this time. The module 210 carries the obtained face region to the angle of the face of the user 50. In the present embodiment, the face is facing the camera 3U with respect to the camera 3G, that is, the # user 50, the first B, 'The angle of inclination of the head of the user 50 is 0 degrees. When the face of the user 5: face = Γ 10 can be calculated by the face _ get the angle between the face area of the user 5 Γ lion, the master of the 士Angle of inclination. Of course, other implementations u '•海第-calculation Block 21〇 can be used to more accurately 'accurately obtain the tilt angle of the head of the user 50. In a manner, more than [0035] step 4 皙, μ first - calculation module 220 performs the second facial region to learn this. In the present embodiment, the face of the user 50 is referred to the camera 3〇, that is, when the face of the user 5〇 faces the camera 3 tree, the rotation angle of the head of the user 5G is The second calculation module 22 can calculate the eyeball and the camera 30 in the face area and the eyeball and the camera 3 in the face area when the face of the user 5 is facing the camera 3 The angle between the lines of the cymbal is used to obtain the rotation angle of the head of the user. Of course, in other embodiments, the second leaf module 220 can utilize more complicated calculation methods to obtain the user 50 more sophisticated. Step S74: The third calculating module 230 performs an operation on the obtained face region to know that the head of the user 50 is in a state of being in a head or a head. 099102938 Form No. A0101 Page 10 / Total 33 pages 0992005563-0 [0036 In the present embodiment, the face of the user 50 is referred to the camera 3〇, that is, when the face of the user 50 faces the camera 30, the user 5〇 neither raises nor lowers the head. The module 230 can know that the head of the user 50 is looking up or down by calculating the position of the eyeball in the face area. Of course, in other embodiments, the third calculating module 23 can even use a more complicated calculation method to obtain The user 5 looks up or down. As shown in Figures 5A-5C, the user 50 in Figure 5A neither looks up nor down, the head of the user 50 in Figure 5β is headed, and the head of the user 5 in Figure 5C is Bow down. _ _ Step S75: The fourth calculation module 250 operates the obtained face region to know the distance between the face of the user 50 and the camera 3 at this time. In this embodiment, the distance between the face of the user 50 and the camera 3〇 is 50 cm, that is, when the distance between the face of the user 5〇 and the camera 3〇 is 50 cm, the fourth calculation is performed. The module 25 记 marks the distance between the face of the user 5 此时 and the camera 30 at this time as 〇. The fourth calculation module 250 can obtain the user 50 by calculating the ratio between the size of the face region and the size of the face region when the distance between the face of the user 5p and the camera 30 is 50 meters. The distance between the face and the reference object. Of course, in other embodiments, the fourth calculation module 250 can utilize a more complicated calculation method to more accurately obtain the distance between the face of the user 50 and the reference object. [0038] The above steps S72, S73 'S74 and S75 are simultaneously performed, that is, after the face detection module 200 obtains the face region in the image, the first to fourth calculation modules 210, 220, 230, and 250 are The obtained face area is calculated to know the angle of the head tilt of the user 50, the angle of rotation of the head, the angle of the head 50 or the head of the user 50, and the face and the reference object. 099102938 Table No. A0101 No. 11 Page / Total 33 pages 0992005563-0 201129084 Distance. In addition, the first to fourth calculation modules 21〇, 22〇, 23 0, and 250 may also perform other operations on the face region to obtain other materials, and may even include the number of blinks of the user 50, thereby determining that the user 5 is The action of the time. In this case, it is possible to customize a certain feature of the user's face area, i.e., to indicate a certain action of the user 50. Step S76 is executed after the user 5's action is known. [0040] Step 076: The control module 26: according to the obtained tilt angle of the head, the rotation angle of the head, the angle of the user's 5 〇 head or head, and the face and

參照物之間的距離發出對應的控制訊號。該控制訊號可 《I 自仃設置,如當得知頭部的傾斜角度為右側45度時,該 控制模塊260發送控制訊號以控制攝影機10沿軌道40逆時 針運動45度;當得知頭部的旋轉角度為向純度時,該 控制模塊260發送控制訊號以控制攝影機1㈣鏡頭向右轉 動45度’當传知用戶5〇低頭45度時,該控制模塊26〇發 送控制减以控制攝影機1()的鏡頭下俯45度;當得知臉 P與“、、物之間的距離為—後1()着_,故控制模塊⑽ 發k控制魏以控簡影機丨Q的鏡懸、距縮小一倍。 ^攝影魅㈣'㈣轉韻控财法料對攝影機 的臉部Si到:用戶5〇的影像進行偵測’以得到影像中 =㈣域,並透過對臉部區域進行運算以得知用户5〇 ,部傾斜角度、頭部旋轉角度、用戶Μ抬頭或低頭的 =及臉部與參照物之間的距離,從而發出對應的控 制^輸軸⑷到動作,避免透 控制絲控制攝影㈣ 屬 —制方_ 一等領=:=。 第]2頁/共33頁 表單編號Anim ^ . 0992005563-0 201129084 員帶來極大的方便。 [0041] 综上所述,本發明符合發明專利要件,爰依法提出專利 申請。惟,以上所述者僅為本發明之較佳實施例,舉凡 熟悉本案技藝之人士,在爰依本發明精神所作之等效修 飾或變化,皆應涵蓋於以下之申請專利範圍内。 【圖式簡單說明】 [0042] 圖1是利用本發明攝影機調整裝置的較佳實施方式的示意 圖。 [0043] 圖2是圖1中攝影機控制系統的較佳實施方式的示意圖。 [0044] 圖3A-3C為用戶的頭部傾斜時的示意圖。 [0045] 圖4A-4C為用戶的頭部旋轉時的示意圖。 [0046] 圖5A-5C為用戶抬頭或低頭時的示意圖。 [0047] 圖6A是透過圖1中攝影機調整裝置調整攝影機的第一示意 圖。 [0048] 圖6B是透過圖1中攝影機調整裝置調整攝影機的第二示意 圖。 [0049] 圖6C是透過圖1中攝影機調整裝置調整攝影機的第三示意 圖。 [0050] 圖7是本發明攝影機控制方法的較佳實施方式的示意圖。 【主要元件符號說明】 [0051] 攝影機:10、30 [0052] 攝影機控制系統:20、22 099102938 表單編號A0101 第13頁/共33頁 0992005563-0 201129084 [0053] 臉部偵測模塊:20 0 [0054] 立體模型建立模塊:210 [0055] 第一計算模塊:220 [0056] 第二計算模塊:230 [0057] 第三計算模塊:250 [0058] 控制模塊:260 [0059] 軌道:40 [0060] 用戶:5 0 [0061] 圖像:510、520、530 099102938 表單編號A0101 第14頁/共33頁 0992005563-0The corresponding control signal is issued by the distance between the reference objects. The control signal can be set automatically. For example, when it is known that the tilt angle of the head is 45 degrees to the right, the control module 260 sends a control signal to control the camera 10 to move counterclockwise by 45 degrees along the track 40; When the rotation angle is the purity, the control module 260 sends a control signal to control the camera 1 (four) to rotate the lens 45 degrees to the right. When the user 5 is notified that the user 5 is down 45 degrees, the control module 26 transmits a control minus the control camera 1 ( ) The lens is tilted 45 degrees; when it is known that the distance between the face P and the ",, and the object is - 1 (), then the control module (10) sends a control to control the shadow of the camera 丨Q. The distance is doubled. ^Photographic Charm (4) '(4) The rhythm control method is used to detect the image of the camera's face Si to: the user's 5〇 image to get the image in the (4) field, and through the operation of the face area In order to know the user 5〇, the angle of inclination of the part, the angle of rotation of the head, the distance between the user's head and the head and the distance between the face and the reference object, the corresponding control shaft (4) is sent to the action to avoid the control wire. Control photography (4) genus - system _ first-class collar =:=. Page 2 of 3 Page form number Anim ^ . 0992005563-0 201129084 The staff brings great convenience. [0041] In summary, the invention complies with the invention patent requirements, and the patent application is filed according to law. However, the above is only the comparison of the present invention. In the preferred embodiment, equivalent modifications or variations made by those skilled in the art of the present invention should be included in the scope of the following claims. [FIG. 1] FIG. BRIEF DESCRIPTION OF THE DRAWINGS Figure 2 is a schematic view of a preferred embodiment of the camera control system of Figure 1. [0044] Figures 3A-3C are schematic views of the user's head tilted. 4A-4C are schematic views of the user's head when rotated. [0046] FIGS. 5A-5C are schematic diagrams of the user looking up or down. [0047] FIG. 6A is the first adjustment of the camera through the camera adjustment device of FIG. 6B is a second schematic diagram of adjusting the camera through the camera adjustment device of FIG. 1. [0049] FIG. 6C is a third schematic diagram of adjusting the camera through the camera adjustment device of FIG. 1. [0050] Fig. 7 is a schematic view showing a preferred embodiment of the camera control method of the present invention. [Main component symbol description] [0051] Camera: 10, 30 [0052] Camera control system: 20, 22 099102938 Form number A0101 Page 13 of 33 Page 0992005563-0 201129084 [0053] Face Detection Module: 20 0 [0054] Stereo Model Establishment Module: 210 [0055] First Calculation Module: 220 [0056] Second Calculation Module: 230 [0057] Third Calculation Module :250 [0058] Control Module: 260 [0059] Track: 40 [0060] User: 5 0 [0061] Image: 510, 520, 530 099102938 Form Number A0101 Page 14 of 33 Page 0992005563-0

Claims (1)

201129084 七、申請專利範圍: 1 . 一種攝影機控制系統,用於控制一第一攝影機,包括: 一臉部偵測模塊,用於接收來自一第二攝影機所拍攝的用 戶的影像,並對影像進行偵測以得到該影像中的臉部區域 一第一計算模塊,用於對得到的臉部區域進行運算,以得 知用戶的頭部傾斜角度;以及 一控制模塊,用於根據得到的頭部傾斜角度發送對應的控 ^ 制訊號,以對應控制該第一攝影機移動至一軌道的對應位 Ο 置。 2 .如申請專利範圍第1項所述之攝影機控制系統,還包括一 第二計算模塊,該第二計算模塊用於對得到的臉部區域進 行運算,以得知用戶的頭部旋轉角度;該控制模塊還用於 根據轉到的頭部旋轉角度發送對應的控制訊號,以對應控 制該第一攝影機的鏡頭向左或向右旋轉至對應的位置。 3 .如申請專利範圍第1所述之攝影機控制系統,還包括一第 Q 三計算模塊,該第三計算模塊用於對得到的臉部區域進行 運算,以得知用戶抬頭或低頭的角度;該控制模塊還用於 根據得到的用戶抬頭或低頭的角度發送對應的控制訊號, 以對應控制該第一攝影機的鏡頭上仰或下俯至對應的位置 〇 4 .如申請專利範圍第1所述之攝影機控制系統,還包括一第 四計算模塊,該第四計算模塊用於對得到的臉部區域進行 運算,以得知用戶的臉部與一參照物之間的距離;該控制 模塊還用於根據得到的臉部與參照物之間的距離發送對應 099102938 表單編號A0101 第15頁/共33頁 0992005563-0 201129084 的控制訊號,以對應控制該第一攝影機的鏡頭至對應的放 大倍數。 5 . —種攝影機控制方法,用於控制一第一攝影機,該攝影機 控制方法包括: 偵測步驟:接收來自一第二攝影機所拍攝的用戶的影像, 並對影像進行偵測以得到該影像中的臉部區域; 第一運算步驟:對得到的臉部區域進行運算,以得知用戶 的頭部傾斜角度;以及 第一控制步驟:根據得到的頭部傾斜角度發送對應的控制 訊號,以對應控制該第一攝影機移動至一軌道的對應位置 Ο ............ ........ 6 .如申請專利範圍第5項所述之攝影機控制方法,其中該偵 測步驟之後還包括: 對得到的臉部區域進行運算,以得知用戶的頭部旋轉角度 ;以及 根據得到的頭部旋轉角度發送對應的控制訊號,以對應控 制該第一攝影機的鏡頭向左或向右旋轉至對應的位置。 7 .如申請專利範圍第5項所述之攝影機控制方法,其中該偵 測步驟之後還包括: 對得到的臉部區域進行運算,以得知用戶抬頭或低頭的角 度;以及 根據得到的用戶抬頭或低頭的角度發送對應的控制訊號, 以對應控制該第一攝影機的鏡頭上仰或下俯至對應的位置 〇 8 .如申請專利範圍第5項所述之攝影機控制方法,其中該偵 測步驟之後還包括: 099102938 表單編號A0101 第16頁/共33頁 0992005563-0 201129084 對得到的臉部區域進行運算,以得知用戶的臉部與一參照 物之間的距離;以及 根據得到的臉部與參照物之間的距離發送對應的控制訊號 ,以對應控制該第一攝影機的鏡頭至對應的放大倍數。 9 . 一種攝影機調整裝置,用於調整一第一攝影機,該攝影機 / 調整裝置包括: 一第二攝影機,用於拍攝用戶的影像; 一軌道,該第一攝影機設置於該執道上且可在該軌道上移 動;以及 一攝影機控制系統,用於接收第二攝影機所拍攝的用戶的 影像,並對影像進行偵測以得到該影像中的臉部區域,還 用於對得到的臉部區域進行運算,以得知臉部區域的特徵 資料,並根據得到的臉部區域的特徵資料發送對應的控制 訊號,以對應控制該第一攝影機。 10 .如申請專利範圍第9項所述之攝影機調整裝置,其中該攝 影機控制系統包括: 一臉部偵測模塊,用於接收來自該第二攝影機所拍攝的用 戶的影像,並對影像進行偵測以得到該影像中的臉部區域 t 一第一計算模塊,用於對得到的臉部區域進行運算,以得 知用戶的頭部傾斜角度;以及 一控制模塊,用於根據得到的頭部傾斜角度發送對應的控 制訊號,以對應控制該第一攝影機移動至該軌道的對應位 置。 11 .如申請專利範圍第10項所述之攝影機調整裝置,其中該攝 影機控制系統還包括一第二計算模塊,該第二計算模塊用 099102938 表單編號A0101 第17頁/共33頁 0992005563-0 201129084 於對得到的臉部區域進行運算,以得知用戶的頭部旋轉角 度;該控制模塊還用於根據得到的頭部旋轉角度發送對應 的控制訊號,以對應控制該第一攝影機的鏡頭向左或向右 旋轉至對應的位置。 12 .如申請專利範圍第10項所述之攝影機調整裝置,其中該攝 影機控制系統還包括一第三計算模塊,該第三計算模塊用 於對得到的臉部區域進行運算,以得知用戶抬頭或低頭的 角度;該控制模塊還用於根據得到的用戶抬頭或低頭的角 度發送對應的控制訊號,以對應調整該第一攝影機的鏡頭 上仰或下俯至對應的位置。 13 .如申請專利範圍第10項所述之攝影機調整裝置,其中該攝 影機控制系統還包括一第四計算模塊,該第四計算模塊用 於對得到的臉部區域進行運算,以得知用戶的臉部與一參 照物之間的距離;該控制模塊還用於根據得到的臉部與參 照物之間的距離發送對應的控制訊號,以對應調整該第一 攝影機的鏡頭至對應的放大倍數。 099102938 表單編號A0101 第18頁/共33頁 0992005563-0201129084 VII. Patent application scope: 1. A camera control system for controlling a first camera, comprising: a face detection module for receiving images of a user photographed by a second camera, and performing image processing on the image Detecting to obtain a face area in the image, a first calculation module for calculating the obtained face area to know the tilt angle of the user's head; and a control module for obtaining the head according to the obtained The tilt angle sends a corresponding control signal to correspondingly control the corresponding position of the first camera to move to a track. 2. The camera control system of claim 1, further comprising a second calculation module, wherein the second calculation module is configured to calculate the obtained face region to know the user's head rotation angle; The control module is further configured to send a corresponding control signal according to the turned head rotation angle to correspondingly control the lens of the first camera to rotate to the left or right to a corresponding position. 3. The camera control system of claim 1, further comprising a Q3 calculation module, wherein the third calculation module is configured to calculate the obtained face region to know the angle of the user's head or head; The control module is further configured to send a corresponding control signal according to the obtained angle of the user's head or head to correspondingly control the lens of the first camera to be tilted up or down to a corresponding position 〇4. The camera control system further includes a fourth calculation module, wherein the fourth calculation module is configured to calculate the obtained face region to know the distance between the user's face and a reference object; the control module also uses The control signal corresponding to the 099102938 form number A0101 page 15 / total page 33 0992005563-0 201129084 is sent according to the distance between the obtained face and the reference object to correspondingly control the lens of the first camera to the corresponding magnification. 5 . A camera control method for controlling a first camera, the camera control method comprising: a detecting step: receiving an image of a user photographed by a second camera, and detecting the image to obtain the image a first operation step: performing an operation on the obtained face region to know the tilt angle of the user's head; and a first control step of: transmitting a corresponding control signal according to the obtained tilt angle of the head to correspond Controlling the first camera to move to a corresponding position of a track Ο ........................ 6. The camera control method according to claim 5, wherein The detecting step further includes: performing an operation on the obtained face region to know the rotation angle of the user's head; and transmitting a corresponding control signal according to the obtained head rotation angle to correspondingly control the lens of the first camera Rotate left or right to the corresponding position. 7. The camera control method of claim 5, wherein the detecting step further comprises: performing an operation on the obtained face region to know an angle at which the user raises or bows; and according to the obtained user's head Or the lowering angle sends a corresponding control signal to correspondingly control the lens of the first camera to be tilted up or down to a corresponding position 〇8. The camera control method according to claim 5, wherein the detecting step After that, it also includes: 099102938 Form No. A0101 Page 16 of 33 0992005563-0 201129084 The obtained face area is calculated to know the distance between the user's face and a reference object; and according to the obtained face A corresponding control signal is sent to the distance between the reference object to correspondingly control the lens of the first camera to a corresponding magnification. 9. A camera adjustment device for adjusting a first camera, the camera/adjusting device comprising: a second camera for capturing an image of a user; a track, the first camera being disposed on the road and being Moving on the track; and a camera control system for receiving an image of the user captured by the second camera, detecting the image to obtain a face region in the image, and also for calculating the obtained face region To know the feature data of the face region, and send a corresponding control signal according to the feature data of the obtained face region to correspondingly control the first camera. 10. The camera adjustment device of claim 9, wherein the camera control system comprises: a face detection module for receiving an image of a user captured by the second camera and detecting the image Measured to obtain a face region t in the image, a first calculation module for calculating the obtained face region to know the tilt angle of the user's head; and a control module for obtaining the head according to the obtained The tilt angle sends a corresponding control signal to correspondingly control the movement of the first camera to the corresponding position of the track. 11. The camera adjustment device of claim 10, wherein the camera control system further comprises a second calculation module, the second calculation module uses 099102938, form number A0101, page 17 / total page 33, 0992005563-0, 201129084 Performing an operation on the obtained face region to know the rotation angle of the user's head; the control module is further configured to send a corresponding control signal according to the obtained head rotation angle to correspondingly control the lens of the first camera to the left Or rotate to the right to the corresponding position. 12. The camera adjustment device of claim 10, wherein the camera control system further comprises a third calculation module, wherein the third calculation module is configured to perform an operation on the obtained face region to learn that the user is looking up Or the angle of the head; the control module is further configured to send the corresponding control signal according to the obtained angle of the user's head or head to adjust the lens of the first camera to tilt up or down to the corresponding position. The camera adjustment device of claim 10, wherein the camera control system further comprises a fourth calculation module, wherein the fourth calculation module is configured to perform operation on the obtained face region to learn the user's a distance between the face and a reference object; the control module is further configured to send a corresponding control signal according to the distance between the obtained face and the reference object to correspondingly adjust the lens of the first camera to a corresponding magnification. 099102938 Form Number A0101 Page 18 of 33 0992005563-0
TW099102938A 2010-02-02 2010-02-02 Controlling system and method for camera, adjusting apparatus for camera including the same TW201129084A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
TW099102938A TW201129084A (en) 2010-02-02 2010-02-02 Controlling system and method for camera, adjusting apparatus for camera including the same
US12/786,289 US20110187866A1 (en) 2010-02-02 2010-05-24 Camera adjusting system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW099102938A TW201129084A (en) 2010-02-02 2010-02-02 Controlling system and method for camera, adjusting apparatus for camera including the same

Publications (1)

Publication Number Publication Date
TW201129084A true TW201129084A (en) 2011-08-16

Family

ID=44341305

Family Applications (1)

Application Number Title Priority Date Filing Date
TW099102938A TW201129084A (en) 2010-02-02 2010-02-02 Controlling system and method for camera, adjusting apparatus for camera including the same

Country Status (2)

Country Link
US (1) US20110187866A1 (en)
TW (1) TW201129084A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI693533B (en) * 2018-08-09 2020-05-11 黃鈞鼎 Image processing method and head-mounted dispaly system
CN111314667A (en) * 2020-03-13 2020-06-19 合肥科塑信息科技有限公司 Security monitoring equipment and security monitoring method
US10878781B2 (en) 2018-08-09 2020-12-29 Chun-Ding HUANG Image processing method and head-mounted display system

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9686452B2 (en) * 2011-02-16 2017-06-20 Robert Bosch Gmbh Surveillance camera with integral large-domain sensor
JP2012244196A (en) * 2011-05-13 2012-12-10 Sony Corp Image processing apparatus and method
US9667854B2 (en) * 2014-12-31 2017-05-30 Beijing Lenovo Software Ltd. Electornic device and information processing unit
CN105812835B (en) * 2014-12-31 2019-01-15 联想(北京)有限公司 A kind of information processing method and electronic equipment

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7570301B2 (en) * 2004-06-03 2009-08-04 Electronic Security Products, Inc. Device, system and method of mounting audio/video capturing equipment
JP3926837B2 (en) * 2004-06-04 2007-06-06 松下電器産業株式会社 Display control method and apparatus, program, and portable device
WO2006030607A1 (en) * 2004-09-14 2006-03-23 Mitsubishi Denki Kabushiki Kaisha Mobile device
AU2006352758A1 (en) * 2006-04-10 2008-12-24 Avaworks Incorporated Talking Head Creation System and Method
US20090156955A1 (en) * 2007-12-13 2009-06-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for comparing media content
US20100118112A1 (en) * 2008-11-13 2010-05-13 Polycom, Inc. Group table top videoconferencing device
US8320617B2 (en) * 2009-03-27 2012-11-27 Utc Fire & Security Americas Corporation, Inc. System, method and program product for camera-based discovery of social networks

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI693533B (en) * 2018-08-09 2020-05-11 黃鈞鼎 Image processing method and head-mounted dispaly system
US10878781B2 (en) 2018-08-09 2020-12-29 Chun-Ding HUANG Image processing method and head-mounted display system
CN111314667A (en) * 2020-03-13 2020-06-19 合肥科塑信息科技有限公司 Security monitoring equipment and security monitoring method

Also Published As

Publication number Publication date
US20110187866A1 (en) 2011-08-04

Similar Documents

Publication Publication Date Title
TW201129084A (en) Controlling system and method for camera, adjusting apparatus for camera including the same
WO2018014730A1 (en) Method for adjusting parameters of camera, broadcast-directing camera, and broadcast-directing filming system
JP6683306B2 (en) Image acquisition system and image acquisition method
TWI311286B (en)
JP5005080B2 (en) Panorama image generation method
WO2017049816A1 (en) Method and device for controlling unmanned aerial vehicle to rotate along with face
WO2013015147A1 (en) Image processing system, information processing device, program, and image processing method
US20220122279A1 (en) Imaging method and imaging control apparatus
TWI517828B (en) Image tracking system and image tracking method thereof
US11184539B2 (en) Intelligent dual-lens photographing device and photographing method therefor
WO2017215351A1 (en) Method and apparatus for adjusting recognition range of photographing apparatus
WO2006054598A1 (en) Face feature collator, face feature collating method, and program
WO2017146202A1 (en) Three-dimensional shape data and texture information generation system, photographing control program, and three-dimensional shape data and texture information generation method
WO2017126222A1 (en) Display control device, display control method, and computer program
US11026762B2 (en) Medical observation device, processing method, and medical observation system
KR101096157B1 (en) watching apparatus using dual camera
JP2020052979A (en) Information processing device and program
JP2005142683A (en) Apparatus and method for camera control
TW201128444A (en) Controlling system and method for camera, adjusting apparatus for camera including the same
JPWO2018150569A1 (en) Gesture recognition device, gesture recognition method, projector including gesture recognition device, and video signal supply device
WO2022061541A1 (en) Control method, handheld gimbal, system, and computer-readable storage medium
WO2018116582A1 (en) Control device, control method, and medical observation system
JP2018085579A (en) Imaging apparatus, control method, and information processing program
JP2014192745A (en) Imaging apparatus, information processing apparatus, control method and program thereof
JP2020025170A (en) Information processing unit, detection method, control program, and recording medium