TW201737199A - Multiple camera computing system having camera-to-camera communications link - Google Patents
Multiple camera computing system having camera-to-camera communications link Download PDFInfo
- Publication number
- TW201737199A TW201737199A TW105143998A TW105143998A TW201737199A TW 201737199 A TW201737199 A TW 201737199A TW 105143998 A TW105143998 A TW 105143998A TW 105143998 A TW105143998 A TW 105143998A TW 201737199 A TW201737199 A TW 201737199A
- Authority
- TW
- Taiwan
- Prior art keywords
- camera
- camera system
- image
- processor
- image processing
- Prior art date
Links
- 238000004891 communication Methods 0.000 title claims description 19
- 238000000034 method Methods 0.000 claims description 38
- 238000012545 processing Methods 0.000 claims description 33
- 238000013144 data compression Methods 0.000 claims description 7
- 230000008569 process Effects 0.000 claims description 7
- 238000003860 storage Methods 0.000 claims description 7
- 230000006870 function Effects 0.000 description 28
- 230000009977 dual effect Effects 0.000 description 18
- 238000001514 detection method Methods 0.000 description 7
- 238000007726 management method Methods 0.000 description 4
- 230000002093 peripheral effect Effects 0.000 description 4
- 230000009467 reduction Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000008921 facial expression Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 238000005549 size reduction Methods 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 230000002730 additional effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/271—Image signal generators wherein the generated image signals comprise depth maps or disparity maps
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/5854—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using shape and object relationship
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/174—Segmentation; Edge detection involving the use of two or more images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/139—Format conversion, e.g. of frame-rate or size
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/2224—Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
- H04N5/2226—Determination of depth image, e.g. for foreground/background separation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
- G06T2207/10021—Stereoscopic video; Stereoscopic image sequence
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Library & Information Science (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
Abstract
Description
存在於具有一或多個整合式相機之傳統計算系統中之一問題係過量影像資料串流至計算系統(例如一手持裝置之一或多個應用處理器)之處理核心以使處理核心處理影像資料且基於影像資料之內容作出智慧決策。不幸地,串流至處理器之許多資料係非相關或非所關注。因而,消耗顯著量電力及資源,基本上通過系統傳輸無意義資料。One problem in conventional computing systems with one or more integrated cameras is that the excess image data is streamed to a processing core of a computing system (eg, one or more application processors) to cause the processing core to process the image. Data and make intelligent decisions based on the content of the image data. Unfortunately, many of the data streamed to the processor is unrelated or unattended. Thus, a significant amount of power and resources are consumed, and substantially no meaningful data is transmitted through the system.
本發明描述一種設備。該設備包含具有一處理器及一記憶體之一第一相機系統。該第一相機系統包含用於自一第二相機系統接收影像之一介面。該第一相機系統包含一處理器及記憶體。該處理器及記憶體執行由該第一相機系統擷取之第一影像及由該第二相機系統擷取且在該介面處接收之第二影像之影像處理程式碼。 本發明描述一種設備。該設備包含用於在一第一相機系統處處理由該第一相機系統接收之影像之構件。該設備亦包含用於在該第一相機系統處處理由一第二相機系統接收且通過耦合該第一相機系統及該第二相機系統之一通信鏈路發送至該第一相機系統之影像之構件。該設備亦包含用於自該第一相機系統通知一應用處理器與該第一相機系統及該第二相機系統之任一者或兩者有關之事件之構件。The present invention describes an apparatus. The device includes a first camera system having a processor and a memory. The first camera system includes an interface for receiving images from a second camera system. The first camera system includes a processor and a memory. The processor and the memory perform a first image captured by the first camera system and an image processing code of the second image captured by the second camera system and received at the interface. The present invention describes an apparatus. The apparatus includes means for processing an image received by the first camera system at a first camera system. The apparatus also includes means for processing, at the first camera system, an image received by a second camera system and transmitted to the first camera system via a communication link coupled to the first camera system and the second camera system . The device also includes means for notifying the first camera system of an event associated with an application processor and either or both of the first camera system and the second camera system.
圖1展示具有一雙相機配置之一第一先前技術計算系統,其中兩個不同相機101、102具有分離式各自硬體105、106通道至一應用處理器103。根據圖1之系統之操作,兩個相機101、102基本上將其自身專用影像流及其他形式之通信通過系統之硬體平台104之其各自通道105、106獨立引導至處理器。 伴隨圖1之方法之一問題係相較於一單一相機解決方案兩倍量之額外負擔及佈線隨電腦系統而駐存。例如,若第一相機101期望傳達至處理器103,則一或多個信號沿通道105發送;而若第二相機102期望傳達至處理器103,則一或多個信號沿通道106發送。 因此,處理器103需要能夠在兩個不同處理器輸入107、108 (例如處理器中斷輸入)處服務兩個不同通信。兩個不同處理器輸入107、108之消耗係無效的,就此而言,處理器103僅具有有限數目個輸入且兩個此等輸入107、108由雙相機系統消耗。因此,可能難以自系統中之其他組件(可能有大量組件)饋送其他引導通道,若無法經設計以直接到達處理器之任何組件相對重要,則這可尤其困擾。 伴隨圖1之方法之另一問題係複雜佈線密度及相關聯之電力消耗。此處,考量一情況,其中兩個相機同時沿其各自通道105、106串流至處理器103。因此,兩個資料流均通過硬體平台104單獨傳輸至處理器。 除由具有設計至硬體平台104中之兩個分離式專用硬體通道105、106自然引起之內在佈線複雜度之外,亦存在無效電力消耗之問題,尤其當原始資料或稍微處理之影像資料引導至處理器104 (即處理器在自相機101、102串流之資料上執行相當複雜之功能)時。在此情況中,大量資料之兩個單獨流可能需要在平台104內傳輸越過可能較大距離,其將需要大量電力來實現。 伴隨圖1之方法之另一問題係至雙相機系統之介面109、110相對不可撓。此處,兩個相機必須連接至為其提供之實體介面109、110對。即,硬體平台104之一設計者拒絕整合不支援介面109及110之相機之機會且同樣地,相機供應商拒絕將其相機整合至設計者之平台104之機會。 圖2中觀察所屬技術領域中已知之一改良方法。根據圖2之方法,一橋接功能212放置於雙相機系統201、202與處理器203之間。橋接功能212基本上強化及/或多工來自兩個相機201、202 (例如雙影像流等等)至饋送至處理器203之一單一通道213。 橋接功能212之引入幫助減輕上文參考圖1討論之一些無效率。特定言之,在處理器處僅消耗一輸入207其「釋放」一輸入208 (相較於圖1之方法)使得(例如)除一相機之外之一些其他系統組件可與處理器203直接通信。 然而,電力消耗仍係焦點。此處,橋接功能212受限於多工及/或交織且不執行實質資料減少程序(諸如資料壓縮)。因而,若大量資料串流至處理器203,接著硬體平台204將消費大量電力以在平台104內越過長距離傳輸大量資料。 另外,橋接功能212未解決可存在於平台204提供以連接至一相機之介面209、210之類型與可為整合至系統中之一選項之可用相機經設計以包含之介面之類型之間的任何不匹配問題。 參考圖3,可藉由將處理智慧引入相機之一者中而至少在某種程度上減輕電力消耗問題。此處,圖3展示其中一雙相機系統(「主要」相機301)內之相機之一者具有一局部處理器314及局部記憶體315之另一先前技術方法。處理器314執行來自記憶體315之程式碼且可執行特定資料大小減少功能(諸如資料壓縮)以有效地減少需要傳輸至主處理器303之資料之量。 隨著極少資料發送至主處理器303 (例如理想地,僅主處理器303需要執行與影像相關之應用之資訊自主要相機301發送至主處理器303),硬體平台304將消耗較少電力而主處理器303期望提供之功能之任何損耗。 然而,注意:圖3之方法在相機301之一者中僅包含一處理器解決方案314。此處,雙相機系統通常具有一主要相機301及一次要相機302 (例如次要相機可為背向一手持裝置之使用者之一「背側」相機而主要相機可為面向一手持裝置之使用者之一「前側」相機(或替代地次要相機可為主要相機且前側相機可為次要相機))。次要相機302之次要功能通常不導致駐存於主要相機301中之處理器314及記憶體315之新增成本。因而,僅針對自主要相機301轉移至主處理器303而不自次要相機302轉移至主處理器303實現將較少資料通過平台304發送至主處理器303之電力消耗減少改良。 另外,如同圖1及圖2之方法,圖3之硬體平台304針對雙相機系統提供一對固定介面309、310。因而,仍存在由硬體平台304支援之介面309、310與否則可視為整合至平台304中之候選項之設計至相機之介面之間的不匹配問題。另外,圖3之方法消耗兩個處理器輸入307、308,如參考圖1所討論,其可排除計算系統內之其他重要組件與主處理器303直接通信。 圖4展示優於就在上文參考圖1至圖3討論之先前技術解決方案之任何者之克服前述問題之一新穎方法。圖4之方法包含介於次要相機402與主要相機401之間的一通信通道416。另外,一橋接功能417包含於主要相機401中以通過存在於主要相機401與主處理器403之間的單一通道405 (例如)多工及/或組合來自兩個相機401、402之影像流。如下文所進一步更詳細討論,通道405可為一直接固線式通道或實體上通過硬體平台404之多個組件之一邏輯通道。 在圖4之方法中,來自第二相機402之影像資料通過存在於兩個相機401、402之間的通信通道416傳遞至主要相機401。嵌入主要相機401內之橋接功能417 (例如作為處理器414執行之一可執行軟體程式)使主要相機401能夠沿通道405將次要相機之影像資料以及主要相機之影像資料發送至主處理器403。 因此,如同圖2之方法,圖4之改良方法在主處理器403處僅消耗一輸入407其「釋放」一處理器輸入408使得其可用以與系統中之一些其他組件直接通信。 另外,如同圖3之方法,因為資料大小減少常式(諸如資料壓縮)可由主要相機401執行(其減少需要通過平台404傳輸至主處理器403之資料之總量),所以實現電力節省。然而,儘管圖3之方法僅可減少主要相機301之電力消耗(即僅可減少主要相機之影像資料之大小),但圖4之方法可減少將資訊自兩個相機401、402傳輸至主處理器403之相關聯之電力消耗。 此處,由主要相機401對其自身影像資料執行資料減少程序(例如資料壓縮)亦可在其經由通道416自次要相機402接收之影像資料上執行。因而,來自兩個相機401、402之較小大小資料流可發送至主處理器403。 再另外,次要相機402至少對已在主機硬體平台404上實施之相機介面409之特定類型不關心。因此,僅主要相機401需要可與平台404之一介面409相容之一介面。若要實施該解決方案,次要相機之介面419僅需要與主要相機之第二介面418相容即可。因此,通道416之存在於主要與次要相機401、402之間提供系統設計者關於可與其平台404整合之相機之可能更自由之選擇。 例如,僅作為一實例,駐存於相機401、402之間的通道416可為製造主要及次要相機401、402之一相機製造者之一專屬通道。儘管次要相機402可不具有可與主機平台404相容之一介面,然而次要相機402能夠使其資料經由主要相機401之相機至相機通道416及橋接功能417串流至主處理器403。 另外,圖4之方法本質上對於其中來自兩個相機401、402之影像組合或以其他方式一起處理以實現一結合性奇異資訊集之應用係更有效。一實例係一實施方案,其中兩個相機401、402充當一立體對且其各自影像組合以判定兩個相機401、402均聚焦於其上之一物件之一三維深度剖面(「深度圖」)。深度剖面可由主處理器403使用以執行一些影像深度功能(諸如手/手指運動偵測、面部辨識等等)。 此處,在主要相機401上執行之軟體可處理其自身影像流資料及來自次要相機402之影像流資料以計算深度圖。深度圖可自主要相機401發送至主處理器403。此處,先前已知之解決方案需要將兩個影像流均發送至主處理器403。接著,主處理器403執行計算以判定深度圖。 在就在上文描述之改良方法(其中在主要相機401內計算深度圖)中,因為僅一深度圖跨越平台404傳輸至主處理器403且(潛在大量)影像流資料保持局部化至雙相機系統401、402,所以實現實質電力節約。此處,深度圖理解為比深度圖自其計算之影像流之資料小很多之資料之量。 另一實例係自動對焦。此處,由在主要相機401上執行之軟體自兩個相機401、402之影像流計算之深度剖面資訊可用以控制一或兩個相機401、402之一自動對焦功能。例如,在主要相機401上執行之軟體可處理來自兩個相機401、402之影像流以將控制信號提供至一或多個相機401、402內之音圈、致動器或其他電子機械裝置以調整(若干)相機401、402之(若干)透鏡系統之聚焦位置。 作為一比較點,傳統系統將影像資料串流至主處理器且主處理器判定自動對焦調整。在能夠由圖4之改良系統執行之改良方法中,主處理器403僅接收聚焦影像資料(即主處理器403不必須執行各種自動對焦任務)。發送至主處理器403之減少量之資料再次對應於一電力減少改良。 其他功能亦可由在主要相機401上執行之軟體執行以減少自雙相機401、402系統發送至主處理器403之資訊之量。顯著地,在傳統系統中,串流至主處理器403之許多資訊價值較小。 例如,舊一影像辨識功能而言,無需尋找影像之大量資料耗費地串流至主處理器403,一旦主處理器403意識到所尋找之影像不存在將放棄資料。一更佳方法將在主要相機401內執行影像辨識且僅在所尋找之影像已認識到所要或所尋找之影像當前在(若干)相機之視野中時通知主處理器403。 在所尋找之項目(或所關注之項目)由主要相機401辨識之後,接著,影像資料可串流至主處理器403使得處理器可執行識別所要影像之後待執行之任何功能(例如追蹤物件、記錄物件周圍之特徵等等)。因而,理想上,實際上僅將相關或所關注之資訊(或具有含有相關或所關注之資訊之一高概率之資訊)跨越平台404遞送至主處理器403。理想上,不含有相關項目之其他資訊由主要相機401放棄。 此處,注意:所以所關注之尋找項目可在主要相機之影像流及次要相機之影像流中發現,此係因為主要相機可處理主要相機之影像流及次要相機之影像流。依據實施方案,觸發注意至主處理器403已發現所關注之項目之標準可經組態以識別兩個流中之項目或僅流之一者中之項目。 由主要相機在相機401、402之任一者或兩者之影像流上執行之相關聯之尋找特徵程序可包含(例如)面部偵測(偵測任何面部之出現)、面部辨識(偵測一特定面部之出現)、面部表情辨識(偵測一特定面部表情)、物件偵測或辨識(偵測一一般或特定物件)、運動偵測或辨識(偵測一種一般或特定運動)、事件偵測或辨識(偵測一種一般或特定事件)、影像品質偵測或辨識(偵測影像品質之一一般或特定位準)。 在主要相機已在一影像流中偵測到一尋找項目之後,其亦可隨後執行若干相關「後續」任務之任何者以進一步限制最終針對主處理器403之資訊之量。可由主要相機執行之額外動作之一些實例包含以下之任一者或多者:1)識別一影像內之所關注之一區域(例如包圍該影像內之一或多個尋找特徵之中間區域);2)剖析一影像內之所關注之一區域且將其遞送至系統內之其他(例如較高性能)處理組件;3)放棄一影像內之未關注之區域;4)在將一影像遞送至系統內之其他組件之前壓縮該影像或該影像之部分;5)擷取一種特定影像(例如一快照、一系列快照、一視訊流);及6)改變一或多個相機設定(例如改變耦合至光學裝置之伺服馬達上之設定以放大、縮小或以其他方式調整相機之聚焦/光學裝置;改變一曝光設定;觸發一閃光)。 注意儘管圖4展示主要相機401與主處理器403之間的一直接通道405,但主要相機401與主處理器403之間的完整端至端路徑可為終止於主處理器403處及/或可在到達相機之前通過若干系統功能方塊之一直接硬體通道。在一實施例中,一直接硬體路徑存在於自主要相機401至主處理器403之一中斷輸入用以通知主處理器403在主要相機處偵測到的突發事件。另外,實際資料可遞送平台404之系統記憶體(圖中未展示),其中實際資料隨後由主處理器403讀取。 在一實施例中,主要相機實際上插入之介面可由一周邊控制集線器(圖中未展示)提供。接著,來自主要相機之資料可自該周邊控制集線器直接引導至處理器或儲存在記憶體中。 由主要相機401執行之軟體/韌體可儲存在駐存於相機401內或平台404上之其他位置之非揮發性記憶體中。就後者而言,軟體/韌體在系統啟動期間自平台加載至主要相機401。同樣地,相機處理器414及/或記憶體415可整合為主要相機401之一組件或可實體位於相機401自身外部但(例如)放置為非常接近相機401使得有效操作為位於相機401本地之一處理系統。因而,本應用更一般而言係關於相機系統而非特指相機。 注意相機401、401之任一者可為一可見光相機、一深度資訊相機(諸如輻射紅外線光且有效量測所輻射之光在反射之後返回相機所耗費之時間之一飛行時間測距相機)或在一相同相機解決方案中整合可見光偵測及深度資訊擷取兩者之一相機。 儘管上述討論係聚焦於由一相機系統執行程式碼(軟體/韌體),但上述功能之一些或全部功能可完全在硬體中執行(例如作為一應用特定積體電路或經程式化以執行此等功能之一可程式化邏輯裝置)或硬體及程式碼之一組合。 主要相機401與硬體平台404之間的介面可為諸如一MIPI介面之一工業標準介面。兩個相機之間的介面及/或通道可為一工業標準介面(諸如一MIPI介面)或可為一專屬介面。 圖5展示可由具有多個相機之一系統執行之上述之一方法,其中一通信鏈路存在於相機之間。根據圖5,該方法包含在第一相機系統處處理由一第一相機系統接收之影像501。該方法亦包含在該第一相機系統處處理由一第二相機系統接收且通過耦合該第一相機系統及該第二相機系統之一通信鏈路發送至該第一相機系統之影像502。該方法亦包含自該第一相機系統通知一應用處理器與該第一相機系統及該第二相機系統之任一者或兩者有關之事件503。 圖6提供一計算系統之一例示性描繪。下文所描述之該計算系統之許多組件可應用於具有一整合式相機及相關聯之影像處理器之一計算系統(例如諸如一智慧型電話或平板電腦之一手持裝置)。一般技術者將能夠容易區隔兩者。 如圖6中所觀察,基本計算系統可包含一中央處理單元601 (其可包含(例如)複數個通用處理器核心615_1至615-N及安置於一多核心處理器或應用處理器上之一主記憶體控制器617)、系統記憶體602、一顯示器603 (例如觸控螢幕或平板)、一局部有線點至點鏈路(例如USB)介面604、各自網路I/O功能605 (諸如一乙太網路介面及/或蜂巢式數據機子系統)、一無線區域網路(例如WiFi)介面606、一無線點至點鏈路(例如藍芽)介面607及一全球定位系統介面608、各種感測器609_1至609_N、一或多個相機610、一電池611、一電源管理控制單元612、一揚聲器及麥克風613及一音訊編碼器/解碼器614。 一應用處理器或多核心處理器650在其CPU 601內可包含一或多個通用處理器核心615、一或多個圖形處理單元616、一記憶體管理功能617 (例如一記憶體控制器)、一I/O控制功能(諸如前述周邊控制集線器) 618。通用處理器核心615通常執行操作系統及計算系統之應用軟體。圖形處理單元616通常執行圖形密集功能以(例如)產生出現在顯示器603上之圖形資訊。記憶體控制功能617與系統記憶體602介接以將資料寫入系統記憶體602/自系統記憶體602讀取資料。電源管理控制單元612大體上控制系統600之電力消耗。 觸控螢幕顯示器603、通信介面604至607、GPS介面608、感測器609、相機610及揚聲器/麥克風編解碼器613、614之各者全可視為相對於總體計算系統之各種形式之I/O (輸入及/或輸出),總體計算系統視情況亦包含一整合式週邊裝置(例如該一或多個相機610)。依據實施方案,此等I/O組件之各一者可整合於應用處理器/多核心處理器650上或可位於應用處理器/多核心處理器650之晶粒外或封包外部。 在一實施例中,相機610之至少兩者在相機之間具有一通信通道且此等相機之一者具有用於實施上文參考圖4討論之特徵之一些或所有特徵之一處理器及記憶體。 本發明之實施例可包含如上文所闡述之各種程序。程序可體現在機器可執行指令中。指令可用以引起一通用或專用處理器執行特定程序。替代地,此等程序可由含有用於執行程序之固線式邏輯之特定硬體組件或由程式化電腦組件及定製硬體組件之任何組合執行。 本發明之元件亦可提供為用於儲存機器可執行指令之一機器可讀媒體。機器可讀媒體可包含(但不限於)軟碟 、光碟、CD-ROM及磁光碟、快閃記憶體、ROM、RAM、EPROM、EEPROM、磁卡或光學卡、傳播媒體或適合於儲存電子指令之其他類型之媒體/機器可讀媒體。例如,本發明可經由一通信鏈路(例如一數據機或網路連接)下載為可憑藉體現在一載波或其他傳播媒體中之資料信號自一遠端電腦(例如一伺服器)轉移至一請求電腦(例如一客戶端)之一電腦程式。 在前述說明書中,已參考本發明之特定例示性實例描述本發明。然而,顯而易見,可在不會背離如隨附申請專利範圍中所闡述之本發明之廣泛精神及範疇之情況下對本發明實行各種修改及改變。相應地,說明書及圖式被視為具繪示性而非限制性。1 shows a first prior art computing system having a dual camera configuration in which two different cameras 101, 102 have separate respective hardware 105, 106 channels to an application processor 103. In accordance with the operation of the system of FIG. 1, the two cameras 101, 102 substantially independently direct their own dedicated video streams and other forms of communication to their respective channels through their respective channels 105, 106 of the hardware platform 104 of the system. One of the problems associated with the method of Figure 1 is that the additional burden and wiring of the single camera solution resides with the computer system. For example, if the first camera 101 desires to communicate to the processor 103, one or more signals are transmitted along the channel 105; and if the second camera 102 desires to communicate to the processor 103, one or more signals are transmitted along the channel 106. Therefore, the processor 103 needs to be able to service two different communications at two different processor inputs 107, 108 (e.g., processor interrupt inputs). The consumption of the two different processor inputs 107, 108 is invalid, in this regard, the processor 103 has only a limited number of inputs and the two such inputs 107, 108 are consumed by the dual camera system. Therefore, it may be difficult to feed other boot channels from other components in the system (possibly with a large number of components), which can be especially troublesome if it is not designed to directly reach any component of the processor. Another problem associated with the method of Figure 1 is the complex wiring density and associated power consumption. Here, consider a situation in which two cameras simultaneously stream to the processor 103 along their respective channels 105,106. Therefore, both data streams are separately transmitted to the processor through the hardware platform 104. In addition to the inherent wiring complexity inherently from the two separate dedicated hardware channels 105, 106 of the design to hardware platform 104, there is also the problem of inefficient power consumption, especially when the original data or slightly processed image data Boot to processor 104 (i.e., when the processor performs a rather complex function on the data streamed from cameras 101, 102). In this case, two separate streams of bulk data may need to be transmitted within the platform 104 over a potentially large distance, which would require a significant amount of power to implement. Another problem with the method of Figure 1 is that the interfaces 109, 110 of the dual camera system are relatively inflexible. Here, the two cameras must be connected to the physical interface 109, 110 pairs provided for them. That is, one of the hardware platforms 104 refuses to integrate the opportunity to integrate cameras that do not support interfaces 109 and 110 and, likewise, the camera vendor refuses to integrate their cameras into the designer's platform 104. An improved method known in the art is observed in Figure 2. According to the method of FIG. 2, a bridge function 212 is placed between the dual camera systems 201, 202 and the processor 203. The bridging function 212 is substantially enhanced and/or multiplexed from two cameras 201, 202 (e.g., dual image streams, etc.) to a single channel 213 that is fed to the processor 203. The introduction of the bridging function 212 helps alleviate some of the inefficiencies discussed above with respect to FIG. In particular, only one input 207 is consumed at the processor to "release" an input 208 (as compared to the method of FIG. 1) such that, for example, some other system components other than a camera can communicate directly with the processor 203. . However, power consumption remains the focus. Here, the bridging function 212 is limited to multiplex and/or interleaving and does not perform a substantial data reduction procedure (such as data compression). Thus, if a large amount of data is streamed to the processor 203, then the hardware platform 204 will consume a significant amount of power to transfer large amounts of data over the long distance within the platform 104. In addition, the bridging function 212 does not address any of the types between the types of interfaces 209, 210 that the platform 204 provides to connect to a camera and the types of interfaces that can be included for integration into one of the options available to the system. Does not match the problem. Referring to Figure 3, the power consumption problem can be mitigated at least to some extent by introducing processing intelligence into one of the cameras. Here, FIG. 3 shows another prior art method in which one of the cameras in one of the dual camera systems ("primary" camera 301) has a local processor 314 and local memory 315. Processor 314 executes the code from memory 315 and can perform a particular data size reduction function, such as data compression, to effectively reduce the amount of data that needs to be transferred to host processor 303. The hardware platform 304 will consume less power as very little data is sent to the main processor 303 (e.g., ideally only the main processor 303 needs to perform information related to the image-related application sent from the main camera 301 to the main processor 303). The main processor 303 is expected to provide any loss of functionality. Note, however, that the method of FIG. 3 includes only one processor solution 314 in one of the cameras 301. Here, the dual camera system typically has a primary camera 301 and a primary camera 302 (eg, the secondary camera can be one of the "back side" cameras facing the user of a handheld device and the primary camera can be used for a handheld device. One of the "front side" cameras (or alternatively the secondary camera can be the primary camera and the front side camera can be the secondary camera)). The secondary function of the secondary camera 302 typically does not result in additional costs for the processor 314 and memory 315 resident in the primary camera 301. Thus, the power consumption reduction improvement of transmitting less data through the platform 304 to the main processor 303 is achieved only for the transfer from the primary camera 301 to the main processor 303 and not from the secondary camera 302 to the main processor 303. Additionally, like the methods of FIGS. 1 and 2, the hardware platform 304 of FIG. 3 provides a pair of fixed interfaces 309, 310 for a dual camera system. Thus, there is still a mismatch between the interfaces 309, 310 supported by the hardware platform 304 and the design-to-camera interface that may otherwise be considered as candidates for integration into the platform 304. In addition, the method of FIG. 3 consumes two processor inputs 307, 308, as discussed with respect to FIG. 1, which may exclude other important components within the computing system from communicating directly with the main processor 303. 4 shows a novel method that overcomes the aforementioned problems over any of the prior art solutions discussed above with reference to FIGS. 1-3. The method of FIG. 4 includes a communication channel 416 between the secondary camera 402 and the primary camera 401. Additionally, a bridge function 417 is included in the primary camera 401 to, for example, multiplex and/or combine image streams from the two cameras 401, 402 via a single channel 405 that exists between the primary camera 401 and the main processor 403. As discussed in further detail below, channel 405 can be a direct fixed channel or a physical channel through one of a plurality of components of hardware platform 404. In the method of FIG. 4, image data from the second camera 402 is transmitted to the primary camera 401 via a communication channel 416 that exists between the two cameras 401, 402. The bridge function 417 embedded in the main camera 401 (for example, as one of the executable software programs executed by the processor 414) enables the main camera 401 to transmit the image data of the secondary camera and the image data of the main camera to the main processor 403 along the channel 405. . Thus, as with the method of FIG. 2, the improved method of FIG. 4 consumes only one input 407 at the main processor 403 which "releases" a processor input 408 to make it available for direct communication with some other components in the system. In addition, as with the method of FIG. 3, power savings can be achieved because the data size reduction routine (such as data compression) can be performed by the primary camera 401 (which reduces the amount of data that needs to be transmitted to the main processor 403 via the platform 404). However, although the method of FIG. 3 can only reduce the power consumption of the main camera 301 (ie, only reduce the size of the image data of the main camera), the method of FIG. 4 can reduce the transmission of information from the two cameras 401, 402 to the main processing. The associated power consumption of 403. Here, the data reduction procedure (e.g., data compression) performed by the primary camera 401 on its own video material may also be performed on the video material it receives from the secondary camera 402 via channel 416. Thus, a smaller size stream from the two cameras 401, 402 can be sent to the main processor 403. Still further, the secondary camera 402 does not care at least for the particular type of camera interface 409 that has been implemented on the host hardware platform 404. Therefore, only the primary camera 401 needs to be compatible with one interface 409 of one of the platforms 404. To implement this solution, the secondary camera interface 419 only needs to be compatible with the second interface 418 of the primary camera. Thus, the presence of channel 416 provides a more liberal choice between the primary and secondary cameras 401, 402 for the system designer to integrate with the camera 404. For example, as just one example, the channel 416 resident between the cameras 401, 402 can be a dedicated channel for one of the camera manufacturers that manufacture one of the primary and secondary cameras 401, 402. Although secondary camera 402 may not have one interface that is compatible with host platform 404, secondary camera 402 can stream its data to main processor 403 via camera-to-camera channel 416 and bridge function 417 of primary camera 401. Additionally, the method of FIG. 4 is substantially more efficient for applications in which images from two cameras 401, 402 are combined or otherwise processed together to achieve a combined singular information set. An example is an embodiment in which two cameras 401, 402 act as a stereo pair and their respective images are combined to determine a three-dimensional depth profile ("depth map") of one of the objects on which both cameras 401, 402 are focused. . The depth profile can be used by the main processor 403 to perform some image depth functions (such as hand/finger motion detection, facial recognition, etc.). Here, the software executing on the main camera 401 can process its own image stream data and image stream data from the secondary camera 402 to calculate a depth map. The depth map can be sent from the primary camera 401 to the main processor 403. Here, the previously known solution requires both video streams to be sent to the main processor 403. Next, the main processor 403 performs calculation to determine a depth map. In the improved method described above (where the depth map is computed within the primary camera 401), only one depth map is transmitted across the platform 404 to the main processor 403 and (potentially large) image stream data remains localized to the dual camera. The systems 401, 402, thus achieving substantial power savings. Here, the depth map is understood to be the amount of data that is much smaller than the data of the depth map from the image stream it calculates. Another example is autofocus. Here, the depth profile information calculated from the image streams of the two cameras 401, 402 by the software executing on the main camera 401 can be used to control one of the one or two cameras 401, 402. For example, software executing on primary camera 401 can process image streams from two cameras 401, 402 to provide control signals to voice coils, actuators, or other electromechanical devices within one or more cameras 401, 402. The focus position of the lens system(s) of the cameras 401, 402 is adjusted. As a comparison point, the conventional system streams the image data to the main processor and the main processor determines the auto focus adjustment. In an improved method that can be performed by the improved system of FIG. 4, main processor 403 receives only focused image data (ie, main processor 403 does not have to perform various autofocus tasks). The reduced amount of data sent to the main processor 403 again corresponds to a power reduction improvement. Other functions may also be performed by software executing on the primary camera 401 to reduce the amount of information sent from the dual cameras 401, 402 system to the main processor 403. Significantly, in conventional systems, much of the information streamed to the main processor 403 is less valuable. For example, in the case of the old image recognition function, a large amount of data that does not need to be searched for images is costly streamed to the main processor 403, and once the main processor 403 realizes that the image sought is not present, the data will be discarded. A better method would perform image recognition within the primary camera 401 and notify the host processor 403 only if the sought image has recognized that the desired or sought image is currently in view of the camera(s). After the item being sought (or the item of interest) is recognized by the primary camera 401, then the image material can be streamed to the main processor 403 such that the processor can perform any function to be performed after identifying the desired image (eg, tracking objects, Record the characteristics around the object, etc.). Thus, ideally, only relevant or focused information (or information with a high probability of having one of the relevant or focused information) is actually delivered across the platform 404 to the main processor 403. Ideally, other information that does not contain related items is abandoned by the primary camera 401. Here, note: so the search item of interest can be found in the image stream of the main camera and the image stream of the secondary camera, because the main camera can process the image stream of the main camera and the image stream of the secondary camera. Depending on the implementation, the criteria that triggers attention to the main processor 403 that the item of interest has been found can be configured to identify an item in either stream or only one of the streams. The associated feature search program executed by the primary camera on the video stream of either or both cameras 401, 402 may include, for example, face detection (detecting the appearance of any face), face recognition (detecting one) The appearance of a particular face), facial expression recognition (detecting a specific facial expression), object detection or recognition (detecting a general or specific object), motion detection or recognition (detecting a general or specific motion), event detection Measure or identify (detect a general or specific event), image quality detection or identification (detecting one of the image quality's general or specific levels). After the primary camera has detected a search item in an image stream, it may then perform any of a number of related "subsequent" tasks to further limit the amount of information ultimately directed to the host processor 403. Some examples of additional actions that may be performed by a primary camera include any one or more of the following: 1) identifying an area of interest within an image (eg, surrounding an intermediate region of one or more of the sought features in the image); 2) parsing an area of interest within an image and delivering it to other (eg, higher performance) processing components within the system; 3) discarding an unattended area within an image; 4) delivering an image to Other components in the system previously compress the image or portions of the image; 5) capture a particular image (eg, a snapshot, a series of snapshots, a video stream); and 6) change one or more camera settings (eg, change coupling) The setting on the servo motor of the optical device to zoom in, zoom out or otherwise adjust the focus/optical device of the camera; change an exposure setting; trigger a flash). Note that although FIG. 4 shows a direct channel 405 between the primary camera 401 and the primary processor 403, the full end-to-end path between the primary camera 401 and the primary processor 403 may terminate at the primary processor 403 and/or The hard channel can be accessed directly through one of several system function blocks before reaching the camera. In one embodiment, a direct hardware path exists in one of the interrupt inputs from the primary camera 401 to the main processor 403 to notify the host processor 403 of an incident detected at the primary camera. Additionally, the actual data can be delivered to the system memory of the platform 404 (not shown), where the actual data is then read by the main processor 403. In an embodiment, the interface into which the primary camera is actually inserted may be provided by a peripheral control hub (not shown). The data from the primary camera can then be directed from the peripheral control hub to the processor or stored in memory. The software/firmware executed by the primary camera 401 can be stored in non-volatile memory resident in the camera 401 or other location on the platform 404. In the latter case, the software/firmware is loaded from the platform to the primary camera 401 during system startup. Likewise, camera processor 414 and/or memory 415 can be integrated into one of the main cameras 401 or can be physically located external to camera 401 itself but, for example, placed in close proximity to camera 401 such that it operates effectively as one of local cameras 401 Processing system. Thus, the application is more generally related to camera systems than to specific cameras. Note that any of the cameras 401, 401 can be a visible light camera, a depth information camera (such as a time-of-flight ranging camera that radiates infrared light and effectively measures the time it takes for the radiated light to return to the camera after reflection) or Integrate one of the visible light detection and depth information captures in a single camera solution. Although the above discussion focuses on the execution of code (software/firmware) by a camera system, some or all of the above functions may be performed entirely in hardware (eg, as an application specific integrated circuit or programmed to perform) One of these functions can be a programmatic logic device) or a combination of hardware and code. The interface between the primary camera 401 and the hardware platform 404 can be an industry standard interface such as an MIPI interface. The interface and/or channel between the two cameras can be an industry standard interface (such as an MIPI interface) or can be a proprietary interface. Figure 5 illustrates one of the above methods that may be performed by a system having one of a plurality of cameras, wherein a communication link is present between the cameras. According to Figure 5, the method includes processing an image 501 received by a first camera system at a first camera system. The method also includes processing, at the first camera system, an image 502 received by a second camera system and transmitted to the first camera system via a communication link coupled to the first camera system and the second camera system. The method also includes an event 503 from the first camera system to notify an application processor of any one or both of the first camera system and the second camera system. Figure 6 provides an illustrative depiction of one of the computing systems. Many of the components of the computing system described below are applicable to a computing system having an integrated camera and associated image processor (eg, a handheld device such as a smart phone or tablet). The average technician will be able to easily separate the two. As seen in Figure 6, the basic computing system can include a central processing unit 601 (which can include, for example, a plurality of general purpose processor cores 615_1 through 615-N and one of the multi-core processors or application processors) Main memory controller 617), system memory 602, a display 603 (eg, a touch screen or tablet), a local wired point-to-point link (eg, USB) interface 604, and respective network I/O functions 605 (such as An Ethernet interface and/or a cellular modem subsystem, a wireless local area network (e.g., WiFi) interface 606, a wireless point-to-point link (e.g., Bluetooth) interface 607, and a global positioning system interface 608 Various sensors 609_1 to 609_N, one or more cameras 610, a battery 611, a power management control unit 612, a speaker and microphone 613, and an audio encoder/decoder 614. An application processor or multi-core processor 650 can include one or more general purpose processor cores 615, one or more graphics processing units 616, and a memory management function 617 (eg, a memory controller) within its CPU 601. An I/O control function (such as the aforementioned peripheral control hub) 618. The general purpose processor core 615 typically executes application software for the operating system and computing system. Graphics processing unit 616 typically performs graphics intensive functions to, for example, generate graphical information that appears on display 603. The memory control function 617 interfaces with the system memory 602 to write data to/from the system memory 602. Power management control unit 612 generally controls the power consumption of system 600. Each of the touch screen display 603, the communication interfaces 604 to 607, the GPS interface 608, the sensor 609, the camera 610, and the speaker/microphone codecs 613, 614 can be viewed as various forms of I/ relative to the overall computing system. O (input and / or output), the overall computing system also includes an integrated peripheral device (eg, the one or more cameras 610) as appropriate. Depending on the implementation, each of these I/O components can be integrated on the application processor/multi-core processor 650 or can be external to the die or package of the application processor/multi-core processor 650. In one embodiment, at least two of the cameras 610 have a communication channel between the cameras and one of the cameras has one or all of the features and functions of one or all of the features discussed above with respect to FIG. body. Embodiments of the invention may include various programs as set forth above. Programs can be embodied in machine executable instructions. Instructions can be used to cause a general purpose or special purpose processor to execute a particular program. Alternatively, such programs may be executed by specific hardware components containing fixed-line logic for executing programs or by any combination of stylized computer components and custom hardware components. The elements of the present invention may also be provided as a machine readable medium for storing machine executable instructions. A machine-readable medium can include, but is not limited to, a floppy disk, a compact disc, a CD-ROM and a magneto-optical disc, a flash memory, a ROM, a RAM, an EPROM, an EEPROM, a magnetic or optical card, a broadcast medium, or a storage device suitable for storing electronic instructions. Other types of media/machine readable media. For example, the present invention can be downloaded via a communication link (e.g., a modem or network connection) to be transferred from a remote computer (e.g., a server) to a data signal embodied in a carrier or other communication medium. A computer program that requests a computer (such as a client). In the previous specification, the invention has been described with reference to the specific exemplary embodiments of the invention. However, it will be apparent that various modifications and changes can be made in the present invention without departing from the spirit and scope of the invention. Accordingly, the specification and drawings are to be regarded as
101‧‧‧相機/第一相機
102‧‧‧相機/第二相機
103‧‧‧應用處理器
104‧‧‧硬體平台
105‧‧‧硬體/通道
106‧‧‧硬體/通道
107‧‧‧處理器輸入
108‧‧‧處理器輸入
109‧‧‧介面
110‧‧‧介面
201‧‧‧雙相機系統
202‧‧‧雙相機系統
203‧‧‧處理器
204‧‧‧硬體平台
207‧‧‧輸入
208‧‧‧輸入
209‧‧‧介面
210‧‧‧介面
212‧‧‧橋接功能
213‧‧‧單一通道
301‧‧‧「主要」相機
302‧‧‧次要相機
303‧‧‧主處理器
304‧‧‧硬體平台
307‧‧‧處理器輸入
308‧‧‧處理器輸入
309‧‧‧固定介面
310‧‧‧固定介面
314‧‧‧局部處理器/處理器解決方案
315‧‧‧局部記憶體
401‧‧‧主要相機/雙相機系統
402‧‧‧次要相機/第二相機/雙相機系統
403‧‧‧主處理器
404‧‧‧硬體平台/主機硬體平台/主機平台
405‧‧‧通道
407‧‧‧輸入
408‧‧‧處理器輸入
409‧‧‧相機介面
414‧‧‧相機處理器
415‧‧‧記憶體
416‧‧‧通信通道/相機至相機通道
417‧‧‧橋接功能
418‧‧‧主要相機之第二介面
419‧‧‧次要相機之介面
601‧‧‧中央處理單元(CPU)
602‧‧‧系統記憶體
603‧‧‧顯示器/觸控螢幕顯示器
604‧‧‧局部有線點至點鏈路介面/通信介面
610‧‧‧相機
611‧‧‧電池
612‧‧‧電源管理控制單元
615_1至615-N‧‧‧通用處理器核心
616‧‧‧圖形處理單元(GPU)
617‧‧‧主記憶體控制器
618‧‧‧I/O控制功能
650‧‧‧多核心處理器101‧‧‧ Camera / First Camera
102‧‧‧Camera/second camera
103‧‧‧Application Processor
104‧‧‧ hardware platform
105‧‧‧ Hardware/Channel
106‧‧‧ Hardware/Channel
107‧‧‧Processor input
108‧‧‧Processor input
109‧‧‧ interface
110‧‧‧ interface
201‧‧‧Two camera system
202‧‧‧Double camera system
203‧‧‧ processor
204‧‧‧ hardware platform
207‧‧‧ input
208‧‧‧Enter
209‧‧‧ interface
210‧‧‧ interface
212‧‧‧Bridge function
213‧‧‧ single channel
301‧‧‧ "main" camera
302‧‧‧ secondary camera
303‧‧‧Main processor
304‧‧‧ hardware platform
307‧‧‧Processor input
308‧‧‧Processor input
309‧‧‧Fixed interface
310‧‧‧Fixed interface
314‧‧‧Local Processor/Processor Solutions
315‧‧‧Local memory
401‧‧‧Main camera/dual camera system
402‧‧‧Secondary camera/second camera/dual camera system
403‧‧‧Main processor
404‧‧‧Hard Platform/Host Hardware Platform/Host Platform
405‧‧‧ channel
407‧‧‧Enter
408‧‧‧Process input
409‧‧‧ camera interface
414‧‧‧ camera processor
415‧‧‧ memory
416‧‧‧Communication channel/camera to camera channel
417‧‧‧Bridge function
418‧‧‧Second interface of the main camera
419‧‧‧ secondary camera interface
601‧‧‧Central Processing Unit (CPU)
602‧‧‧ system memory
603‧‧‧Display/Touch Screen Display
604‧‧‧Local wired point-to-point link interface/communication interface
610‧‧‧ camera
611‧‧‧Battery
612‧‧‧Power Management Control Unit
615_1 to 615-N‧‧‧General Processor Core
616‧‧‧Graphic Processing Unit (GPU)
617‧‧‧Main memory controller
618‧‧‧I/O control function
650‧‧‧Multicore processor
以下描述及附圖用以繪示本發明之實施例。在圖式中: 圖1展示一第一先前技術雙相機配置; 圖2展示一第二先前技術雙相機配置; 圖3展示一第三先前技術雙相機配置; 圖4展示一改良式雙相機配置; 圖5展示由圖4之相機配置之一相機執行之一方法; 圖6展示一計算系統。The following description and the drawings are used to illustrate embodiments of the invention. In the drawings: Figure 1 shows a first prior art dual camera configuration; Figure 2 shows a second prior art dual camera configuration; Figure 3 shows a third prior art dual camera configuration; Figure 4 shows an improved dual camera configuration Figure 5 shows one of the methods performed by one of the camera configurations of Figure 4; Figure 6 shows a computing system.
401‧‧‧主要相機/雙相機系統 401‧‧‧Main camera/dual camera system
402‧‧‧次要相機/第二相機/雙相機系統 402‧‧‧Secondary camera/second camera/dual camera system
403‧‧‧主處理器 403‧‧‧Main processor
404‧‧‧硬體平台/主機硬體平台/主機平台 404‧‧‧Hard Platform/Host Hardware Platform/Host Platform
405‧‧‧通道 405‧‧‧ channel
407‧‧‧輸入 407‧‧‧Enter
408‧‧‧處理器輸入 408‧‧‧Process input
409‧‧‧相機介面 409‧‧‧ camera interface
414‧‧‧相機處理器 414‧‧‧ camera processor
415‧‧‧記憶體 415‧‧‧ memory
416‧‧‧通信通道/相機至相機通道 416‧‧‧Communication channel/camera to camera channel
417‧‧‧橋接功能 417‧‧‧Bridge function
418‧‧‧主要相機之第二介面 418‧‧‧Second interface of the main camera
419‧‧‧次要相機之介面 419‧‧‧ secondary camera interface
Claims (20)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/017,653 US20170230637A1 (en) | 2016-02-07 | 2016-02-07 | Multiple camera computing system having camera-to-camera communications link |
US15/017,653 | 2016-02-07 |
Publications (2)
Publication Number | Publication Date |
---|---|
TW201737199A true TW201737199A (en) | 2017-10-16 |
TWI623910B TWI623910B (en) | 2018-05-11 |
Family
ID=57799783
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
TW105143998A TWI623910B (en) | 2016-02-07 | 2016-12-29 | Multiple camera computing system having camera-to-camera communications link |
TW107110629A TW201822144A (en) | 2016-02-07 | 2016-12-29 | Multiple camera computing system having camera-to-camera communications link |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
TW107110629A TW201822144A (en) | 2016-02-07 | 2016-12-29 | Multiple camera computing system having camera-to-camera communications link |
Country Status (7)
Country | Link |
---|---|
US (1) | US20170230637A1 (en) |
EP (1) | EP3360061A1 (en) |
CN (1) | CN107046619A (en) |
DE (2) | DE202016107172U1 (en) |
GB (1) | GB2547320A (en) |
TW (2) | TWI623910B (en) |
WO (1) | WO2017136037A1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10009933B2 (en) | 2016-09-02 | 2018-06-26 | Brent Foster Morgan | Systems and methods for a supplemental display screen |
US9720639B1 (en) | 2016-09-02 | 2017-08-01 | Brent Foster Morgan | Systems and methods for a supplemental display screen |
US10346122B1 (en) | 2018-10-18 | 2019-07-09 | Brent Foster Morgan | Systems and methods for a supplemental display screen |
CN110809152A (en) * | 2019-11-06 | 2020-02-18 | Oppo广东移动通信有限公司 | Information processing method, encoding device, decoding device, system, and storage medium |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6864911B1 (en) * | 2000-10-26 | 2005-03-08 | Hewlett-Packard Development Company, L.P. | Linkable digital cameras for an image capture system |
US7649938B2 (en) * | 2004-10-21 | 2010-01-19 | Cisco Technology, Inc. | Method and apparatus of controlling a plurality of video surveillance cameras |
US7969469B2 (en) * | 2007-11-30 | 2011-06-28 | Omnivision Technologies, Inc. | Multiple image sensor system with shared processing |
US8427552B2 (en) * | 2008-03-03 | 2013-04-23 | Videoiq, Inc. | Extending the operational lifetime of a hard-disk drive used in video data storage applications |
US8781152B2 (en) * | 2010-08-05 | 2014-07-15 | Brian Momeyer | Identifying visual media content captured by camera-enabled mobile device |
US20120250984A1 (en) * | 2010-12-01 | 2012-10-04 | The Trustees Of The University Of Pennsylvania | Image segmentation for distributed target tracking and scene analysis |
JP5784664B2 (en) * | 2013-03-21 | 2015-09-24 | 株式会社東芝 | Multi-eye imaging device |
US10863098B2 (en) * | 2013-06-20 | 2020-12-08 | Microsoft Technology Licensing. LLC | Multimodal image sensing for region of interest capture |
CN103607538A (en) * | 2013-11-07 | 2014-02-26 | 北京智谷睿拓技术服务有限公司 | Photographing method and photographing apparatus |
US20150248772A1 (en) * | 2014-02-28 | 2015-09-03 | Semiconductor Components Industries, Llc | Imaging systems and methods for monitoring user surroundings |
-
2016
- 2016-02-07 US US15/017,653 patent/US20170230637A1/en not_active Abandoned
- 2016-12-09 EP EP16826504.9A patent/EP3360061A1/en not_active Withdrawn
- 2016-12-09 WO PCT/US2016/065868 patent/WO2017136037A1/en active Search and Examination
- 2016-12-20 GB GB1621697.0A patent/GB2547320A/en not_active Withdrawn
- 2016-12-20 DE DE202016107172.0U patent/DE202016107172U1/en not_active Expired - Lifetime
- 2016-12-20 DE DE102016225600.9A patent/DE102016225600A1/en not_active Withdrawn
- 2016-12-29 TW TW105143998A patent/TWI623910B/en not_active IP Right Cessation
- 2016-12-29 TW TW107110629A patent/TW201822144A/en unknown
- 2016-12-29 CN CN201611249312.1A patent/CN107046619A/en active Pending
Also Published As
Publication number | Publication date |
---|---|
GB2547320A (en) | 2017-08-16 |
EP3360061A1 (en) | 2018-08-15 |
DE102016225600A1 (en) | 2017-08-10 |
CN107046619A (en) | 2017-08-15 |
TW201822144A (en) | 2018-06-16 |
WO2017136037A1 (en) | 2017-08-10 |
GB201621697D0 (en) | 2017-02-01 |
DE202016107172U1 (en) | 2017-05-10 |
TWI623910B (en) | 2018-05-11 |
US20170230637A1 (en) | 2017-08-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12010423B2 (en) | Electronic device for recording image as per multiple frame rates using camera and method for operating same | |
TWI623910B (en) | Multiple camera computing system having camera-to-camera communications link | |
US10812768B2 (en) | Electronic device for recording image by using multiple cameras and operating method thereof | |
KR102524498B1 (en) | The Electronic Device including the Dual Camera and Method for controlling the Dual Camera | |
EP3641294B1 (en) | Electronic device and method for obtaining images | |
CN113647094B (en) | Electronic device, method and computer readable medium for providing out-of-focus imaging effects in video | |
US11025890B2 (en) | Electronic device and method for acquiring depth information by using at least one of cameras or depth sensor | |
CN109756763B (en) | Electronic device for processing image based on priority and operating method thereof | |
US11563887B2 (en) | Method for controlling synchronization of plurality of image sensors and electronic device for implementing same | |
KR102442921B1 (en) | Electronic device capable of increasing the task management efficiency of the digital signal processor | |
US20220268935A1 (en) | Electronic device comprising camera and method thereof | |
CN115883948A (en) | Image processing architecture, image processing method, device and storage medium | |
CN116841947A (en) | Multi-core communication method, device, electronic equipment and storage medium | |
CN115545371A (en) | Data distribution processing system, method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
MM4A | Annulment or lapse of patent due to non-payment of fees |