CN102215374B - Camera is switched during the video conference of multi-camera mobile device - Google Patents

Camera is switched during the video conference of multi-camera mobile device Download PDF

Info

Publication number
CN102215374B
CN102215374B CN201010602687.8A CN201010602687A CN102215374B CN 102215374 B CN102215374 B CN 102215374B CN 201010602687 A CN201010602687 A CN 201010602687A CN 102215374 B CN102215374 B CN 102215374B
Authority
CN
China
Prior art keywords
camera
equipment
image
video conference
mobile device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201010602687.8A
Other languages
Chinese (zh)
Other versions
CN102215374A (en
Inventor
J·S·阿布安
D·A·埃尔德雷德
郑铉国
小R·加西亚
吴锡荣
周小松
E·C·克兰菲尔
J·O·诺麦尔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Computer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US12/794,775 external-priority patent/US8451994B2/en
Application filed by Apple Computer Inc filed Critical Apple Computer Inc
Publication of CN102215374A publication Critical patent/CN102215374A/en
Application granted granted Critical
Publication of CN102215374B publication Critical patent/CN102215374B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The present invention relates to and switch camera during the video conference of multi-camera mobile device.Some embodiments provide a kind of method of carrying out video conference between the first mobile device and the second equipment.First mobile device comprises the first and second cameras.Described method choice is used for the first camera of photographic images.Described method gives the second equipment the image transmission of being taken by the first camera.Described method, during video conference, receives the selection to the second camera for photographic images.Described method, during video conference, stops the transmission of the image taken by the first camera, and gives the second equipment the image transmission that the second camera by the first mobile device is taken.

Description

Camera is switched during the video conference of multi-camera mobile device
When the model machine of the iPhone 4 of Apple is when being stolen from an engineer of Apple on March 25th, 2010, the invention that will disclose and advocate in the application by advance and without Apple to authorize to public.Before this obviously theft, not yet submit to the application based on US priority application.
Background technology
Many current portable sets, such as intelligent telephone set possesses video capture function.By the camera on telephone set, the user of portable set can take rest image and video.But, after completing video capture, in order to the video of shooting is sent to the opposing party, user directly must send to described the opposing party video usually, or video is uploaded to another location (such as, internet video trustship website (hosting site)).Unfortunately, this does not allow described the opposing party when portable set capture video, checks live video stream.
In addition, the portable set of standard is only equipped with a camera, and process is quite difficult from the information of this camera.Desirable equipment ought to have multi-section camera, and can send real-time video, and described real-time video is the synthesis of the video from least two cameras.Just process the equipment of the video flowing of multiple shooting, and with described equipment connection, be responsible for process the network of the transmission of live video stream, in view of the resource-constrained that can utilize for portable set, this is an especially difficult problem.
Summary of the invention
Some embodiments of the present invention provide to be had two and can take pictures and the mobile device of camera of video.The mobile device of some embodiments has the photograph image of display shooting and the display screen of video image.It also comprises the image of preservation shooting to send the memory of another equipment after a while to.Described equipment also has network interface, during described network interface allows the real time communication session of this equipment between the user of multiple equipment, the image of shooting is sent to one or more equipment.Described equipment also comprises encoder, and it can use described encoder to the Image Coding of shooting, stores so that local or send another equipment to.Described mobile device also comprises decoder, and described decoder allows this equipment to the diagram decoding of being taken by another equipment during real time communication session, or to the image decoding that this locality is preserved.
The example relating to the real time communication session of the transmission of the video image of shooting is video conference.In certain embodiments, any specific time of mobile device during video conference, the video image of a transmission camera shooting is merely able to.But, in other embodiments, mobile device during video conference or other real time communication session, can send the video image from its two camera shootings simultaneously.
With the video conference of another equipment during, the mobile device of some embodiments, by together with the video taken by one portion or two cameras, transmits the content of other type.During the video of the camera shooting that an example of other content this is included in this equipment for video conference, the low resolution of being taken by another camera of equipment or high resolution picture image.Other example of other content this comprises (1) and is kept at file on equipment and other content, (2) screen display of equipment (namely, display content on the screen of the device), (3) during video conference or other real time communication session, from the content that another equipment receives, etc.
The mobile device of some embodiments adopts (in-conference) adjustment technology in novel meeting, adjusts during video conference.Such as, when only transmitting the video of a camera shooting during video conference, the mobile device of some embodiments can become transmit the video taken by its another camera by switching at runtime.In this case, the mobile device of some embodiments participates in this switching notice in any miscellaneous equipment of video conference, makes described miscellaneous equipment can level and smooth conversion between its that one end provides the video taken by these two cameras.
In certain embodiments, the request switching camera can not only originate from during video conference, carries out " this locality " equipment switched between its camera, but also can originate from another " long-range " equipment of the video just receiving local device shooting.In addition, another equipment of device directive switching camera is allowed to be an example of the remote control ability of the equipment of some embodiments.In certain embodiments, exposure adjustment operation (such as, automatic exposure) can be comprised, Focussing operation (such as, automatic focus) etc. by long-range other example operated to device directive.Another example that can adjust in the Local or Remote meeting of novelty of specifying identifies the area-of-interest (ROI) in capture video, with the behavior using described ROI to identify amendment shooting camera, revise the image processing operations of the equipment with shooting camera, or amendment is with the encoding operation of the equipment of shooting camera.
Another example adjusted in the meeting of the novelty of some embodiments relates to the real time modifying of the synthetic video display that equipment produces.Specifically, in certain embodiments, mobile device produces the compound display simultaneously showing the multiple videos taken by the multi-section camera of one or more equipment.In some cases, compound display is placed on video in adjacent viewing area and (such as, is placed in adjacent window).In other cases, compound display is picture-in-picture (PIP) display, described PIP display comprises at least two viewing areas of display two different videos, and one of them viewing area is the main viewing area of background, and another viewing area is that the prospect overlapped on the main viewing area of background inserts viewing area.
The real time modifying of the synthetic video display in some embodiments relates in response to user to the selection of viewing area and movement, mobile one or more viewing area in compound display.When providing the screen rotation of equipment of compound display, some embodiments also rotate this compound display during video conference.In addition, video in user's exchange (swap) PIP display of the mobile device permission equipment of some embodiments (namely, the video that prospect is inserted in display appears in the main display of background, and the video in the main display of background is appeared in prospect insertion display).
Summary of the invention intention above simply introduces some embodiments of the present invention.Do not mean it is introduction or the summary of all subject matters disclosed in the literature.Embodiment below and the accompanying drawing related in a specific embodiment will further illustrate the embodiment described in described summary of the invention, and other embodiment.Therefore, be appreciated that all embodiments that the literature describes, need intactly to investigate summary of the invention, embodiment and accompanying drawing.
Accompanying drawing explanation
Novel feature of the present invention is set forth in accessory claim.But, in order to illustrate, in following accompanying drawing, illustrate several embodiments of the present invention.
The compound display of some embodiments of Fig. 1 graphic extension.
The another kind of compound display of some embodiments of Fig. 2 graphic extension.
Fig. 3 is two Video processing of camera movement equipment of some embodiments of graphic extension and the software architecture of coding module conceptually.
The photographic images processing unit of Fig. 4 some embodiments of graphic extension conceptually.
Fig. 5 concept map explains orally Benq in the example of the different frame rates of different vertical blanking intervals (VBI).
Fig. 6 is two video conference of camera movement equipment of some embodiments of graphic extension and the software architecture of processing module conceptually.
The example video conference request sending and receiving sequence of Fig. 7 some embodiments of graphic extension conceptually.
The user interface of the video conference setting operation of some embodiments of Fig. 8 graphic extension.
The user interface accepting video conference invitation of some embodiments of Fig. 9 graphic extension.
Another user interface accepting video conference invitation of some embodiments of Figure 10 graphic extension.
Another user interface of the video conference setting operation of some embodiments of Figure 11 graphic extension.
Figure 12 is the video conference of two camera movement equipment of some embodiments of graphic extension and another software architecture of processing module conceptually.
Another software architecture of Figure 13 two camera movement equipment of some embodiments of graphic extension conceptually.
Figure 14 process of being performed by the video conference manager of some embodiments as illustrated in Figure 12 of graphic extension conceptually.
Figure 15 process of being performed by the image processing manager of some embodiments of graphic extension in such as Fig. 6 of graphic extension conceptually.
The user interface of the exposure adjustment operation of some embodiments of Figure 16 graphic extension.
The user interface of the Focussing operation of some embodiments of Figure 17 graphic extension.
The software architecture of Figure 18 networking manager of graphic extension some embodiments as illustrated in Figure 12 conceptually.
The PIP of some embodiments of Figure 19 graphic extension shows the user interface of rotation process.
The PIP of some embodiments of Figure 20 graphic extension shows another user interface of rotation process.
The PIP of some embodiments of Figure 21 graphic extension shows another user interface of rotation process.
The PIP of some embodiments of Figure 22 graphic extension shows another user interface of rotation process.
The user interface of the area-of-interest in the identification display frame of some embodiments of Figure 23 graphic extension.
Another user interface of area-of-interest in the identification display frame of some embodiments of Figure 24 graphic extension.
Another user interface of area-of-interest in the identification display frame of some embodiments of Figure 25 graphic extension.
The process that two camera movement equipment carries out local switching camera operation of some embodiments of Figure 26 graphic extension.
The user interface of the switching camera operation of some embodiments of Figure 27 graphic extension.
Another user interface of the switching camera operation of some embodiments of Figure 28 graphic extension.
Another user interface of the switching camera operation of some embodiments of Figure 29 graphic extension.
Another user interface of the switching camera operation of some embodiments of Figure 30 graphic extension.
The process that two camera movement equipment carries out long-range switching camera operation of some embodiments of Figure 31 graphic extension.
The Long-distance Control of some embodiments of Figure 32 graphic extension switches the user interface of camera operation.
The Long-distance Control of some embodiments of Figure 33 graphic extension switches another user interface of camera operation.
The Long-distance Control of some embodiments of Figure 34 graphic extension switches another user interface of camera operation.
The Long-distance Control of some embodiments of Figure 35 graphic extension switches another user interface of camera operation.
The process carrying out exposure adjustment operation of Figure 36 some embodiments of graphic extension conceptually.
The user interface of carrying out exposure adjustment operation of some embodiments of Figure 37 graphic extension.
Another user interface of carrying out exposure adjustment operation of some embodiments of Figure 38 graphic extension.
Another user interface of carrying out exposure adjustment operation of some embodiments of Figure 39 graphic extension.
Figure 40 is the exposure adjustment processing that performed by the image processing manager of some embodiments as illustrated in Figure 12 of graphic extension conceptually.
The exposure adjustment operation of Figure 41 some embodiments of graphic extension conceptually.
Figure 42 conceptually some embodiments of graphic extension execution Focussing operation process.
The user interface of the Focussing operation of some embodiments of Figure 43 graphic extension.
Another user interface of the Focussing operation of some embodiments of Figure 44 graphic extension.
Another user interface of the Focussing operation of some embodiments of Figure 45 graphic extension.
API (API) architecture of Figure 46 some embodiments of graphic extension conceptually.
The architecture of two camera movement computing equipments of some embodiments of Figure 47 graphic extension.
Touch I/O (I/O) equipment of Figure 48 some embodiments of graphic extension conceptually.
The example communication system of Figure 49 some embodiments of graphic extension conceptually.
Another example communication system of Figure 50 some embodiments of graphic extension conceptually.
Embodiment
In the following description, for purpose of explanation, set forth numerous details.But, those of ordinary skill in the art will appreciate that and can put into practice the present invention when not utilizing these details.In other cases, in order to make description of the invention smudgy because of unnecessary details, known structure and equipment is illustrated in block form an.
Some embodiments of the present invention provide a kind of to be had two and can take pictures and the mobile device of camera of video.The example of mobile device comprises the mobile computing device of mobile phone, intelligent telephone set, personal digital assistant (PDA), laptop computer, tablet personal computer or other type any.Photo used herein refers to by each ground of single shot pattern, or by the quick screening-mode picture image taken by camera of several ground at every turn.On the other hand, video refers to a series of video images that camera is taken with special speed (being commonly referred to frame rate).The representative frame rate of capture video is 25 frames/second (fps), 30fps and 60fps.The camera of the mobile device of some embodiments can with these and other frame rate capture video image (that is, frame of video).
The mobile device (1) of some embodiments can show photograph image and the video image of shooting, (2) image of shooting can be preserved, to send another equipment after a while to, (3) can during the real time communication session between the multidigit user of multiple equipment, the image of shooting is sent to one or more equipment, (4) to the Image Coding of shooting, can store so that local or send another equipment to.
The example relating to the real time communication session of the transmission of the video image of shooting is video conference.In certain embodiments, any specific time during video conference, mobile device is merely able to the video image of a transmission camera shooting.But, in other embodiments, during video conference or other real time communication session, mobile device can transmit the video image of its two camera shootings simultaneously.
The mobile device of some embodiments produces compound display, shows while described compound display comprises the multiple videos taken by the multi-section camera of one or more equipment.In some cases, compound display is placed on video in adjacent viewing area and (such as, is placed in adjacent window).A this example of Fig. 1 graphic extension compound display 100, compound display 100 comprises two adjacent viewing areas 105 and 110, these two viewing areas 105 and 110 show simultaneously to be taken by two of an equipment cameras, or two videos being taken by two cameras of two distinct devices participating in video conference.
In other cases, compound display is the PIP display of at least two viewing areas comprising display two different videos, and one of them viewing area is the main viewing area of background, and another viewing area is that the prospect overlapped on the main viewing area of background inserts viewing area.A this example of Fig. 2 graphic extension synthesis PIP display 200.Synthesis PIP display 200 comprises the main viewing area of background 205 and the prospect overlapped on the main viewing area of background inserts viewing area 210.These two viewing areas 205 and 210 show simultaneously to be taken by two of an equipment cameras, or two videos being taken by two cameras of two distinct devices participating in video conference.Although the example synthesis PIP display illustrated in this article and discuss is similar to the synthesis PIP display 200 that the whole prospect of display in the main viewing area of background 205 inserts viewing area 210, but have and overlap on the main viewing area 205 of background, but and other synthesis PIP display that incomplete prospect in the main viewing area of background 205 inserts viewing area 210 is also possible.
Except with the video conference of another equipment during transmit except video content, the mobile device of some embodiments can transmit the content of other type together in company with the video content of meeting.When the video for video conference taken by one of example camera being included in equipment of other content this, the low resolution of being taken by another camera of this equipment or high resolution picture image.Other example of other content this comprises (1) and is kept at file on equipment and other content, (2) screen display of equipment (namely, display content on the screen of the device), (3) during video conference or other real time communication session, from the content that another equipment receives, etc.
The mobile device of some embodiments adopts adjustment technology in novel meeting, adjusts during video conference.Such as, when only transmitting the video of a camera shooting during video conference, the mobile device of some embodiments can become transmit the video taken by its another camera by switching at runtime.In this case, the mobile device of some embodiments participates in this switching notice another equipment any of video conference, and another equipment described in making can provide the level and smooth conversion between this video that two cameras are taken in its that one end.
In certain embodiments, the request switching camera can not only originate from during video conference, carries out " this locality " equipment switched between its camera, but also can originate from another " long-range " equipment of the video just receiving local device shooting.In addition, another equipment of device directive switching camera is allowed to be an example of the remote control ability of the equipment of some embodiments.In certain embodiments, exposure adjustment operation (such as, automatic exposure) can be comprised, Focussing operation (such as, automatic focus) etc. by long-range other example operated to device directive.Another example that can adjust in the Local or Remote meeting of novelty of specifying identifies the area-of-interest (ROI) in capture video, with the behavior using described ROI to identify amendment shooting camera, revise the image processing operations of the equipment with shooting camera, or amendment is with the encoding operation of the equipment of shooting camera.
Another example adjusted in the meeting of the novelty of some embodiments relates to the real time modifying of the synthetic video display that equipment produces.Specifically, in certain embodiments, the real time modifying of synthetic video display relates in response to user to the selection of viewing area and movement, mobile one or more viewing area in compound display.When providing the screen rotation of equipment of compound display, some embodiments also rotate this compound display during video conference.In addition, (namely the mobile device of some embodiments allows the order of the video in user's upset (flip) PIP display of equipment, the video that prospect is inserted in display appears in the main display of background, and the video in the main display of background is appeared in prospect insertion display).
The following describes several more detailed embodiment.I joint provides the explanation of the video processing architecture of some embodiments.Subsequently, II joint describes the photographic images processing unit of some embodiments.In certain embodiments, photographic images processing unit is the assembly of the original image that the responsible process of equipment is taken by the camera of equipment.
Next, III joint describes the video conference architecture of some embodiments.III joint also describes the video conference module of some embodiments, and arranges several modes of single camera video conference.Afterwards, IV joint describes adjustment and control operation in the meeting of some embodiments.Next V joint describes the hardware architecture of two camera apparatus of some embodiments.Finally, the U.S. Patent application * * simultaneously submitted to the application, name is called that " Establishing a Video ConferenceDuring a Phone Call " (attorney docket No.APLE.P0212) describes the several other embodiment relevant to more above-mentioned features, adjustment in such as some meetings, etc.Name is called that the U.S. Patent application of " Establishing a Video Conference During a Phone Call " is incorporated by reference at this.
I. video capture and process
Fig. 3 is the Video processing of two camera movement equipment of some embodiments of graphic extension and coding module 300 conceptually.In certain embodiments, module 300 processes the image that the camera by two camera movement equipment is taken, and encodes to video.As shown in Figure 3, module 300 comprises photographic images processing unit (CIPU) driver 305, exchange of media module 310, encoder-driven device 320 and video processing module 325.
In certain embodiments, program (user of media content and producer) the switched-media content on exchange of media module 310 permission equipment and the instruction of the process about media content.In Video processing and coding module 300, the exchange of media module 310 of some embodiments between video processing module 325 and CIPU driver 305, and between video processing module 325 and encoder-driven device 320 these instructions of route and media content.In order to the route making described instruction and media content is easier, the exchange of media module 310 of some embodiments provides one group of API (API) of user for media content and producer.In the embodiment that some are such, exchange of media module 310 is one group of one or more framework of the part as the operating system run on two camera movement equipment.An example of this exchange of media module 310 is Core Media frameworks that Apple provides.
The image that video processing module 325 is taken the camera by equipment and/or video carry out image procossing.The example of this operation comprises exposure adjustment operation, Focussing operation, perspective correction, dynamic range adjustment, image scaling, Images uniting etc.In certain embodiments, some image processing operations also can be performed by exchange of media module 310.Such as, as shown in Figure 3, the exchange of media module 310 of some embodiments performs temporal noise reduction (TNR) operation (such as, with TNR 315) of the noise in the video image reducing and taken by the camera of equipment.The other example of this image processing operations of video processing module 325 and exchange of media module 310 is provided below.
By exchange of media module 310, video processing module 325 and CIPU driver 305 and encoder-driven device 320 interface, as mentioned above.CIPU driver 305 serves as the communication interface between photographic images processing unit (CIPU) 330 and exchange of media module 310.As described further below, CIPU 330 is assemblies of two camera apparatus, is responsible for processing the image taken during the image taking or movie shooting operation of the camera of equipment.CIPU driver 305, by exchange of media module 310, receives from the image of one or two camera of equipment and/or the request of video from video processing module 325.CIPU driver 305 passes to CIPU 330 described request, responsively, receive image and/or the video of request from CIPU 330, CIPU driver 305, subsequently by exchange of media module 310, sends to video processing module 325 the image received and/or video.By CIPU driver 305 and exchange of media module 310, the video processing module 325 of some embodiments also sends instruction to CIPU 330, to revise its certain operations (such as, revising the frame rate of camera, exposure adjustment operation, Focussing operation etc.).
Encoder-driven device 320 serves as the communication interface between exchange of media module 310 and encoder hardware 335 (encoding pack such as, on encoder chip, chip system etc.).In certain embodiments, encoder-driven device 320, by exchange of media module 310, receives image and the request to Image Coding from video processing module 325.Encoder-driven device 320 sends to encoder 335 the image that will encode, and encoder 335 carries out photo coding or Video coding to image subsequently.When encoder-driven device 320 receives coded image from encoder 335, encoder-driven device 320 passes through exchange of media module 310, coded image loopback to video processing module 325.
In certain embodiments, video processing module 325 can carry out different operations to it from the coded image of encoder accepts.The example of described operation comprises coded image is kept in the memory of equipment, transmits coded image by the network interface of equipment in video conference, etc.
In certain embodiments, some or all modules of Video processing and coding module 300 are realized as a part for operating system.Such as, some embodiments are embodied as all four assemblies 305,310,320 and 325 of Video processing and coding module 300 part for the operating system of equipment.Other embodiment is embodied as a part for the operating system of equipment exchange of media module 310, CIPU driver 305 and encoder-driven device 320, and using video processing module 325 as the application run on an operating system.In addition, other realization of module 300 is also possible.
Present explanation during video capture session, the operation of Video processing and coding module 300.In order to start video capture session, video processing module 325 is initialized as the several assemblies needed for video capture session.In certain embodiments, these assemblies comprise (1) CIPU 330, (2) proportional zoom of video processing module 325 and synthesis module (not shown), (3) the image processing module (not shown) of video processing module 325, and (4) encoder 335.In addition, when participating in video conference, the video processing module 325 initialization network manager (not shown) of some embodiments.
By exchange of media module 310 and CIPU driver 305, video processing module sends to CIPU 330 its initialization request, starts video capture to make one of equipment or two cameras.In certain embodiments, described request specifies the particular frame speed, exposure and the scaling size that need every portion camera of capture video.Response described request, CIPU 330 starts frame rate, exposure and the scaling size of specifying, and returns the video image from asked camera.These video images are returned to video processing module 325 by CIPU driver 305 and exchange of media module 310, and as mentioned above, before video image is supplied to video processing module 325, exchange of media module 310 pairs of video images carry out TNR operation.At video processing module 325, video image is stored in buffer (not shown), to carry out other image procossing.
The image processing module of video processing module 325 fetches preservation video image in a buffer, to carry out other Video processing.Proportional zoom and synthesis module fetch the video image after process subsequently, so that scaled video image (if the necessary words of real-time display on the display screen of equipment).In certain embodiments, the image that this module is taken with two cameras by equipment, or the image creation composograph taken by the camera of the camera of equipment and another equipment during being used in video conference, to provide the real-time display of the video image of shooting on equipment, or create the composite video image for coding.
Process and/or the video image after synthesizing are provided to encoder 335 by encoder-driven device 320 and exchange of media module 310.Encoder 335 is subsequently to encoding video pictures.Coded image is returned to video processing module 325 (again by encoder-driven device 320 and exchange of media module 310) subsequently, to be kept on equipment, or transmits during video conference.When equipment participates in video conference, network manager (by video processing module 325 initialization) fetches these coded images subsequently, to coded image subpackage, and by the network interface (not shown) of equipment, coded image is sent to one or more miscellaneous equipment.
II. photographic images process
The image taken by the camera of two camera movement equipment of some embodiments is original raw image.Can be used to other operation at these images, before such as image being sent to another equipment (such as, during video conference), preservation image or display image, these images need to transform to the specific color space.In addition, the image of camera shooting needs processed, with error recovery and/or distortion, and the color, size etc. of adjustment image.Therefore, some embodiments, before preservation, transmitting and show this image, are carried out several process to image and are operated.The part process of described image is performed by CIPU 330.
An example of described CIPU is illustrated in Fig. 4.Specifically, Fig. 4 summarizes the photographic images processing unit (CIPU) 400 of some embodiments of ground graphic extension.CIPU 400 comprise or each only process from the image of one of the camera of equipment, or process single processing pipeline (single processing pipeline) 485 of the image of two cameras from equipment according to time division multiplex modes (that is, according to time-interleaved mode) simultaneously.Differently can configure the processing pipeline 485 of CIPU 400, to process different qualities and/or the operation setting of different camera.The example of the different camera characters in some embodiments comprises different resolution, noise transducer, lens type (fixed lens or zoom lens), etc.In addition, in certain embodiments, the example that equipment can be arranged according to the different operating of its operate camera comprises image resolution ratio size, frame rate, zoom level, exposure etc.
As shown in Figure 4, CIPU 400 comprises sensor assembly 415, OK/frame buffer 417, bad pixel correction (BPC) module 420, lens shade (LS) module 425, demosaicing module 430, white balance (WB) module 435, γ module 440, color space transformation (CSC) module 445, tone, saturation and contrast (HSC) module 450, scaler module 455, filter module 460, statistics engine 465, two groups of registers 470, and controller module 475.In certain embodiments, whole modules of CIPU400 are all realize with hardware (such as, ASIC, FPGA, SOC with microcontroller, etc.), and in other embodiments, some or all module software simulating of CIPU 400.
As shown in Figure 4, two pel array 410a and 410b of sensor assembly 415 and two cameras of equipment, and two sensors 405a and 405b coupled in communication.In certain embodiments, by the mobile Industry Processor Interface (MIPI) of each camera sensor, make described coupled in communication easier.
By described coupled in communication, sensor assembly 415 can forward instruction to camera, to control the various aspects of the operation of every portion camera, and such as its power stage, zoom level, focal length, exposure etc.In certain embodiments, every portion camera has four kinds of operating power patterns.Under the first operating power pattern, camera is de-energized.With regard to the second operating power pattern, camera is energized, but camera is not also configured.Under the third operating power pattern, camera is energized, and the transducer of camera is configured, and the pixel collection photon of camera sensor, and the photon collected is converted to digital value.But, camera sensor does not also send image to sensor assembly 415.Finally, under the 4th kind of operating power pattern, camera is under the operating power pattern identical with the third power mode, except camera sends except image to sensor assembly 415 now.
During the operation of equipment, camera repeatedly can be switched to another kind of operating power pattern from a kind of operating power pattern.When switch operating power mode, some embodiments require that camera is according to said sequence switch operating power mode.So in these embodiments, the camera under the first operating power pattern can only be switched to the second operating power pattern.When camera is in the second operating power pattern, it can be switched to the first operating power pattern or the third operating power pattern.Similarly, camera can be switched to the second operating power pattern or the 4th kind of operating power pattern from the third operating power pattern.When camera is in the 4th kind of operating power pattern, it is merely able to switch back the third operating power pattern.
In addition, lower one or front a kind of operating power pattern needs a certain amount of time is switched to from a kind of operating power pattern.Thus, switch two or three operating power pattern and be slower than a kind of operating power pattern of switching.Different operating power patterns also consumes the power of different amount.Such as, the 4th kind of operating power pattern consumes the power of maximum, and the power that the third operating power pattern consumes is higher than the first and the second operating power pattern, and the power that the second operating power pattern consumes is higher than the first operating power pattern.In certain embodiments, the first operating power pattern does not consume any power.
When camera is not in the 4th kind of operating power pattern of photographic images, can make camera remain on other operating power pattern once.The determination remained under which kind of mode of operation of no camera is depended on, and permission camera consumes how much power and how soon camera needs respond the request starting photographic images.Such as, the camera ratio being configured to work according to the third operating power pattern (such as, standby mode) is configured to the more power of camera consumption being in the first operating power pattern (that is, power-off).But, when instruction camera photographic images, the 4th kind of operating power pattern can be switched to quickly than the camera according to the first operating power work pattern according to the camera of the third operating power work pattern.Thus, according to different requirements (such as, to the response time of the request of photographic images, power consumption), when not photographic images, camera can be configured to according to different operating power work patterns.
As described further below, when video processing module 325 asks one or two cameras to start photographic images, and when sensor assembly 415 receives this request by controller module 475, by the coupled in communication of it and every portion camera, sensor assembly 415 instruction one group or two groups of camera sensor can start photographic images.Baeyer filter is superimposed on each camera sensor, thus each camera sensor exports Baeyer format-pattern, and described Baeyer format-pattern is stored in the pel array associated with each camera sensor.Baeyer format-pattern is that wherein each pixel only preserves a kind of color value: red, blue or green image.
By coupling of it and pel array 410a and 410b, sensor assembly 415 fetches the primitive Bayer format-pattern be kept in camera image pixel array 410a and 410b.Fetched the speed of image from the pel array of camera by control sensor assembly 415, sensor assembly 415 can control the frame rate of the video image taken by particular camera.By controlling the speed that its image is fetched, sensor assembly 415 can also interweave the reading of the image taken by different camera, to interweave the image procossing of CIPU processing pipeline 485 to the photographic images from different camera.Below and at the U.S. Patent application * * of above-mentioned introducing, during name is called " Establishing a VideoConference During a Phone Call " (attorney docket No.APLE.P0212), further illustrate the control that sensor assembly 415 is fetched its image.
Sensor assembly 415 image line (that is, the several rows of pixel of image) that it is fetched from pel array 410a and 410b preserve be expert at/frame buffer 417 in.The each image line in row/frame buffer 417 is processed by CIPU processing pipeline 485.As shown in Figure 4, CIPU processing pipeline 485 is made up of BPC module 420, LS module 425, demosaicing module 430, WB module 43, γ module 440, CSC module 445, HSC module 450, scaler module 455 and filter module 460.In certain embodiments, (that is, by row) CIPU processing pipeline 485 processes the image from row/frame buffer 417 line by line, and in other embodiments, and the process of CIPU processing pipeline 485 frame by frame comes voluntarily/the whole image of frame buffer 417.
In the example pipeline of graphic extension in the diagram, BPC module 420 is the modules of fetching image from row/frame buffer 417.BPC module 420 is carried out bad pixel and is eliminated operation, this operation manages to correct the bad pixel that may be caused by one or more defective camera sensor in the image fetched (such as, defective photon sensor not sensor light, sensor light etc. mistakenly).In certain embodiments, BPC module 420, by the specific pixel in movement images and the one or more neighbors in image, detects bad pixel.If the difference between the value of the value of described specific pixel and described neighbor is greater than threshold quantity, the mean value of the value of color (that is, red, green and blue) the several neighbors identical with described specific pixel is so used to replace the value of described specific pixel.
The operation part of BPC module 420 is by for this module, the value be kept in two groups of registers 470 of CIPU 400 controls.Specifically, in order to process the image taken by two of equipment different cameras, some embodiments differently configure CIPU processing pipeline 485 about every portion camera, as mentioned above.By preserving two groups of different values in the different register 470a (Ra) of two groups at CIPU 400 and 470b (Rb), be two different camera arrangement CIPU processing pipeline 485.Often group register 470 comprises a register (Ra or Rb) for each module 420-460 in CIPU processing pipeline 485.The value that one group defines the operation of a processing pipeline module preserved by each register in each Parasites Fauna.Therefore, as shown in Figure 4, Parasites Fauna 470a is used to indicate the mode of operation of each processing pipeline module of a camera (camera A) of two camera movement equipment, and Parasites Fauna 470b is used to indicate the mode of operation of each processing pipeline module of another camera (camera B) of two camera movement equipment.
The example every portion camera differently being configured to CIPU processing pipeline 485 is that the block configuration of CIPU processing pipeline 485 is become to process the image varied in size.Such as, if camera sensor 405a is 640 × 480 pixels, camera sensor 405b is 2048 × 1536 pixels, so one group of register 470a is configured to hold instruction the value of modules process 640 × 480 pixel image of CIPU processing pipeline 485, and one group of register 470b is configured to hold instruction the value of modules process 2048 × 1536 pixel image of CIPU processing pipeline 485.
In certain embodiments, different processing pipeline configuration (that is, register value) is stored in during different profiles arranges.In the embodiment that some are such, one of the user of mobile device is allowed to select profile to arrange (such as, by showing user interface on the mobile apparatus), to arrange the operation of or multi-section camera.Such as, user can select camera arrangement to become the profile of shooting high-resolution video to arrange, become the profile of shooting low-resolution video to arrange identical camera arrangement, or two cameras are all configured to the profile setting of taking the static image of high-resolution.Various different configuration is all possible, can be stored in during many different profiles arrange.In the embodiment that other is such, replace allowing user to select profile to arrange, the application selected according to user or activity, automatically select profile to arrange.Such as, if user selects video conference application, so automatically select the profile two camera arrangement being become capture video, if user selects photo application, so automatically select the profile one of camera being configured to take rest image, etc.
After BPS module 420, LS module 425 receives the image after bad pixel correction.LS module 425 performs correcting lens shadow operation, to correct the image deflects caused by the camera lens producing optical attenuation effect (that is, towards the edge of camera sensor, light reduces gradually).This effect cause image illumination uneven (such as, corner and/or edge darker).In order to correct these image deflects, the LS module 425 of some embodiments estimates the Mathematical Modeling of the illuminance decay of lens.The model estimated is used to the lenses attenuate of compensation image picture subsequently, with the part be not uniformly lighted of illumination image equably.Such as, if the brightness in the corner of image is the half of the brightness of picture centre, so the LS module 425 of some embodiments is multiplied by 2 corner pixels value, to produce uniform image.
Demosaicing module 430 performs demosaicing operation, to produce full-colour image by the image of sample color.As mentioned above, camera sensor exports Baeyer format-pattern, and because each pixel of Baeyer format-pattern only preserves a kind of colour, therefore Baeyer format-pattern is incomplete.Demosaicing module 430, by inserting the colour often organizing color in Baeyer format-pattern, reconstructs red, green, blue (RGB) image by Baeyer format-pattern.
WB module 435 performs white balance operation to the RGB image received from demosaicing module 430, the color similarity of the described content that human eye perceives in the color of picture material and actual life is arrived.WB module 435 adjusts white balance by adjusting the color of image, correctly to present neutral color (such as, grey, white etc.).Such as, the image of a blank sheet of paper under incandescent lamp may be revealed as yellow, and a piece of paper described in human eye perceives is white.In order to solve the difference between the color of image of transducer shooting and the color of the image of human eye perceives, WB module 435 adjusts the colour of image, makes the image taken correctly reflect the color of human eye perceives.
Statistics engine 465 is collected in the view data in each stage of CIPU processing pipeline 485.Such as, Fig. 4 represents that statistics engine 465 is collected in the view data after LS module 425, demosaicing module 430 and WB module 435.Different embodiments collects data from the different phase of the arbitrary number of CIPU processing pipeline 485.Statistics engine 465 processes the data of collecting, and according to the data after process, is adjusted the operation of camera sensor 405a and 405b by controller module 475 and sensor assembly 415.The example of this operation comprises exposure and focuses on.Although Fig. 4 display controls the statistics engine 465 of camera sensor 405a and 405b by controller module 475, but, other embodiment of statistics engine 465 controls camera sensor by sensor assembly 415.
Data after process also can be used for the operation of the modules adjusting CIPU 400.Such as, the statistics engine 465 of some embodiments, according to the data of collecting after WB module 435, adjusts the operation of WB module 435.In the embodiment that some are such, statistics engine 465 provides Automatic white balance (AWB) function, to adjust the white balance operation of WB module 435 by utilizing the data after processing.After other embodiment can use the process of collecting from the stage of the arbitrary number of CIPU processing pipeline 485, data are to adjust the operation of the arbitrary number module in CIPU processing pipeline 485.In addition, statistics engine 465 can also receive instruction from controller module 475, to adjust the operation of one or more modules of CIPU processing pipeline 485.
After receiving image from WB module 435, γ module 440 pairs of images carry out γ correct operation, with the brightness of Code And Decode camera arrangement or tristimulus values.The γ module 440 of some embodiments, by the linear signal of 10-12 bit being converted to the non-uniform encoding of 8 bits, with the γ value of correcting image, is carried out γ value and is corrected.Some embodiments utilize look-up table to correct γ value.
CSC module 445 the image received from γ module 440 from a color space transformation to another color space.Specifically, CSC module 445 image from RGB color space transformation to brightness and colourity (YUV) color space.But, other embodiment of CSC module 445 can be to and from the color space transformation image of arbitrary number.
The tone of the image that HSC module 450 adjustable receives from CSC module 445, saturation, contrast or their combination in any.Such as, HSC module 450 can adjust these character, to reduce noise or to strengthen image.Such as, the saturation of the image taken by low noise camera sensor can be increased, seem more bright-coloured to make image.On the contrary, the saturation of the image taken by strong noise camera sensor can be reduced, to reduce the coloured noise of this image.
After HSC module 450, the scalable image of scaler module 455, to adjust the pixel resolution of image, or the size of data of adjustment image.Such as, scaler module 455 also can reduce the size of image, to be applicable to less display.Such as, scaler module 455 can scaling images in a number of different ways.Such as, scaler module 455 can scale up (that is, amplifying) and reduce (that is, reducing) image in proportion.Scaler module 455 can also bi-directional scaling image, or shifting ground zoomed image.
Filter module 460 to one or more filtering operations of image applications received from scaler module 455, to change one or more attributes of some or all pixels of image.The example of filter comprises low pass filter, high pass filter, band pass filter, two-sided filter, Gaussian filter, etc.Thus, filter module 460 can to any multiple different filtering of image applications.
The controller module 475 of some embodiments is microcontrollers of the operation of control CIPU 400.In certain embodiments, controller module 475 (1) is by sensor assembly 41, control the operation of camera sensor (such as, exposure), (2) operation of control CIPU processing pipeline 485, (3) timing of control CIPU processing pipeline 485 (such as, when switch camera sensor, when switch register, etc.), (4) control photoflash lamp/stroboscope light (not shown), described photoflash lamp/stroboscope light is a part for two camera movement equipment of some embodiments.
The instruction that some embodiment process of controller module 475 receive from statistics engine 465 and CIPU driver 480.In certain embodiments, from two camera movement equipment (namely the instruction received from CIPU driver 480 is, be received from local device) instruction, and in other embodiments, from the instruction (Long-distance Control such as, during video conference) that the instruction of CIPU driver 480 reception is from another equipment.According to the instruction after process, controller module 475 can by the value of planning register 470, the operation of adjustment CIPU 400.In addition, controller module 475 during the operation of CIPU 400, dynamically can plan the value of register 470 again.
As shown in Figure 4, CIPU 400 comprises the number of modules in CIPU processing pipeline 485.But, those of ordinary skill in the art will appreciate that and only by the module of some graphic extensions, or can realize CIPU 400 with other disparate modules.In addition, the process performed by disparate modules can be applied sequentially to image according to different from the order of graphic extension in Fig. 4.
With reference now to Fig. 4, an exemplary operations of CIPU 400 is described.For the purpose of illustrating, one group of register Ra is used to process the image taken by the camera sensor 405a of two camera movement equipment, and one group of register Rb is used to process the image taken by the camera sensor 405b of two camera movement equipment.Controller module 475 receives the instruction produced by the image of one of camera of two camera movement equipment shooting from CIPU driver 480.
The modules of controller module 475 initialization CIPU processing pipeline 485 subsequently, to process the image taken by one of camera of two camera movement equipment.In certain embodiments, this comprises controller module 475 and checks whether the one group of correct register employed in register 470.If such as CIPU driver 480 instruction control unit module 475 produces the image taken by camera sensor 405a, so controller module 475 checks that whether one group of register Ra is one group of register that the module of CIPU 400 reads.If not, so controller module 475 switches between two groups of registers, makes this group register Ra be the one group of register read by the module of CIPU 400.
Concerning each module in CIPU processing pipeline 485, mode of operation is indicated by the value be kept in described one group of register Ra.As previously mentioned, the value in one group of register 470 can by dynamically again regular during the operation of CIPU 400.Thus the process of an image is different from the process of next image.Although the discussion of this exemplary operations of CIPU 400 describes the value of the mode of operation of each module reading and saving instruction modules in a register in CIPU 400, but in the embodiment of some software simulating, change modules parameter being sent to CIPU 400 into.
In certain embodiments, controller module 475 postpones special time amount by instruction sensor 415 after fetching image from pel array 410a, carrys out initializing sensor module 415.In other words, controller module 475 instruction sensor module 415 fetches image with special speed from pel array 410a.
Subsequently, controller module 475 is by sensor assembly 415 instruction camera sensor 405a photographic images.In certain embodiments, controller module 475 also provides exposure parameter and other camera operation parameter to camera sensor 405a.In other embodiments, camera sensor 405a uses the default value of camera sensor operating parameter.According to described parameter, camera sensor 405a takes original image, and described original image is stored in pel array 410a.Sensor assembly 415 fetches described original image from pel array 410a, and image is sent to row/frame buffer 417 to preserve, and CIPU processing pipeline 485 processes this image afterwards.
In some cases, image may be abandoned by row/frame buffer 417.When camera sensor 405a and/or 405b is with two-forty photographic images, the comparable BPC module 420 of sensor assembly 415 can be fetched image from row/frame buffer 417 and be received image quickly, and Image Saving be expert at/frame buffer 417 in (such as, when taking high frame rate video), OK/frame buffer 417 can become and be full of completely.When this occurs, the row/frame buffer 417 of some embodiments abandons image (that is, frame) according to first in first out.That is, when row/frame buffer 417 abandons a two field picture, OK/frame buffer 417 abandon be expert at/frame buffer 417 in other images all before that two field picture of receiving.
The image procossing of CIPU processing pipeline 485 fetches voluntarily/the image of frame buffer 417 from BPC module 420, starts with any bad pixel in correcting image.BPC module 420 sends to LS module 425 image subsequently, with any non-uniform illumination degree in correcting image.After the illuminance of correcting image, LS module 425 sends to demosaicing module 430 image, and demosaicing module 430 processes original image, produces RGB image by original image.Subsequently, WB module 435 receives RGB image from demosaicing module 430, and adjusts the white balance of RGB image.
As mentioned above, statistics engine 465 may have collected some data at each point of CIPU processing pipeline 485.Such as, as shown in diagram in Fig. 4, statistics engine 465 collects data after LS module 425, demosaicing module 430 and WB module 435.According to the data of collecting, the operation of the one or more modules in the operation of statistics engine 465 adjustable camera sensor 405a and/or CIPU processing pipeline 485, to adjust the shooting of the successive image from camera sensor 405a.Such as, according to the data of collecting, statistics engine 465 can determine that the exposure of present image is too low, thus increases the exposure of follow-up shot image by sensor assembly 415 instruction camera sensor 405a.Thus the statistics engine 465 of some embodiments plays the feedback loop of some process operations.
After WB module 435 adjusts the white balance of image, it sends to γ module 440 image, to carry out γ correction (such as, adjusting the gamma curve of image).CSC module 445 receives the image after γ correction from γ module 440, the circumstances in which people get things ready for a trip of going forward side by side spatial alternation.In this example, CSC module 445 becomes YUV image RGB image conversion.In other words, CSC module 445 image that the image conversion one-tenth yuv color space showed by the RGB color space is showed.HSC module 450 receives YUV image from CSC module 445, and adjusts the tone of each pixel in image, saturation and contrast properties.After HSC module 450, scaler module 455 zoomed image (such as, zooming in or out image).After receiving image from scaler module 455, filter module 460 pairs of one or more filtering of image applications.Finally, filter module 460 sends to CIPU driver 480 the image after process.
In this example of operation of above-described CIPU 400, each module in CIPU processing pipeline 485 processes image in some way.But, other image that CIPU 400 processes may not need the process of all modules of CIPU processing pipeline 485.Such as, image may not need blank level adjustment, γ correction, proportional zoom or filtering.Thus, CIPU 400 can according to the various inputs received, such as from the instruction of CIPU driver 480, or the data of being collected by statistics engine 465, with any various mode process image.
The speed (that is, frame rate) of different embodiments differently control treatment image.A kind of mode of control frame speed is the manipulation by vertical blanking interval (VBI).For fetching image line to process some embodiments of image line by line, VBI is the last column of the image fetching the video taken by the camera of two camera movement equipment from pel array, and fetch from pel array video next image the first row between time difference.In other embodiments, VBI is the image fetching the video taken by the camera of two camera movement equipment from pel array, and the time difference between the next image fetching video from pel array.
Wherein can use an example of VBI between sensor assembly 415 and pel array 410a and 410b.Such as, some embodiments of sensor assembly 415 fetch image from pel array 410a and 410b line by line, and other embodiment of sensor assembly 415 fetches image from pel array 410a and 410b one by one image.Thus, by adjusting the VBI of sensor assembly 415, can control frame speed: increase VBI and can reduce frame rate, and reduce VBI and can improve frame rate.
Fig. 5 concept map explains orally Benq in the example of the different frame rates 505,510 and 515 of different VBI.Each sequence is presented at each moment 525-555 along timeline 520, holds the image of the personage of guitar, and described image is by one of the camera shooting of two camera movement equipment.In addition, the time between each moment 525-555 is identical, is called a time quantum.For the purpose of illustrating, referring now to sensor assembly 415 and the pel array 410a of Fig. 4, key diagram 5.Thus, each image representative fetches the moment of image along the sensor assembly 415 of timeline 520 from pel array 410a.
In the frame rate 505 of example, sensor assembly 415 is set to 3 time quantums (such as, being set by controller module 475) about the VBI of pel array 410a.That is, sensor assembly 415 is along timeline 520, fetches a two field picture every two moment from pel array 410a.As shown in the frame rate 505 of example, sensor assembly 415 fetches image in the moment 525,540 and 555.Thus the frame rate 505 of example has the frame rate of every three time quantum one two field pictures.
Except VBI is set to 2 time quantums, the frame rate 510 of example is similar to the frame rate 505 of example.Thus sensor assembly 415, along timeline 520, fetches a two field picture every a moment from pel array 410a.The frame rate 510 of example represents that sensor assembly 415 is in the moment 525,535,545 and 555, fetches image from pel array 410a.VBI due to the frame rate 510 of example is less than the VBI of the frame rate 505 of example, and therefore the frame rate of the frame rate 510 of example is higher than the frame rate of the frame rate 505 of example.
Except sensor assembly 415 is set to except 1 time quantum about the VBI of pel array 410a, the frame rate 515 of example is also similar to the frame rate 505 of example.So sensor assembly 415 along timeline 520, is fetched a two field picture in each moment from pel array 410a by instruction.As shown in the figure, sensor assembly 415, at moment 525-555, fetches image from pel array 410a.The VBI of the frame rate 515 of example is less than the VBI of the frame rate 505 and 510 of example.So the frame rate of the frame rate 515 of example is greater than the frame rate 505 and 510 of example.
III. video conference
A. video conference architecture
Fig. 6 is the video conference of two camera movement equipment of some embodiments of graphic extension and the software architecture of processing module 600 conceptually.Video conference and processing module 600 comprise and the respective modules illustrated above with reference to Fig. 3 and driver 305,301 and 320 similar CIPU driver 605, exchange of media module 610 and encoder-driven devices 620.Video conference and processing module 600 also comprise video conference module 625, video conference client 645 and network interface 650 for realizing various video conference function.Be similar to Video processing and coding module 300, video conference and processing module 600 process and encode the image taken from the camera of two camera movement equipment.
As referring to Figure 3 as described above, the user of the media content in exchange of media module 610 permission equipment and producer's switched-media content, and the instruction relevant with the process of media content.CIPU driver 605 serves as the communication interface with photographic images processing unit (CIPU) 655, and encoder-driven device 620 serve as with encoder hardware 660 (such as, encoder chip, the encoding pack on chip system, etc.) communication interface.
The video conference module 625 of some embodiments is responsible for the various video conference function of process, such as image procossing, video conference management and networking.As shown in the figure, video conference module 625 is mutual with exchange of media module 610, video conference client 645 and network interface 650.In certain embodiments, video conference module 625 receives instruction from video conference client 645, and sends instruction to video conference client 645.The video conference module 625 of some embodiments is also by network interface 650, data and from network reception data (are such as sent to network, described network is local area network (LAN) (LAN), wireless wide area network (WLAN), wide area network (WAN), network of network (a network ofnetworks), code division multiple access access (CDMA) network, GSM network, etc.).
Video conference module 625 comprises image procossing layer 630, management level 635 and network layer 640.In certain embodiments, image procossing layer 630 pairs of image carry out image processing operations, for use in video conference.Such as, the image procossing layer 630 of some embodiments carries out exposure adjustment, image scaling, perspective correction and dynamic range adjustment, as described in further detail below.The image procossing layer 630 of some embodiments sends the request to the image from CIPU 655 by exchange of media module 610.
The management level 635 of some embodiments control the operation of video conference module 625.Such as, in certain embodiments, the one/multi-section camera of the two camera movement equipment of management level 635 initialization, process image and audio frequency, to send remote equipment to, and process the image and audio frequency that receive from remote equipment.In certain embodiments, the synthesis (such as PIP) that management level 635 produce for this equipment shows.In addition, management level 635 according to the networking report received from network layer 640, can change the operation of video conference module 625.
In certain embodiments, network layer 640 realizes some or all networking functions being used for video conference.Such as, as the U.S. Patent application * * introduced below and above, described in title " EstablishingVideo Conference During a Phone Call " (attorney docket APLE.P0212), except other function, the network that the network layer 640 of some embodiments is set up between two camera movement equipment of video conference and remote equipment connects (not shown), image is sent to remote equipment, and receives image from remote equipment.In addition, network layer 640 receives such as packet loss, the networked data of unidirectional stand-by period and round trip delay time and so on, and other various data, processes such data, and data report to management level 635.
The video conference client 645 of some embodiments is the application of the video conference function utilizing video conference module 625, such as video conference application, IP speech (VOIP) application (such as, Skype) or instant messaging application.In certain embodiments, video conference client 645 independently applies, and in other embodiments, video conference client 645 is integrated in Another application.
In certain embodiments, network interface 650 allows video conference module 625 and video conference client 645 send data by network (such as, cellular network, local area network (LAN), wireless network, network of network, internet etc.) and receive the communication interface of data.Such as, if video conference module 625 is wanted to another equipment sending data on internet (such as, the image taken by the camera of two camera movement equipment), so video conference module 625 sends to another equipment described by network interface 650 image.
B. video conference is arranged
The video conference request message sending and receiving sequence 700 of Fig. 7 example of some embodiments of graphic extension conceptually.Fig. 7 represents the video conference client 710 on the equipment of running on 705, videoconference server 715, and runs on the video conference request message sending and receiving sequence 700 between the video conference client 725 on equipment 720.In certain embodiments, video conference client 710 is identical with the video conference client 645 shown in Fig. 6 with 725.As shown in Figure 7, equipment (that is, an equipment 705) request video conference, another equipment (that is, equipment 720) replys this request.The two camera movement equipment described in this application can perform these two operations (that is, sending request and response request).
Videoconference server 715 route messages between video conference client of some embodiments.Although some embodiments realize videoconference server 715 on a computing equipment, but other embodiment realizes videoconference server 715 on multiple computing equipment.In certain embodiments, videoconference server is the server that can openly access, and it can process the message of numerous meeting with route simultaneously.Each video conference client 710 and 725 of some embodiments is through network interface, than network interface 650 as explained above, communicated with videoconference server 715 by network (such as, cellular network, local area network (LAN), wireless network, network of network, internet etc.).
When video conference client 710 receives (in operation 1) from the user of equipment 705 and the request of video conference of equipment 720 time, the video conference request message sending and receiving sequence 700 of some embodiments starts.When user's selection of equipment 705 is presented at a certain user interface (UI) project of the user interface on equipment 705, the video conference client 710 of some embodiments receives the request starting video conference.The example of this user interface is illustrated in Fig. 8 and Figure 11 described below.
After video conference client 710 receives request, video conference client 710 sends (in operation 2) video conference request to videoconference server 715, and this request, according to the input of user, is designated as addressee equipment 720.Videoconference server 715 gives the video conference client 725 of (in operation 3) equipment 720 video conference request forward.In certain embodiments, videoconference server 715 utilizes push technology, video conference request forward to video conference client 725.That is, when receiving request from video conference client 710, videoconference server 715 starts to transmit video conference request to video conference client 725, instead of waits for that client 725 sends the request to any message.
When the video conference client 725 of some embodiments receives video conference request, equipment 720 shows user interface, to point out that to the user of equipment 720 user of equipment 705 have sent the request starting video conference, and the user of prompt facility 720 accepts or refuses this video conference request.An example of this user interface is illustrated in Fig. 9 described below.In certain embodiments, when video conference client 725 receive (in operation 4) accept request from the video conference request of the user of equipment 705 time, video conference client 725 sends (in operation 5) video conference to videoconference server 715 and accepts.When the user of equipment 720 selects a certain user interface items of user interface of graphic extension in as Fig. 9, the video conference client 725 of some embodiments receives the request accepting video request.
After videoconference server 715 receives video conference acceptance from video conference client 725, videoconference server 715 accepts video conference to be transmitted to (in operation 6) video conference client 710.Some embodiments of videoconference server 715 utilize above-mentioned push technology, and video conference is accepted to be transmitted to video conference client 710.
When receiving video conference and accepting, some embodiments set up (in operation 7) video conference between equipment 705 and equipment 720.Different embodiments differently sets up video conference.Such as, the connection comprised between consulting device 705 and equipment 720 is set up in the video conference of some embodiments, determines the bit rate to Video coding, and exchange video between equipment 705 and equipment 720.
In the above example, the user of equipment 720 accepts video conference request.In certain embodiments, the video conference request of acceptance arrival automatically that equipment 720 can be configured to (such as, being arranged by the prioritizing selection of equipment), and do not show UI.In addition, the user of equipment 720 also can refuse (in operation 4) video conference request (such as, by selecting certain user interface items of the user interface be presented on equipment 720).Replace sending video conference to accept, video conference client 725 sends video conference refusal to videoconference server 715, and videoconference server 715 is transmitted to video conference client 710 video conference refusal.Thus do not set up video conference.
In certain embodiments, according to ongoing call, initiate video conference.That is, when the user and second user's communication of mobile device, through the license of the opposing party, this user can become call into video conference.Concerning some embodiments of the present invention, the startup of this video conference that Fig. 8 graphic extension is undertaken by two camera handheld mobile device 800.Fig. 8 utilizes five operational phases 810,815,820,825 and 830 of the user interface (" UI ") 805 of equipment 800, the startup of graphic extension video conference.
As shown in Figure 8, UI 805 comprises name field 835, choice menus 840 and selectable UI project 845.The user that name field 835 is presented at the call other end is intended to ask the name of the personage carrying out with it video conference.In this example, selectable UI project 845 (it can be realized as selectable button) provides selectable end call (End Call) option, to terminate call to user.Choice menus 840 shows the menu of selectable UI project, such as speakerphone project 842, quiet project 844, numeric keypad project 846, phone book item 848, maintenance project 852, video conference project 854 etc.Different embodiments differently shows choice menus.Concerning the embodiment of Fig. 8 graphic extension, choice menus 840 comprises several onesize icon, and each icon represents a different operation.Other embodiment provides rotatable menu, or gives specific project with priority (such as, by making these projects become larger).
Referring now to the state in the double teacher 810,815,820,825 and 830 of UI 805 graphic extension in fig. 8, the operation of UI 805 is described.In the first stage 810, between handheld mobile device user and Nancy Jones, establish call.Second stage 815 be presented at user select selectable video conference option 854 (such as, by point 850 singly refer to dub), to activate the UI 805 after video conference instrument.In this example, video conference option 854 (it can be realized as selectable icon) allows user to start video conference during conversing.In second stage, video conference option 850 is highlighted, is activated with instruction video meeting instrument.Different embodiments by different modes (such as, by highlighting frame or the text of project), can indicate such selection.
Phase III 820 is presented at the selection according to selecting video conference option 854, and equipment 800 has started the UI 805 after video conference process.Phase III be equipment wait for set up video conference time (such as, equipment wait for the equipment of the call other end accept or refusal video conference time) transition keep the stage.In the phase III 820, while setting up video conference connection, the user of equipment 800 still can talk with the user of another equipment (that is, Nancy Jones).In addition, the user by selecting of some embodiments permission equipment 800 is presented at the optional UI project (not shown) for cancelling video conference request on UI 805, cancels video conference request in the phase III 820.During this maintenance stage, different embodiments uses the display of the difference in UI 805 to indicate this pending state.
As shown in Figure 8, in certain embodiments, the full screen display of the video taken by equipment 800 is utilized, and the wait state of " Preview (preview) " symbol graphic extension phase III in the bottom of this video.Specifically, in fig. 8, the phase III 820, by the viewing area 860 of UI 805, shows the full frame of the video taken by the camera of equipment and presents, the beginning of graphic extension video conference process.In certain embodiments, the camera in front is when starting video conference, the acquiescence camera of equipment choice.Usually, when starting video conference, described front camera is facing to the user of equipment.Therefore, in the example of graphic extension in fig. 8, the phase III 820 is illustrated as equipment 800 full screen video of the user of display device 800.During being located at the phase III 820, the wait state of equipment is given prominence in " Preview " instruction appeared under the video in viewing area 860 further.
In certain embodiments, the cambic 3rd maintenance stage 820 can differently be represented.Such as, some embodiments allow the user of equipment 800 to select the camera at the back side as the camera starting video conference.In order to allow this selection, some embodiments allow user (such as, arranged by menu prioritizing selection) camera at the back side is appointed as the acquiescence camera starting video conference, and/or allow user from after selecting video conference option 854 user, select the camera at the back side in the menu of the camera in the display back side and front.In any one of these situations, UI 805 (such as, viewing area 860), during the 3rd keeps the stage 820, shows the video taken by the camera at the back side.
In addition, other embodiment by display device 800 take the photograph the comparatively small video of video, by showing the rest image be kept on equipment 800, by providing the message of the wait state of the equipment of highlighting (such as, by display " Conference Being Established " (setting up meeting), by not showing " Preview " instruction etc., the activation of video conference instrument can be pointed out.In addition, in the phase III 820, if the UI of some embodiments 805 provide this stage (such as, when user is just waiting for that long-distance user replies his request), user determines not enter video conference, so allow user to cancel and enter video conference, and return the conclusion button (not shown) of talking state.
Fourth stage 825 be illustrated in long-distance user accepted video conference request and establish video conference connect after, UI 805 in an interim state.Under this transition state, the size of the viewing area 860 of the video (in this example, the video taken by the camera in front) of display local user reduces (that is, reducing gradually), gradually as shown in arrow 875.Viewing area 860 (that is, the video of local user) reduces, and makes UI 805 can show the viewing area 870 (such as, display window 870) comprising the video of the camera from remote equipment after viewing area 860.In other words, the reducing of video 860 of local user produces the prospects insertion display 860 that PIP display 880, PIP display 880 has the video of local user, and the main display 870 of the background of long-distance user.In this example, the front camera that the main display 870 of background presents remote equipment is taking the Ms of its video (such as, Nancy Jones, the user of remote equipment), or the video of Ms's (such as, Nancy Jones is taking the Ms of its video) of its video taken by the back side camera of remote equipment.Those of ordinary skill will appreciate that the cambic fourth stage shown in Fig. 8 is a kind of exemplary method that some embodiments use, and other embodiment can the differently cambic fourth stage of cartoon making.
Fourth stage 825 is the optional UI project 832 of graphic extension in viewing area 855 below also.Optional UI project 832 (it can be realized as selectable button) provides selectable " closing session " option 832 under PIP display 880.User can select " closing session " option 832 to terminate video conference (such as, by singly referring to dub).Different embodiments can allow user with different mode closing sessions, as by the switch on conversion mobile device, by sending voice commands, etc.In addition, different embodiments can allow " closing session " option 832 to fade away during video conference, thus allows PIP display 880 to occupy whole viewing area 885.Subsequently when singly referring to the bottom dubbing viewing area 885, " closing session " option 832 can reappear, and makes user can utilize " closing session " option 832.In certain embodiments, the layout of viewing area 855 is identical with the viewing area 855 further described below.
Five-stage 830 is illustrated in the UI 805 after the cartoon making of end the 4th transition state 825.Specifically, during five-stage 830 is illustrated in video conference, the PIP display 880 presented by UI 805.As mentioned above, PIP display 880 comprises two video displays: from the larger background display 870 of remote camera, and insert display 860 from the less prospect of local camera.
PIP display 880 just presents a kind of mode of the synthesis view of the video taken by remote equipment and local device.Except this synthesis view, the equipment of some embodiments provides other synthesis view.Such as, replace the larger background display 870 with long-distance user, larger background display 870 can be local user, and it is long-distance users that less prospect inserts display 860.As described further below, some embodiments allow user during video conference, are switching as between the local camera providing PIP to show the insertion view of 880 and the camera of front view and/or remote camera.
In addition, some embodiments allow local video and long-distance video to appear in UI 805 two viewing areas side by side (such as, left and right display window, or display window up and down), or in the viewing area of two diagonal angle arrangements.In certain embodiments, as the U.S. Patent application * * introduced below and above, described in further in title " Establishing Video Conference During a Phone Call " (attorney docket APLE.P0212), arranged by the prioritizing selection of equipment, or by the control that user can select during video conference, user can specify the mode that PIP shows or acquiescence display mode.
When the user of the equipment 800 of Fig. 8 invites long-distance user to carry out video conference, long-distance user can accept or refuse this invitation.Fig. 9 is illustrated in six different stages 910,915,920,925,930 and 935, the UI 905 of the equipment 900 of long-distance user, and the equipment that described six phase table are shown in long-distance user presents and accept the sequence of operation that video conference is invited.The description of UI 905 below equipment 900 (namely, receive the equipment of video conference request) user be called invitee (invite recipient), and the user of equipment 800 (that is, sending the equipment of video conference request) is called inviter (invite requestor).In addition, in this example, assuming that the equipment 900 of invitee is two camera apparatus, the same with the equipment of inviter.But, in other example, one or two in these equipment are all single camera apparatus.
First stage 910 graphic extension is when invitee is from inviter, and John Smith receives the UI 905 when video conference is invited.As shown in Figure 9, the UI 905 of first stage comprises name field 935, message hurdle 940 and two optional UI projects 945 and 950.Name field 935 shows the name of the personage just asking video conference.In certain embodiments, name field 935 shows the telephone number of the personage just asking video conference, instead of the name of personage.Message hurdle 940 shows inviter to the invitation of invitee.In this example, " Video Conference Invitation (video conference invitation) " instruction inviter in message hurdle 940 is asking to carry out video conference with invitee.Optional UI project 945 and 950 (they can be realized as selectable button) provides selectable for invitee " refusal request (Deny Requst) " and " accept request (AcceptRequest) " option 945 and 950 for refusing or accept the invitation.Different embodiments differently can show these options and/or show other option.
When seeing " the Video Conference Invitation " symbol be presented in message hurdle 940, by selecting " refusal request " option 945 in UI or " accepting request " option 950 respectively, invitee can refuse or accept request.Second stage 915 is illustrated in the example shown in Fig. 9, and user selects " accepting request " option 950.In this example, click " accepting request " option 950 gently by the finger of user and realize described selection, and point out described selection by highlighting of option 950.There is provided other technology in certain embodiments to select accept or refusal request option 945 and 950 (such as, doublely to click gently, etc.) to point out described selection (such as, highlighting frame or the text of UI project).
Phase III 920 is presented at the UI 905 after invitee adheres to video conference.In this stage, UI 905 enters preview mode, and the full frame of video that preview mode shows from the front camera of remote equipment in viewing area 944 presents.In this case front camera is facing to the user (that is, the Nancy Jones in this example) of remote equipment.Therefore, her image is shown by described preview mode.This preview mode invitee can be guaranteed her video is properly displayed, and before video conference starts (such as, before beginning actual transmissions video), she pleases oneself to its appearance.In certain embodiments, " Preview " symbol and so on symbol can be shown such as under viewing area 944, to point out that invitee is in preview mode.
Some embodiments allow invitee to select the camera at the back side as the acquiescence camera starting video conference, or the camera at front or the back side is selected when starting video conference, as the U.S. Patent application * * introduced above, described in further in title " Establishing Video Conference During aPhone Call " (attorney docket APLE.P0212).In addition, other embodiment differently shows the preview display (such as, being placed in the less image in corner of viewing area 944) of invitee.In addition other embodiment does not comprise this preview mode, but after invitee accepts request, starts video conference immediately.
In the phase III, UI 905 shows two optional UI projects 975 and 946.One of them overlaps on viewing area 944, and another is below viewing area 944.Optional UI project 975 is that user can select to start video conference " acceptance " button 975.Optional UI project 946 is if invitee determined not add video conference in this stage, so her selectable " end " button 946.
Fourth stage 925 is presented at the UI 905 after invitee's selection " acceptance " button 975.In this example, " acceptance " button 975 is highlighted, to point out that invitee can start video conference at any time.Such selection can be pointed out in other embodiments by different modes.
Five-stage 930 is illustrated in UI 905 in an interim state after invitee has accepted video conference request.In this transition stage, the size of the viewing area 944 of the video (in this example, described video is taken by the camera in front) of display invitee reduces (that is, reducing gradually), gradually as shown in arrow 960.The video of invitee reduces, and makes UI 905 after viewing area 944, can show the viewing area 965 (such as, display window 965) of the video of the camera comprised from inviter.In other words, the reducing of video of invitee produces the prospects insertion viewing area 944 that PIP display 980, PIP display 980 has the video of invitee, and the main display 965 of the background of inviter.
In this example, the video of the man (that is, John Smith, the user of local device 800) of its video taken by the front camera that the main display 965 of background presents local device.In another example, this video can be the video that man's (such as, being taken the man of its video by John Smith) of its video taken by the back side camera of local device.Differently embodiment can the differently cambic five-stage of cartoon making.
During the UI of five-stage 930 also shows and is included in video conference, make the optional UI project 985 of the audio mute of another user (such as, mute button 985), terminate the optional UI project 987 of video conference (such as, closing session button 987), the viewing area 855 (such as, tool bar or menu bar) of the optional UI project 989 (such as, switching camera button 989) of the switching camera further described below.Thus, invitee can select any optional UI project 985-989 (such as, by singly referring to dub), to perform the operation of hope during video conference.Different embodiments allows invitee in different ways, such as, switch the switch on mobile device, by providing voice commands etc., performing and operating arbitrarily.
Although Fig. 9 represents the example layout of viewing area 855, but, some embodiments provide the different layouts of viewing area 855, the layout of the viewing area 855 of such as Fig. 8, and this layout only includes optional " closing session " UI option 832 for terminating video conference.Other layout of viewing area 855 can comprise any multiple different optional UI project for performing difference in functionality.In addition, five-stage 930 represents the viewing area 855 being presented at the bottom of UI 905.The different embodiments of viewing area 855 can be displayed on the diverse location in UI 905, and/or are defined as different shapes.
Fig. 9 is expressed as static state display district (that is, viewing area 855 is always shown) viewing area 855.But, in certain embodiments, viewing area 855 is dynamic display area.In the embodiment that some are such, viewing area 855 is not shown usually.On the contrary, only having when receiving trigger event (such as, such as dub viewing area 980 once, the user of voice commands and so on selects), just showing described viewing area 855.Receiving user's selection (such as, select optional quiet UI project 985), or the time of ormal weight (such as, 3 seconds) after, viewing area 855 disappears, and the time of described ormal weight can arrange appointment by user by the prioritizing selection of mobile device or video conference application.In the embodiment that some are such, viewing area 855 is automatically displayed after video conference starts, and disappears according to above mentioned same way.
6th stage 935 was illustrated in the UI905 after the cartoon making of end the 5th transition stage.Specifically, during the 6th stage was illustrated in video conference, the PIP display 980 presented by UI 905.As mentioned above, PIP display 980 comprises two video displays: from the larger background display 965 of local camera, and insert display 944 from the less prospect of remote camera.PIP display 980 just presents a kind of mode of the synthesis view of the video taken by remote equipment and local device.Except this synthesis view, the equipment of some embodiments provides other synthesis view.Such as, replace the background display with larger invitee, larger background display can be the video of inviter, and less prospect inserts the video that display can be invitee.As the U.S. Patent application * * introduced above, described in further in title " Establishing Video ConferenceDuring a Phone Call " (attorney docket APLE.P0212), insertion view during some embodiments allow user's control PIP to show and front view, switchably to show local camera and remote camera.In addition, some embodiments allow local video and long-distance video to appear in UI 905 two viewing area (such as, left and right display window, or display window up and down) side by side, or in the viewing area of two diagonal angle arrangements.As the U.S. Patent application * * introduced above, described in further in title " Establishing Video Conference During a Phone Call " (attorney docket APLE.P0212), arranged by the prioritizing selection of equipment, or by the control that user can select during video conference, the mode that user can specify PIP to show or acquiescence display mode.
Although Fig. 9 is according to six different operational phases, represent the sequence of operation presenting and accept video conference and invite, but some embodiments can realize described operation with the less stage.Such as, some such embodiments can be omitted and present phase III 920 and fourth stage 925, thus after user selects " accepting request " option 950, enter five-stage 930 from second stage 915.Described operation is realized (namely with the less stage, present and accept video conference invite) other embodiment can omit first stage 910 and second stage 915, thus when invitee receives the invitation of video conference from inviter, present the phase III 920 to user.
Figure 10 graphic extension by first and phase III are combined into a stage, and is combined into a stage second and fourth stage, performs the example of the operation of graphic extension in Fig. 9 with the less stage.Especially, Figure 10 is illustrated in five different stages 1090,1092,1094,930 and 935, the UI 905 of the equipment 900 of long-distance user.First stage 1090 is similar to the stage 810, except name field 995 display name " John Smith ", to point out outside the name of the personage of the call other end.That is, between the user and the user (that is, the John Smith in this example) of local device of remote-moving apparatus, call is established.Second stage 1092 and phase III 1094 are similar to first stage 910 and the second stage 915 of Fig. 9, outside second stage 1092 and phase III 1094 also show the preview of user (that is, the Nancy Jones in this example) of remote-moving apparatus.Fourth stage 930 is identical with the 6th stage 935 with the five-stage 930 of Fig. 9 with five-stage 935.
Except during conversing, activate outside video conference instrument by selectable option, some embodiments allow the user of two camera apparatus directly to initiate video conference, and need not first make a phone call.The such alternative approach of the another kind of video conference is initiated in Figure 11 graphic extension.Figure 14 is illustrated in the UI 1105 of seven different phases 1110,1115,1120,1125,1130,1135 and 1140, and described seven different phases display starts the alternative operation sequence of video conference.
In the first stage 1110, user checks the Affiliates List on mobile device, finds the personage that he wishes to carry out with it video conference, and the mode of to search contact person with him in order to converse is similar.In second stage 1115, user selects him to be ready to carry out with it the personage 1155 (such as, dubbing 1160 by singly referring in the name 1155 of this people) of video conference.This selection triggers UI 1105 and shows the information of contact person and various user-selectable options.In this example, the name 1155 of Jason is highlighted, to point out that this is the people that user is intended to carry out with it video conference.Different embodiments can point out this selection by different modes.Although second stage 1115 allows the user of equipment 1100 to select user to be intended to carry out with it the people of video conference by Affiliates List, but some embodiments allow user to select this people by " Recents (recently) " call history, and the user that described " Recents " call history lists equipment 1100 carries out with it particular number or the name of the personage of video conference or call recently.
In the phase III 1120, after the name 1155 that have selected personage, UI 1105 shows the information 1162 of selected personage and various optional UI project 1168,1172 and 1170.In this example, one of each optional UI project 1172 (can be realized as optional icon or button) provides video conference instrument." video conference " option one 172 allows user to invite the personage identified according to contact person 1166 to add video conference.Different embodiment differently (such as, with different arrangements) shows information 1162 and optional UI project 1168,1172 and 1170.
Fourth stage 1125 represents that user selects " video conference " option one 172 (such as, by singly referring to dub).In this example, " video conference " option one 172 is highlighted, to point out that video conference instrument 1172 is activated.Differently can point out this selection (such as, by highlighting text or the frame of selected icon) in various embodiments.
Five, the 6th and the 7th stage 1130,1135 and 1140 was similar to the 3rd, the 4th and five-stage 820,825 and 830 of graphic extension in Fig. 8, and understood for the 5th, the 6th and the 7th stage 1130,1135 and 1140 by the discussion with reference to these stages.Briefly, five-stage 1130 graphic extension waits for that long-distance user responds the cambic maintenance stage of video conference invitation.6th stage 1135 was illustrated in after long-distance user accepts video conference request, the size of (video of display local user) viewing area 1180 reduces gradually, therefore UI 1105 can after viewing area 1180, and display comprises the viewing area 1192 of the video of the camera from long-distance user.In the 7th stage 1140, during video conference, UI 1105 presents PIP display 1147.In certain embodiments, the 6th stage 1135 is similar to the layout of the viewing area 855 of Fig. 9 described above with the layout of the viewing area 855 in the 7th stage 1140.
Fig. 7,8,9,10 and 11 represents several modes setting up video conference.In certain embodiments, during conversing, voice data is transmitted (such as by a communication port (communication network by such as circuit switched communication network or packet exchange communication network and so on), speech), and during video conference, transmit voice data by another communication port.Thus, in such embodiments, before setting up video conference, voice data is transmitted (such as by a communication port, speech), once establish video conference, just transmit audio frequency by different communication ports (instead of the communication port used during conversing).
In order to provide the seamless transitions of voice data from call to video conference (such as, switching), some embodiment not finished calls before setting up video conference.Such as, some embodiments are in finished call and begin through peer-to-peer communications session and transmit before audio/video data, set up reciprocity video conference and connect (such as, after the message sequence completing graphic extension in Fig. 7).On the other hand, other embodiment is before finished call and starting presents the audio/video data of reception, set up reciprocity video conference and connect (such as after the message sequence completing graphic extension in Fig. 7), and begin through this peer-to-peer communications session transmission audio/video data.
The reciprocity video conference of some embodiments connects the mobile device directly intercommunication mutually (instead of such as by such as central server communication) allowed in video conference.Some embodiments of equity video conference allow the mutual shared resource of mobile device in video conference.Such as, via the control communication port of video conference, by sending instruction from a mobile device to another mobile device, image is differently processed (namely to order another mobile device described, share its image processing resource), than exposure adjustment operation as described in detail further below, Focussing operation and/or switch camera operation, a described mobile device can the operation of another equipment described in Long-distance Control video conference.
C. video conference architecture
As mentioned above, Figure 12 video conference of two camera movement equipment of some embodiments of graphic extension and software architecture of processing module 1200 conceptually.As shown in the figure, video conference and processing module 1200 comprise client application 1265, video conference module 1202, exchange of media module 1220, buffer 1225, photographic images processing unit (CIPU) driver 1230, encoder-driven device 1235 and decoder driver 1240.In certain embodiments, buffer 1225 is that the image of preservation video is so that the frame buffer of display on the display 1245 of two camera movement equipment.
In certain embodiments, client application 1265 is identical with the video conference client 645 of Fig. 6.As mentioned above, client application 1265 can be integrated in Another application, or is embodied as and independently applies.Client application 1265 can be the application of the video conference function utilizing video conference module 1202, such as video conference application, speech IP (VOIP) application (such as, Skype) or instant messaging application.
The client application 1265 of some embodiments sends instruction to video conference module 1202, such as start the instruction of meeting and closing session, instruction is received from video conference module 1202, the instruction of the user from two camera movement equipment is routed to video conference module 1202, and produces and be presented on two camera movement equipment and the user interface allowing user mutual with application.
D. video conference manager
As shown in Figure 12, video conference module 1202 comprises video conference manager 1204, image processing manager 1208, networking manager 1214 and buffer 1206,1210,1212,1212 and 1218.In certain embodiments, video conference module 1202 is identical with the video conference module 625 of graphic extension in Fig. 6, thus about the identical function of some or all described in video conference module 625 above performing.
In certain embodiments, video conference manager 1204 is responsible for when video conference starts, other modules some or all of of initializes video meeting module 1202 (such as, image processing manager 1208 and networking manager 1214), the operation of video conference module 1202 is controlled during video conference, with at the end of video conference, stop the operation of other modules some or all of of video conference module 1202.
The video conference manager 1204 of some embodiments also processes the image received from the one or more equipment video conference, and the image taken by of two camera movement equipment or two cameras, to be presented on two camera movement equipment.Such as, the decoded picture received from another equipment participating in video conference fetched by the video conference manager 1204 of some embodiments from buffer 1218, with fetch the image (that is, the image taken by two camera movement equipment) processed by CIPU 1250 from buffer 1206.In certain embodiments, before image being presented on two camera movement equipment, video conference manager 1204 goes back proportional zoom and composograph.That is, in certain embodiments, video conference manager 1204 produces PIP or other synthesis view, to show on the mobile apparatus.The image that some embodiment proportional zooms are fetched from buffer 1206 and 1218, and the image that other embodiment proportional zoom is fetched from one of buffer 1206 and 1218.
Although Figure 12 is illustrated as a part for video conference module 1202 video conference manager 1204, but, some embodiments of video conference manager 1204 are realized as the assembly be separated with video conference module 1202.Thus, single video conference manager 1204 can be used to management and control several video conference module 1202.Such as, some embodiments will run independently video conference module on the local device with mutual with each party in Multi-Party Conference, and each of these video conference modules on local device is by a video conference manager administration and control.
Before image is encoded by encoder 1255, the image processing manager 1208 of some embodiments processes the image taken by the camera of two camera movement equipment.Such as, some embodiments of image processing manager 1208 perform the one or more operations in exposure adjustment, Focussing, perspective correction, dynamic range adjustment and image scaling to the image that CIPU 1250 processes.In certain embodiments, image processing manager 1208 controls the frame rate of the coded image of another equipment sent in video conference.
One or more connections between the two camera movement equipment of some embodiments management of networking manager 1214 and another equipment participating in video conference.Such as, the networking manager 1214 of some embodiments, when video conference starts, sets up two connection between camera movement equipment and another equipment of video conference, and at the end of video conference, interrupts these and connect.
During video conference, networking manager 1214 sends the image of being encoded by encoder 1255 to another equipment of video conference, the image path that another equipment from video conference is received by decoder 1260, to decode.In certain embodiments, networking manager 1214, instead of image processing manager 1208, control the frame rate sending the image of another equipment of video conference to.Such as, some such embodiments of networking manager 1214 should be transmitted to some coded frame in the coded frame of another equipment of video conference by abandoning (that is, not transmitting), carry out control frame speed.
As shown in the figure, the exchange of media module 1220 of some embodiments comprises camera source module 1222, video compressing module 1224 and video decompression module 1226.Exchange of media module 1220 is identical with the exchange of media module 310 shown in Fig. 3, provides more details simultaneously.Camera source module 1222 is by CIPU driver 1230, route messages and media content between video conference module 1202 and CIPU 1250, video compressing module 1224 is by encoder-driven device 1235, route messages and media content between video conference module 1202 and encoder 1255, and video decompression module 1226 is by decoder driver 1240, route messages and media content between video conference module 1202 and decoder 1260.Some embodiments are embodied as a part for camera source module 1222 the TNR module 315 (not shown in Figure 12) be included in exchange of media module 310, and other embodiment is embodied as a part for video compressing module 1224 TNR module 315.
In certain embodiments, CIPU driver 1230 is identical with encoder-driven device 320 with the CIPU driver 305 of graphic extension in Fig. 3 with encoder-driven device 1235.The decoder driver 1240 of some embodiments serves as the communication interface between video decompression module 1226 and decoder 1260.In such embodiments, decoder 1260 decode by networking manager 1214 be received from video conference another equipment and by the image of video decompression module 1226 route.After image is decoded, image is sent back to video conference module 1202 by decoder driver 1240 and video compressing module 1226.
Except carrying out except Video processing during video conference, the video conference of two camera movement equipment of some embodiments and processing module 1200, also during video conference, perform audio processing operation.The software architecture that Figure 13 graphic extension is such.As shown in the figure, video conference and processing module 1200 comprise video conference module 1202 (it comprises video conference manager 1204, image processing manager 1208 and networking manager 1214), exchange of media module 1220 and client application 1265.Other assembly and the module of the video conference shown in Figure 12 and processing module 1200 is eliminated, with simplified illustration in Figure 13.Video conference and processing module 1200 also comprise frame buffer 1305 and 1310, audio frequency process manager 1315 and audio driver 1320.In certain embodiments, audio frequency process manager 1315 is realized as independently software module, and in other embodiments, audio frequency process manager 1315 is realized as a part for exchange of media module 1220.
Audio frequency process manager 1315 processes the voice data of being caught by two camera movement equipment, to send another equipment in video conference to.Such as, audio frequency process manager 1315 receives the voice data of being caught by microphone 1325 by audio driver 1320, to audio data coding, afterwards the voice data of coding is kept in buffer 1305, to send another equipment described to.Audio frequency process manager 1315 also processes is caught by another equipment in video conference, and from the voice data that another equipment described receives.Such as, audio frequency process manager 1315 fetches voice data from buffer 1310, and to voice data decoding, decoded voice data is exported to loud speaker 1330 by audio driver 1320 subsequently.
In certain embodiments, video conference module 1202 is the part of larger meeting module together with the buffer of audio frequency process manager 1315 and association thereof.When not exchanging video content, when carrying out multi-party audio meeting among several devices, video conference and processing module 1200 utilize networking manager 1214 and audio frequency process manager 1315, easier to make by the audio exchange of Internet protocol (IP) layer.
With reference now to Figure 14, the operation of the video conference manager 1204 of some embodiments is described.Figure 14 conceptually graphic extension by the video conference manager of some embodiments, than as illustrated in Figure 12 video conference manager 1204 perform process 1400.This can be equal to and is performed by the management level 635 of Fig. 6.In certain embodiments, when the user of two camera movement equipment accepts (such as, by being presented at the user interface on two camera movement equipment) video conference request, or when the request that the user that the user of another equipment accepts two camera movement equipment sends, video conference manager 1204 performs process 1400.
Starting the instruction of video conference by receiving (1405), starting to process 1400.In certain embodiments, instruction receives from client application 1265, or received from user by the user interface be presented on two camera movement equipment, and be transmitted to video conference manager 1204 by client application 1265.Such as, in certain embodiments, when the user of two camera movement equipment accepts video conference request, instruction is received by user interface, and is forwarded by client application.On the other hand, when the user of another equipment accepts the request sent from local device, some embodiments receive instruction from client application, and do not have user interface interaction (but, can there is the user interface interaction of previous transmission initial request).
Subsequently, the first module that 1400 initialization (1410) are mutual with video conference manager 1204 is processed.The module mutual with video conference manager 1204 of some embodiments comprises CIPU1250, image processing manager 1208, audio frequency process manager 1315 and the manager 1214 that networks.
In certain embodiments, initialization CIPU 1250 comprises the image of one or two camera shooting of the two camera movement equipment of instruction CIPU 1250 beginning reason.Some embodiments start to fetch image and process and the image fetched of encoding, initialisation image process manager 1208 from buffer 1210 by instruction image processing manager 1208.In order to initialization audio frequency process manager 1315, some embodiment instruction audio frequency process managers 1315 start the voice data that coding microphone 1325 is caught, with the voice data (voice data from another equipment receives) be kept in buffer 1310 of decoding, to export to loud speaker 1330.The initialization of the networking manager 1214 of some embodiments comprises the foundation of instruction networking manager 1214 and is connected with the network of another equipment in video conference.
Process 1400 is determined whether (1415) also remain to have subsequently and is wanted initialized any module.When surplus have want initialized module time, process 1400 returns operation 1410, with another module of initialization.When module in need is initialised all, process 1400 produces (1420) for the composograph showing (that is, local display) on two camera movement equipment.These composographs can be included in the U.S. Patent application * * introduced above, those composographs of graphic extension in Figure 65 in title " Establishing Video Conference During aPhone Call " (attorney docket APLE.P0212), and the various combinations that can comprise the image of the camera of the two camera movement equipment from this locality and the image from the camera of another equipment of participation video conference.
Next, process 1400 and determine whether (1425) have made change to video conference.Some embodiments are mutual by user and the user interface be presented on two camera movement equipment, receive the change to video conference, and other embodiment is by networking manager 1214, receive the change (that is, Long-distance Control) to video conference from another equipment.In certain embodiments, also can from client application 1265, or other module in video conference module 1202 receives the change arranged video conference.Video conference arranges and also can change due to the change of network condition.
When changing, process 1400 determines that whether (1430) are the changes to network settings to the change of video conference.In certain embodiments, described change or network settings change or image taking arranges change.When being the change to network settings to the change of video conference, process amendment (1440) network settings, enter operation 1445 subsequently.The network settings of some embodiments change the bit rate comprising and changing Image Coding, or transmit the frame rate of image to another equipment.
When not being the change to network settings to the change of video conference, process 1400 determines that described change is the change arranged image taking, thus enters operation 1435.Process 1400 performs the change that (1435) are arranged image taking subsequently.In certain embodiments, switching camera can be comprised (namely to the change that image taking is arranged, any camera capture video on switch dual camera movement equipment), Focussing, exposure adjustment, display or the image and zooming in or out do not shown from one or two camera of two camera movement equipment be presented at image on two camera movement equipment, and other arranges change.
In operation 1445, process 1400 determines whether to terminate video conference.When process 1400 is determined not terminate video conference, process 1400 returns operation 1420.At the end of process 1400 determines that video conference is incited somebody to action, process 1400 terminates.When process 1400 receives from client application 1265 instruction terminating video conference (, receive owing to the user interface by local two camera movement equipment, or the instruction from another equipment participating in video conference receives), some embodiments of process 1400 are determined to terminate video conference.
In certain embodiments, at the end of video conference, video conference manager 1204 performs the various operations be not shown in process 1400.Some embodiment instructions CIPU 1250 stops producing image, and instruction networking manager 1214 interrupts being connected with the network of another equipment in video conference, and instruction image processing manager 1208 stops process and coded image.
E. image processing manager & encoder
To reduce except the temporal noise that performed by CIPU and/or CIPU driver and except image processing operations, some embodiments perform various image processing operations at the image procossing layer 630 of video conference module 625.These image processing operations can comprise the adjustment and image scaling etc. of exposure adjustment, Focussing, perspective correction, dynamic range.
Figure 15 conceptually graphic extension performs the process 1500 of such image processing operations.In certain embodiments, some or all operations processing 1500 are performed by the image processing manager 1208 of Figure 12 and the combination of encoder-driven device 1235.In the embodiment that some are such, image processing manager 1208 performs the process (such as, convergent-divergent, dynamic range adjustment, perspective correction etc.) based on pixel.Some embodiments, during video conference, perform process 1500 to by the image being transmitted to another equipment participating in video conference.
With reference now to Figure 12, process 1500 is described.By fetching (1505) image from buffer 1206, start this process.In certain embodiments, the image fetched is the image (that is, an image in image sequence) of video.This video can be taken by the camera of the equipment performing process 1500 thereon.
Subsequently, process 1500 and (1510) exposure adjustment is performed to the image fetched.Some embodiments, by being presented at the user interface on two camera movement equipment, perform exposure adjustment.The example exposure adjustment operation of this embodiment of Figure 16 graphic extension.
The three phases 1610,1615 and 1620 of the UI 1605 of Figure 16 reference device 1600, illustrates exposure adjustment operation.First stage 1610 graphic extension comprises the UI 1605 of viewing area 1625 and viewing area 855.As shown in the figure, viewing area 1625 shows the image 1630 of all black dull man of sun shape and face and health.Black dull face and body indicates this man not obtain appropriate exposure.Image 1630 can be the video image taken by the camera of equipment 1600.As shown in the figure, viewing area 855 comprises the optional UI project 1650 for terminating video conference.In certain embodiments, the layout of viewing area 855 is identical with the layout of the viewing area 855 of Fig. 9 described above.
The a certain region of the user by selecting viewing area 1625 of second stage 1615 graphic extension equipment 1600, initiates exposure adjustment operation.In this example, the realization be Anywhere placed in viewing area 1625 by finger 1635 is selected.In certain embodiments, user arranges the menu of adjustment from possible image and selects exposure adjustment.
After phase III 1620 has been presented at exposure adjustment operation, the image 1640 of described man.As shown in the figure, image 1640 is similar to image 1630, but the man in image 1640 is by appropriate exposure.In certain embodiments, the image of appropriate exposure is the image taken after the image of incorrect exposure.The exposure of the successive image that the exposure adjustment operation adjustment of initiating in second stage 1615 is taken by the camera of equipment 1600.
Return Figure 15, process 1500 performs (1515) Focussing to image subsequently.Some embodiments, by being presented at the user interface on two camera movement equipment, perform Focussing.The example of Figure 17 this Focussing of graphic extension operation conceptually.
Three different phases 1710,1715 and 1720 of the UI 1705 of Figure 17 reference device 1700, graphic extension Focussing operates.First stage 1710 graphic extension comprises the UI 1705 of viewing area 1725 and viewing area 855.Viewing area 1725 presents the blurred picture 1730 of the man taken by the camera of equipment 1700.The image 1730 misalignment focus of this fuzzy this man of explanation.That is, when the image 1730 of this man taken by camera, the lens of camera are not focused in this man.Equally, image 1730 can be the video image taken by the camera of equipment 1700.As shown in the figure, viewing area 855 comprises the optional UI project 1750 for terminating video conference.In certain embodiments, the layout of viewing area 855 is identical with the layout of the viewing area 855 of Fig. 9 described above.
A region of the user by selecting viewing area 1725 of second stage 1715 graphic extension equipment 1700, initiates Focussing operation.In this example, the realization be Anywhere placed in viewing area 1725 by finger 1735 is selected.In certain embodiments, user arranges the menu of adjustment from possible image and selects Focussing.
Phase III 1720 be presented at Focussing operation after, the image 1740 of described man.As shown in the figure, image 1740 is identical with image 1730, but the man in image 1740 seems more clear.This illustrates that the lens of camera are gathered in this man rightly.In certain embodiments, the appropriate image focused on is the image taken after the image of incorrect focal length.The focal length of the successive image that the Focussing operation adjustment of initiating in second stage 1715 is taken by the camera of equipment 1700.
Return Figure 15, process 1500 pairs of images and carry out (1520) image scaling.Some embodiments carry out image scaling to image, to reduce for the bit number (that is, making a price reduction bit rate) to Image Coding.In certain embodiments, process 1500 is carried out as with reference to the U.S. Patent application * * that introduces above, the image scaling that the Figure 26 in title " Establishing Video Conference During a Phone Call " (attorney docket APLE.P0212) illustrates.
Process 1500 carries out (1525) perspective correction to image subsequently.In certain embodiments, the U.S. Patent application * * as introduced is above carried out in process 1500, the perspective correction illustrated in the Figure 24 in title " Establishing VideoConference During a Phone Call " (attorney docket APLE.P0212).Such perspective correction relates to the data utilized by the two orientation of camera movement equipment of identification and of movement or many upper accelerometers and/or gyro sensor acquisition.Described data are used to revise image, to correct incorrect perspective subsequently.
After perspective correction is carried out to image, the dynamic range of process 1500 adjustment (1530) image.In certain embodiments, the dynamic range of image is the scope that each pixel in image can have probable value.Such as, the image with the dynamic range of 0-255 can be adjusted to the scope of 0-128, or other number range any.The dynamic range of adjustment image can reduce the quantity of the bit be used for Image Coding (that is, reducing bit rate), thus makes image smoothing.
The dynamic range of adjustment image also can be used for other object various.An object reduces picture noise (such as, image is taken by noisy camera sensor).In order to reduce noise, the dynamic range of image can be adjusted, black level is redefined, to comprise somber (that is, crushing (crush) black).Like this, the noise of image is reduced.Another object of dynamic range adjustment is one or more colors of adjustment or color gamut, to strengthen image.Such as, some embodiments can suppose that the image of front camera shooting is the image of face.Therefore, the dynamic range of this image can be adjusted, to strengthen redness and pink, make the cheek of people seem ruddy/more ruddy.Dynamic range adjustment operation also can be used for other object.
Finally, process 1500 and determine that (1535) are for the one or more rate controller parameters to Image Coding.In certain embodiments, such rate controller parameter can comprise quantization parameter and frame type (such as, predictive frame, bidirectional frame, intracoded frame).This process terminates subsequently.
Although each operation of process 1500 is illustrated perform according to specific order, but, those of ordinary skill will appreciate that the many operations (exposure adjustment, Focussing, perspective correction etc.) in these operations can perform according to random order, and has nothing to do each other.That is, the process of some embodiments can carry out Focussing before exposure adjustment, or can carry out similar amendment to the process of graphic extension in Figure 15.
F. network manager
The networking manager 1800 of Figure 18 some embodiments of graphic extension conceptually, than the software architecture of networking manager 1214 as illustrated in Figure 12.As mentioned above, networking manager 1800 network managed between the remote equipment in the two camera movement equipment and video conference that it runs on connects (such as, connection establishment, connects monitoring, connects and adjusts, disconnecting etc.).During video conference, the networking manager 1800 of some embodiments also processes the data sending remote equipment to, and the data that process receives from remote equipment.
As shown in Figure 18, the manager 1800 that networks comprises session negotiation manager 1805, transmitter module 1815, general transmission buffer 1820, general transmission buffer-manager 1822, virtual transportation protocol (VTP) manager 1825, receiver module 1830 and media transfer management device 1835.
Session negotiation manager 1805 comprises protocol manager 1810.Protocol manager 1810 is guaranteed during video conference, and transmitter module 1815 uses correct communication protocol to transmit data to remote equipment, and forces the rule of obeying the communication protocol used.Some embodiments of protocol manager 1810 support various communications protocols, such as real time transport protocol (RTP), transmission control protocol (TCP), User Datagram Protoco (UDP) (UDP) and HTTP (HTTP), etc.
Session negotiation manager 1805 is responsible for setting up the connection between two camera movement equipment and the one or more remote equipments participating in video conference, and interrupts these connections after the conference.In certain embodiments, session negotiation manager 1805 be also responsible for setting up between the remote equipment in two camera movement equipment and video conference (such as, to transmit and receiver, video and/or audio stream) multimedia communication sessions (such as, utilizing Session initiation Protocol (SIP)).
Session negotiation manager 1805 also receives feedback data from media transfer management device 1835, according to feedback data, the operation (being such as, transmit or abandon packet/frame) of general transmission buffer 1820 is determined by general transmission buffer-manager 1822.In certain embodiments, this feedback can comprise unidirectional stand-by period and bandwidth estimation bit rate.In other embodiments, feedback comprises packet loss information and round trip delay time (such as, according to the grouping of the remote equipment sent in video conference and determine from the reception of the confirmation of remote equipment).According to the information from media transfer management device 1835, session negotiation manager 1805 can determine whether sending too much grouping, with instruction general transmission buffer-manager 1822, general transmission buffer 1820 is allowed to transmit less grouping (that is, adjusting bit rate).
Transmitter module 1815 from video buffer (such as, the buffer 1212 of Figure 12) fetch coded image (such as, form with bit rate), and to image subpackage by general transmission buffer 1820 and virtual transportation protocol manager 1825, to send the remote equipment in video conference to.Produce coded image and send to the mode of transmitter module 1815 can based on the instruction received from media transfer management device 1815 and/or session negotiation manager 1805 or data coded image.In certain embodiments, one component group is divided into the bit stream that image subpackage relates to receiving, (namely each grouping all has specific size, the size of being specified according to specific protocol by session negotiation manager 1805), with increase any needs header (such as, address header, agreement specifies header, etc.).
General transmission buffer-manager 1822, according to the data received from session negotiation manager 1805 and/or instruction, controls the operation of general transmission buffer 1820.Such as, data can be transmitted, stop transmitting data, abandoning data by instruction general transmission buffer-manager 1822 order general transmission buffer 1820, etc.As mentioned above, in certain embodiments, when the remote equipment of conference participation is as lost packets, according to the confirmation received from remote equipment, this situation can be identified.In order to reduce packet loss, instruction general transmission buffer-manager 1822 grouping can be transmitted with lower speed to remote equipment.
The data received from transmitter module 1815 preserved by general transmission buffer 1820, and by VTP manager 1825, these data sent to remote equipment.As mentioned above, according to the instruction received from general transmission buffer-manager 1822, the discardable data of general transmission buffer 1820 (such as, the image of video).
In certain embodiments, RTP is used to transmit packet (such as, audio packet and video packets) by UDP during video conference.Other embodiment utilizes RTP to transmit packet by TCP during video conference.In various embodiments, also other transport-layer protocols can be used.
Some embodiments, with pair of end slogan (that is, source port number and destination port numbers), define the specific communications passage between two mobile devices.Such as, a communication port between mobile device can use pair of end slogan (such as, source port 50 and destination port one 00) definition, and another the different communication port between mobile device can define different port numbers (such as, source port 75 and destination port one 50) with another.Some embodiments also utilize a pair Internet protocol (IP) address to define communication port.In certain embodiments, different communication ports is used to transmit dissimilar packet.Such as, video data packets, audio data packet and control signal packet can be transmitted in independent communication port.Thus, video communication passage transmits video data packets, and audio communication channel transmits audio data packet.
In certain embodiments, communication port is controlled for during video conference, the information receiving between local mobile device and remote equipment.The example of this information receiving comprises and sends and receive request, notice and the confirmation to described request and notice.Another example of information receiving comprises the transmission remote control commands message from an equipment to another equipment.Such as, by the control communication port through local device, the instruction of the operation of Long-distance Control remote equipment is sent from local device to remote equipment, the U.S. Patent application * * introduced above can be performed, the remote control operation illustrated in title " Establishing VideoConference During a Phone Call " (attorney docket APLE.P0212) (such as, commander equipment only sends the image from a particular camera, or only utilizes particular camera photographic images).Different embodiments utilizes different agreements, and such as transfer control agreement (RTCP), RTP expansion, SIP etc. realize described control communication in real time.Such as, some embodiments utilize RTP to expand and between two mobile devices, transmit one group of control message in video conferences, and between described two mobile devices, transmit another group control message during using SIP to be grouped in video conference.
The VTP manager 1825 of some embodiments allows by single communication port (such as, utilize identical pair of end slogan), transmit the dissimilar packet of specifying and being transmitted by different communication ports (such as, utilizing different multipair port numbers).A kind of technology for this relates to identification data packet type, the communication port that will be transmitted by it is prescribed by the port numbers identification data of specifying for a pair grouping of extracting packet, with the pair of end slogan by the pair of end slogan of packet being modified as described single communication passage, carry out specified data grouping by by the transmission of described single communication passage (that is, all packets are transmitted by identical pair of end slogan).
In order to understand a pair initial port number of often kind of packet, some embodiments preserve the mapping relative to a pair initial port number of data packet type.Some such embodiments utilize the packet type field of agreement to distinguish the different grouping be multiplexed in a communication port subsequently.Such as, some embodiments with VTP manager are multiplexed to audio frequency, video and control packet in a rtp streaming, utilize RTP packet type field to distinguish in a RTP passage, be transmitted to another equipment in video conference audio frequency, video and control packet.In the embodiment that some are such, VTP manager also gives another equipment described the control message route in SIP grouping.
Some embodiment identification checks packet signature (that is, packet header form), to distinguish the different grouping (such as, distinguishing the grouping utilizing RTP to transmit and the grouping utilizing SIP to transmit) utilizing different agreements to transmit.In such embodiments, after the packet determining different agreement, check the field of the packet (such as, utilizing voice data and the video data of RTP) using same protocol as mentioned above, to identify different data types.In this manner, VTP manager 1825, by single communication passage, transmits the predetermined different pieces of information grouping transmitted by different communication ports.
Although be described above by single communication passage, combine a kind of mode of dissimilar data, but, other embodiment utilizes other technology that different packet types is multiplexed in a communication stream.Such as, a kind of technology of some embodiments relates to a pair initial port number noticing packet, and described a pair initial port number is kept in packet itself, to extract after a while.Dissimilar data assemblies between also existing two video conferencing participants to a port to other technology in passage.
When VTP manager 1825 receives packet by virtual communication passage from remote equipment, VTP manager 1825 checks the signature of packet, to identify the different grouping utilizing different agreements to send.This signature can be used to distinguish SIP grouping and RTP grouping.The VTP manager of some embodiments also utilizes some or all packet type field of dividing into groups, and DeMux is multiplexed into the various dissimilar grouping (such as, audio frequency, video and control packet) in single virtual passage.After these dissimilar groupings of identification, the port that VTP manager is preserved according to it is checked the number and the mapping of packet type, and the port check mark of each dissimilar grouping with its correspondence is connected.VTP 1825 is subsequently with the pair of end slogan of the pair of end slogan Update Table grouping identified, and forwarding data grouping, to unpack.In other embodiment utilizing different technology different packet types to be multiplexed in single channel, VTP manager uses different technology to resolve grouping.
By utilizing the multiplexed grouping different with demultiplexing of such technology, VTP manager 1825 creates single virtual communication port (such as, single port number), video data, voice data and control signal data are transmitted by described single virtual communication port, with by described single virtual communication port, from remote equipment audio reception, video and control packet.Thus, from the viewpoint of network, data are transmitted by described single virtual communication port, and from the viewpoint of session negotiation manager 1805 and protocol manager 1810, video data, voice data and control signal data are transmitted by different communication ports.
Be similar to the image of the remote equipment sent in video conference, the image transmitted from the remote equipment video conference receives by packet format.Receiver module 1830 receives grouping, and unpacks grouping so that reconstructed image, afterwards Image Saving in video buffer (such as, the buffer 1216 of Figure 12) to decode.In certain embodiments, image is unpacked and relates to any header of removing, and by grouping reconstruct, only there is the bit stream of view data (may have size data).
Media transfer management device 1835 processes from the feedback data of network reception (such as, unidirectional stand-by period, bandwidth estimation bit rate, packet loss data, round trip delay time data, etc.), dynamically and adaptively to adjust the speed (that is, bit rate) of transfer of data.In some other embodiments, media transfer management device 1835 also controls fault-tolerant according to the feedback data after process, and feedback data can be sent to video conference manager 1204, and to adjust other operation of video conference module 1202, such as proportional zoom, size adjustment and coding.Except when when the remote equipment in meeting can not process all groupings, make general transmission buffer abandon outside grouping, video conference module and encoder can use low bit rate to Image Coding, make each image, will send less grouping.
In certain embodiments, media transfer management device 1835 also can other variable of watch-dog, such as can affect power consumption and the thermal level of the operating power pattern how configuring camera, as mentioned above.These data also can be used as the additional input (such as, if equipment just overheats, so media transfer management device 1835 can manage process is slowed down) added in feedback data.
With reference now to Figure 12, several exemplary operations of networking manager 1800 are described.First the remote equipment image that camera by two camera movement equipment is taken sent in video conference is described, illustrates afterwards from remote equipment and receive image.Transmitter module 1815 is fetched the coded image of the remote equipment be transmitted to video conference from buffer 1212.
Protocol manager 1810 determines the appropriate agreement (such as, transmitting the RTP of Voice & Video) that will use, and session negotiation manager 1805 is this notice of settlement transmitter module 1815.Afterwards, transmitter module 1815 pairs of images carry out subpackage, and the image of subpackage is sent to general transmission buffer 1820.General transmission buffer-manager 1822 receives instruction from session negotiation manager 1805, transmits with order general transmission buffer 1820 or abandons image.VTP manager 1825 receives from general transmission buffer 1820 and divides into groups and process grouping, to give remote equipment by single communication passage grouping queries.
When receiving image from remote equipment, VTP manager 1825 is by virtualized single communication passage, receive the image of subpackage from remote equipment and process grouping, so that by be assigned with for receive image communication port (such as, video communication passage), image is directed to receiver module 1830.
Receiver module 1830 unpacks grouping with reconstructed image, and image is sent to buffer 1216, to be decoded by decoder 1260.Receiver module 1830 also forwards control signal message and gives media transfer management device 1835 (such as, from the confirmation that the reception of the remote equipment in video conference is divided into groups).
Be described above several exemplary operations of networking manager 1800.These are illustrative example, because other embodiment various will utilize different modules, or when various function differently distributes between the modules, perform these operations or different operations.In addition, the module of networking manager 1800, or other module can perform other operation, such as dynamic bit rate adjustment.
IV. adjustment and control operation in meeting
A. picture-in-picture amendment
1. rotate
When the user of the mobile device for video conference rotates described mobile device during video conference, the PIP display presented during some embodiments are rotated in video conference.Figure 19 graphic extension when making equipment 1900 rotate to horizontal level from upright position, the rotation of the UI 805 of equipment 1900.When the long limit of screen is vertical, equipment 1900 is held by perpendicular, and when the long limit shielded is level, equipment 1900 is held by horizontal.In the example of graphic extension in Figure 19, the longitudinal view that UI 805 optimizes from perpendicular the holding for equipment rotates horizontal the holding becoming equipment 1900 and the transverse views optimized.This spinfunction make when mobile device 1900 by perpendicular hold or horizontal hold time, user can watch with the UI 805 of stand up position display.
Figure 19 according to six different operational phases 1910,1915,1920,1925,1930 and 1935, the rotation of graphic extension UI 805.First stage 1910 be illustrated in video conference during UI 805 between the local user of this equipment and the long-distance user of remote equipment.UI 805 in Figure 19 shows with after establishing video conference, and the PIP shown in the five-stage of Fig. 8 shows identical PIP display 880.In this example, the video of the equipment shooting of local user is presented at and inserts in viewing area 860, and the video of the equipment shooting of long-distance user is presented in background display area 870.Viewing area 855 under PIP display 880 comprises the optional UI project 1985 (such as, " closing session " button 1985) that user can select to terminate video conference (such as, by singly referring to dub).
Second stage 1915 is illustrated in the UI805 that user starts after banking equipment 1900.In this example, user starts equipment 1900 to be held by horizontal, as shown in arrow 1960 from erecting to hold to tilt to.The outward appearance of UI 805 does not change.In other cases, user may change into want equipment 1900 from horizontal hold to tilt to held by perpendicular, in these cases, UI 805 switches to the vertical view optimized from the view of level optimization.
Phase III 1920 is illustrated in equipment 1900 from by the perpendicular UI 805 held the state after tilting to by horizontal holding.In this state, the outward appearance of UI 805 does not still change.In certain embodiments, be tilted beyond threshold quantity making equipment 1900 and after remaining above threshold quantity a period of time, trigger rotation process.In the example of graphic extension in Figure 19, assuming that threshold quantity and rotary speed can not cause UI 805 to rotate, until equipment be placed on horizontal level after short period interval.Different embodiments has different threshold quantity for triggering rotation process and stand-by period.Such as, some embodiments can have the threshold value of so low triggering rotation process, make the orientation of not tube apparatus 1900, all make UI 805 seem as it always shows with stand up position.In other embodiments, the user of equipment 1900 can specify when to trigger rotation process (such as, being arranged by menu prioritizing selection).Equally, some embodiments after equipment is tilted beyond threshold quantity, can not postpone described rotation.In addition, different embodiments can allow to trigger rotation process by different modes, as by the switch switched on mobile device, by sending voice commands, according to the selection undertaken by menu, etc.
Fourth stage 1925 is illustrated in the UI 805 started after rotation process.Some embodiment cartoon making rotational display districts, to provide the feedback about rotation process to user.The example of a kind of animation that Figure 19 graphic extension is such.Specifically, Figure 19 its fourth stage 1925 show viewing area 880 start together with 855 rotate.Viewing area 880 and 855 rotates around the axle 1965 (that is, z-axis) at the center through UI 805.Viewing area 880 is rotated identical quantity with 855, but rotates along the direction contrary with the rotation (such as, by the inclination of equipment 1900) of equipment 1900.In this example, because equipment 1900 has been rotated in a clockwise direction 90 ° (by becoming horizontal hold from perpendicular holding), therefore rotation process can make viewing area 880 and 855 along counter rotation 90 °.When viewing area 880 and 855 rotates, viewing area 880 and 855 reduces pari passu, with applicable UI 805, viewing area 880 and 855 still can be appeared on UI 805 completely.Some embodiments can provide the message (such as, by display words " Rotating (rotation) ") of the state of indicating equipment 1900.
Five-stage 1930 is illustrated in the UI 805 of viewing area 880 and 855 from longitudinal view to transverse views after inhour half-twist.In this stage, viewing area 880 and 855 is rotated, but is not also deployed into the whole width of UI 805.Arrow 1975 points out the ending at five-stage, and viewing area 880 and 855 will start side direction and launch, with the whole width of applicable UI 805.Different embodiments can not comprise this stage, because side by side can carry out described expansion with the rotation in fourth stage 1925.
6th stage 1935 was illustrated in viewing area 880 and 855 and has been unfolded to occupy UI 805 after the whole display of UI 805.As mentioned above, other embodiment can differently realize this rotation.Concerning some embodiments, only the screen rotation of equipment is exceeded threshold quantity and just can trigger described rotation process, and the orientation of not tube apparatus 1900.
In addition, other embodiment can be provided for the different animations indicating rotation process.The rotation process performed in Figure 19 relates to the central rotation of viewing area 880 and 855 around UI 805.On the other hand, viewing area can be made to rotate separately around the central shaft of their viewing areas separately.A kind of such method is illustrated in Figure 20.Figure 20 represents the alternative approach of the rotation of the viewing area 870 and 860 of the PIP display 880 of cartoon making UI 805.The PIP display 880 of graphic extension in Figure 20 and the PIP of graphic extension in Fig. 8 show 880 identical.
Figure 20 shows the rotation of 880 according to six different operational phases 1910,1915,1920,2025,2030 and 2035, graphic extension PIP.The operation of the front three phases of UI 805 with illustrate in the UI 805 such as in Figure 19 before the operation of three phases identical.In the phase III of Figure 19 and 20, equipment 2000 all becomes horizontal hold from perpendicular holding, and the rotation of UI 805 does not also start.
The alternative approach rotated described in fourth stage 2025 graphic extension cartoon making.In fourth stage, rotation process starts.Specifically, fourth stage 2025 shows the beginning of the rotation of viewing area 870 and 860.Viewing area 870 and 860 rotates around the axle 2067 and 2065 (that is, z-axis) at the center through each viewing area respectively.Viewing area 870 is rotated identical amount with 860, but rotates along the direction contrary with the rotation (such as, by the inclination of equipment 2000) of equipment 2000.With graphic extension similar in the fourth stage 1925 of Figure 19 above, because equipment 2000 has been rotated in a clockwise direction 90 ° (by becoming horizontal hold from perpendicular holding), therefore rotation process can make viewing area 870 and 860 along counter rotation 90 °.When viewing area 870 and 860 rotates, viewing area 870 and 860 is scaled with applicable UI 805, and viewing area 870 and 860 still can be appeared on UI 805 completely.
Five-stage 2030 is illustrated in the UI 805 of viewing area 870 and 860 all from longitudinal view to transverse views after inhour half-twist.In this stage, viewing area 870 and 860 is rotated, but is not also deployed into the whole width of UI 805.In addition, viewing area 860 is not also moved to its final position.Insert the final position of viewing area 860 in PIP display 880 to be determined (such as, inserting viewing area 860 to show 880 lower left corner at PIP) by the position as shown in 880 at PIP in the insertion viewing area 860 as shown in the first stage 1910.In this stage, insert viewing area 860 still in the upper left corner of UI 805.
Arrow 2080 points out the ending at five-stage 2030, and viewing area 870 and 860 will start side direction and launch, until main viewing area 870 is applicable to the whole width of the UI 805 of the horizontal equipment held.In addition, arrow 2075 points out that inserting viewing area 860 will slide into the lower left corner of PIP display 880.
Different embodiments differently can realize this point.In certain embodiments, the movement of inserting viewing area 860 can occur with the expansion of main viewing area 870 simultaneously, or can occur in sequence.In addition, some embodiments can insert viewing area 860 by convergent-divergent before main viewing area 870 is launched, in expansion or after launching, to produce new PIP display 880.In this example, when viewing area 860 and 870 positive rotation, viewing area 855 disappears.But, in certain embodiments, viewing area 855 can during rotation be retained on UI 805, and rotates together with 870 with viewing area 860.
6th stage 2035 was illustrated in inserts viewing area 860 and has arrived its reposition, and viewing area 860 and 870 is by the UI 805 after appropriately launching with the whole width of applicable UI 805.In this example, insert the lower left corner that viewing area 860 is positioned at PIP display 880 now, overlap on main viewing area 870.PIP show 880 existing have to show 880 identical display arrangements with the PIP of first stage 1910.The appearance instruction rotation process of the viewing area 855 in the 6th stage under PIP display 880 completes.As mentioned above, only the screen rotation of equipment is exceeded threshold quantity and just can trigger described rotation process, and the orientation of not tube apparatus 2000.
In the example illustrated above with reference to Figure 19 and 20, the orientation of viewing area 870 also changes (that is, from longitudinally becoming laterally).That is, after rotational display district 870, show 880, make it be full of whole UI 805 by horizontal development PIP in the phase III 1920, the orientation of viewing area 870 is from longitudinally becoming laterally.In certain embodiments, when equipment 2000 is by rotation, the video taken by remote equipment rotates, but the orientation showing the viewing area of the video taken by remote equipment remains unchanged.Such example is illustrated in Figure 21.Figure 21 is similar to Figure 20, and except the video be presented in viewing area 870 rotates, but viewing area 870 keeps showing with longitudinal direction.
The example that wherein viewing area 855 remains on the rotation process of same position (instead of rotating like that as shown in Figure 20 and horizontal development, to be full of PIP display 880) is also illustrated in Figure 21.In addition, Figure 21 comprises the layout of the viewing area 855 identical with the layout of the viewing area 855 illustrated in fig .9 above.As shown in the figure, in the stage 2140,2145,2150,2155,2185 and 2190, when equipment 2000 rotates, viewing area 855 is retained in same position.
Some embodiments provide so a kind of rotation process, wherein show the directed change (instead of keeping identical orientation like that as shown in Figure 20) of the viewing area of the video taken by local device, be reflected in rotation process is carried out to local device after the orientation of local device.The example of this rotation process in the stage 2140,2145,2150,2155,2185 and 2190, graphic extension UI 805 that Figure 21 reference six is different.In figure 21, the first stage 2140 shows along the longitudinal direction inserts viewing area 860, inserts viewing area 860 and shows the video taken by the camera of equipment 2000.Second stage 2145 is identical with the phase III 1920 with the second stage 1915 of Figure 20 with the phase III 2150, because their indication equipments 2000 are in the inclination in each stage of rotation process.Now, the camera of equipment 2000 just transversely direction photographic images.In order to point out this transformation, some embodiments provide as the animation as shown in fourth stage 2155 and five-stage 2185, and other embodiment does not provide any animation.
In fourth stage 2155, be presented at the image inserted in viewing area 860 and rotated, but insert viewing area 860 itself and do not rotated because second stage 1945 and in the phase III 2150 inclination of equipment 2000 insertion viewing area 860 is rotated to horizontal direction.In five-stage 2185, image rotating in insertion viewing area 860 is by horizontal development, viewing area 860 is inserted to fill, insert viewing area 860 to start to move towards the lower left side region of PIP display 880, to be placed in inserting viewing area 860 and to insert the PIP display identical relative position of viewing area 860 in the first stage 2140.
In certain embodiments, the orientation showing the viewing area of the video taken by remote equipment also changes, be reflected in rotation process is carried out to remote equipment after the orientation of remote equipment.Four different phases of the UI 805 of Figure 22 graphic extension equipment 2000, wherein (1) shows the change in orientation of the viewing area (viewing area 860 in this example) of the video taken by local device, with be reflected in rotation process is carried out to local device after the orientation of local device, (2) show the change in orientation of the viewing area (viewing area 870 in this example) of video taken by remote equipment, be reflected in rotation process is carried out to remote equipment after the orientation of remote equipment.
Identical with the UI 805 in Figure 21 in the first stage 2205, UI 805.Specifically, the first stage 2205 shows the viewing area 860 and 870 of portrait orientation, because equipment 2000 is by portrait orientation display, remote equipment is the (not shown) of portrait orientation.From the first stage 2205 to second stage 2210, by equipment 2000 from stand up position to lateral attitude half-twist, rotation process is carried out to local device.Second stage 2210 is presented at the UI 805 after the rotation process of finishing equipment 2000.In second stage, the video be presented in viewing area 870 and 860 rotates to stand up position.But, the viewing area 860 of local capture video is only had to rotate to transversal orientation from portrait orientation, because only carried out rotation process to local device (that is, equipment 2000).Viewing area 870 keeps portrait orientation.
From second stage 2210 to the phase III 2215, by remote equipment is rotated to lateral attitude (not shown) from stand up position, rotation process is carried out to remote equipment.Phase III 2215 has been presented at the UI 805 after the rotation process of remote equipment.In the phase III, the video be presented in the viewing area 870 of viewing area 870 and long-range capture video rotates to transversal orientation from portrait orientation, because only carried out rotation process to remote equipment.Thus the phase III of UI 805 all shows the viewing area 870 and 860 of local and remote capture video by transversal orientation.
From the phase III 2215 to fourth stage 2220, by equipment 2000 from lateral attitude to stand up position half-twist, rotation process is carried out to local device.Fourth stage 2220 has been presented at the UI 805 after this rotation process.In fourth stage 2220, the video be presented in viewing area 860 and 870 rotates to stand up position.But, the viewing area 860 of local capture video is only had to rotate to portrait orientation from transversal orientation, because only carried out rotation process to local device (that is, equipment 2000).Viewing area 870 keeps transversal orientation.
From fourth stage 2220 to the first stage 2205, by remote equipment is rotated to stand up position (not shown) from lateral attitude, rotation process is carried out to remote equipment.In this case, the first stage 2205 has been presented at the viewing area 870 after this rotation process.So the UI 805 in this stage shows the viewing area 860 and 870 of portrait orientation.Although Figure 22 illustrates the operation of a series of different rotary, but other embodiment according to the different sequences of arbitrary number, can carry out the rotation process of arbitrary number.
Figure 19,20,21 and 22 illustrates during video conference the rotation process that local and remote device carries out.When carrying out rotation process to local mobile device, some embodiments notify remote equipment rotation process, to allow remote equipment can carry out any amendment (such as rotating the viewing area of the video just showing local device) to the video of local device.Similarly, when carrying out rotation process to remote equipment, remote equipment, this operational notification local device, makes local device can carry out any amendment to the video of remote equipment.During some embodiments are provided in video conference, between local device and remote equipment, transmit the control communication port of the notice of rotation process.
Although Figure 19,20,21 and 22 graphic extensions can realize the different modes of animation that rotates, but those of ordinary skill will appreciate that other embodiment can show the animation of rotation by different modes.In addition, the animation of rotation process can cause the change of the image processing operations of local mobile device, such as makes video conference manager 1204 with different angle again compound display district in UI 805, and proportional zoom display image in the display area.
2. identify area-of-interest
Some embodiments allow user during video conference, identify the area-of-interest (ROI) in the video of display, to revise image procossing (such as, image processing manager 1208 in Figure 12), coding (such as, encoder 1255 in Figure 12), the behavior during video conference of mobile device and camera thereof, or their combination.Different embodiments provides different technology to identify this area-of-interest in video.Figure 23 graphic extension is for identifying that area-of-interest in video is to improve the user interface of some embodiments of the picture quality of video.
In fig 23, the UI 2300 of mobile device 2325, during the video conference of the long-distance user with another mobile device, presents PIP display 2365.PIP display in Figure 23 comprises two video displays: the main display 2330 of background and prospect insert display 2335.In this example, the main display 2330 of background presents the video of the personage setting and wear a hat, described tree and personage are assumed that the front camera by remote equipment takes tree and the personage of its video, or take tree and the personage of its video by the back side camera of remote equipment.Prospect inserts the video that display 2335 presents man, and in this example, described man is assumed that the man of its video taken by the front camera by local device, or takes the personage of its video by the back side camera of local device.Be the viewing area 855 comprising the optional UI project 2360 (such as, button 2360) being marked as " End Conference " below PIP display, optional UI project 2360 allows this project of user by selecting to terminate video conference.
This PIP display just presents a kind of mode of the synthesis view of the video taken by remote equipment and local device.Some embodiments can provide other synthesis view.Such as, replace the comparatively overall background had for the video from remote equipment to show, larger background display can be the video from local device, and less prospect insertion display can be the video from remote equipment.In addition, some embodiments allow local video and long-distance video to appear in UI two viewing area (such as, left and right display window, or display window up and down) side by side, or in the viewing area of two diagonal angle arrangements.In other embodiments, PIP display also can comprise a larger background display, and the prospect insertion display that two less.In certain embodiments, PIP display mode or acquiescence display mode can be specified by user.
Figure 23 is according to four operational phase graphic extension ROI identifying operations of UI 2300.As shown in the first stage 2305, the video presented in background display 2330 has very low quality (that is, video image is fuzzy).In this example, the user of mobile device 2325 is intended to occur that the region recognition of the face 2370 of personage is area-of-interest in background display 2330.
In second stage 2310, initiate the operation identifying area-of-interest.In this example, among the video presented in background display 2330 by selection, user wants to be identified as the region of area-of-interest (such as, by on the screen of the device, position near the character facial 2370 of the display in background display 2330 dubs finger 2350), initiate this operation.
As shown in the phase III 2315, the regional choice of user make UI 2300 draw the region selected around user around frame 2375 (such as, dashed rectangle 2375).Fourth stage 2320 is presented at and terminates to identify the UI 2300 after area-of-interest.As the result of this process, compared with the first stage 2305, the quality of the video in area-of-interest significantly improves.Elimination instruction ROI around frame 2375 selects operation to complete now.In certain embodiments, ROI identifying processing is also to the change that display same video on a remote device causes the change that causes local device 2325 with it identical.Such as, in this example, the picture quality shown in the area-of-interest of same video on a remote device also significantly improves.
In certain embodiments, user can zoom in or out around frame 2375 (such as in the phase III 2315, be placed in display frame by finger 2350, and towards the upper right corner moveable finger 2350 of screen, with amplifying ring around frame 2375, or towards the lower left corner moveable finger 2350 of screen, to reduce around frame 2375).Some embodiments also allow user again to change the position (such as, be placed in display frame by finger 2350, and in display frame level or vertical moveable finger 2350) around frame 2375 in the phase III 2315.In some other embodiments, the selection in described region can not make UI 2300 draw around frame 2375 in the phase III 2315.
Other embodiment provides the different technologies allowing user to identify the area-of-interest in video.A kind of other technology like this of Figure 24 graphic extension.In fig. 24, user identifies area-of-interest by the shape drawing Sensurround region-of-interest.In this example, described shape is rectangle, but also can be other shape (such as, other polygon any, circle, ellipse etc.).Some embodiments, in the equipment UI of that technology also providing graphic extension in fig 23, provide the alternative technique of Figure 24.But, other embodiment does not provide these two kinds of technology in identical UI.
Figure 24 is according to this ROI identifying operation of five operational phase graphic extensions of UI 2300.First stage 2305 in Figure 24 is identical with the first stage 2305 in Figure 23.Specifically, in the first stage 2305, UI 2300 graphic extension has the main display 2330 of larger background, and the less prospect in the lower left corner being positioned at PIP display 2365 inserts the PIP display 2365 of display 2335.
In second stage 2410, initiate the operation identifying area-of-interest.In this example, the primary importance of the area-of-interest in the video be presented in the main viewing area 2330 of background is defined (such as by continuing for some time selection, by on the screen of the device, continue for some time on the position near shown character facial 2370 that finger 2450 is placed in background display area 2330), initiate described operation.Utilize the round dot 2455 of the selected primary importance of closing on the main viewing area of background 2330 in the phase III 2415, UI 2300, instruction have selected described primary importance 2470.
Fourth stage 2420 be illustrated in user have selected definition area-of-interest the second place 2475 after UI 2300.In this example, by such as shown in arrow 2460, after round dot 2455 occurs, from primary importance, in the screen of equipment, drag finger 2450, and the position between the cap and the tree of display of the display in background display area 2330 stops, user selects the second place 2475.As shown in fourth stage, this dragging causes UI 2300 to draw the rectangular shaped rim 2465 of area-of-interest, and described rectangular shaped rim 2465 has the first and second positions 2470 and 2475 on its opposed apexes.
UI 2300 after the identification that five-stage 2425 is illustrated in area-of-interest has terminated.In this example, user by just stopping the dragging of finger 2450 once the area-of-interest identifying expectation, and removes finger 2450 from the display screen of equipment, has carried out the identification of area-of-interest.Five-stage 2425 graphic extension is by this dilatory (drawing) process, and compared with the first stage 2305, the quality of the video in area-of-interest significantly improves.In certain embodiments, the same with the change that it causes local device 2325, this heave also causes identical change to the display frame on remote equipment.Such as, in this example, the picture quality shown in the area-of-interest of same video on a remote device can significantly improve.
Figure 23 above and the description of 24 illustrate the area-of-interest identified in video, to improve the different modes of the picture quality of institute's identified region.In certain embodiments, the picture quality improving the area-of-interest identified can cause the change of the encoding operation of two camera movement equipment, such as when to Video coding, to the more bit of identified region allocation.
Some embodiments permission user identifies the area-of-interest in video, to make different changes to mobile device or its camera.Such as, the area-of-interest in Figure 25 graphic extension identification video, to expand or to reduce the example of area-of-interest in display frame.In this approach, user, by selecting a certain region in display frame as the center of area-of-interest as, expands subsequently or reduces described area-of-interest, identifies the area-of-interest in video.
In fig. 25, the UI 2500 of mobile device 2525, during the long-distance user with another mobile device carries out video conference, presents PIP display 2365.PIP display 2365 in Figure 25 shows 2365 similar substantially with the PIP of Figure 23, but the prospect of Figure 25 inserts the lower left corner that display 2335 is positioned at PIP display 2365.
Figure 25 selects operation according to four operational phase graphic extension ROI of UI 2500.As shown in the first stage 2505, background display area 2530 presents the man in left side had in background display 2530, and the video of tree 2540 on the right side of background display area 2530.In addition, set 2540 relatively little and only occupy the right side of background display area 2530.In this example, the user of mobile device 2525 is intended to background display area 2530 occurring the region recognition of tree 2540 is area-of-interest.
In second stage 2510, initiate the operation identifying area-of-interest.In this example, by among video of selecting to be presented in background display area 2530, user wishes to be identified as the region 2540 of area-of-interest (such as, by two fingers 2545 and 2546 are placed on place background display area 2530 showing tree 2540), initiate described operation.In second stage 2510, by dragging its finger 2545 and 2546 away from each other, user can make area-of-interest 2540 expand, and occupies the background display area 2530 of greater part.By closely dragging its finger 2545 and 2546, user can also make area-of-interest 2540 reduce, thus occupies the background display area 2530 of smaller portions.
Phase III 2515 is illustrated in as shown in arrow 2550, (namely user by moving its finger 2545 and 2546 away from each other, finger 2545 moves towards the upper left corner of background display area 2530, finger 2546 moves towards the lower right corner of background display area 2530), start area-of-interest 2540 is expanded, with the UI 2500 after the background display area 2530 occupying greater part.In certain embodiments, finger is mobile also to the change that the display of remote equipment causes the change that causes local device with it the same.Such as, in this example, the area-of-interest of same video will expand, thus occupy the background display area 2530 of the greater part of remote equipment.In certain embodiments, the expansion of the area-of-interest in local display and/or long-range display causes one or two mobile device or their other operations one or more of its camera amendment, as described further below.
UI 2500 after the identification that fourth stage 2520 is presented at area-of-interest has completed.In this example, once area-of-interest reaches the ratio of expectation in background display area 2530, so user is by stopping the dragging of the finger 2545 and 2546 of user, and removes finger 2545 and 2546 from the display screen of equipment, has carried out the identification of area-of-interest.As the result of this process, area-of-interest occupies most background display area 2530.The identifying operation of area-of-interest completes now.
Some examples above illustrate how user identifies the area-of-interest in video, with the picture quality (such as, by increasing the bit rate to the encoding region of interest of video) in area-of-interest selected by improving in video.In certain embodiments, the area-of-interest in identification video causes the image processing operations of mobile device, the change of such as exposure, proportional zoom, focusing etc.Such as, the area-of-interest identifying in video can cause the image (such as, identifying the area-of-interest wanting zoom) of video conference manager 1204 differently proportional zoom and synthetic video.
In other embodiments, the area-of-interest in identification video causes the change (such as, frame rate, zoom, exposure, proportional zoom, focusing etc.) of the operation of the camera of mobile device.In other other embodiment, the area-of-interest in identification video causes the change of the encoding operation of mobile device, such as to the more bit of region allocation, proportional zoom etc. that identify.In addition, although example ROI identifying operation described above only can cause the one in above-mentioned amendment to revise to mobile device or its camera, but in some other embodiments, ROI identifying operation can cause more than a kind of amendment the operation of mobile device or its camera.In addition, in certain embodiments, the layout of the viewing area 855 of Figure 23-25 is identical with the layout of the viewing area 855 of Fig. 9 described above.
B. camera is switched
The method of camera (that is, changing the camera of photographic images) is switched during some embodiments are provided in video conference.Different embodiments provides the distinct methods realizing switching camera operation.(namely some embodiments provide the camera of this equipment of switching performed by two camera movement equipment, local to switch) method, and other embodiment provides the method for the camera (that is, long-range switching) of another pair of camera movement equipment switching another equipment described in instruct video meeting for two camera movement equipment.In addition other embodiment provides the method for carrying out local switching and long-range switching simultaneously.IV.B.1 joint carries out the local process switching camera operation by illustrating on two camera movement equipment.IV.B.2 joint will illustrate the process carrying out long-range switching camera operation on two camera movement equipment.
1. local switching camera
Some embodiments of Figure 26 graphic extension local two camera movement equipment perform, with the video conference of remote-moving apparatus comprising at least one camera during, between two cameras of local device, carry out the process 2600 switched.In certain embodiments, process 2600 to be performed by the video conference manager 1204 shown in Figure 12.For the purpose of illustrating, discussion below calls camera 1 by a camera of local two camera movement equipment, and another camera of local two camera movement equipment is called camera 2.
By in local two beginning (2605) video conference between camera movement equipment and remote-moving apparatus, process 2600 and start.Subsequently, process 2600 and the video image of the current selected camera (such as, camera 1) from local two camera movement equipment is sent to (2610) remote-moving apparatus, to show on remote-moving apparatus.2610, process 2600, also according to the video image that described video image and it receive from remote-moving apparatus, produces and display composograph.
Process 2600 determines whether (2615) receive the request terminating video conference subsequently.As mentioned above, in certain embodiments, the request of the user of local two camera movement equipment can be answered (such as, user interface by local two camera movement equipment), or answer the request of the user of remote-moving apparatus (such as, user interface by remote-moving apparatus), terminate video conference.When process 2600 receives the request terminating video conference, process 2600 terminates.
When process 2600 do not receive terminate the request of video conference time, process 2600 determine (2620) local two camera movement equipment subsequently user whether this local device of instruction switch the camera being used for video conference.When process 2600 determines that (2620) described local device is not switched camera by instruction, process 2600 returns operation 2610.But, when process 2600 determines that (2620) described local device is switched camera by instruction, process 2600 enters operation 2625.
2625, process 2600 sends to remote-moving apparatus and points out that local two camera movement equipment will switch the notice of camera.In certain embodiments, process 2600, by as mentioned above, sends described notice by the video conference control channel that VTP manager 1825 is multiplexed with voice-grade channel together with video channel.
After its notice of transmission, process 2600 is carried out (2630) and is switched camera operation.In certain embodiments, carry out (2630) switching camera operation and comprise instruction CIPU stopping camera 1 capture video image, start with camera 2 capture video image.These instructions only can switch the image of catching from the pel array associated with camera 2 by instruction CIPU, and start to process these images.On the other hand, in certain embodiments, can with one group of initiation parameter to the instruction of CIPU, described one group of initiation parameter instruction CIPU:(1) according to specific one group of setting operation camera 2, (2) video produced by camera 2 is caught with specific frame rate, and/or (3) are according to the video image of specific one group of setting (such as, resolution etc.) process from camera 2.
In certain embodiments, switch camera instruction (2630) and also comprise the instruction untapped camera being switched to the 4th kind of operating power pattern as above.In this example, switch camera instruction comprise to camera 2, the instruction that is switched to its 4th kind of operating power pattern.In addition, switch camera instruction also comprise to camera 1, be switched to another kind of operating power pattern from its 4th kind of operating power pattern, such as the first operating power pattern is to save electric power, or be switched to the third operating power pattern, make when being required photographic images, it can be switched to the 4th kind of operating power pattern fast and start the instruction of photographic images.Switch camera operation 2630 and also relate to the image (instead of the image taken by camera 1) that synthesis is taken by the camera 2 of local two camera movement equipment and the image received from remote-moving apparatus, to show on two camera movement equipment.
After 2630 instructions switch camera, process 2600 is carried out (2635) and is switched camera animation, to be presented at the transformation between the display from the image of camera 1 and the display from the image of camera 2 on local two camera movement equipment.After switching camera animation on local two camera movement equipment, process 2600 circulates through operation 2610-2620, terminates video conference request until receive, or till new switching camera requests.
How some embodiments of Figure 27 graphic extension allow is asked to switch camera operation by the UI 805 of two camera apparatus, and these embodiments how cartoon making switch an example of camera operation.Figure 27 is according to eight of UI 805 of this equipment different operational phase 2710,2715,2720,2725,2730,2735,2740 and 2745, and graphic extension switches camera operation.Front four-stage 2710,2715,2720 and 2725 graphic extension of UI805 receives the example of the switching camera requests of user.In some embodiments of the invention, the user of this equipment has other mechanism producing this request.
First stage 2710 is identical with the five-stage 830 of the UI 805 of Fig. 8, and it represents the UI 805 after setting up video conference.In this stage, UI 805 display comprises the PIP display of two video displays: from the larger background display of remote camera, and insert display from the less prospect of local camera.In this example, the main viewing area 870 of background presents the video of a Ms, in this example, described Ms is assumed that the Ms being taken its video by remote equipment, and prospect inserts the video that viewing area 860 presents a man, in this example, described man is assumed that the man of its video taken by the front camera by local device.
Second stage 2715 shows the selection of the PIP viewing area 880 by UI 805 subsequently, starts to switch camera operation.As shown in the figure, by the finger 2770 of user being placed in PIP display 880, described selection is realized.Phase III 2720 display comprises for during video conference, asks the UI 805 of the optional UI project 2775 (such as, switching camera (switch camera) button 2775) carrying out switching between the camera of local device 2700.The user that fourth stage 2725 is illustrated in local device 2700 selects after (such as, by singly referring to dub) optional UI project 2775, and highlighting by optional UI project 2775 UI805 pointed out after this selection.By selecting this optional UI project 2775, user instruction equipment 2700, during video conference, is switched to the back side camera of equipment 2700 from the front camera of equipment 2700.In other example of the back side camera capture video of equipment 2700 wherein, user to be switched to the front camera of equipment 2700 by commander equipment 2700 to the selection of optional UI project 2775 from the back side camera of equipment 2700.After fourth stage, video conference manager sends instruction to CIPU, and remote equipment starts to switch camera operation.
The example of the switching camera animation on rear four-stage 2730,2735, the 2740 and 2745 graphic extension local device of UI 805.This animation intention causes the video quilt taken from front camera and the back side camera of local device and the impression be presented at depositing the tow sides observing pane, and at any time, described observation pane is merely able to allow user see one of its tow sides.When switching camera in video conference intermediate request, make this observation pane seemingly around vertical axis revolving, the video showing the camera that a side of the video of a camera presents the first forward direction user observing pane is made to be rotated away from user, until another side of its observed pane replaces, another side described shows the video of another camera.The rotation animation of this observation pane experienced and phenomenon be by (1) in the viewing area for a camera, reduce the video image from this camera gradually, and to described video image application perspective correction operation, (2) are in described viewing area subsequently, expand the video image from another camera gradually, and reduce to realize the operation of the perspective correction of described video image.
Therefore, five-stage 2730 graphic extension is around the beginning of vertical axis 2782 " observing the rotation of pane ".In order to produce the rotation phenomenon observing pane, UI 805 reduces the size of the video image of the front camera in video viewing area 860, and applies pivot operation, and to seem compared with the left side of video image, the right side of video image is further from user.
Pane half-twist is observed in 6th stage 2735 graphic extension, makes user can only see the edge of pane, as be presented at the central authorities of viewing area 860 fine rule 2786 shown in.The graphic extension of 7th stage 2740 is observed pane and is continued to rotate, and the back side observing pane 2788 is engendered in front of the user now, to show the video of the back side camera shooting from user.Equally, in certain embodiments, rotating that animation this present is the size of video image by reducing back side camera in video viewing area 2788, and applies pivot operation to make compared with the right side of video image, and the left side of video image realizes further from user.
8th stage 2745 graphic extension represents completing of the animation switching camera operation.Specifically, this stage shows the video image of the automobile taken by the back side camera of equipment 2700 in viewing area 860.
The example illustrated above with reference to Figure 27 passes through to switch camera user interface, calls switching camera operation.Other embodiment differently calls switching camera operation.Such as, some embodiments are for good and all presented at UI by making the optional UI project of switching camera during video conference, such as, on the UI 805 of Figure 28, call switching camera operation.In Figure 28, switch camera button 989 and be displayed in viewing area 855 together with mute button 985 and closing session button 987.The layout of viewing area 855 is identical with the layout of the viewing area 855 illustrated above with reference to Fig. 9.
Figure 28 is according to six stages: the switching camera operation of 2710,2890,2730,2735,2740 and 2745, graphic extension UI 805.The first stage 2710 of Figure 28 is similar to the first stage 2710 of Figure 27, and the layout except viewing area 855 shows mute button 985, closing session button 987 and switches camera button 989, instead of outside single closing session button.The user that second stage 2890 is illustrated in local device 2700 selects (such as, by utilize finger 2770 singly refer to dub) switch UI 805 after camera optional UI project 989.In this example, by selecting this optional UI project 989, user instruction equipment 2700, during video conference, is switched to the back side camera of equipment 2700 from the front camera of equipment 2700.The rear four-stage of Figure 28 and the rear four-stage of Figure 27 similar, except viewing area 855 layout with above in the first stage 2710 illustrate layout identical except, so be not described further, to avoid by the fuzzy description of the invention of unnecessary details.
In certain embodiments, when remote-moving apparatus receive from the different cameras of local two camera movement equipment image (namely, local two camera movement equipment switches camera) time, remote-moving apparatus also carries out switching camera animation, be presented at the image of a camera from local two camera movement equipment display and from the image of another camera of local two camera movement equipment display between transformation.Figure 29 according to five operational phases 2910,2915,2920,2925 and 2930 of UI 2905, the example of one of this switching camera of graphic extension animation.Figure 29 represents that the example on remote-moving apparatus 2900 switches camera animation.Each operational phase is identical with the example animation of Figure 27, and except being carry out except animation to the image be presented in viewing area 2935, viewing area 2935 is the places of the image shown on remote-moving apparatus 2900 from local two camera movement equipment.Thus, the image being presented at the man in viewing area 2935 is made into by animation and on the vertical axis 2955 being positioned at viewing area 2950 central authorities, rotates 180 ° seemingly, with the transformation between the display of the image of the display and automobile 2970 that represent the image of the man in viewing area 2935.Realizing of the realization of the switching camera animation of some embodiments and animation described above is identical.
Example above illustrates the switching camera animation on the remote equipment with particular user interface layout.Other embodiment can realize this switching camera animation on the remote equipment with different user interface layout.Such as, Figure 30 graphic extension has a this example of the remote equipment 2900 of different user interface layout 2905.Especially, the UI 2905 of Figure 30 has the mute button 985 be included in viewing area 855, closing session button 987 and switches camera button 989, and during video conference, described viewing area 855 is for good and all presented at the side of compound display 2950.The layout of these three buttons is described above with reference to Figure 29.Except different user interface layout, the double teacher 2910,2915,2920,2925 of Figure 30 with 2930 with the double teacher 2910 of Figure 29,2915,2920,2925 identical with 2930.
2. long-range switching camera
During Figure 31 is illustrated in video conference, switch the process 3100 of two cameras of remote double camera apparatus.Process 3100 is performed by the video conference manager of the equipment comprising at least one camera.In the following discussion, the equipment that user switches camera by its instruct remote is called as local device, and the equipment switching its two cameras is called as remote equipment.In addition, in the following discussion, remote equipment is considered to switch between its front camera (in other words camera 1) and its back side camera (in other words camera 2).
Below with reference to Figure 32,33,34 and 35, the process 3100 of Figure 31 is described.The UI 3205 of Figure 32 graphic extension local device 3200, during video conference, user asks remote equipment to switch between its two cameras by local device 3200.The operational phase 3210,3215,3220,3225,3230,3235,3240 and 3245 that eight of Figure 32 graphic extension UI 3205 are different.Figure 35 graphic extension receives the UI 3505 of the remote equipment 3500 switching camera requests from local device 3200.The operational phase 3510,3515,3520,3525,3530 and 3535 that six of Figure 35 graphic extension UI 3505 are different.
As shown in Figure 31, by starting (3105) video conference between local device and remote equipment, process 3100 starts.Process 3100 subsequently (3110) receives from the image of a camera (such as, the front camera from each equipment) of each equipment, and according to the synthesis view of these Computer image genration video conferences.3110, process 3100 also sends to remote equipment the video image from local device.
Subsequently, process 3100 and determine whether (3115) receive the request terminating video conference.As mentioned above, in certain embodiments, the request of the user of Local or Remote equipment can be answered, terminate video conference.When process 3100 receives the request terminating video conference, process 3100 terminates.
When process 3100 does not receive the request terminating video conference, process 3100 determine subsequently user (that is, the user of local device) that (3120) perform the equipment of process 3100 thereon whether this device request remote equipment of instruction switch its camera for video conference.When process 3100 determine (3120) it does not initiated long-range switching camera by instruction time, process 3100 returns operation 3110.When process 3100 determine (3120) it initiated long-range switching camera by instruction time, process 3100 enters the operation 3125 further illustrated below.
Front four-stage 3210,3215,3220 and 3225 graphic extension of the UI 3205 of Figure 32 receives the example of the request of the camera of the switching remote equipment of user.First and second stages 3200 are identical with 2715 with first and second stages 2710 of Figure 27 with 3215.Phase III 3220 is identical with the phase III 2720, switches the optional UI project 3275 of camera except the phase III 3220 not only comprises request local device 3200, and comprises request remote equipment 3200 and switch outside the optional UI project 3280 of camera.The user of fourth stage 3225 graphic extension local device 3200 select to ask remote equipment switch camera UI project 3280 (such as, by optional UI project 3280 singly refer to dub 3270).By highlighting optional UI project 3280, point out described selection.Figure 32 represents the example carrying out this operation, but other embodiment can differently carry out asking remote equipment to switch the operation of camera.
The example illustrated above with reference to Figure 32 calls long-range switching camera operation by long-range switching camera user interface.Other embodiment differently calls long-range switching camera operation.Such as, some embodiments are for good and all presented at UI by making the optional UI project of switching camera during video conference, such as, on the UI 3205 of Figure 33, call switching camera operation.In fig. 33, long-range switching camera button 3388 is displayed in viewing area 855 together with mute button 3382, closing session button 3384 and the local camera button 3386 that switches.
Figure 33 according to six different stages 3210,3390,3230,3235,3240 and 3245, the long-range switching camera operation of the UI 3205 of graphic extension equipment 3200.The first stage 3210 of Figure 33 is similar to the first stage 3210 of Figure 32, except the layout display mute button 3382 of viewing area 855, local switching camera button 3386, long-range switching camera button 3388 and closing session button 3384.Second stage 3390 is illustrated in the UI 805 after the optional UI project 3388 of user's selection (such as, by singly referring to dub 3270) long-range switching camera of local device 3200.The rear four-stage of Figure 33 is similar to the rear four-stage of Figure 32, except viewing area 855 layout with above in the first stage 3210 describe layout identical except, so be not described further, to avoid by the fuzzy description of the invention of unnecessary details.
Some embodiments provide the layout similar with the layout of graphic extension in Figure 33, except the optional UI project of long-range switching camera is presented in PIP display 3265, instead of are presented at outside in viewing area 855.The layout 3205 that Figure 34 graphic extension is such.Specifically, Figure 34 represents the PIP display with the optional UI project 3280 of long-range switching camera, and only has mute button 3382, the local viewing area 855 switching camera button 3386 and closing session button 3384.
As mentioned above, when user asks long-range switching camera, process 3100 enters operation 3125.In operation 3125, process 3100 sends to remote equipment the request switching camera.In certain embodiments, this request is by as mentioned above, is sent together with video channel with voice-grade channel by the video conference control channel that VTP manager 1825 is multiplexed.
After the request switching camera is received, process 3100 determines whether (3130) remote equipment has responded the request switching camera.In certain embodiments, remote equipment, automatically by video conference control channel, sends acceptance response (that is, sending confirmation) to local device.But, in other embodiments, the user of remote equipment must accept this request by the user interface of remote equipment.
The first two stage 3510 and 3515 graphic extension long-distance user of the UI 3505 of Figure 35 accepts the example of the request of the camera switching remote equipment 3500.First stage 3510 shows (1) for showing the viewing area 3540 described request being notified the text of long-distance user, (2) for accepting the optional UI project 3565 of the request of the camera switching remote equipment (such as, allow (allow) button 3565), (3) for refusing the optional UI project 3570 (such as, refusing (reject) button 3570) of the request of the camera switching remote equipment.The user that second stage 3515 is illustrated in remote equipment subsequently selects (such as, by singly referring to dub 3580) for accept switch camera request UI project 3565 after UI 3505, by highlighting optional UI project 3565, the described selection of indicating user.
When process 3100 determine (3130) it does not receive from remote equipment response time, process 3100 determines whether (3135) receive the request terminating video conference.If so, so process 3100 and return operation 3110, continue to receive image from the camera of another equipment.Otherwise, process receives (3140) image from the currently used camera of remote equipment and local device, produce the synthesis view of video conference according to these images, the video image of local device is sent to remote equipment, returns operation 3130 subsequently.
When process 3100 determine (3130) it have received from remote equipment response time, process 3100 determines whether (3145) remote equipment accepts to switch the request of camera.If not, so process 3100 to terminate.Otherwise, process 3100 receives (3150) image from another camera of remote equipment, carry out (3155) subsequently on the local device and switch camera animation, with the transformation between the video of the video of remote camera and the remote camera of current utilization that show previously utilization (that is, at the image that operation 3150 receives).After operation 3155, process returns operation 3110 described above.
About an example of this long-range switching camera animation on rear four operational phases 3230,3235,3240 and 3245 graphic extension local device 3200 of UI 3205 graphic extension in Figure 32.The example animated type of this example animation and graphic extension in the stage 2915,2920,2925 and 2930 of Figure 29 seemingly, except Figure 32 is in viewing area 3250, show the video of the tree taken with the back side camera by remote equipment, replace by outside the animation of the video of the Ms of the front camera shooting of remote equipment.The animation that the rear four-stage graphic extension of Figure 33 with Figure 34 is identical with the animation in Figure 32, except the viewing area 855 of Figure 33 and 34 comprises except the optional UI project different from the viewing area 855 in Figure 32.
In certain embodiments, when remote equipment switches camera, the UI of remote equipment also carries out switching camera animation, to show the transformation between two cameras.About rear four operational phases 3520,3525,3530 and 3535 graphic extension of UI 3505 graphic extension in Figure 35 when remote equipment 3500 switches camera, be presented at the example of the switching camera animation on remote equipment 3500.The animated type of this animation and graphic extension in the stage 2730,2735,2740 and 2745 of Figure 27 seemingly, except the video of the tree that the animation in viewing area 3545 is taken with the back side camera by remote equipment 3500, replace by outside the video of the Ms of the front camera shooting of remote equipment 3500.
As mentioned above, Figure 27,28,29,30,32,33,34 and 35 represents the various examples of the switching camera animation performed on a user interface.In certain embodiments, the change that camera animation causes the image processing operations of corresponding pair of camera movement equipment is switched, the proportional zoom that such as can be performed by video conference manager 1204 and image processing manager 1208, synthesis and perspective distortion.
C. adjustment is exposed
During video conference between two camera movement equipment and another mobile device, different embodiments provides the different technologies of the exposure adjusting the image taken by the camera of any one mobile device.Some embodiments provide the technology of the exposure adjusting image take by the camera of another equipment to the user of two camera movement equipment, and the technology of the exposure of the image that other embodiment provides adjustment to be taken by the camera of two camera movement equipment to user.Below by some exemplary examples of detailed description.
During Figure 36 is illustrated in video conference, two camera movement equipment of some embodiments carry out the process 3600 of long-range exposure adjustment operation.In the following discussion, the equipment that user adjusts its exposure by its instruct remote equipment is called as local device.In certain embodiments, process 3600 to be performed by the video conference manager of local device.In addition, illustrate that process 3600, Figure 37,38 and 39 illustrates that the user of local device asks remote equipment to carry out the various modes of exposure adjustment operation with reference to Figure 37,38 and 39.
As shown in Figure 36, by starting (3605) video conference between local device and remote equipment, process 3600 starts.Process 3600 receives (3610) video from remote equipment subsequently, to show on the display screen of local device.Subsequently, process 3600 and determine whether (3615) receive the request terminating video conference.As mentioned above, some embodiments can receive from the user of Local or Remote equipment the request terminating video conference.When process 3600 receives the request terminating video conference, process 3600 terminates.
But, when process 3600 does not receive the request terminating video conference, process 3600 determines whether (3620) receive the request of the exposure of the camera of adjustment remote equipment subsequently.When the request of the exposure of the camera not receiving adjustment remote equipment is determined in process 3600, process 3600 returns operation 3610, receives the other video from remote equipment shooting.Figure 37,38 and 39 graphic extensions provide three different examples of the mode producing this request to user.In Figure 37,38 and 39, the PIP that first stage 3710,3810 and 3910 all shows local device 3700,3800 and 3900 shows 3725,3850 and 3935, PIP display 3725,3850 and 3935 display two videos: the video taken by the camera of local device, and another video taken by the camera of remote equipment.In the first stage 3710,3810 and 3910, the man in background display 3735,3860 and 3945 is black dull, indicates this man not by appropriate exposure.
Second stage 3715 graphic extension of Figure 37 is by selecting the video (such as, by clicking background display 3735) of remote equipment, and the user of local device 3700 asks remote equipment to carry out exposing a kind of mode of adjustment.In this manner, UI 3705 makes the instruct remote equipment of user to the selection of the area-of-interest limited by square frame 3745 and user carry out exposing the expectation adjusted to area-of-interest to connect automatically, thus the video conference manager contact remote equipment of instruction local device, to carry out exposure adjustment operation.The area-of-interest of definition is used for calculation exposure by remote equipment and adjusts.
The second stage 3815 being similar to the second stage 3715, Figure 38 of Figure 37 represents the selection of local user to the video of remote equipment, except this selection instruction UI 3805 shows the optional UI project 3870 as shown in the phase III 3820.The user of fourth stage 3825 graphic extension local device selects optional UI project 3870, carries out exposure adjustment operation as mentioned above with instruct remote equipment.
The second stage 3915 of Figure 39 is similar to the second stage of Figure 38, the selection instruction UI of user to the video of remote equipment is but replaced to show single optional UI project, the selection instruction UI 3905 of user shows the menu of optional UI project 3955,3960,3965 and 3970, as shown in the phase III 3920.Project 3955 that optional UI project comprises " automatic focus (Auto Focus) ", " automatic exposure (Auto Exposure) " project 3960, " switching camera (SwitchCamera) " project 3965, and " cancelling (Cancel) " project 3970.In certain embodiments, " switching camera " optional UI project 3965 is used to ask local switching camera operation, and in other embodiments, " switching camera " optional UI project 3965 is used to ask long-range switching camera operation.Fourth stage 3925 graphic extension user selects " automatic exposure " project 3960, carries out exposure adjustment operation as mentioned above with instruct remote equipment.
When (3620) local user's instruction local device request exposure adjustment operation is determined in process 3600, process 3600 sends (3625) order by video conference control channel to remote equipment, to adjust by taking at present and the exposure of video of taking to the camera that local device transmits video.After operation 3625, process 3600 returns operation 3610 described above.
In certain embodiments, before remote equipment carries out exposure adjustment operation, require that the user of remote equipment provides license, and in other embodiments, when receiving request from local device, remote equipment carries out exposure adjustment operation automatically.In addition, in certain embodiments, some video conference functions are realized by video conference manager 1204.In the embodiment that some are such, video conference manager 1204 adjusts the transducer of the remote camera used exposure by instruction CIPU 1250 is arranged, and performs exposure adjustment operation.
Figure 37,38 and 39 final stage 3720,3830 and 3930 show the video of remote equipment more brightly, this indicates this man by appropriate exposure.Although Figure 37,38 and 39 provides the example of the exposure adjustment request receiving the exposure correcting remote equipment, but some embodiments provide this local device of request to adjust the mode of the exposure of its camera to the user of local device.The mode that can adjust the exposure of its camera with the request remote equipment of graphic extension in Figure 37,38 and 39 produces such request similarly.
Figure 37-39 described above represents several user interfaces for carrying out exposure adjustment operation.In certain embodiments, exposure adjustment operation can cause the change of the image processing operations of two camera movement equipment, such as calls the exposure adjustment processing 4000 further described below.Exposure adjustment operation can also cause the change of the operation of the camera of two camera movement equipment of capture video, and the exposure such as changing camera is arranged.
Figure 40 conceptually graphic extension by the image processing manager of some embodiments, than as illustrated in Figure 12 image processing manager perform exposure adjustment processing 4000.In certain embodiments, processing 4000 is parts that exposure adjustment operation is described above with reference to Figure 36,37,38 and 39.In the embodiment that some are such, image processing manager 1208 performs process 4000, and by sending instruction to video conference manager 1204, the exposure of adjustment camera is arranged, described video conference manager 1204 instruction CIPU 1250 adjusts camera sensor 405a or 405b, as mentioned above.
In certain embodiments, process 4000 is performed by the image procossing layer 630 shown in Fig. 6, and in other embodiments, process 4000 is performed by the statistics engine 465 shown in Fig. 4.Some embodiments carry out process 4000 to the image that the camera by (Local or Remote) equipment in video conference is taken, and other embodiment performs process 4000 with the form of a part for the process 1500 of graphic extension in Figure 15 (such as, operating 1510).Some embodiments carry out exposure adjustment operation, are not too bright, neither be too dark images to expose what taken by the camera of two camera movement equipment.In other words, process 4000 is performed, with according to making the maximized as far as possible mode photographic images of the quantity of details.
By receiving the image that (4005) are taken by the camera of two camera movement equipment, process 4000 starts.In certain embodiments, when the image received is the first two field picture taken by the camera of the equipment in video conference, process 4000 (that is, before the first two field picture, there is not any image determining exposure value according to it) is not carried out to this first two field picture.The pixel value that (4010) receive the regulation region in image is read in process 4000 subsequently.Different embodiments differently defines described region.Some such embodiments define difform region, such as square, rectangle, triangle, circle etc., and other this embodiment is at the diverse location of image, such as the described region of the definition such as center, central upper portion, lower central.
Subsequently, the mean value that 4000 calculate the pixel value in the defined range of (4015) image is processed.Process 4000 determines whether the calculating mean value of (4020) pixel value equals specific setting.Different embodiments defines different particular values.Such as, some embodiments are defined as described particular value the median pixel value of the dynamic range of image.In certain embodiments, the scope of definition numerical value, instead of single value.In such embodiments, process 4000 and determine that the calculating mean value of (4020) pixel value is whether within the prescribed limit of numerical value.
When the calculating mean value of pixel value is not equal to specific setting, process 4000, according to calculating mean value, adjusts (4025) exposure value.When the calculating mean value of pixel value equals specific setting, process 4000 terminates.In certain embodiments, exposure value represents the time quantum that camera sensor is exposed.In certain embodiments, the exposure value after adjustment is used to expose the next frame image taken by the camera receiving image by shooting.After according to the mean value adjustment exposure value calculated, process 4000 terminates.
In certain embodiments, repeat process 4000, until the calculating mean value of pixel value equals specific setting (or in prescribed limit of numerical value).Some embodiments, during video conference, constantly carry out process 4000, and other embodiment are during video conference, carry out process 4000 every official hour (such as, 5 seconds, 10 seconds, 30 seconds etc.).In addition, during video conference, the process 4000 of some embodiments, before carrying out process 4000, dynamically redefines specific pixel value.
The example of Figure 41 exposure adjustment operation of some embodiments of graphic extension conceptually.Each example 4100,4110 and 4115 is presented at left side the image 4120 that the camera by two camera movement equipment is taken.Specifically, image 4120 shows the black dull personage back to the sun.The exposure of black dull personage's indicating image is not high enough to the face or the health that expose personage.The right side of each example 4100,4110 and 4115 is illustrated respectively in the image 4125,4130 and 4135 of image 4120 shooting afterwards.In certain embodiments, the image on image 4120 and right side is the image of the video taken by the camera of two camera movement equipment.In other embodiments, the image on image 4120 and right side is the rest image do not taken in the same time by the camera of two camera movement equipment.
The graphic extension of first example 4100 is without any the operation of exposure adjustment.Thus, image 4125 seems identical with image 4120.Owing to not carrying out any exposure adjustment, therefore with the figure kind in image 4120 seemingly, the personage in image 4125 is still black dull.
In second example 4110, exposure adjustment operation is carried out to image 4120.In certain embodiments, exposure adjustment operation utilizes regulation region 4140 to carry out by process 4000.According to exposure adjustment operation, the exposure of camera is adjusted, and camera utilizes the exposure photographic images 4130 after adjustment.As shown in Figure 41, the personage in image 4130 is not as black dull in image 4125.But, in image 4130 face of personage and health still unintelligible.
3rd example 4115 represents the exposure adjustment operation carried out image 4120.Be similar to second example 4110, the exposure adjustment operation of the example 4115 of some embodiments utilizes regulation region 4145 to carry out by process 4000.According to exposure adjustment operation, the exposure of camera is adjusted, and camera utilizes the exposure photographic images 4135 after adjustment.As shown in Figure 41, the personage in image 4135 is by correct exposure, because the face of personage and health are all visible.
In certain embodiments, the selection in regulation region can be undertaken by the user of two camera movement equipment.Equipment itself can also by the feedback loop of the exposure mentioned in CIPU 400 above adjustment, automatically adjusts its regulation region for exposure adjustment operation.Statistics engine 465 in Fig. 4 can collect data, to determine whether exposure is suitable for the image taken, and adjusts camera sensor (such as, by being connected with the direct of sensor assembly 415) accordingly.
D. Focussing
During Figure 42 is illustrated in video conference, the process 4200 of the focal length of the two camera movement equipment of adjustment.In the following discussion, the equipment that user adjusts its camera focal by its instruct remote equipment is called as local device.In certain embodiments, the process 4200 of Figure 42 is performed by the video conference manager 1204 of local device.In addition, illustrate that the user that process 4200, Figure 43 and 44 is local device provides request remote equipment to perform two kinds of way of example of Focussing operation below with reference to Figure 43 and 44.
As shown in Figure 42, by starting (4205) video conference between local device and remote equipment, process 4200 starts.Process 4200 receives (4210) video from remote equipment subsequently, to show on the display screen of local device.Subsequently, 4215, process 4200 determines whether to receive the request terminating video conference.As mentioned above, in certain embodiments, the request of the user of Local or Remote equipment can be answered, terminate video conference.When process 4200 receives the request terminating video conference, process 4200 terminates.
Otherwise (4220) are determined in process 4200, and whether it receives the request of focal length of the remote camera of adjustment remote equipment.When process 4200 determines that it does not receive the request of the focal length of the remote camera of adjustment remote equipment, process 4200 returns operation 4210, receives the other video from remote equipment.Three kinds of different modes of Figure 43, the 44 this requests of generation provided to user with the different embodiment of 45 graphic extension.In Figure 43,44 and 45, the PIP that first stage 4310,4410 and 4572 all shows local device 4300,4400 and 4571 shows 4325,4435 and 4582, PIP display 4325,4435 and 4582 all shows two videos, the video taken by local device, and another video taken by remote equipment.Viewing area 855 and 855 in Figure 43 and 44 shows closing session button.But, in Figure 45, the layout of viewing area 855 is identical with the layout of the viewing area 855 of Fig. 9 described above.In addition, the switching camera button 4588 shown in viewing area 855 can be selected, to call local switching camera operation in certain embodiments, or call long-range switching camera operation in other embodiments.As shown in the first stage 4310,4410 and 4572, the video being presented at the remote equipment in background display 4335,4445 and 4580 is fuzzy.
Second stage 4315 graphic extension of Figure 43 is by selecting the video (such as, the single-point 4340 by the video of remote equipment) of remote equipment simply, and the user of local device is to the method for remote device requests Focussing.According to the method, UI 4305 automatically user to the instruct remote equipment of the selection of the area-of-interest limited by square frame 4345 and user to described area-of-interest executable operations (such as, Focussing operate) expectation connect, thus the video conference manager 1204 contact remote equipment of instruction local device 4300, to perform adjustment operation (such as, Focussing operation).Definition area-of-interest by remote equipment for calculating Focussing.
The second stage 4415 of Figure 44 represents the selection (such as, by user dubbing the video of remote equipment) of local user to long-distance video similarly.But, be different from the example of graphic extension in Figure 43, this selection instruction UI 4405 in Figure 44 shows the menu as the optional UI project 4455,4460,4465 and 4470 (they can be realized as optional button) as shown in the phase III 4420.These optional UI projects comprise " automatic focus " project 4460, " automatic exposure " project 4465, " switching camera " project 4470 and " cancellation " project 4455.In certain embodiments, " switching camera " optional UI project 4470 is used to ask local switching camera operation, and in other embodiments, switches camera optional UI project 4470 and be used to ask long-range switching camera operation.Fourth stage 4425 subsequently graphic extension local user selects automatic focus project 4460.
The second stage 4574 of Figure 45 represents that local user is to the selection (such as, by user's dubbing the video of remote equipment) of long-distance video again similarly.But, be different from the example of graphic extension in Figure 44, this selection instruction UI 4578 in Figure 45 asks Focussing to operate (that is, in second stage 4574).After completing Focussing operation, UI 4578 shows the menu (that is, in the phase III 4576) of optional UI project 4584 and 4586, and optional UI project 4584 and 4586 can be realized as selectable buttons.These optional UI projects comprise " automatic exposure " project 4586 and " cancellation " project 4584.
When process 4200 determines that (4220) local user's instruction local device request Focussing operates, process 4200 is by video conference control channel, (4240) order is sent, to adjust the focal length that the camera of its video was caught and transmitted to remote equipment at present to remote equipment.After 4240, process returns operation 4210 described above.
In certain embodiments, before remote equipment carries out this operation, the user of remote equipment must provide license, and in other embodiments, when receiving the request of local device, remote equipment automatically performs this operation.In addition, in certain embodiments, the focal length of the camera used during video conference of Focussing operation adjustment remote equipment is arranged.In the embodiment that some are such, as mentioned above, some video conference functions are realized by video conference module 1202.In these embodiments, video conference manager 1204 instruction CIPU 1250 adjusts the transducer of the remote equipment camera used.
Figure 43,44 and 45 final stage 4320,4430 and 4576 represent the video of the appropriate remote equipment focused on.Although Figure 43,44 and 45 provides the example of the Focussing request receiving the focal length correcting remote equipment, but, some embodiments allow the user of local device to ask local device to adjust the focal length of the camera of local device.The request remote equipment that can be similar to shown in Figure 43,44 and 45 adjusts the method for the focal length of its camera, produces such request.
Figure 43,44 and 45 graphic extensions allow user to carry out three kinds of example user interface of Focussing operation.In certain embodiments, Focussing operation causes taking the change of the operation of the camera of two camera movement equipment of the video be presented in UI, such as changes the focal length of camera.
As above described in Figure 37 and 43, the area-of-interest of definition is respectively used to the exposure adjustment of video and the calculating of Focussing by remote-moving apparatus.But, in some other embodiments, user can be used to instruct remote equipment to the selection of area-of-interest and carry out one or more operation.Such as, in certain embodiments, according to the area-of-interest of definition, carry out exposure adjustment and Focussing, thus instruct remote equipment can carry out two operations.
E. frame rate controls
During video conference, some embodiments may wish the speed (that is, frame rate) adjusting or keep to transmit to another equipment in video conference the video image taken by the camera of two camera movement equipment.Such as, assuming that bandwidth is fixed, some such embodiments reduce the frame rate of video, and to improve the picture quality of video image, and other such embodiment increases the frame rate of video, with smoothed video (that is, reducing shake).
During different embodiments is provided in video conference, control the different technologies of the frame rate of video image.The VBI of the sensor assembly 415 of the example adjustment camera illustrated above, the speed of the image taken by camera with control treatment.As another example, some embodiments of the management level 635 of the video conference module 625 shown in Fig. 6 carry out control frame speed by abandoning image.Similarly, some embodiments of image procossing layer 630 carry out control frame speed by abandoning image.Some embodiments provide other technology in addition to carry out control frame speed, such as abandon the frame in general transmission buffer 1820.
V. electronic system
Many above-mentioned characteristic sum application are realized as software process, and described software process is defined as the instruction set be recorded on computer-readable recording medium (also referred to as computer-readable medium).When these instructions are performed by one or more processing unit (such as, one or more processor, the core core of processor, or other processing unit), they make processing unit perform the action indicated in instruction.The example of computer-readable medium includes but not limited to CD-ROM, flash drive, RAM chip, hard disk drive, EPROM etc.Computer-readable medium does not comprise carrier wave that is wireless or that transmitted by wired connection and the signal of telecommunication.
In this manual, term " software " intention comprises the firmware resided in read-only memory, or is kept at the application in magnetic storage device, and they can be read in memory, so that by processor process.In addition, in certain embodiments, multiple software inventions can be realized as the subdivision of more large program, remains different software inventions simultaneously.In certain embodiments, multiple software inventions also can be realized as independently program.Finally, any combination of the single program of software inventions described herein is together realized also within the scope of the invention.In certain embodiments, when being mounted to operate in one or more electronic system, the one or more specific machine of the operation that software program definition performs and realizes software program realizes.
At caller code by one or more interface and the mutual environment of other program code invoked, some embodiments are realized as the software process comprising one or more API (API).Can comprise further the various function calls of various parameter, message or other variously call by API, transmit between caller and called code.In addition, API can provide to caller code and be used in definition in API and the ability of the data type realized in invoked program code or classification.
At least some embodiment comprises calls component software by API and the mutual environment of invoked component software.A kind of method operated by API in this environment comprises by API, transmits one or more function call, message, other variously calls or parameter.
In certain embodiments, one or more API (API) can be used.Such as, some embodiments of exchange of media module 310 (or 610) provide one group of API to other component software, for accessing the various Video processing and encoding function that describe in Fig. 3 and 9.
API is the interface realized by program code components or nextport hardware component NextPort (hereinafter referred to " API realizes assembly "), allows different program code components or nextport hardware component NextPort (hereinafter referred to " API Calls assembly ") access and use to realize one or more functions that assembly provides, method, process, data structure, classification and/or other service by API.AIP can be defined in one or more parameters that API Calls assembly and AIP realize transmitting between assembly.
API allows developer's (can be third party developer) of API Calls assembly to utilize and realizes by API the regulation feature that assembly provides.An API Calls assembly can be there is, or more than one API Calls assembly can be there is.API can be that computer system or program library are for supporting the source code interface that the service request of self-application provides.Operating system (OS) can have multiple API, to allow the one or more such API of application call run on OS, service (such as program library) can have multiple API, to allow the one or more such API of application call using this service.Can according to when setting up application, the programming language regulation API that can be explained or compile.
In certain embodiments, API realizes the API that assembly can provide more than one, and each API provides access to be realized the different views of the different situations of the different aspect of the function that assembly realizes by API, or has the different views of described different situation.Such as, the API that API realizes assembly can provide first group of function, and can third party developer be exposed to, another API that API realizes assembly can be hidden (not being exposed), and the subset of first group of function is provided, also provide another group function, the test such as not in first group of function or debug function.In other embodiments, API realizes assembly API Calls other assembly one or more by lower floor itself, thus is API Calls assembly, is again that API realizes assembly.
API defines when access with when utilizing API to realize the regulation feature of assembly, the language that API Calls assembly uses and parameter.Such as, one or more API Calls that API Calls assembly is exposed by API or enable (such as specializing with function or method call), access API realizes the regulation feature of assembly, and via API Calls or enable, utilizes parameter to transmit data and control information.API realizes assembly can respond API Calls from API Calls assembly, returns numerical value by API.Although API defines the grammer of API Calls and result (such as, how to enable API Calls, and what API Calls doing), but API can not represent AIP calls the function how completing and specified by API Calls.Each API Calls is that the one or more API realized between assembly by called side (API Calls assembly) and API are transmitted.Transmit API Calls can comprise and send, just open, enable, call, receive, to return or response function calls or message; In other words, transmit and can describe the action that API Calls assembly or API realize assembly.The function call of API or other enable by parameter list or other structure, send or receive one or more parameter.Parameter can be constant, key (key), data structure, object, object type, variable, data type, pointer, array, list or function or method pointer or quote the data or the another kind of mode of another object that will be transmitted by API.
In addition, data type or classification can be provided by API, and realize assembly realization by API.Thus API Calls assembly can utilize the definition provided in API, explanatory variable, use the pointer of this type or classification, use or illustrate the constant of this type or classification.
Usually, API can be used to access and realize service that assembly provides or data by API, or initiates the execution being realized operation that assembly provides or calculating by API.Such as, API realize assembly and API Calls assembly can be operating system, program library, device driver, API, application program or other module one of any (should understand that API realizes the module that assembly and API Calls assembly can be identical types, or type module different from each other).In some cases, imbody API can be carried out with firmware, microcode or other hardware logic at least partly and realize assembly.In certain embodiments, API can allow client-side program to use the service provided by software development kit (SDK) storehouse.In other embodiments, application or other client-side program can use the API provided by application architecture.In these embodiments, application or client-side program can comprise calling the function provided by SDK and provided by API or method, or are used in definition in SDK and the data type provided by API or object.In these embodiments, application architecture can be program and provides the main event responding the various events defined by described framework circulation.API allows applications exploiting application architecture, allocate event and responding event.In some implementations, API Calls can to the ability of application report hardware device or state, comprise the ability relevant to various aspects or state, such as input capability and state, fan-out capability and state, disposal ability, power supply status, memory capacity and state, communication capacity etc., API can partly by other rudimentary logic realization that firmware, microcode or part are run on a hardware component.
API Calls assembly can be local component (namely, assembly is realized in identical data handling system) with API, or by network, realize the remote component (that is, realizing assembly from API in different data handling systems) of component communication through API and API.Should understand that API realizes assembly and also can serve as API Calls assembly (namely, it can carry out API Calls to the AIP being realized assembly exposure by different API), by realizing the API being exposed to different API Calls assemblies, API Calls assembly also can serve as API and realize assembly.
API also allows multiple API Calls assembly of writing with different programming languages and API to realize component communication (thus API can be included in API and realize changing the feature of calling and replying between assembly and API Calls assembly); But, API can be realized according to specific programming language.In one embodiment, API Calls assembly is adjustable with the API from different provider, such as from one group of API of OS provider, from another group API of plug-in unit provider, with another group API of the founder from another provider (such as, the provider of software library) or another group API.
Figure 46 is the block diagram that graphic extension can use example A PI architecture in some embodiments of the invention.As shown in Figure 46, API architecture 4600 comprises the API realizing API 4620 and realizes assembly 4610 (such as, operating system, program library, device driver, API, application, software or other module).API 4620 specifies that one or more functions, method, classification, object, agreement, data structure, form and/or the API that can be used by API Calls assembly 4630 realizes the further feature of assembly.API 4620 can specify at least one calling convention, and described calling convention specifies how the API function realized in assembly 4610 receives the parameter from API Calls assembly 4630, and how this function returns to API Calls assembly result.API Calls assembly 4630 (such as, operating system, program library, device driver, API, application program, software or other module) send API Calls by API 4620, to access and to utilize the API specified by API 4620 to realize the feature of assembly 4610.API realizes assembly 4610 can respond API Calls, returns numerical value by API 4620 to API Calls assembly 830.
Recognize that API realizes assembly 4610 and can comprise not by other function, method, classification, data structure and/or further feature that API 4620 specifies, and can not for API Calls assembly 4630.Should understand that API Calls assembly 4630 can realize assembly 4610 on the same system with API, or can remote location be positioned at, and by network, utilize API 4620 to access API and realize assembly 4610.Although Figure 46 graphic extension and the mutual single API Calls assembly 4630 of API 4620, but should understand that other API Calls assembly can use API 4620, other API Calls assembly described can be write with the language different from API Calls assembly 4630 (or with identical language).
API realizes assembly 4610, API 4620 and API Calls assembly 4630 and can be stored in machine readable media, and described machine readable media comprises protects stored any mechanism with the form that machine (such as computer or other data handling system) is readable.Such as, machine readable media comprises disk, CD, random access memory, read-only memory, flash memory device etc.
Figure 47 is the example of two camera movement computing equipment architecture 4700.The realization of mobile computing device can comprise one or more processing unit 4705, memory interface 4710 and Peripheral Interface 4715.Each formation in these assemblies of computing equipment architecture can be independently assembly, or is integrated in one or more integrated circuit.These each assemblies also available one or more communication bus or holding wire are coupled in together.
Peripheral Interface 4715 can with various transducer and subsystem, comprise camera subsystem 4720, radio communication subsystem 4725, audio subsystem 4730, I/O subsystem 4735 etc. and couple.Peripheral Interface 4715 can realize the communication between processor and ancillary equipment.The ancillary equipment of such as orientation sensor 4745 or acceleration transducer 4750 and so on can couple with Peripheral Interface 4715, with convenient directed and acceleration function.
Camera subsystem 4720 can with one or more optical pickocff 4740, such as charge coupled device (CCD) optical pickocff, complementary metal oxide semiconductors (CMOS) (CMOS) optical pickocff couple.The camera subsystem 4720 coupled with transducer can convenient camera-enabled, such as image and/or video data capture.Radio communication subsystem 4725 can be used for convenient communication function.Radio communication subsystem 4725 can comprise radio frequency receiver and reflector, and optical receiver and reflector.They can be realized as by one or more communication network, the work such as such as GSM network, Wi-Fi network, blueteeth network.Audio subsystem 4730 couples with loud speaker and microphone, allows the function of speech, such as speech recognition, digital record etc. with facility.
I/O subsystem 4735 relates to the transmission by Peripheral Interface between the I/O ancillary equipment of such as display, touch-screen and so on and the data/address bus of CPU.I/O subsystem 4735 can comprise touch screen controller 4755 and other input control device 4760, with these functions convenient.Touch screen controller 4755 can couple with touch-screen 4765, utilizes any one in multiple Touch technologies, detects the contact on screen and movement.Other input control device 4760 can with other input/control devices, such as one or more button couples.
Memory interface 4710 can couple with memory 4770, and memory 4770 can comprise high-speed random access memory and/or nonvolatile memory, such as flash memory.Memory can preserve operating system (OS) 4772.OS 4772 can comprise process basic system services and perform the instruction of hardware dependent tasks.
Memory also can comprise makes communication instruction 4774 easier with the communication of one or more other equipment; The graphic user interface instruction 4776 of convenient graphic user interface process; Facility relates to the process of image/video and the image/video processing instruction 4778 of function; Facility relates to the process of phone and the telephone order 4780 of function; Facility and media communication and the exchange of media and the processing instruction 4782 that process relevant process and function; Facility relates to the process of camera and the camera instruction 4784 of function; With the video conference instruction 4786 of convenient video conference process and function.Above-mentioned instruction does not need to be realized as independently software program or module.Available hardware and/or software, comprise the various functions realizing mobile computing device with one or more signal transacting and/or application-specific integrated circuit (ASIC).
Above-described embodiment can comprise can receive touch input, with as shown in Figure 48 by wired or wireless communication passage 4802, with the touch I/O equipment 4801 that computing system 4803 is mutual.Touch I/O equipment 4801 can be used to replace or in conjunction with other input equipment, such as keyboard, mouse etc., user's input is supplied to computing system 4803.One or more touch I/O equipment 4801 can be used for user's input to be supplied to computing system 4803.Touch the part (such as, the touch-screen of laptop computer) that I/O equipment 4801 can be computing system 4803, or independent of computing system 4803.
Touch I/O equipment 4801 and can comprise transparent, translucent, opaque touch sensitive panel wholly or in part, or their combination in any.Touch I/O equipment 4801 can be embodied as touch-screen, touch pad, play the touch-screen of touch pad effect (such as, replace the touch-screen of touch pad of laptop computer), to be combined with other input equipment any or the touch-screen or the touch pad (such as, being placed in the touch-screen on keyboard or touch pad) that merge or have for receiving any multidimensional object touching the Touch sensitive surface inputted.
In one example in which, the touch I/O equipment 4801 being embodied as touch-screen can comprise the transparent and/or translucent touch sensitive panel on the display that is partly or entirely placed at least partially.According to this embodiment, touch the graph data that I/O equipment 4801 transmits from computing system 4803 (and/or another source) for display, also for receiving user's input.In other embodiments, touch I/O equipment 4801 and can be embodied as the integrated touch screen wherein making touch-sensitive assembly/equipment be combined with display module/equipment.In other embodiment, touch-screen can be used as display assistant images data, or the graph data identical with basic display unit, and receives the auxiliary or additional display touching and input.
Touching I/O equipment 4801 can be configured to according to electric capacity, resistance, optics, acoustics, induction, machinery or chemical measurement result, or the one or many about nearby device 4801 touches or close to the measurable any phenomenon of generation touched, the one or many on checkout equipment 4801 touches or the close position touched.Software, hardware, firmware or their combination in any can be used to the measurement result processing the touch detected, to identify and to follow the tracks of one or more gesture.Gesture may correspond to and touches or close touch in touching the fixing or revocable one or many on I/O equipment 4801.By substantially side by side, incessantly or one after the other according to predetermined way, such as dub, press, swing, wipe, twist, change directed, pressure and alternatively press, on touch I/O equipment 4801, mobile one or more finger or other object, can make gesture.Gesture can with finger between, or any other one or more finger pinching, slide, hit, rotate, bend, drag or dub action characterize (but being not limited thereto).Single gesture can with a hand or two hands, and by one or more user, or their combination in any realizes.
Computing system 4803 can use graph data driving display, with display graphics user interface (GUI).GUI can be configured to receive touch input by touching I/O equipment 4801.The touch I/O equipment 4801 being embodied as touch-screen can show GUI.On the other hand, GUI can be displayed on and touch on display that I/O equipment 4801 is separated.GUI can comprise the graphic element of the ad-hoc location be presented in interface.Graphic element can include but not limited to the virtual input device of various display, comprises virtual scroll wheel, dummy keyboard, virtual knob, virtual push button, any virtual UI etc.User can do gesture at the one or more ad-hoc locations associated with the graphic element of GUI touching I/O equipment 4801.In other embodiments, one or more ad-hoc locations that user can have nothing to do in the position of the graphic element with GUI do gesture.The gesture that touch I/O equipment 4801 makes can be handled directly or indirectly, controls, revises, moves, encourages, starts or usually affect the graphic element in GUI, such as cursor, icon, media file, list, text, all or part of image etc.Such as, with regard to touch-screen, user by graphic element on the touchscreen does gesture, directly and graphic element mutual.On the other hand, touch pad generally provides indirectly mutual.Gesture also affects the GUI element (such as, causing user interface to occur) do not shown, or can affect other action in computing system 4803 (such as, affecting GUI, the state of application or operating system or pattern).In conjunction with the cursor of display, touch I/O equipment 4801 can do or does not do gesture.Such as, just do the situation of gesture on a touchpad, cursor (or pointer) can be displayed on display screen or touch-screen, by the touch input controllable cursor on touch pad, with mutual with the Drawing Object on display screen.Directly do on the touchscreen wherein in other embodiment of gesture, user can in cursor or pointer display or when not showing on the touchscreen, the object interaction directly and on touch-screen.
Respond or according to the touch touched on I/O equipment 4801 or close to touching, provide feedback by communication port 4802 to user.Feedback can changeably or immutablelyly with optics, machinery, electrically, sense of smell, the mode such as acoustics, or their combination in any transmits.
These function available digital electronic circuits above-mentioned, to realize with computer software, firmware or hardware.Various technology can utilize one or more computer program to realize.Programmable processor and computer can be included in or be packaged into mobile device.Process and logic flow can perform by one or more programmable processor with by one or more Programmable Logic Device.General and dedicated computing equipment and memory device are by interconnection of telecommunication network.
Some embodiments comprise electronic building brick computer program instructions is kept in machine readable or computer-readable medium (calling computer-readable recording medium, machine readable media or machinable medium on the other hand), such as microprocessor, storage device and storage arrangement.The example of this computer-readable medium comprises RAM, ROM, read-only optical disc (CD-ROM), CD-R (CD-R), CD-RW (CD-RW), read-only digital universal disc (such as, DVD-ROM, DVD-dual layer-ROM), various can record/rewritable DVD (such as, DVD-RAM, DVD-RW, DVD+RW etc.), flash memory (such as SD card, small-sized SD card, miniature SD card etc.), magnetic and/or solid-state hard drive, read-only and can blue light be recorded cD, high-density optical disk, arbitrarily other optics or magnetic medium and floppy disk.Computer-readable medium can preserve the computer program that can be performed by least one processing unit, comprises the many groups instruction for realizing various operation.The example of computer program or computer code comprises machine code, the machine code such as produced by compiler, and comprises the file by computer, electronic building brick or the high-level code that utilizes the microprocessor of interpreter to perform.
As long as although discussion above relates to microprocessor or the multi-core processor of executive software, but some embodiments are by one or more integrated circuit, and such as application-specific integrated circuit (ASIC) (ASIC) or field programmable gate array (FPGA) perform.In certain embodiments, such integrated circuit performs and is kept at circuit originally instruction with it.
Electronic equipment or other technical equipment is all referred to as used term " computer ", " server ", " processor " and " memory " in any claim of specification and the application.People or crowd got rid of in these terms.Concerning specification, the display of term " display " meaning on an electronic device.Be confined to completely as used term " computer-readable medium " in any claim of this specification and the application protect stored tangible entity with computer-readable form.Any wireless signal, wired download signal and other instant signal any got rid of in these terms.
Figure 49 conceptually graphic extension according to some embodiments, for connecting the example communication system 4900 of some participants of video conference.As shown in the figure, communication system 4900 comprises some mobile devices 4915, some cellular basestations (in other words Node B) 4910, some radio network controllers (RNC) 4905, and core network 4925.Cellular basestation and RNC are referred to collectively as Universal Mobile Telecommunications System (UMTS) grounding wireless access network network (UTRAN) 4930.Each RNC 4905 with together be called as Radio Access Network (RAN) one or more cellular basestations 4910 connect.
Each cellular basestation 4910 covers a service area 4920.As shown in the figure, the mobile device 4915 in each service area passes through Uu interface, with serving cell site 4910 wireless connections of service area 4920.Uu interface uses the dihedral protocol stack of tool: control plane and user plane.User plane support circuit-switched, packet switching and broadcast data stream.Control plane carries the signaling message of network.
Each cellular basestation is connected with RNC by Iub interface.Each RNC 4905 is connected with core network 4925 by Iu-cs with Iu-ps interface.Iu-cs interface is used for circuit switched services (such as, speech), and Iu-ps interface is used for packet-switched services (such as, data).Iur interface is used for two RNC to link together.
Therefore, communication system 4900 support circuit-switched service and packet-switched services.Such as, circuit switched service allows to transmit communicating data (such as, speech) by the circuit switching equipment via communication system 4900, converses.Packet-switched services allows the transportation protocol layer (such as UDP or TCP) by utilizing on Internet protocol layer (such as IP), and the PSE via communication system 4900 transmits videoconference data, carries out video conference.In certain embodiments, before, the call that illustrates in chapters and sections-video conference is set in video conference and changes the circuit switched service and packet-switched services that (such as, switching) utilize the communication system of such as communication system 4900 and so on to support.That is, in such embodiments, call is undertaken by the circuit switched equipment of communication system 4900, and video conference is undertaken by the PSE of communication system 4900.
Although the example communication system in Figure 49 illustrates the third generation (3G) technology UTRAN mobile radio system, but should note in certain embodiments, the second generation (2G) communication system, other 3G communication system such as 3GPP2 evolution data optimization develops in other words-and be data (EV-DO) and third generation partner program 2 (3GPP2) code division multiple access access 1X (CDMA 1X), forth generation (4G) communication system, WLAN (wireless local area network) (WLAN), some participants being connected meeting can be used to micro-wave access global inter communication (WiMAX).The example of 2G system comprises global system for mobile communications (GSM), general packet radio service (GPRS) and enhancing data rate GSM evolution (EDGE).2G communication system architecture is similar to the architecture shown in Figure 49, and except 2G communication system architecture uses base station transceiver (BTS) to replace Node B 4910, and base station controller (BSC) replaces outside RNC 4905.In 2G communication system, the A interface between BSC and core network is used for circuit switched services, and the Gb Interface between BSC and core network is used for packet-switched services.
In certain embodiments, communication system 4900, by supplying mobile device 4915 at first, utilizes the service provider of communication system 4900 to run to allow mobile device 4915.Some embodiments, by configuration and registered user's identification module (SIM) in mobile device 4915, provide mobile device 4915.In other embodiments, the memory configuration utilizing mobile device 4915 and registration mobile device 4915 is changed into.In addition, other service (after client buys mobile device 4915) can be provided, such as the data, services of similar GPRS, Multimedia Messaging Service (MMS) and instant messaging.Once be supplied, mobile device 4915 just serviced operator activates, thus allows to use communication system 4900.
In certain embodiments, communication system 4900 is dedicated communications networks.In such embodiments, mobile device 4915 can in (such as, for mobile device 4915 that communication system 4900 provides) communication (such as, conversing, swap data) each other.In other embodiments, communication system 4900 is public communication networks.Thus except the mobile device 4915 provided for communication system 4900, mobile device 4915 can also communicate with the miscellaneous equipment outside communication system 4900.Some in miscellaneous equipment outside communication system 4900 comprise by other network, such as public switch telephone network or another kind of cordless communication network, the phone be connected with communication system 4900, computer and miscellaneous equipment.
Long Term Evolution (LTE) specification is used for defining 4G communication system.Figure 50 conceptually graphic extension in certain embodiments, for connecting the example of the 4G communication system 5000 of some participants of video conference.As shown in the figure, communication system 5000 comprises some mobile devices 4915, some evolved Node B (eNB) 5005, mobile management entity (MME) 5015, gateway (S-GW) 5020, packet data network (PDN) gateway 5025 and home subscriber server (HSS) 5035.In certain embodiments, communication system 5000 comprises one or more MME 5015, one or more S-GW5020, one or more PDN Gateway 5025 and one or more HSS 5035.
ENB 5005 provides air interface for mobile device 4915.As shown in the figure, each eNB 5005 covers service area 5010.Mobile device 4915 in each service area 5010 passes through LTE-Uu interface, with eNB 5005 wireless connections of service area 5010.Figure 50 also represents that eNB 5005 is interconnected by X2 interface.In addition, eNB 5005 is connected with MME 5015 by S1-MME interface, is connected with S-GW 5020 by S1-U interface.ENB 5005 collective is called as the UTRAN (E-TRAN) 5030 of evolution.
ENB 5005 provides various function, such as provided for radio resources management (such as, radio bearer controls, and connects mobility and controls etc.), user plane data is towards the route of S-GW 5020, signal measurement and measurement report, MME selection when mobile device is connected etc.The function of MME 5015 comprises idle pulley mobile device tracking and paging, the enabling and stop using, selection, the termination of non access stratum (NAS) signaling of S-GW 5020 when mobile device is connected, to be differentiated by the user mutual with HSS 5035 of radio bearer, etc.
The function of S-GW 5020 comprises (1) route and forwards user data packets, and (2) manage and preserve mobile device context, the parameter of such as IP carrying service and network internal routing iinformation.The function of PDN Gateway 5025 comprises exit point and the entrance of the traffic by becoming mobile device, provides the connectedness from mobile device to external packet data networks (not shown).Connective while mobile radio station can have more than one PDN Gateway, to access multiple packet data network.PDN Gateway 5025 also serves as 3GPP and non-3 gpp technology, the ambulant anchor point (anchor) such as between WiMAX and 3GPP2 (such as, CDMA 1X and EV-DO).
As shown in the figure, MME 5015 is connected with S-GW 5020 by S11 interface, is connected with HSS 5035 by S6a interface.S-GW 5020 is connected by S8 interface with PDN Gateway 5020.MME 5015, S-GW 5020 and PDN Gateway 5025 collective are called as block core evolution (EPC).EPC is the primary clustering of system architecture evolution (SAE) architecture, and system architecture evolution (SAE) architecture is the Core Network Architecture of 3GPP LTE wireless communication standard.EPC is pure grouping system.Such as, EPC does not have speech media gateway.The service of such as speech and SMS and so on is packet switching route, and is provided by the application function utilizing EPC to serve.The call that utilization like this illustrates above-video conference changes as an example, and in certain embodiments, call and video conference are all undertaken by the PSE of communication system 5000.In the embodiment that some are such, after the conversation is over, the packet switched channels for conversing continues the voice data being used to video conference.But, in other this embodiment, creating different packet switched channels (such as, when setting up video conference), by the packet switched channels newly created, instead of when the call ends, utilizing the packet switched channels of call to transmit voice data.
In addition, the quantity of bandwidth that provides of these different technologies from 44 kilobits/second (kbps) of GPRS to LTE more than 10 mbit (Mbps).Following concerning LTE, expectation downloading rate is 100Mbps, and uploading rate is 50Mbps.
Although describe the present invention about numerous details, but those of ordinary skill in the art will appreciate that can imbody the present invention in other specific forms, and do not depart from spirit of the present invention.In addition, numerous accompanying drawings illustrates various process conceptually.These concrete operations processed can be performed not in accordance with the exact sequence of described expression and description.Specific operation can not be performed in continuous print sequence of operations, different specific operations can be performed in various embodiments.In addition, process can utilize a few height process to realize, or is realized as a part for larger grand process.
In addition, many embodiments are described above with reference to the video conference between two two camera movement equipment.But, many embodiments that those of ordinary skill in the art will appreciate that in these embodiments can be used on and relate to two camera movement equipment and another equipment, such as single camera movement equipment, computer, there is video conference capabilities phone etc. between video conference situation in.In addition, at single camera movement equipment and have in other computing equipment of video conference capabilities, many embodiments described above can be used.Thus those of ordinary skill in the art understands the present invention and does not limit by example details above, but is limited by the claim of adding.

Claims (19)

1. first mobile device, comprising:
First camera and the second camera;
During video conference between the first mobile device and the second equipment, the image taken is sent to the device of the second equipment by the first camera;
The device of the image taken by the camera of the second equipment is received during video conference;
The device of the image of transmission and the image of reception is shown during video conference;
The device of the selection to the second camera for photographic images is received during video conference;
During video conference, stop the transmission of the image taken by the first camera and the image taken by the second camera is sent to the device of the second equipment; With
The device of the switching of the image that display animation is taken to transmission by the second camera from the image that transmission is taken by the first camera with vision instruction, wherein said animation with display by first camera take image and with display by second camera take image terminate, wherein when showing animation, the first mobile device shows the image taken by the camera of the second equipment of reception continuously.
2. according to the first mobile device according to claim 1, wherein, the device of display image also comprises when the image that the every portion camera by the first mobile device is taken sends the second equipment to, the image taken is presented at the device on the viewing area of the first mobile device by this camera.
3. according to the first mobile device according to claim 1, wherein said animation be included in seem to rotate the virtual observation pane leaving visual field first surface on show the image taken by the first camera, on second that seems the virtual observation pane rotating into visual field, show the image taken by the second camera simultaneously.
4. according to the first mobile device according to claim 2, wherein, described viewing area is the first viewing area, and wherein said first mobile device also comprises:
The image taken by the camera of the second equipment received is presented at device on the second viewing area of the first mobile device, and wherein said first viewing area partly overlaps described second viewing area.
5., according to the first mobile device according to claim 1, also comprise:
The image taken by the second camera of the first mobile device transmitted during the image taken by the camera of the second equipment received and video conference, and be presented at the device on the viewing area of the first mobile device with depositing.
6. according to the first mobile device according to claim 5, also comprise: when showing the image taken by the first camera of the first mobile device and transmitted and the image taken by the camera of the second equipment, display is simultaneously used for the selectable UI item destination device that instruction first mobile device switches camera during video conference.
7. according to the first mobile device according to claim 1, wherein select the device of the second camera to comprise: by the user interface of the first mobile device, receive the device that instruction first mobile device selects the input of the second camera.
8. according to the first mobile device according to claim 1, wherein select the device of the second camera to comprise: to receive the device being switched to the request of the second camera from the first camera from the second equipment.
9. according to the first mobile device according to claim 8, wherein the first mobile device and the second equipment exchange control message during video conference, and are in the control message exchanged during video conference one from the request of the second equipment.
10. first mobile device, comprising:
First camera and the second camera;
Display screen, for
Show by one of camera shooting of the first mobile device and during real-time video conference session, send the image of the second equipment to; With
Display is sent to the image of the first mobile device during real-time video conference session by the second equipment; And
Video conference module, for during real-time video conference session, switch between the first camera and the second camera, and for receiving the request carrying out switching during real-time video conference session between the camera of the first mobile device from the second equipment
Wherein in response to described request, first mobile device shows by stopping by the transmission of the image of one of camera of the first mobile device shooting and transmitting to the second equipment the image taken by another camera of the first mobile device the animation that vision indicates the switching of camera on the display screen, wherein, while display animation, the first mobile device shows the image sent by the second equipment continuously.
11. according to the first mobile device according to claim 10, and the image taken by arbitrary camera of the first mobile device wherein shown is overlapping with the image section sent by the second equipment of display.
12. according to the first mobile device according to claim 10, and the selectable UI project wherein switching camera for instruction first mobile device during videoconference session is simultaneously displayed on display screen together with the image shown during videoconference session.
13. according to the first mobile device according to claim 10, and wherein video conference module comprises network manager, for receiving the described request carrying out switching during videoconference session between the camera of the first mobile device from the second equipment.
14. according to the first mobile device according to claim 13, and wherein the first mobile device and the second equipment exchange control message during videoconference session, and is in the control message exchanged during videoconference session one from the request of the second equipment.
15. according to the first mobile device according to claim 10, and wherein, described animation comprises the transformation of the image taken by another camera to display by the image of a camera shooting from display.
16. 1 kinds are carried out the method for the video conference between the first mobile device and the second equipment at the first mobile device place comprising the first camera and the second camera, and described method comprises:
Show by one of camera shooting of the first mobile device and during real-time video conference session, send the image of the second equipment to;
Display is sent to the image of the first mobile device during real-time video conference session by the second equipment;
Receive the request carrying out switching during real-time video conference session between the camera of the first mobile device from the second equipment;
In response to the request received, during real-time video conference session, display is used for the selectable UI project switched between the camera of the first mobile device; With
When receiving the selection to selectable UI project, by stopping by the transmission of the image of one of camera of the first mobile device shooting, and the image taken by another camera of the first mobile device is transmitted to the second equipment, show the animation of switching of camera of vision instruction at real-time video conference session period first mobile device, wherein, while display animation, the first mobile device shows the image sent by the second equipment continuously.
17. in accordance with the method for claim 16, and wherein, described animation comprises the transformation of the image taken by another camera to display by the image of a camera shooting from display.
18. in accordance with the method for claim 16, also comprises: during real-time video conference session, shows the viewing area for showing image and selectable UI project simultaneously.
19. in accordance with the method for claim 16, and wherein the first mobile device and the second equipment exchange control message during real-time video conference session, and be in the control message exchanged during real-time video conference session one from the request of the second equipment.
CN201010602687.8A 2010-04-07 2010-09-25 Camera is switched during the video conference of multi-camera mobile device Active CN102215374B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US32187110P 2010-04-07 2010-04-07
US61/321,871 2010-04-07
US12/794,775 US8451994B2 (en) 2010-04-07 2010-06-06 Switching cameras during a video conference of a multi-camera mobile device
US12/794,775 2010-06-06

Publications (2)

Publication Number Publication Date
CN102215374A CN102215374A (en) 2011-10-12
CN102215374B true CN102215374B (en) 2015-09-02

Family

ID=44746474

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201010602687.8A Active CN102215374B (en) 2010-04-07 2010-09-25 Camera is switched during the video conference of multi-camera mobile device

Country Status (1)

Country Link
CN (1) CN102215374B (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9954996B2 (en) 2007-06-28 2018-04-24 Apple Inc. Portable electronic device with conversation management for incoming instant messages
CN102957870A (en) * 2012-10-30 2013-03-06 中兴通讯股份有限公司 Photographing method and device
CN103841352B (en) * 2012-11-27 2018-08-10 联想(北京)有限公司 A kind of information processing method and a kind of mobile terminal
CN103281492A (en) * 2013-05-23 2013-09-04 深圳锐取信息技术股份有限公司 Video picture switching method, video picture switching system, recording and broadcasting server and video recording and broadcasting system
CN104469243A (en) * 2013-09-13 2015-03-25 联想(北京)有限公司 Communication method and electronic equipment
US9185062B1 (en) 2014-05-31 2015-11-10 Apple Inc. Message user interfaces for capture and transmittal of media and location content
CN106605201B (en) 2014-08-06 2021-11-23 苹果公司 Reduced size user interface for battery management
CN115665320B (en) 2014-09-02 2024-10-11 苹果公司 Electronic device, storage medium, and method for operating electronic device
EP3189409B1 (en) 2014-09-02 2020-01-29 Apple Inc. Reduced-size interfaces for managing alerts
JP6425573B2 (en) 2015-02-04 2018-11-21 キヤノン株式会社 Electronic device and control method thereof
US10003938B2 (en) 2015-08-14 2018-06-19 Apple Inc. Easy location sharing
CN106713759A (en) * 2016-12-31 2017-05-24 深圳天珑无线科技有限公司 Method and system of camera of capturing image intelligently
US11079913B1 (en) 2020-05-11 2021-08-03 Apple Inc. User interface for status indicators
CN113660448B (en) * 2021-08-23 2022-07-15 珠海格力电器股份有限公司 Call processing method, device, terminal equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008040566A1 (en) * 2006-10-04 2008-04-10 Sony Ericsson Mobile Communications Ab An electronic equipment and method in an electronic equipment
CN101296356A (en) * 2007-04-24 2008-10-29 Lg电子株式会社 Video communication terminal and method of displaying images

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6879828B2 (en) * 2002-09-09 2005-04-12 Nokia Corporation Unbroken primary connection switching between communications services
KR100784971B1 (en) * 2006-07-06 2007-12-11 삼성전자주식회사 Upgrade system and method using remote control between portable terminal
KR101433157B1 (en) * 2007-10-26 2014-08-22 삼성전자주식회사 Mobile terminal and method for transmitting image thereof
CN101291379A (en) * 2008-06-05 2008-10-22 中兴通讯股份有限公司 Mobile terminal and picture-phone implementing method thereof

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008040566A1 (en) * 2006-10-04 2008-04-10 Sony Ericsson Mobile Communications Ab An electronic equipment and method in an electronic equipment
CN101296356A (en) * 2007-04-24 2008-10-29 Lg电子株式会社 Video communication terminal and method of displaying images

Also Published As

Publication number Publication date
CN102215374A (en) 2011-10-12

Similar Documents

Publication Publication Date Title
CN102215372B (en) Remote control operations in a video conference
CN102215374B (en) Camera is switched during the video conference of multi-camera mobile device
CN102215373B (en) In conference display adjustments
CN102215217B (en) Establishing a video conference during a phone call
JP6949917B2 (en) Establishing a video conference during a call
KR101970352B1 (en) Apparatus and method for providing video telephony service, and computer program for executing the method, Apparatus and method for controlling display and computer program for executing the method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 1162796

Country of ref document: HK

C14 Grant of patent or utility model
GR01 Patent grant
REG Reference to a national code

Ref country code: HK

Ref legal event code: GR

Ref document number: 1162796

Country of ref document: HK