CN105141833B - Terminal image pickup method and device - Google Patents
Terminal image pickup method and device Download PDFInfo
- Publication number
- CN105141833B CN105141833B CN201510429077.5A CN201510429077A CN105141833B CN 105141833 B CN105141833 B CN 105141833B CN 201510429077 A CN201510429077 A CN 201510429077A CN 105141833 B CN105141833 B CN 105141833B
- Authority
- CN
- China
- Prior art keywords
- camera
- image
- terminal
- image information
- shooting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Landscapes
- Studio Devices (AREA)
Abstract
The invention discloses a kind of terminal image pickup methods, comprising: when terminal opens the screening-mode of multi-cam, obtains the image taking parameter of each camera respectively;According to the image taking parameter of each camera, the image information that each camera is obtained same target object by corresponding image taking parameter is controlled.The invention also discloses a kind of terminal filming apparatus.The present invention realizes multi-cam and carries out independent shooting according to respectively different acquisition parameters, obtain different image informations, the image of different-effect can be obtained in the same time, when solving terminal and being shot using multi-cam, the technical issues of can only be shot by same acquisition parameters.
Description
Technical field
The present invention relates to terminal device technical field more particularly to a kind of terminal image pickup methods and device.
Background technique
Currently, the camera of terminal is most of to be shot using single camera, synchronization can only be according to the list
The pre-set acquisition parameters of camera are disposably shot, and the image under a kind of acquisition parameters is obtained.Have some
The photographed scene of time effect, for example, meteor shower, lunar eclipse etc., during being shot using single camera, if shooting
Parameter setting is improper, will lead to that shooting effect is bad, also cannot achieve the target object to synchronization, utilizes different bats
Parameter is taken the photograph repeatedly to be shot.In addition, in the scene that needs are constantly shot, if it is desired to seeing the bat under different acquisition parameters
After taking the photograph as a result, the terminal shot using single camera can only be shot using a kind of acquisition parameters, then try another
Acquisition parameters, it is very inconvenient.And when being shot using multi-cam, multi-cam all uses mutually matched shooting skill
Art, to reach the photographic effects such as the better depth of field, 3D shooting, but the multi-cam cannot independently carry out shooting work, can only
It is shot by same acquisition parameters.
Summary of the invention
The main purpose of the present invention is to provide a kind of terminal image pickup method and devices, it is intended to solve terminal using more camera shootings
When head is shot, the technical issues of can only be shot by same acquisition parameters.
To achieve the above object, the present invention provides a kind of terminal filming apparatus, comprising:
Parameter acquisition module obtains each camera when for opening the screening-mode of multi-cam when terminal respectively
Image taking parameter;
Control module controls each camera by each for the image taking parameter according to each camera
Self-corresponding image taking parameter obtains the image information of same target object.
Preferably, the terminal filming apparatus further include:
Judgment module needs to carry out the described image information obtained captured by each camera for judging whether
Synthesis;
Processing module, for if desired being synthesized to the described image information obtained captured by each camera,
Then each image information is synthesized, to export composograph;Otherwise, the corresponding figure generated of each image information is exported respectively
Picture.
Preferably, the control module is also used to, control between each camera according to the preset time interval according to
The secondary image information for obtaining same target object.
Preferably, the control module is also used to, and is controlled each camera and is respectively completed in same preset time
Different shooting numbers.
Preferably, the terminal filming apparatus further include:
Execution module, when for opening the screening-mode of single camera when terminal, according to corresponding with the single camera
The image information of image taking parameter acquisition target object.
In addition, to achieve the above object, the present invention also provides a kind of terminal image pickup methods, comprising:
When terminal opens the screening-mode of multi-cam, the image taking parameter of each camera is obtained respectively;
According to the image taking parameter of each camera, controls each camera and clapped by corresponding image
Take the photograph the image information that parameter obtains same target object.
Preferably, the image taking parameter according to each camera controls each camera by respective
Corresponding image taking parameter obtains after the image information of same target object further include:
Judge whether to need to synthesize the described image information obtained captured by each camera;
If so, each image information is synthesized, to export composograph;
If it is not, then exporting the corresponding image generated of each image information respectively.
Preferably, the image taking parameter according to each camera controls each camera by respective
The image information that corresponding image taking parameter obtains same target object includes:
Control the image letter for successively obtaining same target object between each camera according to the preset time interval
Breath.
Preferably, the image taking parameter according to each camera controls each camera by respective
The image information that corresponding image taking parameter obtains same target object includes:
It controls each camera and is respectively completed different shooting numbers in same preset time.
Preferably, the terminal image pickup method further include:
When terminal opens the screening-mode of single camera, obtained according to image taking parameter corresponding with the single camera
Take the image information of target object.
When the embodiment of the present invention is shot by the camera of terminal, the screening-mode of multi-cam is opened, and according to more
The respective image taking parameter of camera obtains the image information of same target object.Multi-cam is realized according to respective different
Acquisition parameters carry out independent shooting, the image of different-effect can be obtained in the same time, solve terminal using more camera shootings
When head is shot, the technical issues of can only be shot by same acquisition parameters.
Detailed description of the invention
The hardware structural diagram of Fig. 1 mobile terminal of each embodiment to realize the present invention;
Fig. 2 is the wireless communication device schematic diagram of mobile terminal as shown in Figure 1;
Fig. 3 is the functional block diagram of terminal filming apparatus first embodiment of the present invention;
Fig. 4 is the structural schematic diagram that the present invention is configured the camera parameter of terminal;
Fig. 5 is the functional block diagram of terminal filming apparatus second embodiment of the present invention;
Fig. 6 is the flow diagram of terminal image pickup method first embodiment of the present invention;
Fig. 7 is the flow diagram of terminal image pickup method second embodiment of the present invention.
The embodiments will be further described with reference to the accompanying drawings for the realization, the function and the advantages of the object of the present invention.
Specific embodiment
It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not intended to limit the present invention.
The mobile terminal of each embodiment of the present invention is realized in description with reference to the drawings.In subsequent description, use
For indicate element such as " module ", " component " or " unit " suffix only for being conducive to explanation of the invention, itself
There is no specific meanings.Therefore, " module " can be used mixedly with " component ".
Mobile terminal can be implemented in a variety of manners.For example, terminal described in the present invention may include such as moving
Phone, smart phone, laptop, digit broadcasting receiver, PDA (personal digital assistant), PAD (tablet computer), PMP
The mobile terminal of (portable media player), navigation device etc. and such as number TV, desktop computer etc. are consolidated
Determine terminal.Hereinafter it is assumed that terminal is mobile terminal.However, it will be understood by those skilled in the art that in addition to being used in particular for moving
Except the element of purpose, the construction of embodiment according to the present invention can also apply to the terminal of fixed type.
Fig. 1 to realize the present invention the mobile terminal of each embodiment hardware configuration signal.
Mobile terminal 100 may include wireless communication unit 110, A/V (audio/video) input unit 120, user's input
Unit 130, sensing unit 140, output unit 150, memory 160, interface unit 170, controller 180 and power supply unit 190
Etc..Fig. 1 shows the mobile terminal with various assemblies, it should be understood that being not required for implementing all groups shown
Part.More or fewer components can alternatively be implemented.The element of mobile terminal will be discussed in more detail below.
Wireless communication unit 110 generally includes one or more components, allows mobile terminal 100 and wireless communication device
Or the radio communication between network.For example, wireless communication unit may include broadcasting reception module 111, mobile communication module
112, at least one of wireless Internet module 113, short range communication module 114 and location information module 115.
Broadcasting reception module 111 receives broadcast singal and/or broadcast from external broadcast management server via broadcast channel
Relevant information.Broadcast channel may include satellite channel and/or terrestrial channel.Broadcast management server, which can be, to be generated and sent
The broadcast singal and/or broadcast related information generated before the server or reception of broadcast singal and/or broadcast related information
And send it to the server of terminal.Broadcast singal may include TV broadcast singal, radio signals, data broadcasting
Signal etc..Moreover, broadcast singal may further include the broadcast singal combined with TV or radio signals.Broadcast phase
Closing information can also provide via mobile communications network, and in this case, broadcast related information can be by mobile communication mould
Block 112 receives.Broadcast singal can exist in a variety of manners, for example, it can be with the electronics of digital multimedia broadcasting (DMB)
Program guide (EPG), digital video broadcast-handheld (DVB-H) electronic service guidebooks (ESG) etc. form and exist.Broadcast
Receiving module 111 can receive signal broadcast by using various types of broadcasters.Particularly, broadcasting reception module 111
It can be wide by using such as multimedia broadcasting-ground (DMB-T), digital multimedia broadcasting-satellite (DMB-S), digital video
It broadcasts-holds (DVB-H), forward link media (MediaFLO@) data broadcasting device, received terrestrial digital broadcasting integrated service
(ISDB-T) etc. digital broadcast apparatus receives digital broadcasting.Broadcasting reception module 111, which may be constructed such that, to be adapted to provide for extensively
Broadcast the various broadcasters and above-mentioned digital broadcast apparatus of signal.Via the received broadcast singal of broadcasting reception module 111 and/
Or broadcast related information can store in memory 160 (or other types of storage medium).
Mobile communication module 112 sends radio signals to base station (for example, access point, node B etc.), exterior terminal
And at least one of server and/or receive from it radio signal.Such radio signal may include that voice is logical
Talk about signal, video calling signal or according to text and/or Multimedia Message transmission and/or received various types of data.
The Wi-Fi (Wireless Internet Access) of the support mobile terminal of wireless Internet module 113.The module can be internally or externally
It is couple to terminal.Wi-Fi (Wireless Internet Access) technology involved in the module may include WLAN (Wireless LAN) (Wi-Fi), Wibro
(WiMAX), Wimax (worldwide interoperability for microwave accesses), HSDPA (high-speed downlink packet access) etc..
Short range communication module 114 is the module for supporting short range communication.Some examples of short-range communication technology include indigo plant
ToothTM, radio frequency identification (RFID), Infrared Data Association (IrDA), ultra wide band (UWB), purple honeybeeTMEtc..
Location information module 115 is the module for checking or obtaining the location information of mobile terminal.Location information module
Typical case be GPS (global pick device).According to current technology, GPS module 115, which calculates, comes from three or more satellites
Range information and correct time information and the Information application triangulation for calculating, thus according to longitude, latitude
Highly accurately calculate three-dimensional current location information.Currently, it is defended for the method for calculating position and temporal information using three
Star and the error that calculated position and temporal information are corrected by using an other satellite.In addition, GPS module 115
It can be by Continuous plus current location information in real time come calculating speed information.
A/V input unit 120 is for receiving audio or video signal.A/V input unit 120 may include 121 He of camera
Microphone 122, camera 121 is to the static images obtained in video acquisition mode or image capture mode by image capture apparatus
Or the image data of video is handled.Treated, and picture frame may be displayed on display unit 151.It is handled through camera 121
Picture frame afterwards can store in memory 160 (or other storage mediums) or be sent out via wireless communication unit 110
It send, two or more cameras 121 can be provided according to the construction of mobile terminal.Microphone 122 can be in telephone calling model, note
Sound (audio data) is received via microphone in record mode, speech recognition mode etc. operational mode, and can will in this way
Acoustic processing be audio data.Audio that treated (voice) data can be converted in the case where telephone calling model can
The format output of mobile communication base station is sent to via mobile communication module 112.Various types of make an uproar can be implemented in microphone 122
Sound eliminates (or inhibition) algorithm to eliminate the noise or do that (or inhibition) generates during sending and receiving audio signal
It disturbs.
The order that user input unit 130 can be inputted according to user generates key input data to control each of mobile terminal
Kind operation.User input unit 130 allows user to input various types of information, and may include keyboard, metal dome, triggering
Plate (for example, the sensitive component of detection due to the variation of resistance, pressure, capacitor etc. caused by being contacted), idler wheel, rocking bar etc.
Deng.Particularly, when trigger board is superimposed upon in the form of layer on display unit 151, triggering screen can be formed.
Sensing unit 140 detects the current state of mobile terminal 100, (for example, mobile terminal 100 opens or closes shape
State), the position of mobile terminal 100, user is for the presence or absence of contact (that is, triggering input) of mobile terminal 100, mobile terminal
100 orientation, the acceleration of mobile terminal 100 or by fast movement and direction etc., and generate for controlling mobile terminal 100
The order of operation or signal.For example, sensing unit 140 can sense when mobile terminal 100 is embodied as sliding-type mobile phone
The sliding-type phone is to open or close.In addition, sensing unit 140 be able to detect power supply unit 190 whether provide electric power or
Whether person's interface unit 170 couples with external device (ED).Sensing unit 140 may include that proximity sensor 141 will combine below
Triggering screen is to be described this.
Interface unit 170 be used as at least one external device (ED) connect with mobile terminal 100 can by interface.For example,
External device (ED) may include wired or wireless headphone port, external power supply (or battery charger) port, wired or nothing
Line data port, memory card port, the port for connecting the device with identification module, audio input/output (I/O) end
Mouth, video i/o port, ear port etc..Identification module can be storage and use each of mobile terminal 100 for verifying user
It plants information and may include subscriber identification module (UIM), client identification module (SIM), Universal Subscriber identification module (USIM)
Etc..In addition, the device (hereinafter referred to as " identification device ") with identification module can take the form of smart card, therefore, know
Other device can be connect via port or other attachment devices with mobile terminal 100.Interface unit 170, which can be used for receiving, to be come from
The input (for example, data information, electric power etc.) of external device (ED) and the input received is transferred in mobile terminal 100
One or more elements can be used for transmitting data between mobile terminal and external device (ED).
In addition, when mobile terminal 100 is connect with external base, interface unit 170 may be used as allowing will be electric by it
Power, which is provided from pedestal to the path or may be used as of mobile terminal 100, allows the various command signals inputted from pedestal to pass through it
It is transferred to the path of mobile terminal.The various command signals or electric power inputted from pedestal, which may be used as mobile terminal for identification, is
The no signal being accurately fitted on pedestal.Output unit 150 is configured to provide with vision, audio and/or tactile manner defeated
Signal (for example, audio signal, vision signal, alarm signal, vibration signal etc.) out.Output unit 150 may include display
Unit 151, audio output module 152, alarm unit 153 etc..
Display unit 151 may be displayed on the information handled in mobile terminal 100.For example, when mobile terminal 100 is in electricity
When talking about call mode, display unit 151 can show and converse or other communicate (for example, text messaging, multimedia file
Downloading etc.) relevant user interface (UI) or graphic user interface (GUI).When mobile terminal 100 is in video calling mode
Or when image capture mode, display unit 151 can show captured image and/or received image, show video or figure
Picture and the UI or GUI of correlation function etc..
Meanwhile when display unit 151 and trigger board the triggering screen superposed on one another with formation in the form of layer, display unit
151 may be used as input unit and output device.Display unit 151 may include liquid crystal display (LCD), thin film transistor (TFT)
In LCD (TFT-LCD), Organic Light Emitting Diode (OLED) display, flexible display, three-dimensional (3D) display etc. at least
It is a kind of.Some in these displays may be constructed such that transparence to allow user to watch from outside, this is properly termed as transparent
Display, typical transparent display can be, for example, TOLED (transparent organic light emitting diode) display etc..According to specific
Desired embodiment, mobile terminal 100 may include two or more display units (or other display devices), for example, moving
Dynamic terminal may include outernal display unit (not shown) and inner display unit (not shown).Triggering screen can be used for detecting triggering
Input pressure and triggering input position and triggering input area.
Audio output module 152 can mobile terminal be in call signal reception pattern, call mode, logging mode,
It is when under the isotypes such as speech recognition mode, broadcast reception mode, wireless communication unit 110 is received or in memory 160
The audio data transducing audio signal of middle storage and to export be sound.Moreover, audio output module 152 can provide and movement
The relevant audio output of specific function (for example, call signal receives sound, message sink sound etc.) that terminal 100 executes.
Audio output module 152 may include sound pick-up, buzzer etc..
Alarm unit 153 can provide output notifying event to mobile terminal 100.Typical event can be with
Including calling reception, message sink, key signals input, triggering input etc..Other than audio or video output, alarm unit
153 can provide output in different ways with the generation of notification event.For example, alarm unit 153 can be in the form of vibration
Output is provided, when receiving calling, message or some other entrance communications (incoming communication), alarm list
Member 153 can provide tactile output (that is, vibration) to notify to user.By providing such tactile output, even if
When the mobile phone of user is in the pocket of user, user also can recognize that the generation of various events.Alarm unit 153
The output of the generation of notification event can be provided via display unit 151 or audio output module 152.
Memory 160 can store the software program etc. of the processing and control operation that are executed by controller 180, Huo Zheke
Temporarily to store oneself data (for example, telephone directory, message, still image, video etc.) through exporting or will export.And
And memory 160 can store about the vibration and audio signal for being applied to the various modes exported when triggering is shielded when triggering
Data.
Memory 160 may include the storage medium of at least one type, and the storage medium includes flash memory, hard disk, more
Media card, card-type memory (for example, SD or DX memory etc.), random access storage device (RAM), static random-access storage
Device (SRAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), programmable read only memory
(PROM), magnetic storage, disk, CD etc..Moreover, mobile terminal 100 can execute memory with by network connection
The network storage device of 160 store function cooperates.
The overall operation of the usually control mobile terminal of controller 180.For example, controller 180 executes and voice communication, data
Communication, video calling etc. relevant control and processing.In addition, controller 180 may include for reproducing (or playback) more matchmakers
The multi-media module 181 of volume data, multi-media module 181 can construct in controller 180, or can be structured as and control
Device 180 separates.Controller 180 can be with execution pattern identifying processing, by the handwriting input executed on triggering screen or picture
It draws input and is identified as character or image.
Power supply unit 190 receives external power or internal power under the control of controller 180 and provides operation each member
Electric power appropriate needed for part and component.
Various embodiments described herein can be to use the calculating of such as computer software, hardware or any combination thereof
Machine readable medium is implemented.Hardware is implemented, embodiment described herein can be by using application-specific IC
(ASIC), digital signal processor (DSP), digital signal processing device (DSPD), programmable logic device (PLD), scene can
Programming gate array (FPGA), controller, microcontroller, microprocessor, is designed to execute function described herein processor
At least one of electronic unit is implemented, and in some cases, such embodiment can be implemented in controller 180.
For software implementation, the embodiment of such as process or function can with allow to execute the individual of at least one functions or operations
Software module is implemented.Software code can by the software application (or program) write with any programming language appropriate Lai
Implement, software code can store in memory 160 and be executed by controller 180.
So far, oneself is through describing mobile terminal according to its function.In the following, for the sake of brevity, will description such as folded form,
Slide type mobile terminal in various types of mobile terminals of board-type, oscillating-type, slide type mobile terminal etc., which is used as, to be shown
Example.Therefore, the present invention can be applied to any kind of mobile terminal, and be not limited to slide type mobile terminal.
Mobile terminal 100 as shown in Figure 1 may be constructed such that using via frame or grouping send data it is all if any
Line and wireless communication device and satellite-based communication device operate.
Referring now to Fig. 2 description communication device that wherein mobile terminal according to the present invention can operate.
Different air interface and/or physical layer can be used in such communication device.For example, used by communication device
Air interface includes such as frequency division multiple access (FDMA), time division multiple acess (TDMA), CDMA (CDMA) and universal mobile communications dress
Set (UMTS) (particularly, long term evolution (LTE)), global mobile communication device (GSM) etc..As non-limiting example, under
The description in face is related to cdma communication device, but such introduction is equally applicable to other types of device.
With reference to Fig. 2, cdma wireless communication device may include multiple mobile terminals 100, multiple base stations (BS) 270, base station
Controller (BSC) 275 and mobile switching centre (MSC) 280.MSC280 is configured to and Public Switched Telephony Network (PSTN)
290 form interface.MSC280 is also structured to form interface with the BSC275 that can be couple to base station 270 via back haul link.
Back haul link can be constructed according to any in several known interfaces, and the interface includes such as E1/T1, ATM, IP,
PPP, frame relay, HDSL, ADSL or xDSL.It will be appreciated that device may include multiple BSC275 as shown in Figure 2.
Each BS270 can service one or more subregions (or region), by multidirectional antenna or the day of direction specific direction
Each subregion of line covering is radially far from BS270.Alternatively, each subregion can be by two or more for diversity reception
Antenna covering.Each BS270, which may be constructed such that, supports multiple frequency distribution, and the distribution of each frequency has specific frequency spectrum
(for example, 1.25MHz, 5MHz etc.).
What subregion and frequency were distributed, which intersects, can be referred to as CDMA Channel.BS270 can also be referred to as base station transceiver
Device (BTS) or other equivalent terms.In this case, term " base station " can be used for broadly indicating single
BSC275 and at least one BS270.Base station can also be referred to as " cellular station ".Alternatively, each subregion of specific BS270 can be claimed
For multiple cellular stations.
As shown in Figure 2, broadcast singal is sent to the mobile terminal operated in the device by broadcsting transmitter (BT) 295
100.Broadcasting reception module 111 as shown in Figure 1 is arranged at mobile terminal 100 to receive the broadcast sent by BT295
Signal.In fig. 2 it is shown that several global pick device (GPS) satellites 300.The help of satellite 300 positions multiple mobile terminals
At least one of 100.
In Fig. 2, multiple satellites 300 are depicted, but it is understood that, it can use any number of satellite and obtain
Useful location information.GPS module 115 as shown in Figure 1 is generally configured to cooperate with satellite 300 to obtain and desired determine
Position information.It substitutes GPS tracking technique or except GPS tracking technique, the position that can track mobile terminal can be used
Other technologies.In addition, at least one 300 property of can choose of GPS satellite or extraly processing satellite dmb transmission.
As a typical operation of wireless communication device, BS270 receives the reverse link from various mobile terminals 100
Signal.Mobile terminal 100 usually participates in call, information receiving and transmitting and other types of communication.Certain base station 270 is received each anti-
It is handled in specific BS270 to link signal.The data of acquisition are forwarded to relevant BSC275.BSC provides call
The mobile management function of resource allocation and the coordination including the soft switching process between BS270.The number that BSC275 will also be received
According to MSC280 is routed to, the additional route service for forming interface with PSTN290 is provided.Similarly, PSTN290 with
MSC280 forms interface, and MSC and BSC275 form interface, and BSC275 controls BS270 correspondingly with by forward link signals
It is sent to mobile terminal 100.
Structure based on above-mentioned mobile terminal hardware configuration, communication device proposes each embodiment of the method for the present invention.
As shown in figure 3, proposing a kind of terminal filming apparatus first embodiment of the present invention.The terminal filming apparatus of the embodiment
Include:
Parameter acquisition module 10 obtains each camera when for opening the screening-mode of multi-cam when terminal respectively
Image taking parameter;
In the present embodiment, the type of terminal can be configured according to actual needs, for example, the terminal can be mobile phone, shine
Camera, tablet computer etc., following embodiment, which will use mobile phones as an example, to be described in detail.The mobile phone can preset multiple camera shootings
Head, the number of camera can flexible settings as the case may be, it is preferable that settable dual camera.The present embodiment is with mobile phone
It is provided with for dual camera and is described in detail, which is respectively A camera and B camera.The A camera and B
Setting position of the camera on mobile phone can be configured according to actual needs.It should be noted that being taken the photograph built in cell phone system
As head drive system unit, it can be used for respectively driving A camera and B camera, so that the image of A camera and B camera is clapped
Taking the photograph parameter can be respectively set, and independently be shot according to respective image taking parameter.
When user is shot using the camera applications of mobile phone, firstly, call parameters, which obtain module 10, enters shooting mould
Formula selection interface selects dual camera screening-mode, or default setting when according to preceding first use is directly entered dual camera
Screening-mode.Under dual camera screening-mode, user can the acquisition parameters respectively to A camera and B camera set
It sets, and carries out independent shooting.
The mode being configured to the acquisition parameters of A camera and B camera can include: 1) mode one: enter A camera
Parameter setting interface, first the parameter of A camera is configured.For example, the sensitivity of setting camera A, exposure may be selected
The parameters such as time, white balance, aperture size;It may also set up screening-mode, such as HDR (High-Dynamic Range, high dynamic range
Enclose image), night scene etc.;It may also set up shooting filter, such as LOMO, skin makeup.After the completion of camera A setting, B camera can be entered
Parameter setting interface, the parameters such as acquisition parameters or screening-mode to B camera or shooting filter carry out selection setting.When
So, the parameter including A camera and B camera can also be directly entered when being configured to the parameter of A camera and B camera
Set interface is respectively configured the parameter of A camera and B camera in same interface.2) mode two: when user needs
When being configured to a certain acquisition parameters, selection enters the acquisition parameters set interface, for example, selection enters setting for time for exposure
Interface is set, the setting parameter of the time for exposure of A camera or B camera is only shown in the interface, or shows that A is imaged simultaneously
The setting parameter of head and the time for exposure of B camera, is configured for user.After the setting for completing the time for exposure, it may be selected
It is configured into next acquisition parameters set interface.
It is illustrated below, it is assumed that, can when user needs the sensitivity to A camera and B camera to be configured
Selection enters the set interface that sensitivity is specified, and will show A camera and B camera simultaneously in the set interface that sensitivity is specified
Sensitivity parameter, as shown in Figure 4.By adjusting the corresponding photosensitive instruction button of A camera (CamA) being located above
Realization is configured the sensitivity of A camera, or by adjusting the corresponding photosensitive finger of underlying B camera (CamB)
Show that button can be realized to be configured the sensitivity of A camera.When user needs the shooting mould to A camera and B camera
When formula is configured, the set interface specified into screening-mode may be selected, such as the set interface specified in Fig. 4 in screening-mode
The screening-mode of A camera Yu B camera will be shown simultaneously.Wherein, A camera and the screening-mode of B camera provide respectively
6 different screening-modes are shown in the form of icon, carry out screening-mode needed for clicking selection for user.Work as user
When the shooting filter to A camera and B camera being needed to be configured, in the upper of the specified set interface of shooting filter in Fig. 4
Lower section will show respectively the A camera shooting filter different from 6 of B camera in the form of icon, for needed for user's selection
A camera or B camera shooting filter.
The mode that the above-mentioned acquisition parameters to camera are configured is only enumerating for specific embodiment, art technology
The mode for other setting acquisition parameters that personnel propose, is within the scope of the invention.
When mobile phone starts camera and enters the screening-mode of multi-cam, parameter acquisition module 10 is according to shooting instruction hand
Machine will obtain the image taking parameter of the above-mentioned each camera pre-set respectively.
Control module 20 controls each camera and presses for the image taking parameter according to each camera
Corresponding image taking parameter obtains the image information of same target object.
After the acquisition parameters that the screening-mode that above-mentioned mobile phone enters multi-cam obtains A camera and B camera, when with
After shutter or the button of other preset execution shootings are clicked in family, control module 20 controls A camera and B camera according to setting
Good image taking parameter carries out independent shooting, and to obtain the image information of same target object, which can be quiet
State, it is also possible to dynamic.
It is understood that during user is shot using the camera applications of mobile phone, can according to oneself
Hobby or concrete condition are updated set acquisition parameters.Reenter the progress of image taking parameter setting interface
It resets.
It, can be according to specifically parameter when each camera of mobile phone is shot according to corresponding image taking parameter
Correspondingly controlled.In one embodiment, the control module 20 is also used to, and is controlled between each camera according to pre-
If time interval successively obtain the image information of same target object.
It is above-mentioned when carrying out image taking parameter setting, can to A camera and B camera setting shooting sequence and shooting
Interval time is configured.For example, user can be set A camera and B camera while shoot;It may also set up A camera shooting
Head is first shot, and is shot after B camera, B camera also can be set, the shooting sequence of A camera is shot;Or
Person is that shooting according to the preset time interval for A camera and B camera is arranged, i.e., according to the preset time interval successively
Obtain the image information of same target object.The preset time interval can flexible setting as the case may be, for example, working as A
When camera completes shooting, B camera is shot again after 3s.
The camera applications calling control module 20 of mobile phone obtains the image of same target object according to the style of shooting of setting
Information, two cameras shoot simultaneously or shoot according to sequencing or shot according to the preset time interval, clap
It independently carries out, is independent of each other when taking the photograph, and can all generate respective shooting image and be stored in picture library.
In another embodiment, the control module 20 is also used to, and controls each camera in same preset time
Inside it is respectively completed different shooting numbers.
It is above-mentioned when carrying out acquisition parameters setting, can be respectively to A camera and B camera in same preset time
Shooting number is repeated to be configured.For example, setting A camera is shot 10 times in 3S, B camera shoots 30 times in 1S, two
A camera separate counts, are independent of each other.It should be noted that if the above-mentioned shooting image for being already provided with two cameras needs
Synthesize and shoot according to the preset time interval, then when calculating shooting number, after a camera completes shooting before this, warp
Preset time interval is crossed, another camera is shot, and is counted completion and shot for the first time.And so on, until two are taken the photograph
It is taken the photograph as great wheel abortive lot and reaches the number of setting.The camera applications of mobile phone call control module 20 according to the shooting number of setting, control
Camera processed completes the shooting number.
Mobile phone can carry out independent shooting according to respectively different acquisition parameters by multi-cam in the embodiment of the present invention,
Obtain the different image informations of same target.Mobile phone is enabled to shoot to obtain the image of different-effect in the same time, it is non-
Often convenient and saving power consumption does not re-shoot the trouble of chance when avoiding primary shooting that the effect of anticipation is not achieved.Meanwhile
Multiple cameras can independently carry out shooting work, be independent of each other, and be respectively completed the setting, shooting, preservation of image taking parameter
Image etc. has not only broken the necessary mutually matched shooting rule of current dual camera shooting, has solved terminal using more camera shootings
When head is shot, the technical issues of can only be shot by same acquisition parameters, and each camera is shot to obtain
Image synthesized, the image of other special-effects can be obtained.
Further, as shown in figure 5, based on the above embodiment, above-mentioned terminal filming apparatus in the embodiment further include:
Judgment module 30, for judge whether to need to the described image information obtained captured by each camera into
Row synthesis;
Processing module 40, for if desired being closed to the described image information obtained captured by each camera
At then each image information being synthesized, to export composograph;Otherwise, the corresponding generation of each image information is exported respectively
Image.
After above-mentioned A camera and B camera obtain the image information of same target object, judgment module 30 is according to image
In the setting of acquisition parameters, to whether needing whether to need to synthesize progress to the image obtained captured by A camera and B camera
Determine.
In the setting whether synthesized to A camera and B camera, it can be and acquisition parameters are configured above-mentioned
When be configured, if above-mentioned be already provided with when carrying out acquisition parameters setting synthesizes image, processing module at this time
40 export and save composograph;If above-mentioned do not select to synthesize image when carrying out acquisition parameters setting, at this time
Processing module 40 exports respectively and saves image captured by A camera and B camera.It is of course also possible to be in A camera and
After B camera completes shooting, mobile phone interface will will pop up the interface for whether needing to synthesize to image, when user selects "Yes"
When, then the image information obtained to A camera and B camera synthesizes;Conversely, if selection "No", does not carry out image
Synthesis, at this point, saving the image that A camera and B camera are shot respectively.I.e. when two cameras are completed respectively to shoot
When task, each self-generating photo is simultaneously saved.
It should be noted that distinguishing when needing to synthesize image that can also be arranged in specified set interface
The image that A camera and B camera are shot is saved, the image of synthesis is then saved.It is of course also possible to be set as not saving
The image that A camera and B camera are shot only saves the image after synthesis;Alternatively, saving what B camera was shot
Image after image and preservation synthesis.When carrying out image preservation, the settable image that will be saved is marked, and is to distinguish
It is being shot by which camera or whether be the image synthesized.
It, can be according to need after the image information for utilizing A camera and B camera to obtain same target object due to the present invention
Image is synthesized, therefore also to guarantee A camera and B camera same when the position to camera is configured
On horizontal line, and the two is separated by a distance small as far as possible, and the deviation between the image for respectively obtaining two cameras reduces, even
It can ignore, so that the image of synthesis is more accurate.
It is illustrated below, it is assumed that the different time for exposure is arranged to A camera and B camera respectively in user
LDR (Low-Dynamic Range) image, can synthesize HDR (High-Dynamic Range, high dynamic range images).Such as
Fruit user is set as translucent to A camera display mode, is not provided with to the display mode of B camera, then claps A camera
In the image that the image superposition taken the photograph is shot to B camera, image after being synthesized can synthesize special-effect
Image causes a kind of similar surrealistic image.
Further, based on the above embodiment, above-mentioned terminal filming apparatus in the embodiment further include:
Execution module, when for opening the screening-mode of single camera when terminal, according to corresponding with the single camera
The image information of image taking parameter acquisition target object.
In the present embodiment, when user is not desired to using dual camera while shooting, it can be selected only in specified set interface
It opens one of camera to be shot, and closes another camera.I.e. when user's control mobile phone by execution module into
When entering the screening-mode of screening-mode unlatching single camera and selecting one of camera, mobile phone will be according to selected camera shooting
The corresponding image taking parameter of head executes shooting function.User the image taking to the camera can also join as the case may be
Number is modified, with the image envisioned.The present embodiment is shot by unrestricted choice camera, increases what user used
The freedom degree that mobile phone camera is shot, it is not only easy to use, but also save power consumption.
Accordingly, as shown in fig. 6, showing a kind of terminal image pickup method first embodiment of the present invention.The end of the embodiment
End image pickup method include:
Step S10, when terminal opens the screening-mode of multi-cam, the image taking ginseng of each camera is obtained respectively
Number;
In the present embodiment, the type of terminal can be configured according to actual needs, for example, the terminal can be mobile phone, shine
Camera, tablet computer etc., following embodiment, which will use mobile phones as an example, to be described in detail.The mobile phone can preset multiple camera shootings
Head, the number of camera can flexible settings as the case may be, it is preferable that settable dual camera.The present embodiment is with mobile phone
It is provided with for dual camera and is described in detail, which is respectively A camera and B camera.The A camera and B
Setting position of the camera on mobile phone can be configured according to actual needs.It should be noted that being taken the photograph built in cell phone system
As head drive system unit, it can be used for respectively driving A camera and B camera, so that the image of A camera and B camera is clapped
Taking the photograph parameter can be respectively set, and independently be shot according to respective image taking parameter.
When user is shot using the camera applications of mobile phone, firstly, being taken the photograph into the selection of screening-mode selection interface is double
Default setting as head screening-mode, or when according to preceding first use is directly entered dual camera screening-mode.In double camera shootings
Under head screening-mode, user can the acquisition parameters respectively to A camera and B camera be configured, and carry out independent shooting.
The mode being configured to the acquisition parameters of A camera and B camera can include: 1) mode one: enter A camera
Parameter setting interface, first the parameter of A camera is configured.For example, the sensitivity of setting camera A, exposure may be selected
The parameters such as time, white balance, aperture size;It may also set up screening-mode, such as HDR (High-Dynamic Range, high dynamic range
Enclose image), night scene etc.;It may also set up shooting filter, such as LOMO, skin makeup.After the completion of camera A setting, B camera can be entered
Parameter setting interface, the parameters such as acquisition parameters or screening-mode to B camera or shooting filter carry out selection setting.When
So, the parameter including A camera and B camera can also be directly entered when being configured to the parameter of A camera and B camera
Set interface is respectively configured the parameter of A camera and B camera in same interface.2) mode two: when user needs
When being configured to a certain acquisition parameters, selection enters the acquisition parameters set interface, for example, selection enters setting for time for exposure
Interface is set, the setting parameter of the time for exposure of A camera or B camera is only shown in the interface, or shows that A is imaged simultaneously
The setting parameter of head and the time for exposure of B camera, is configured for user.After the setting for completing the time for exposure, it may be selected
It is configured into next acquisition parameters set interface.
It is illustrated below, it is assumed that, can when user needs the sensitivity to A camera and B camera to be configured
Selection enters the set interface that sensitivity is specified, and will show A camera and B camera simultaneously in the set interface that sensitivity is specified
Sensitivity parameter, as shown in Figure 4.By adjusting the corresponding photosensitive instruction button of A camera (CamA) being located above
Realization is configured the sensitivity of A camera, or by adjusting the corresponding photosensitive finger of underlying B camera (CamB)
Show that button can be realized to be configured the sensitivity of A camera.When user needs the shooting mould to A camera and B camera
When formula is configured, the set interface specified into screening-mode may be selected, such as the set interface specified in Fig. 4 in screening-mode
The screening-mode of A camera Yu B camera will be shown simultaneously.Wherein, A camera and the screening-mode of B camera provide respectively
6 different screening-modes are shown in the form of icon, carry out screening-mode needed for clicking selection for user.Work as user
When the shooting filter to A camera and B camera being needed to be configured, in the upper of the specified set interface of shooting filter in Fig. 4
Lower section will show respectively the A camera shooting filter different from 6 of B camera in the form of icon, for needed for user's selection
A camera or B camera shooting filter.
The mode that the above-mentioned acquisition parameters to camera are configured is only enumerating for specific embodiment, art technology
The mode for other setting acquisition parameters that personnel propose, is within the scope of the invention.
When mobile phone starts camera and enters the screening-mode of multi-cam, will be obtained respectively according to shooting instruction mobile phone
State the image taking parameter of each camera pre-set.
Step S20, according to the image taking parameter of each camera, each camera is controlled by respectively corresponding
Image taking parameter obtain the image information of same target object.
After the acquisition parameters that the screening-mode that above-mentioned mobile phone enters multi-cam obtains A camera and B camera, when with
After shutter or the button of other preset execution shootings are clicked in family, A camera and B camera are joined according to the image taking set
Number carries out independent shooting, and to obtain the image information of same target object, which can be static state, is also possible to move
State.
It is understood that during user is shot using the camera applications of mobile phone, can according to oneself
Hobby or concrete condition are updated set acquisition parameters.Reenter the progress of image taking parameter setting interface
It resets.
It, can be according to specifically parameter when each camera of mobile phone is shot according to corresponding image taking parameter
Correspondingly controlled.In one embodiment, it controls and is successively obtained according to the preset time interval between each camera
The image information of same target object.
It is above-mentioned when carrying out image taking parameter setting, can to A camera and B camera setting shooting sequence and shooting
Interval time is configured.For example, user can be set A camera and B camera while shoot;It may also set up A camera shooting
Head is first shot, and is shot after B camera, B camera also can be set, the shooting sequence of A camera is shot;Or
Person is that shooting according to the preset time interval for A camera and B camera is arranged, i.e., according to the preset time interval successively
Obtain the image information of same target object.The preset time interval can flexible setting as the case may be, for example, working as A
When camera completes shooting, B camera is shot again after 3s.
The camera applications of mobile phone obtain the image information of same target object according to the style of shooting of setting, and two cameras are same
When shooting or shoot according to sequencing or according to the preset time interval shot, when shooting is independent progress
, it is independent of each other, and can all generate respective shooting image and be stored in picture library.
In another embodiment, it controls each camera and is respectively completed different shooting time in same preset time
Number.
It is above-mentioned when carrying out acquisition parameters setting, can be respectively to A camera and B camera in same preset time
Shooting number is repeated to be configured.For example, setting A camera is shot 10 times in 3S, B camera shoots 30 times in 1S, two
A camera separate counts, are independent of each other.It should be noted that if the above-mentioned shooting image for being already provided with two cameras needs
Synthesize and shoot according to the preset time interval, then when calculating shooting number, after a camera completes shooting before this, warp
Preset time interval is crossed, another camera is shot, and is counted completion and shot for the first time.And so on, until two are taken the photograph
It is taken the photograph as great wheel abortive lot and reaches the number of setting.For the camera applications of mobile phone according to the shooting number of setting, control camera is completed should
Shoot number.
Mobile phone can carry out independent shooting according to respectively different acquisition parameters by multi-cam in the embodiment of the present invention,
Obtain the different image informations of same target.Mobile phone is enabled to shoot to obtain the image of different-effect in the same time, it is non-
Often convenient and saving power consumption does not re-shoot the trouble of chance when avoiding primary shooting that the effect of anticipation is not achieved.Meanwhile
Multiple cameras can independently carry out shooting work, be independent of each other, and be respectively completed the setting, shooting, preservation of image taking parameter
Image etc. has not only broken the necessary mutually matched shooting rule of current dual camera shooting, has solved terminal using more camera shootings
When head is shot, the technical issues of can only be shot by same acquisition parameters, and each camera is shot to obtain
Image synthesized, the image of other special-effects can be obtained.
Further, as shown in fig. 7, based on the above embodiment, in the present embodiment, after above-mentioned steps S20 can include:
Step S30, judge whether to need to synthesize the described image information obtained captured by each camera;
If so, thening follow the steps S40;If it is not, thening follow the steps S50.
Step S40, each image information is synthesized, to export composograph;
Step S50, the corresponding image generated of each image information is exported respectively.
After above-mentioned A camera and B camera obtain the image information of same target object, according to image taking parameter
In setting, to whether needing whether to need the image obtained captured by A camera and B camera to synthesize and determine.
In the setting whether synthesized to A camera and B camera, it can be and acquisition parameters are configured above-mentioned
When be configured, if above-mentioned be already provided with when carrying out acquisition parameters setting synthesizes image, export and protect at this time
Deposit composograph;If above-mentioned do not select to synthesize image when carrying out acquisition parameters setting, export respectively at this time simultaneously
Save image captured by A camera and B camera.It is of course also possible to be the hand after A camera and B camera complete shooting
Machine interface will will pop up the interface for whether needing to synthesize to image, when user selects "Yes", then take the photograph to A camera and B
As the image information that head obtains is synthesized;Conversely, if selection "No", does not synthesize image, at this point, saving A respectively
The image that camera and B camera are shot.I.e. when two cameras are completed respectively to shoot task, each self-generating photo
And it saves.
It should be noted that distinguishing when needing to synthesize image that can also be arranged in specified set interface
The image that A camera and B camera are shot is saved, the image of synthesis is then saved.It is of course also possible to be set as not saving
The image that A camera and B camera are shot only saves the image after synthesis;Alternatively, saving what B camera was shot
Image after image and preservation synthesis.When carrying out image preservation, the settable image that will be saved is marked, and is to distinguish
It is being shot by which camera or whether be the image synthesized.
It, can be according to need after the image information for utilizing A camera and B camera to obtain same target object due to the present invention
Image is synthesized, therefore also to guarantee A camera and B camera same when the position to camera is configured
On horizontal line, and the two is separated by a distance small as far as possible, and the deviation between the image for respectively obtaining two cameras reduces, even
It can ignore, so that the image of synthesis is more accurate.
It is illustrated below, it is assumed that the different time for exposure is arranged to A camera and B camera respectively in user
LDR (Low-Dynamic Range) image, can synthesize HDR (High-Dynamic Range, high dynamic range images).Such as
Fruit user is set as translucent to A camera display mode, is not provided with to the display mode of B camera, then claps A camera
In the image that the image superposition taken the photograph is shot to B camera, image after being synthesized can synthesize special-effect
Image causes a kind of similar surrealistic image.
Further, based on the above embodiment, in the present embodiment, above-mentioned terminal image pickup method further include:
When terminal opens the screening-mode of single camera, obtained according to image taking parameter corresponding with the single camera
Take the image information of target object.
In the present embodiment, when user is not desired to using dual camera while shooting, it can be selected only in specified set interface
It opens one of camera to be shot, and closes another camera.I.e. when user's control mobile phone enters screening-mode
When opening the screening-mode of single camera and selecting one of camera, mobile phone will be according to the corresponding of selected camera
Image taking parameter executes shooting function.User can also the image taking parameter as the case may be to the camera repair
Change, with the image envisioned.The present embodiment is shot by unrestricted choice camera, increases the mobile phone camera that user uses
The freedom degree shot, it is not only easy to use, but also save power consumption.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side
Method can be realized by means of software and necessary general hardware platform, naturally it is also possible to by hardware, but in many cases
The former is more preferably embodiment.Based on this understanding, technical solution of the present invention substantially in other words does the prior art
The part contributed out can be embodied in the form of software products, which is stored in a storage medium
In (such as ROM/RAM, magnetic disk, CD), including some instructions are used so that a terminal device (can be mobile phone, computer, clothes
Business device, air conditioner or the network equipment etc.) execute method described in each embodiment of the present invention.
The above is only a preferred embodiment of the present invention, is not intended to limit the scope of the invention, all to utilize this hair
Equivalent structure or equivalent flow shift made by bright specification and accompanying drawing content is applied directly or indirectly in other relevant skills
Art field, is included within the scope of the present invention.
Claims (10)
1. a kind of terminal filming apparatus, which is characterized in that the terminal filming apparatus includes:
Parameter acquisition module obtains the image of each camera when for opening the screening-mode of multi-cam when terminal respectively
Acquisition parameters;
Control module controls each camera by respectively right for the image taking parameter according to each camera
The image taking parameter answered obtains the image information of same target object;
Wherein, each camera can independently carry out shooting work, be respectively completed image taking parameter setting, shooting and
Save image.
2. terminal filming apparatus as described in claim 1, which is characterized in that the terminal filming apparatus further include:
Judgment module closes the described image information obtained captured by each camera for judging whether to need
At;
Processing module then will for if desired synthesizing to the described image information obtained captured by each camera
Each image information is synthesized, to export composograph;Otherwise, the corresponding image generated of each image information is exported respectively.
3. terminal filming apparatus as claimed in claim 2, which is characterized in that the control module is also used to, and is controlled described each
The image information of same target object is successively obtained between a camera according to the preset time interval.
4. terminal filming apparatus as claimed in any one of claims 1-3, which is characterized in that the control module is also used to,
It controls each camera and is respectively completed different shooting numbers in same preset time.
5. terminal filming apparatus as described in claim 1, which is characterized in that the terminal filming apparatus further include:
Execution module, when for opening the screening-mode of single camera when terminal, according to image corresponding with the single camera
The image information of acquisition parameters acquisition target object.
6. a kind of terminal image pickup method, which is characterized in that the terminal image pickup method the following steps are included:
When terminal opens the screening-mode of multi-cam, the image taking parameter of each camera is obtained respectively;And
According to the image taking parameter of each camera, controls each camera and join by corresponding image taking
Number obtains the image information of same target object;
Wherein, each camera can independently carry out shooting work, be respectively completed image taking parameter setting, shooting and
Save image.
7. terminal image pickup method as claimed in claim 6, which is characterized in that described to be clapped according to the image of each camera
Take the photograph parameter, control each camera by corresponding image taking parameter obtain same target object image information it
Afterwards further include:
Judge whether to need to synthesize the described image information obtained captured by each camera;
If so, each image information is synthesized, to export composograph;
If it is not, then exporting the corresponding image generated of each image information respectively.
8. terminal image pickup method as claimed in claim 7, which is characterized in that described to be clapped according to the image of each camera
Parameter is taken the photograph, the image information packet that each camera is obtained same target object by corresponding image taking parameter is controlled
It includes:
Control the image information for successively obtaining same target object between each camera according to the preset time interval.
9. the terminal image pickup method as described in any one of claim 6-8, which is characterized in that described according to each camera shooting
The image taking parameter of head controls each camera by corresponding image taking parameter and obtains same target object
Image information includes:
It controls each camera and is respectively completed different shooting numbers in same preset time.
10. terminal image pickup method as claimed in claim 6, which is characterized in that the terminal image pickup method further include:
When terminal opens the screening-mode of single camera, mesh is obtained according to image taking parameter corresponding with the single camera
Mark the image information of object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510429077.5A CN105141833B (en) | 2015-07-20 | 2015-07-20 | Terminal image pickup method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510429077.5A CN105141833B (en) | 2015-07-20 | 2015-07-20 | Terminal image pickup method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105141833A CN105141833A (en) | 2015-12-09 |
CN105141833B true CN105141833B (en) | 2018-12-07 |
Family
ID=54727031
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510429077.5A Active CN105141833B (en) | 2015-07-20 | 2015-07-20 | Terminal image pickup method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105141833B (en) |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI573459B (en) | 2016-03-18 | 2017-03-01 | 聚晶半導體股份有限公司 | Milti-camera electronic device and control method thereof |
CN105812677B (en) * | 2016-04-29 | 2019-07-30 | 宇龙计算机通信科技(深圳)有限公司 | A kind of image generating method and system |
CN105872148B (en) * | 2016-06-21 | 2019-05-17 | 维沃移动通信有限公司 | A kind of generation method and mobile terminal of high dynamic range images |
CN106060526A (en) * | 2016-07-04 | 2016-10-26 | 天脉聚源(北京)传媒科技有限公司 | Live broadcast method and device based on two cameras |
CN106161943A (en) * | 2016-07-29 | 2016-11-23 | 维沃移动通信有限公司 | A kind of kinescope method and mobile terminal |
CN106791732A (en) * | 2016-11-30 | 2017-05-31 | 努比亚技术有限公司 | A kind of image processing method and device |
CN106850964A (en) * | 2016-12-27 | 2017-06-13 | 宇龙计算机通信科技(深圳)有限公司 | A kind of multi-cam filming apparatus and its method |
CN107071274B (en) * | 2017-03-13 | 2020-08-21 | 麒和科技(南京)有限公司 | Distortion processing method and terminal |
CN108322670B (en) * | 2018-04-27 | 2019-05-28 | Oppo广东移动通信有限公司 | A kind of control method of multi-camera system, mobile terminal and storage medium |
CN111182093A (en) * | 2018-11-12 | 2020-05-19 | 奇酷互联网络科技(深圳)有限公司 | HDR photographing method based on three cameras, mobile terminal and storage medium |
CN111225126A (en) * | 2018-11-23 | 2020-06-02 | 华为技术有限公司 | Multi-channel video stream generation method and device |
CN110072056B (en) * | 2019-05-10 | 2022-02-01 | 北京迈格威科技有限公司 | Data processing method and device based on multiple camera modules |
CN110049257B (en) | 2019-05-31 | 2021-09-21 | 影石创新科技股份有限公司 | Method for determining semi-synchronous exposure parameters and electronic device |
CN114745508B (en) * | 2022-06-13 | 2023-10-31 | 荣耀终端有限公司 | Shooting method, terminal equipment and storage medium |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009066788A1 (en) * | 2007-11-22 | 2009-05-28 | Nec Corporation | Imaging device, information processing terminal, mobile telephone, program, and light emission control method |
CN103024272A (en) * | 2012-12-14 | 2013-04-03 | 广东欧珀移动通信有限公司 | Double camera control device, method and system of mobile terminal and mobile terminal |
CN103780840B (en) * | 2014-01-21 | 2016-06-08 | 上海果壳电子有限公司 | Two camera shooting image forming apparatus of a kind of high-quality imaging and method thereof |
CN103763477B (en) * | 2014-02-21 | 2016-06-08 | 上海果壳电子有限公司 | A kind of dual camera claps back focusing imaging device and method |
CN103905731B (en) * | 2014-03-26 | 2017-11-28 | 武汉烽火众智数字技术有限责任公司 | A kind of wide dynamic images acquisition method and system |
CN103986875A (en) * | 2014-05-29 | 2014-08-13 | 宇龙计算机通信科技(深圳)有限公司 | Image acquiring device, method and terminal and video acquiring method |
CN104780324B (en) * | 2015-04-22 | 2016-08-24 | 努比亚技术有限公司 | A kind of method and apparatus of shooting |
-
2015
- 2015-07-20 CN CN201510429077.5A patent/CN105141833B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN105141833A (en) | 2015-12-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105141833B (en) | Terminal image pickup method and device | |
CN105404484B (en) | Terminal split screen device and method | |
CN104954689B (en) | A kind of method and filming apparatus that photo is obtained using dual camera | |
CN105120135B (en) | A kind of binocular camera | |
CN105100491B (en) | A kind of apparatus and method for handling photo | |
CN106888349A (en) | A kind of image pickup method and device | |
CN107018331A (en) | A kind of imaging method and mobile terminal based on dual camera | |
CN106791455B (en) | Panorama shooting method and device | |
CN106097284B (en) | A kind of processing method and mobile terminal of night scene image | |
CN106534543B (en) | Television control apparatus, mobile terminal and method | |
CN106686301A (en) | Picture shooting method and device | |
CN105100642B (en) | Image processing method and device | |
CN105472241B (en) | Image split-joint method and mobile terminal | |
CN105979148A (en) | Panoramic photographing device, system and method | |
CN106534552B (en) | Mobile terminal and its photographic method | |
CN106851128A (en) | A kind of video data handling procedure and device based on dual camera | |
CN105430258B (en) | A kind of method and apparatus of self-timer group photo | |
CN106231095B (en) | Picture synthesizer and method | |
CN106686213A (en) | Shooting method and apparatus thereof | |
CN106851113A (en) | A kind of photographic method and mobile terminal based on dual camera | |
CN107018334A (en) | A kind of applied program processing method and device based on dual camera | |
CN106911881A (en) | A kind of an action shot filming apparatus based on dual camera, method and terminal | |
CN106303044B (en) | A kind of mobile terminal and obtain the method to coke number | |
CN105959520B (en) | A kind of photo camera and method | |
CN106993134A (en) | A kind of video generation device and method, terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |