CN105141833A - Terminal photographing method and device - Google Patents

Terminal photographing method and device Download PDF

Info

Publication number
CN105141833A
CN105141833A CN201510429077.5A CN201510429077A CN105141833A CN 105141833 A CN105141833 A CN 105141833A CN 201510429077 A CN201510429077 A CN 201510429077A CN 105141833 A CN105141833 A CN 105141833A
Authority
CN
China
Prior art keywords
camera
image
terminal
image information
taking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510429077.5A
Other languages
Chinese (zh)
Other versions
CN105141833B (en
Inventor
马英超
张登康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201510429077.5A priority Critical patent/CN105141833B/en
Publication of CN105141833A publication Critical patent/CN105141833A/en
Application granted granted Critical
Publication of CN105141833B publication Critical patent/CN105141833B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Studio Devices (AREA)

Abstract

The invention discloses a terminal photographing method comprising the steps that when a terminal starts a multi-camera photographing mode, image photographing parameters of all cameras are acquired respectively; and according to the image photographing parameters of all the cameras, all the cameras are controlled to acquire image information of the same target object according to the respectively corresponding image photographing parameters. The invention also discloses a terminal photographing device. Independent photographing of multiple cameras according to respectively different photographing parameters is realized, different image information is acquired and different effects of images can be obtained at the same time so that the technical problem that photographing is performed by the terminal through the same photographing parameter by means of multiple cameras for photographing can be solved.

Description

Terminal taking method and apparatus
Technical field
The present invention relates to terminal equipment technical field, particularly relate to a kind of terminal taking method and apparatus.
Background technology
At present, the camera major part of terminal is all adopt single camera to take, and the acquisition parameters that can only pre-set according to this single camera at synchronization carries out disposable shooting, obtains the image under a kind of acquisition parameters.There is at some the photographed scene of time effect, such as, meteor shower, lunar eclipse etc., carry out in the process of taking utilizing single camera, if acquisition parameters arranges improper, shooting effect can be caused bad, also cannot realize the destination object to synchronization, utilize different acquisition parameters repeatedly to take.In addition, needing the scene of carrying out constantly shooting, if the shooting results under wanting to see different acquisition parameters, the terminal utilizing single camera to carry out taking can only be applied after a kind of acquisition parameters takes, then tries another acquisition parameters, very not convenient.And when utilizing multi-cam to take, multi-cam all adopts the technique for taking cooperatively interacted, to reach the photographic effects such as the better depth of field, 3D shooting, but this multi-cam independently can not carry out shooting work, can only be taken by same acquisition parameters.
Summary of the invention
Main purpose of the present invention is to provide a kind of terminal taking method and apparatus, is intended to solve terminal when utilizing multi-cam to take, and can only carry out the technical problem of taking by same acquisition parameters.
For achieving the above object, the invention provides a kind of terminal taking device, comprising:
Parameter acquisition module, for when terminal opens the screening-mode of multi-cam, obtains the image taking parameter of each camera respectively;
Control module, for the image taking parameter according to each camera described, controls the image information of each camera described by the same destination object of each self-corresponding image taking parameter acquiring.
Preferably, described terminal taking device also comprises:
Judge module, needs the described image information obtained captured by each camera described to synthesize for judging whether;
Processing module, for if desired synthesizing the described image information obtained captured by each camera described, then synthesizes each image information, to export composograph; Otherwise, export the image that each image information correspondence generates respectively.
Preferably, described control module also for, control to obtain the image information of same destination object successively according to the time interval of presetting between each camera described.
Preferably, described control module also for, control each camera described in same Preset Time, complete different shooting number of times respectively.
Preferably, described terminal taking device also comprises:
Executive Module, for when terminal opens the screening-mode of single camera, according to the image information of the image taking parameter acquiring destination object corresponding with described single camera.
In addition, for achieving the above object, present invention also offers a kind of terminal taking method, comprising:
When terminal opens the screening-mode of multi-cam, obtain the image taking parameter of each camera respectively;
According to the image taking parameter of each camera described, control the image information of each camera described by the same destination object of each self-corresponding image taking parameter acquiring.
Preferably, the image taking parameter of each camera described in described basis, controls each camera described and also comprises by after the image information of the same destination object of each self-corresponding image taking parameter acquiring:
Judge whether to need the described image information obtained captured by each camera described to synthesize;
If so, then each image information is synthesized, to export composograph;
If not, then the image that each image information correspondence generates is exported respectively.
Preferably, the image taking parameter of each camera described in described basis, controls each camera described and comprises by the image information of the same destination object of each self-corresponding image taking parameter acquiring:
Control to obtain the image information of same destination object successively according to the time interval of presetting between each camera described.
Preferably, the image taking parameter of each camera described in described basis, controls each camera described and comprises by the image information of the same destination object of each self-corresponding image taking parameter acquiring:
Control each camera described in same Preset Time, complete different shooting number of times respectively.
Preferably, described terminal taking method also comprises:
When terminal opens the screening-mode of single camera, according to the image information of the image taking parameter acquiring destination object corresponding with described single camera.
When the embodiment of the present invention is taken by the camera of terminal, open the screening-mode of multi-cam, and according to the image information of the multi-cam same destination object of image taking parameter acquiring separately.Achieve multi-cam and carry out independent shooting according to acquisition parameters different separately, the image of different-effect can be obtained at one time, solve terminal when utilizing multi-cam to take, the technical problem of taking can only be carried out by same acquisition parameters.
Accompanying drawing explanation
Fig. 1 is the hardware configuration schematic diagram of the mobile terminal realizing each embodiment of the present invention;
Fig. 2 is the radio communication device schematic diagram of mobile terminal as shown in Figure 1;
Fig. 3 is the high-level schematic functional block diagram of terminal taking device first embodiment of the present invention;
Fig. 4 is the structural representation that the camera parameter of the present invention to terminal is arranged;
Fig. 5 is the high-level schematic functional block diagram of terminal taking device second embodiment of the present invention;
Fig. 6 is the schematic flow sheet of terminal taking method first embodiment of the present invention;
Fig. 7 is the schematic flow sheet of terminal taking method second embodiment of the present invention.
The realization of the object of the invention, functional characteristics and advantage will in conjunction with the embodiments, are described further with reference to accompanying drawing.
Embodiment
Should be appreciated that specific embodiment described herein only in order to explain the present invention, be not intended to limit the present invention.
The mobile terminal realizing each embodiment of the present invention is described referring now to accompanying drawing.In follow-up description, use the suffix of such as " module ", " parts " or " unit " for representing element only in order to be conducive to explanation of the present invention, itself is specific meaning not.Therefore, " module " and " parts " can mixedly use.
Mobile terminal can be implemented in a variety of manners.Such as, the terminal described in the present invention can comprise the such as mobile terminal of mobile phone, smart phone, notebook computer, digit broadcasting receiver, PDA (personal digital assistant), PAD (panel computer), PMP (portable media player), guider etc. and the fixed terminal of such as digital TV, desktop computer etc.Below, suppose that terminal is mobile terminal.But it will be appreciated by those skilled in the art that except the element except being used in particular for mobile object, structure according to the embodiment of the present invention also can be applied to the terminal of fixed type.
Fig. 1 is the hardware configuration signal of the mobile terminal realizing each embodiment of the present invention.
Mobile terminal 100 can comprise wireless communication unit 110, A/V (audio/video) input unit 120, user input unit 130, sensing cell 140, output unit 150, memory 160, interface unit 170, controller 180 and power subsystem 190 etc.Fig. 1 shows the mobile terminal with various assembly, it should be understood that, does not require to implement all assemblies illustrated.Can alternatively implement more or less assembly.Will be discussed in more detail below the element of mobile terminal.
Wireless communication unit 110 generally includes one or more assembly, and it allows the radio communication between mobile terminal 100 and radio communication device or network.Such as, wireless communication unit can comprise at least one in broadcast reception module 111, mobile communication module 112, wireless Internet module 113, short range communication module 114 and positional information module 115.
Broadcast reception module 111 via broadcast channel from external broadcasting management server receiving broadcast signal and/or broadcast related information.Broadcast channel can comprise satellite channel and/or terrestrial channel.Broadcast management server can be generate and send the server of broadcast singal and/or broadcast related information or the broadcast singal generated before receiving and/or broadcast related information and send it to the server of terminal.Broadcast singal can comprise TV broadcast singal, radio signals, data broadcasting signal etc.And broadcast singal may further include the broadcast singal combined with TV or radio signals.Broadcast related information also can provide via mobile communications network, and in this case, broadcast related information can be received by mobile communication module 112.Broadcast singal can exist in a variety of manners, such as, it can exist with the form of the electronic service guidebooks (ESG) of the electronic program guides of DMB (DMB) (EPG), digital video broadcast-handheld (DVB-H) etc.Broadcast reception module 111 can by using the broadcast of various types of broadcaster Received signal strength.Especially, broadcast reception module 111 can by using such as multimedia broadcasting-ground (DMB-T), DMB-satellite (DMB-S), digital video broadcasting-hand-held (DVB-H), forward link media (MediaFLO ) the digital broadcast apparatus receiving digital broadcast of data broadcast device, received terrestrial digital broadcasting integrated service (ISDB-T) etc.Broadcast reception module 111 can be constructed to be applicable to providing the various broadcaster of broadcast singal and above-mentioned digital broadcast apparatus.The broadcast singal received via broadcast reception module 111 and/or broadcast related information can be stored in memory 160 (or storage medium of other type).
Radio signal is sent at least one in base station (such as, access point, Node B etc.), exterior terminal and server and/or receives radio signals from it by mobile communication module 112.Various types of data that such radio signal can comprise voice call signal, video calling signal or send according to text and/or Multimedia Message and/or receive.
Wireless Internet module 113 supports the Wi-Fi (Wireless Internet Access) of mobile terminal.This module can be inner or be externally couple to terminal.Wi-Fi (Wireless Internet Access) technology involved by this module can comprise WLAN (WLAN) (Wi-Fi), Wibro (WiMAX), Wimax (worldwide interoperability for microwave access), HSDPA (high-speed downlink packet access) etc.
Short range communication module 114 is the modules for supporting junction service.Some examples of short-range communication technology comprise bluetooth tM, radio-frequency (RF) identification (RFID), Infrared Data Association (IrDA), ultra broadband (UWB), purple honeybee tMetc..
Positional information module 115 is the modules of positional information for checking or obtain mobile terminal.The typical case of positional information module is GPS (global pick device).According to current technology, GPS module 115 calculates from the range information of three or more satellite and correct time information and for the Information application triangulation calculated, thus calculates three-dimensional current location information according to longitude, latitude and pin-point accuracy.Current, the method for calculating location and temporal information uses three satellites and by the error of the position that uses an other satellite correction calculation to go out and temporal information.In addition, GPS module 115 can carry out computational speed information by Continuous plus current location information in real time.
A/V input unit 120 is for audio reception or vision signal.A/V input unit 120 can comprise camera 121 and microphone 122, and the view data of camera 121 to the static images obtained by image capture apparatus in Video Capture pattern or image capture mode or video processes.Picture frame after process may be displayed on display unit 151.Picture frame after camera 121 processes can be stored in memory 160 (or other storage medium) or via wireless communication unit 110 and send, and can provide two or more cameras 121 according to the structure of mobile terminal.Such acoustic processing can via microphones sound (voice data) in telephone calling model, logging mode, speech recognition mode etc. operational mode, and can be voice data by microphone 122.Audio frequency (voice) data after process can be converted to the formatted output that can be sent to mobile communication base station via mobile communication module 112 when telephone calling model.Microphone 122 can be implemented various types of noise and eliminate (or suppress) algorithm and receiving and sending to eliminate (or suppression) noise or interference that produce in the process of audio signal.
User input unit 130 can generate key input data to control the various operations of mobile terminal according to the order of user's input.User input unit 130 allows user to input various types of information, and keyboard, the young sheet of pot, trigger board (such as, detecting the touch-sensitive assembly of the change of the resistance, pressure, electric capacity etc. that cause owing to being touched), roller, rocking bar etc. can be comprised.Especially, when trigger board is superimposed upon on display unit 151 as a layer, can be formed and trigger screen.
Sensing cell 140 detects the current state of mobile terminal 100, (such as, mobile terminal 100 open or close state), the position of mobile terminal 100, user for mobile terminal 100 contact (namely, trigger input) presence or absence, the orientation of mobile terminal 100, the acceleration of mobile terminal 100 or speed is moved and direction etc., and generate order or the signal of the operation for controlling mobile terminal 100.Such as, when mobile terminal 100 is embodied as sliding-type mobile phone, sensing cell 140 can sense this sliding-type phone and open or close.In addition, whether whether sensing cell 140 can detect power subsystem 190 provides electric power or interface unit 170 to couple with external device (ED).Sensing cell 140 can comprise proximity transducer 141 will combine below trigger screen this is described.
Interface unit 170 is used as at least one external device (ED) and is connected the interface that can pass through with mobile terminal 100.Such as, external device (ED) can comprise wired or wireless head-band earphone port, external power source (or battery charger) port, wired or wireless FPDP, memory card port, for connecting the port, audio frequency I/O (I/O) port, video i/o port, ear port etc. of the device with identification module.Identification module can be that storage uses the various information of mobile terminal 100 for authentication of users and can comprise subscriber identification module (UIM), client identification module (SIM), Universal Subscriber identification module (USIM) etc.In addition, the device (hereinafter referred to " recognition device ") with identification module can take the form of smart card, and therefore, recognition device can be connected with mobile terminal 100 via port or other jockey.Interface unit 170 may be used for receive from external device (ED) input (such as, data message, electric power etc.) and the input received be transferred to the one or more element in mobile terminal 100 or may be used for transmitting data between mobile terminal and external device (ED).
In addition, when mobile terminal 100 is connected with external base, interface unit 170 can be used as to allow by it electric power to be provided to the path of mobile terminal 100 from base or can be used as the path that allows to be transferred to mobile terminal by it from the various command signals of base input.The various command signal inputted from base or electric power can be used as and identify whether mobile terminal is arranged on the signal base exactly.Output unit 150 is constructed to provide output signal (such as, audio signal, vision signal, alarm signal, vibration signal etc.) with vision, audio frequency and/or tactile manner.Output unit 150 can comprise display unit 151, dio Output Modules 152, alarm unit 153 etc.
Display unit 151 may be displayed on the information of process in mobile terminal 100.Such as, when mobile terminal 100 is in telephone calling model, display unit 151 can show with call or other communicate (such as, text messaging, multimedia file are downloaded etc.) be correlated with user interface (UI) or graphic user interface (GUI).When mobile terminal 100 is in video calling pattern or image capture mode, display unit 151 can the image of display capture and/or the image of reception, UI or GUI that video or image and correlation function are shown etc.
Meanwhile, when display unit 151 and trigger board as a layer superposed on one another with formed trigger screen time, display unit 151 can be used as input unit and output device.Display unit 151 can comprise at least one in liquid crystal display (LCD), thin-film transistor LCD (TFT-LCD), Organic Light Emitting Diode (OLED) display, flexible display, three-dimensional (3D) display etc.Some in these displays can be constructed to transparence and watch from outside to allow user, and this can be called transparent display, and typical transparent display can be such as TOLED (transparent organic light emitting diode) display etc.According to the specific execution mode wanted, mobile terminal 100 can comprise two or more display units (or other display unit), such as, mobile terminal can comprise outernal display unit (not shown) and inner display unit (not shown).Trigger screen to can be used for detecting triggering input pressure and triggering input position and trigger input area.
When dio Output Modules 152 can be under the isotypes such as call signal receiving mode, call mode, logging mode, speech recognition mode, broadcast reception mode at mobile terminal, voice data convert audio signals that is that wireless communication unit 110 is received or that store in memory 160 and exporting as sound.And dio Output Modules 152 can provide the audio frequency relevant to the specific function that mobile terminal 100 performs to export (such as, call signal receives sound, message sink sound etc.).Dio Output Modules 152 can comprise pick-up, buzzer etc.
Alarm unit 153 can provide and export that event informed to mobile terminal 100.Typical event can comprise calling reception, message sink, key signals input, trigger input etc.Except audio or video exports, alarm unit 153 can provide in a different manner and export with the generation of notification event.Such as, alarm unit 153 can provide output with the form of vibration, when receive calling, message or some other enter communication (incomingcommunication) time, alarm unit 153 can provide sense of touch to export (that is, vibrating) to notify to user.By providing such sense of touch to export, even if when the mobile phone of user is in the pocket of user, user also can identify the generation of various event.Alarm unit 153 also can provide the output of the generation of notification event via display unit 151 or dio Output Modules 152.
Memory 160 software program that can store process and the control operation performed by controller 180 etc., or temporarily can store oneself through exporting the data (such as, telephone directory, message, still image, video etc.) that maybe will export.And memory 160 can store about being applied to the vibration of various modes and the data of audio signal that export when triggering is shielded when triggering.
Memory 160 can comprise the storage medium of at least one type, described storage medium comprises flash memory, hard disk, multimedia card, card-type memory (such as, SD or DX memory etc.), random access storage device (RAM), static random-access memory (SRAM), read-only memory (ROM), Electrically Erasable Read Only Memory (EEPROM), programmable read only memory (PROM), magnetic storage, disk, CD etc.And mobile terminal 100 can be connected the memory function of execute store 160 network storage device with by network cooperates.
Controller 180 controls the overall operation of mobile terminal usually.Such as, controller 180 performs the control relevant to voice call, data communication, video calling etc. and process.In addition, controller 180 can comprise the multi-media module 181 for reproducing (or playback) multi-medium data, and multi-media module 181 can be configured in controller 180, or can be configured to be separated with controller 180.Controller 180 can pattern recognition process, drawing input and be identified as character or image triggering handwriting input that screen performs or picture.
Power subsystem 190 receives external power or internal power and provides each element of operation and the suitable electric power needed for assembly under the control of controller 180.
Various execution mode described herein can to use such as computer software, the computer-readable medium of hardware or its any combination implements.For hardware implementation, execution mode described herein can by using application-specific IC (ASIC), digital signal processor (DSP), digital signal processing device (DSPD), programmable logic device (PLD), field programmable gate array (FPGA), processor, controller, microcontroller, microprocessor, being designed at least one performed in the electronic unit of function described herein and implementing, in some cases, such execution mode can be implemented in controller 180.For implement software, the execution mode of such as process or function can be implemented with allowing the independent software module performing at least one function or operation.Software code can be implemented by the software application (or program) write with any suitable programming language, and software code can be stored in memory 160 and to be performed by controller 180.
So far, oneself is through the mobile terminal according to its functional description.Below, for the sake of brevity, by the slide type mobile terminal that describes in various types of mobile terminals of such as folded form, board-type, oscillating-type, slide type mobile terminal etc. exemplarily.Therefore, the present invention can be applied to the mobile terminal of any type, and is not limited to slide type mobile terminal.
Mobile terminal 100 as shown in Figure 1 can be constructed to utilize and send the such as wired and radio communication device of data via frame or grouping and satellite-based communicator operates.
Describe wherein according to the communicator that mobile terminal of the present invention can operate referring now to Fig. 2.
Such communicator can use different air interfaces and/or physical layer.Such as, the air interface used by communicator comprises such as frequency division multiple access (FDMA), time division multiple access (TDMA), code division multiple access (CDMA) and universal mobile communications device (UMTS) (especially, Long Term Evolution (LTE)), global mobile communication device (GSM) etc.As non-limiting example, description below relates to cdma communication device, but such instruction is equally applicable to the device of other type.
With reference to figure 2, cdma wireless communication device can comprise multiple mobile terminal 100, multiple base station (BS) 270, base station controller (BSC) 275 and mobile switching centre (MSC) 280.MSC280 is constructed to form interface with Public Switched Telephony Network (PSTN) 290.MSC280 is also constructed to form interface with the BSC275 that can be couple to base station 270 via back haul link.Back haul link can construct according to any one in some interfaces that oneself knows, described interface comprises such as E1/T1, ATM, IP, PPP, frame relay, HDSL, ADSL or xDSL.Will be appreciated that device as shown in Figure 2 can comprise multiple BSC275.
Each BS270 can serve one or more subregion (or region), by multidirectional antenna or point to specific direction each subregion of antenna cover radially away from BS270.Or each subregion can by two or more antenna covers for diversity reception.Each BS270 can be constructed to support multiple parallel compensate, and each parallel compensate has specific frequency spectrum (such as, 1.25MHz, 5MHz etc.).
Subregion can be called as CDMA Channel with intersecting of parallel compensate.BS270 also can be called as the sub-device of base station transceiver (BTS) or other equivalent terms.Under these circumstances, term " base station " may be used for broadly representing single BSC275 and at least one BS270.Base station also can be called as " cellular station ".Or each subregion of particular B S270 can be called as multiple cellular station.
As shown in Figure 2, broadcast singal is sent to the mobile terminal 100 operated in device by broadcsting transmitter (BT) 295.Broadcast reception module 111 as shown in Figure 1 is arranged on mobile terminal 100 and sentences the broadcast singal receiving and sent by BT295.In fig. 2, several global pick device (GPS) satellite 300 is shown.Satellite 300 helps at least one in the multiple mobile terminal 100 in location.
In fig. 2, depict multiple satellite 300, but be understandable that, the satellite of any number can be utilized to obtain useful locating information.GPS module 115 as shown in Figure 1 is constructed to coordinate to obtain the locating information wanted with satellite 300 usually.Substitute GPS tracking technique or outside GPS tracking technique, can use can other technology of position of tracking mobile terminal.In addition, at least one gps satellite 300 optionally or extraly can process satellite dmb transmission.
As a typical operation of radio communication device, BS270 receives the reverse link signal from various mobile terminal 100.Mobile terminal 100 participates in call usually, information receiving and transmitting communicates with other type.Each reverse link signal that certain base station 270 receives is processed by particular B S270.The data obtained are forwarded to relevant BSC275.BSC provides call Resourse Distribute and comprises the mobile management function of coordination of the soft switching process between BS270.The data received also are routed to MSC280 by BSC275, and it is provided for the extra route service forming interface with PSTN290.Similarly, PSTN290 and MSC280 forms interface, and MSC and BSC275 forms interface, and BSC275 correspondingly control BS270 so that forward link signals is sent to mobile terminal 100.
Based on the structure of above-mentioned mobile terminal hardware configuration, communicator, each embodiment of the inventive method is proposed.
As shown in Figure 3, a kind of terminal taking device first of the present invention embodiment is proposed.The terminal taking device of this embodiment comprises:
Parameter acquisition module 10, for when terminal opens the screening-mode of multi-cam, obtains the image taking parameter of each camera respectively;
In the present embodiment, the type of terminal can be arranged according to actual needs, and such as, this terminal can be mobile phone, camera, panel computer etc., and following examples will be described in detail for mobile phone.This mobile phone can pre-set multiple camera, and the number of camera can be arranged as the case may be and flexibly, preferably, can arrange dual camera.The present embodiment is provided with dual camera for mobile phone and is described in detail, and this dual camera is respectively A camera and B camera.This A camera and the setting position of B camera on mobile phone can be arranged according to actual needs.It should be noted that, cell phone system built-in camera drive system unit, can be used for driving A camera and B camera respectively, to enable the image taking parameter of A camera and B camera arrange respectively, and independently take according to respective image taking parameter.
When user uses the camera applications of mobile phone to take, first, call parameters acquisition module 10 enters screening-mode and selects interface to select dual camera screening-mode, or according to front once use time default setting directly enter dual camera screening-mode.Under dual camera screening-mode, user can be arranged the acquisition parameters of A camera and B camera respectively, and carries out independent shooting.
Can comprise the mode that the acquisition parameters of A camera and B camera is arranged: 1) mode one: the optimum configurations interface entering A camera, is first arranged the parameter of A camera.Such as, the parameter such as photosensitivity, time for exposure, white balance, aperture size that camera A is set can be selected; Also can screening-mode be set, as HDR (High-DynamicRange, high dynamic range images), night scene etc.; Also shooting filter can be set, as LOMO, skin makeup etc.After camera A has set, the optimum configurations interface of B camera can be entered, selections setting is carried out to the acquisition parameters of B camera or the parameter such as screening-mode or shooting filter.Certainly, also directly can enter the optimum configurations interface comprising A camera and B camera when the parameter of A camera and B camera is arranged, respectively the parameter of A camera and B camera be arranged in same interface.2) mode two: when user needs to arrange a certain acquisition parameters, selection enters this acquisition parameters and arranges interface, such as, that selects to enter the time for exposure arranges interface, the parameters of the time for exposure of A camera or B camera is only shown in this interface, or show the parameters of the time for exposure of A camera and B camera simultaneously, arrange for user.After completing the setting of time for exposure, can select to enter next acquisition parameters and interface is set arranges.
Below be illustrated, when supposing that user needs to arrange the photosensitivity of A camera and B camera, can select to enter that photosensitivity specifies arranges interface, that specifies in photosensitivity arranges interface will show the photosensitivity parameter of A camera and B camera simultaneously, as shown in Figure 4.By regulating photosensitive instruction button corresponding to the A camera (CamA) above being positioned at can realize arranging the photosensitivity of A camera, or by regulating the photosensitive instruction button of B camera (CamB) correspondence being positioned at below can realize arranging the photosensitivity of A camera.When user needs to arrange the screening-mode of A camera and B camera, can select to enter that screening-mode specifies arranges interface, the interface that arranges as specified at screening-mode in Fig. 4 will show the screening-mode of A camera and B camera simultaneously.Wherein, A camera and the screening-mode of B camera each provide 6 different screening-modes, show with the form of icon, carry out clicking the screening-mode needed for selecting for user.When user needs to arrange the shooting filter of A camera and B camera, at the upper and lower arranging interface that shooting filter is specified in Fig. 4, the A camera shooting filter different from 6 of B camera will be shown respectively with the form of icon, select required A camera or the shooting filter of B camera for user.
The above-mentioned mode arranged the acquisition parameters of camera is only enumerating of embodiment, those skilled in the art propose other mode of acquisition parameters is set, all in protection scope of the present invention.
When handset starting camera and when entering the screening-mode of multi-cam, parameter acquisition module 10 will obtain the image taking parameter of above-mentioned each camera pre-set respectively according to shooting instruction mobile phone.
Control module 20, for the image taking parameter according to each camera described, controls the image information of each camera described by the same destination object of each self-corresponding image taking parameter acquiring.
After the screening-mode entering multi-cam at above-mentioned mobile phone obtains the acquisition parameters of A camera and B camera, after user clicks the button of shutter or other preset execution shooting, control module 20 control A camera and B camera carry out independent shooting according to the image taking parameter set, to obtain the image information of same destination object, this destination object can be static, also can be dynamic.
Be understandable that, use the camera applications of mobile phone to carry out in the process of taking user, can upgrade set acquisition parameters according to the hobby of oneself or concrete condition.Namely reenter image taking optimum configurations interface to reset.
When each camera of mobile phone is taken according to each self-corresponding image taking parameter, correspondingly can control according to parameter particularly.In one embodiment, described control module 20 also for, control to obtain the image information of same destination object successively according to the time interval of presetting between each camera described.
Above-mentioned when carrying out image taking optimum configurations, can to A camera and B camera arrange shooting order and the shooting interval time arrange.Such as, user can arrange A camera and B camera is taken simultaneously; Also can arrange A camera first to take, take after B camera, also can arrange B camera, A camera shooting order take; Or taking according to the time interval of presetting of A camera and B camera is set, the image information of same destination object is namely obtained successively according to the time interval of presetting.This time interval of presetting can be arranged as the case may be and flexibly, and such as, when A camera completes shooting, after 3s, B camera is taken again.
The camera applications of mobile phone calls control module 20 obtains same destination object image information according to the style of shooting arranged, two cameras are taken simultaneously or are taken according to sequencing or take according to the time interval of presetting, independently carry out during shooting, be independent of each other, and all can generate respective photographic images and be kept in picture library.
In another embodiment, described control module 20 also for, control each camera described in same Preset Time, complete different shooting number of times respectively.
Above-mentioned when carrying out acquisition parameters and arranging, can arrange A camera and the B camera repetition shooting number of times in same Preset Time respectively.Such as, arrange A camera and take 10 times in 3S, B camera takes 30 times in 1S, and two camera separate counts, are independent of each other.It should be noted that, if the photographic images being provided with two cameras above-mentioned needs to carry out synthesizing and taking according to the time interval of presetting, when then calculating shooting number of times, after a camera completes shooting before this, through the time interval of presetting, another camera is taken, and has added up first time shooting.By that analogy, until two cameras take the number of times reaching setting in turn.The camera applications of mobile phone calls control module 20 according to the shooting number of times arranged, and controls camera and completes this shooting number of times.
In the embodiment of the present invention, mobile phone can carry out independent shooting by multi-cam according to acquisition parameters different separately, obtains the different image information of same target.Make mobile phone can take the image obtaining different-effect at one time, very convenient and saving power consumption, when avoiding once taking the effect not reaching anticipation, does not have again the trouble of camera meeting.Simultaneously, multiple camera independently can carry out shooting work, be independent of each other, complete the setting of image taking parameter, shooting, preservation image etc. respectively, not only break current dual camera and taken the shooting rule that must cooperatively interact, solved terminal when utilizing multi-cam to take, the technical problem of taking can only have been carried out by same acquisition parameters, and the image obtained is taken to each camera synthesize, the image of other special-effect can be obtained.
Further, as shown in Figure 5, based on above-described embodiment, in this embodiment, above-mentioned terminal taking device also comprises:
Judge module 30, needs the described image information obtained captured by each camera described to synthesize for judging whether;
Processing module 40, for if desired synthesizing the described image information obtained captured by each camera described, then synthesizes each image information, to export composograph; Otherwise, export the image that each image information correspondence generates respectively.
After above-mentioned A camera and B camera obtain the image information of same destination object, judge module 30, according in the arranging of image taking parameter, judges the need of synthesis the need of the image obtained captured by A camera and B camera.
In the setting of whether synthesizing A camera and B camera, can be above-mentioned acquisition parameters is arranged time arrange, if be provided with when carrying out acquisition parameters and arranging above-mentioned and synthesized image, then now processing module 40 exports and preserves composograph; If do not select when carrying out acquisition parameters and arranging to synthesize image above-mentioned, then now processing module 40 exports respectively and preserves A camera and the image captured by B camera.Certainly, also can being that mobile phone interface will eject the interface the need of synthesizing image after A camera and B camera complete shooting, when user selects "Yes", then the image information that A camera and B camera obtain being synthesized; Otherwise if select "No", then do not synthesize image, now, preservation A camera and B camera take the image obtained respectively.Namely, when two cameras all complete respective shooting task, each self-generating photo is also preserved.
It should be noted that, when needs synthesize image, also can arranging and preserve A camera and B camera at the interface that arranges of specifying respectively and take the image obtained, then preserve the image of synthesis.Certainly, also can be set to not preserve A camera and B camera takes the image obtained, only preserve the image after synthesis; Or, preserve B camera take the image obtained synthesize with preservation after image.When carrying out Image Saving, can arrange and be marked by preserved image, be which camera to take image that is that obtain or that be whether synthesis by distinguish.
After the present invention utilizes A camera and B camera to obtain the image information of same destination object, can synthesize image as required, therefore A camera and B camera to also be ensured in the same horizontal line when arranging the position of camera, and the distance that both are separated by is as far as possible little, deviation between the image that two cameras are obtained respectively reduces, even can ignore, make the image of synthesis more accurate.
Below be illustrated, suppose that user arranges LDR (Low-DynamicRange) image of different time for exposure respectively to A camera and B camera, HDR (High-DynamicRange, high dynamic range images) can be synthesized.If user is set to translucent to A camera display mode, the display mode of B camera is not arranged, then A camera being taken the imaging importing obtained takes in the image obtained to B camera, obtain the image after synthesizing, the image of special-effect can be synthesized, cause a kind of similar surrealistic image.
Further, based on above-described embodiment, in this embodiment, above-mentioned terminal taking device also comprises:
Executive Module, for when terminal opens the screening-mode of single camera, according to the image information of the image taking parameter acquiring destination object corresponding with described single camera.
In the present embodiment, when user does not want to use dual camera to take simultaneously, can select only to open one of them camera and take at the interface that arranges of specifying, and close another one camera.Namely when user control mobile phone by Executive Module enter screening-mode open single camera screening-mode and select one of them camera time, mobile phone by the image taking parameter of the correspondence according to selected camera perform shoot function.User also can modify to the image taking parameter of this camera as the case may be, to obtain the image envisioned.The present embodiment is taken by unrestricted choice camera, and the mobile phone camera that adding users uses carries out the degree of freedom of taking, not only easy to use, and saves power consumption.
Accordingly, as shown in Figure 6, a kind of terminal taking method first of the present invention embodiment is shown.The terminal taking method of this embodiment comprises:
Step S10, when terminal opens the screening-mode of multi-cam, obtain the image taking parameter of each camera respectively;
In the present embodiment, the type of terminal can be arranged according to actual needs, and such as, this terminal can be mobile phone, camera, panel computer etc., and following examples will be described in detail for mobile phone.This mobile phone can pre-set multiple camera, and the number of camera can be arranged as the case may be and flexibly, preferably, can arrange dual camera.The present embodiment is provided with dual camera for mobile phone and is described in detail, and this dual camera is respectively A camera and B camera.This A camera and the setting position of B camera on mobile phone can be arranged according to actual needs.It should be noted that, cell phone system built-in camera drive system unit, can be used for driving A camera and B camera respectively, to enable the image taking parameter of A camera and B camera arrange respectively, and independently take according to respective image taking parameter.
When user uses the camera applications of mobile phone to take, first, enter screening-mode and select interface to select dual camera screening-mode, or according to front once use time default setting directly enter dual camera screening-mode.Under dual camera screening-mode, user can be arranged the acquisition parameters of A camera and B camera respectively, and carries out independent shooting.
Can comprise the mode that the acquisition parameters of A camera and B camera is arranged: 1) mode one: the optimum configurations interface entering A camera, is first arranged the parameter of A camera.Such as, the parameter such as photosensitivity, time for exposure, white balance, aperture size that camera A is set can be selected; Also can screening-mode be set, as HDR (High-DynamicRange, high dynamic range images), night scene etc.; Also shooting filter can be set, as LOMO, skin makeup etc.After camera A has set, the optimum configurations interface of B camera can be entered, selections setting is carried out to the acquisition parameters of B camera or the parameter such as screening-mode or shooting filter.Certainly, also directly can enter the optimum configurations interface comprising A camera and B camera when the parameter of A camera and B camera is arranged, respectively the parameter of A camera and B camera be arranged in same interface.2) mode two: when user needs to arrange a certain acquisition parameters, selection enters this acquisition parameters and arranges interface, such as, that selects to enter the time for exposure arranges interface, the parameters of the time for exposure of A camera or B camera is only shown in this interface, or show the parameters of the time for exposure of A camera and B camera simultaneously, arrange for user.After completing the setting of time for exposure, can select to enter next acquisition parameters and interface is set arranges.
Below be illustrated, when supposing that user needs to arrange the photosensitivity of A camera and B camera, can select to enter that photosensitivity specifies arranges interface, that specifies in photosensitivity arranges interface will show the photosensitivity parameter of A camera and B camera simultaneously, as shown in Figure 4.By regulating photosensitive instruction button corresponding to the A camera (CamA) above being positioned at can realize arranging the photosensitivity of A camera, or by regulating the photosensitive instruction button of B camera (CamB) correspondence being positioned at below can realize arranging the photosensitivity of A camera.When user needs to arrange the screening-mode of A camera and B camera, can select to enter that screening-mode specifies arranges interface, the interface that arranges as specified at screening-mode in Fig. 4 will show the screening-mode of A camera and B camera simultaneously.Wherein, A camera and the screening-mode of B camera each provide 6 different screening-modes, show with the form of icon, carry out clicking the screening-mode needed for selecting for user.When user needs to arrange the shooting filter of A camera and B camera, at the upper and lower arranging interface that shooting filter is specified in Fig. 4, the A camera shooting filter different from 6 of B camera will be shown respectively with the form of icon, select required A camera or the shooting filter of B camera for user.
The above-mentioned mode arranged the acquisition parameters of camera is only enumerating of embodiment, those skilled in the art propose other mode of acquisition parameters is set, all in protection scope of the present invention.
When handset starting camera and when entering the screening-mode of multi-cam, the image taking parameter of above-mentioned each camera pre-set will be obtained respectively according to shooting instruction mobile phone.
Step S20, image taking parameter according to each camera described, control the image information of each camera described by the same destination object of each self-corresponding image taking parameter acquiring.
After the screening-mode entering multi-cam at above-mentioned mobile phone obtains the acquisition parameters of A camera and B camera, after user clicks the button of shutter or other preset execution shooting, A camera and B camera carry out independent shooting according to the image taking parameter set, to obtain the image information of same destination object, this destination object can be static, also can be dynamic.
Be understandable that, use the camera applications of mobile phone to carry out in the process of taking user, can upgrade set acquisition parameters according to the hobby of oneself or concrete condition.Namely reenter image taking optimum configurations interface to reset.
When each camera of mobile phone is taken according to each self-corresponding image taking parameter, correspondingly can control according to parameter particularly.In one embodiment, control to obtain the image information of same destination object successively according to the time interval of presetting between each camera described.
Above-mentioned when carrying out image taking optimum configurations, can to A camera and B camera arrange shooting order and the shooting interval time arrange.Such as, user can arrange A camera and B camera is taken simultaneously; Also can arrange A camera first to take, take after B camera, also can arrange B camera, A camera shooting order take; Or taking according to the time interval of presetting of A camera and B camera is set, the image information of same destination object is namely obtained successively according to the time interval of presetting.This time interval of presetting can be arranged as the case may be and flexibly, and such as, when A camera completes shooting, after 3s, B camera is taken again.
The camera applications of mobile phone obtains the image information of same destination object according to the style of shooting arranged, two cameras are taken simultaneously or are taken according to sequencing or take according to the time interval of presetting, independently carry out during shooting, be independent of each other, and all can generate respective photographic images and be kept in picture library.
In another embodiment, control each camera described in same Preset Time, complete different shooting number of times respectively.
Above-mentioned when carrying out acquisition parameters and arranging, can arrange A camera and the B camera repetition shooting number of times in same Preset Time respectively.Such as, arrange A camera and take 10 times in 3S, B camera takes 30 times in 1S, and two camera separate counts, are independent of each other.It should be noted that, if the photographic images being provided with two cameras above-mentioned needs to carry out synthesizing and taking according to the time interval of presetting, when then calculating shooting number of times, after a camera completes shooting before this, through the time interval of presetting, another camera is taken, and has added up first time shooting.By that analogy, until two cameras take the number of times reaching setting in turn.The camera applications of mobile phone, according to the shooting number of times arranged, controls camera and completes this shooting number of times.
In the embodiment of the present invention, mobile phone can carry out independent shooting by multi-cam according to acquisition parameters different separately, obtains the different image information of same target.Make mobile phone can take the image obtaining different-effect at one time, very convenient and saving power consumption, when avoiding once taking the effect not reaching anticipation, does not have again the trouble of camera meeting.Simultaneously, multiple camera independently can carry out shooting work, be independent of each other, complete the setting of image taking parameter, shooting, preservation image etc. respectively, not only break current dual camera and taken the shooting rule that must cooperatively interact, solved terminal when utilizing multi-cam to take, the technical problem of taking can only have been carried out by same acquisition parameters, and the image obtained is taken to each camera synthesize, the image of other special-effect can be obtained.
Further, as shown in Figure 7, based on above-described embodiment, in the present embodiment, can comprise after above-mentioned steps S20:
Step S30, judge whether to need the described image information to obtaining captured by each camera described to synthesize; If so, then step S40 is performed; If not, then step S50 is performed.
Step S40, each image information to be synthesized, to export composograph;
Step S50, export the image that each image information correspondence generates respectively.
After above-mentioned A camera and B camera obtain the image information of same destination object, according in the arranging of image taking parameter, judge the need of synthesis the need of the image obtained captured by A camera and B camera.
In the setting of whether synthesizing A camera and B camera, can be above-mentioned acquisition parameters is arranged time arrange, be provided with when carrying out acquisition parameters and arranging if above-mentioned and image synthesized, then now exported and preserve composograph; Do not select when carrying out acquisition parameters and arranging to synthesize image if above-mentioned, then now export respectively and preserve A camera and the image captured by B camera.Certainly, also can being that mobile phone interface will eject the interface the need of synthesizing image after A camera and B camera complete shooting, when user selects "Yes", then the image information that A camera and B camera obtain being synthesized; Otherwise if select "No", then do not synthesize image, now, preservation A camera and B camera take the image obtained respectively.Namely, when two cameras all complete respective shooting task, each self-generating photo is also preserved.
It should be noted that, when needs synthesize image, also can arranging and preserve A camera and B camera at the interface that arranges of specifying respectively and take the image obtained, then preserve the image of synthesis.Certainly, also can be set to not preserve A camera and B camera takes the image obtained, only preserve the image after synthesis; Or, preserve B camera take the image obtained synthesize with preservation after image.When carrying out Image Saving, can arrange and be marked by preserved image, be which camera to take image that is that obtain or that be whether synthesis by distinguish.
After the present invention utilizes A camera and B camera to obtain the image information of same destination object, can synthesize image as required, therefore A camera and B camera to also be ensured in the same horizontal line when arranging the position of camera, and the distance that both are separated by is as far as possible little, deviation between the image that two cameras are obtained respectively reduces, even can ignore, make the image of synthesis more accurate.
Below be illustrated, suppose that user arranges LDR (Low-DynamicRange) image of different time for exposure respectively to A camera and B camera, HDR (High-DynamicRange, high dynamic range images) can be synthesized.If user is set to translucent to A camera display mode, the display mode of B camera is not arranged, then A camera being taken the imaging importing obtained takes in the image obtained to B camera, obtain the image after synthesizing, the image of special-effect can be synthesized, cause a kind of similar surrealistic image.
Further, based on above-described embodiment, in the present embodiment, above-mentioned terminal taking method also comprises:
When terminal opens the screening-mode of single camera, according to the image information of the image taking parameter acquiring destination object corresponding with described single camera.
In the present embodiment, when user does not want to use dual camera to take simultaneously, can select only to open one of them camera and take at the interface that arranges of specifying, and close another one camera.Namely when user control mobile phone enter screening-mode open single camera screening-mode and select one of them camera time, mobile phone by the image taking parameter of the correspondence according to selected camera perform shoot function.User also can modify to the image taking parameter of this camera as the case may be, to obtain the image envisioned.The present embodiment is taken by unrestricted choice camera, and the mobile phone camera that adding users uses carries out the degree of freedom of taking, not only easy to use, and saves power consumption.
Through the above description of the embodiments, those skilled in the art can be well understood to the mode that above-described embodiment method can add required general hardware platform by software and realize, hardware can certainly be passed through, but in a lot of situation, the former is better execution mode.Based on such understanding, technical scheme of the present invention can embody with the form of software product the part that prior art contributes in essence in other words, this computer software product is stored in a storage medium (as ROM/RAM, magnetic disc, CD), comprising some instructions in order to make a station terminal equipment (can be mobile phone, computer, server, air conditioner, or the network equipment etc.) perform method described in each embodiment of the present invention.
These are only the preferred embodiments of the present invention; not thereby the scope of the claims of the present invention is limited; every utilize specification of the present invention and accompanying drawing content to do equivalent structure or equivalent flow process conversion; or be directly or indirectly used in other relevant technical fields, be all in like manner included in scope of patent protection of the present invention.

Claims (10)

1. a terminal taking device, is characterized in that, described terminal taking device comprises:
Parameter acquisition module, for when terminal opens the screening-mode of multi-cam, obtains the image taking parameter of each camera respectively;
Control module, for the image taking parameter according to each camera described, controls the image information of each camera described by the same destination object of each self-corresponding image taking parameter acquiring.
2. terminal taking device as claimed in claim 1, it is characterized in that, described terminal taking device also comprises:
Judge module, needs the described image information obtained captured by each camera described to synthesize for judging whether;
Processing module, for if desired synthesizing the described image information obtained captured by each camera described, then synthesizes each image information, to export composograph; Otherwise, export the image that each image information correspondence generates respectively.
3. terminal taking device as claimed in claim 2, is characterized in that, described control module also for, control to obtain the image information of same destination object successively according to the time interval of presetting between each camera described.
4. the terminal taking device according to any one of claim 1-3, is characterized in that, described control module also for, control each camera described in same Preset Time, complete different shooting number of times respectively.
5. terminal taking device as claimed in claim 1, it is characterized in that, described terminal taking device also comprises:
Executive Module, for when terminal opens the screening-mode of single camera, according to the image information of the image taking parameter acquiring destination object corresponding with described single camera.
6. a terminal taking method, is characterized in that, described terminal taking method comprises the following steps:
When terminal opens the screening-mode of multi-cam, obtain the image taking parameter of each camera respectively;
According to the image taking parameter of each camera described, control the image information of each camera described by the same destination object of each self-corresponding image taking parameter acquiring.
7. terminal taking method as claimed in claim 6, it is characterized in that, the image taking parameter of each camera described in described basis, controls each camera described and also comprises by after the image information of the same destination object of each self-corresponding image taking parameter acquiring:
Judge whether to need the described image information obtained captured by each camera described to synthesize;
If so, then each image information is synthesized, to export composograph;
If not, then the image that each image information correspondence generates is exported respectively.
8. terminal taking method as claimed in claim 7, is characterized in that, the image taking parameter of each camera described in described basis, controls each camera described and comprises by the image information of the same destination object of each self-corresponding image taking parameter acquiring:
Control to obtain the image information of same destination object successively according to the time interval of presetting between each camera described.
9. the terminal taking method according to any one of claim 6-8, it is characterized in that, the image taking parameter of each camera described in described basis, controls each camera described and comprises by the image information of the same destination object of each self-corresponding image taking parameter acquiring:
Control each camera described in same Preset Time, complete different shooting number of times respectively.
10. terminal taking method as claimed in claim 6, it is characterized in that, described terminal taking method also comprises:
When terminal opens the screening-mode of single camera, according to the image information of the image taking parameter acquiring destination object corresponding with described single camera.
CN201510429077.5A 2015-07-20 2015-07-20 Terminal image pickup method and device Active CN105141833B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510429077.5A CN105141833B (en) 2015-07-20 2015-07-20 Terminal image pickup method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510429077.5A CN105141833B (en) 2015-07-20 2015-07-20 Terminal image pickup method and device

Publications (2)

Publication Number Publication Date
CN105141833A true CN105141833A (en) 2015-12-09
CN105141833B CN105141833B (en) 2018-12-07

Family

ID=54727031

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510429077.5A Active CN105141833B (en) 2015-07-20 2015-07-20 Terminal image pickup method and device

Country Status (1)

Country Link
CN (1) CN105141833B (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105812677A (en) * 2016-04-29 2016-07-27 宇龙计算机通信科技(深圳)有限公司 Image generating method and system
CN105872148A (en) * 2016-06-21 2016-08-17 维沃移动通信有限公司 Method and mobile terminal for generating high dynamic range images
CN106060526A (en) * 2016-07-04 2016-10-26 天脉聚源(北京)传媒科技有限公司 Live broadcast method and device based on two cameras
CN106161943A (en) * 2016-07-29 2016-11-23 维沃移动通信有限公司 A kind of kinescope method and mobile terminal
CN106791732A (en) * 2016-11-30 2017-05-31 努比亚技术有限公司 A kind of image processing method and device
CN106850964A (en) * 2016-12-27 2017-06-13 宇龙计算机通信科技(深圳)有限公司 A kind of multi-cam filming apparatus and its method
CN107071274A (en) * 2017-03-13 2017-08-18 努比亚技术有限公司 A kind of distortion processing method and terminal
CN107205109A (en) * 2016-03-18 2017-09-26 聚晶半导体股份有限公司 The method of electronic installation and its control with many photographing modules
CN108322670A (en) * 2018-04-27 2018-07-24 Oppo广东移动通信有限公司 A kind of control method of multi-camera system, mobile terminal and storage medium
CN110049257A (en) * 2019-05-31 2019-07-23 深圳岚锋创视网络科技有限公司 The method and electronic device of a kind of synchronous exposure parameter of determination half
CN110072056A (en) * 2019-05-10 2019-07-30 北京迈格威科技有限公司 Data processing method and device based on multiple camera modules
CN111182093A (en) * 2018-11-12 2020-05-19 奇酷互联网络科技(深圳)有限公司 HDR photographing method based on three cameras, mobile terminal and storage medium
WO2020103786A1 (en) * 2018-11-23 2020-05-28 华为技术有限公司 Method for generating multiple video streams and device
CN114745508A (en) * 2022-06-13 2022-07-12 荣耀终端有限公司 Shooting method, terminal device and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100255875A1 (en) * 2007-11-22 2010-10-07 Keisuke Oozeki Imaging device, information processing terminal, mobile telephone, program, and light emission control method
CN103024272A (en) * 2012-12-14 2013-04-03 广东欧珀移动通信有限公司 Double camera control device, method and system of mobile terminal and mobile terminal
CN103763477A (en) * 2014-02-21 2014-04-30 上海果壳电子有限公司 Double-camera after-shooting focusing imaging device and method
CN103780840A (en) * 2014-01-21 2014-05-07 上海果壳电子有限公司 High-quality imaging double camera shooting and imaging device and method thereof
CN103905731A (en) * 2014-03-26 2014-07-02 武汉烽火众智数字技术有限责任公司 Broadband dynamic image collection method and system
CN103986875A (en) * 2014-05-29 2014-08-13 宇龙计算机通信科技(深圳)有限公司 Image acquiring device, method and terminal and video acquiring method
CN104780324A (en) * 2015-04-22 2015-07-15 努比亚技术有限公司 Shooting method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100255875A1 (en) * 2007-11-22 2010-10-07 Keisuke Oozeki Imaging device, information processing terminal, mobile telephone, program, and light emission control method
CN103024272A (en) * 2012-12-14 2013-04-03 广东欧珀移动通信有限公司 Double camera control device, method and system of mobile terminal and mobile terminal
CN103780840A (en) * 2014-01-21 2014-05-07 上海果壳电子有限公司 High-quality imaging double camera shooting and imaging device and method thereof
CN103763477A (en) * 2014-02-21 2014-04-30 上海果壳电子有限公司 Double-camera after-shooting focusing imaging device and method
CN103905731A (en) * 2014-03-26 2014-07-02 武汉烽火众智数字技术有限责任公司 Broadband dynamic image collection method and system
CN103986875A (en) * 2014-05-29 2014-08-13 宇龙计算机通信科技(深圳)有限公司 Image acquiring device, method and terminal and video acquiring method
CN104780324A (en) * 2015-04-22 2015-07-15 努比亚技术有限公司 Shooting method and device

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107205109A (en) * 2016-03-18 2017-09-26 聚晶半导体股份有限公司 The method of electronic installation and its control with many photographing modules
US10798288B2 (en) 2016-03-18 2020-10-06 Altek Semiconductor Corp. Multi-camera electronic device and control method thereof
CN105812677B (en) * 2016-04-29 2019-07-30 宇龙计算机通信科技(深圳)有限公司 A kind of image generating method and system
CN105812677A (en) * 2016-04-29 2016-07-27 宇龙计算机通信科技(深圳)有限公司 Image generating method and system
CN105872148A (en) * 2016-06-21 2016-08-17 维沃移动通信有限公司 Method and mobile terminal for generating high dynamic range images
CN105872148B (en) * 2016-06-21 2019-05-17 维沃移动通信有限公司 A kind of generation method and mobile terminal of high dynamic range images
CN106060526A (en) * 2016-07-04 2016-10-26 天脉聚源(北京)传媒科技有限公司 Live broadcast method and device based on two cameras
CN106161943A (en) * 2016-07-29 2016-11-23 维沃移动通信有限公司 A kind of kinescope method and mobile terminal
CN106791732A (en) * 2016-11-30 2017-05-31 努比亚技术有限公司 A kind of image processing method and device
CN106850964A (en) * 2016-12-27 2017-06-13 宇龙计算机通信科技(深圳)有限公司 A kind of multi-cam filming apparatus and its method
CN107071274A (en) * 2017-03-13 2017-08-18 努比亚技术有限公司 A kind of distortion processing method and terminal
CN107071274B (en) * 2017-03-13 2020-08-21 麒和科技(南京)有限公司 Distortion processing method and terminal
CN108322670B (en) * 2018-04-27 2019-05-28 Oppo广东移动通信有限公司 A kind of control method of multi-camera system, mobile terminal and storage medium
CN108322670A (en) * 2018-04-27 2018-07-24 Oppo广东移动通信有限公司 A kind of control method of multi-camera system, mobile terminal and storage medium
CN111182093A (en) * 2018-11-12 2020-05-19 奇酷互联网络科技(深圳)有限公司 HDR photographing method based on three cameras, mobile terminal and storage medium
WO2020103786A1 (en) * 2018-11-23 2020-05-28 华为技术有限公司 Method for generating multiple video streams and device
CN110072056A (en) * 2019-05-10 2019-07-30 北京迈格威科技有限公司 Data processing method and device based on multiple camera modules
CN110072056B (en) * 2019-05-10 2022-02-01 北京迈格威科技有限公司 Data processing method and device based on multiple camera modules
CN110049257A (en) * 2019-05-31 2019-07-23 深圳岚锋创视网络科技有限公司 The method and electronic device of a kind of synchronous exposure parameter of determination half
CN110049257B (en) * 2019-05-31 2021-09-21 影石创新科技股份有限公司 Method for determining semi-synchronous exposure parameters and electronic device
US11758277B2 (en) 2019-05-31 2023-09-12 Arashi Vision Inc. Method for determining semi-synchronous exposure parameters and electronic device
CN114745508A (en) * 2022-06-13 2022-07-12 荣耀终端有限公司 Shooting method, terminal device and storage medium
CN114745508B (en) * 2022-06-13 2023-10-31 荣耀终端有限公司 Shooting method, terminal equipment and storage medium

Also Published As

Publication number Publication date
CN105141833B (en) 2018-12-07

Similar Documents

Publication Publication Date Title
CN105141833A (en) Terminal photographing method and device
CN104954689A (en) Method and shooting device for acquiring photo through double cameras
CN105227837A (en) A kind of image combining method and device
CN105404484A (en) Terminal screen splitting device and method
CN105554383A (en) Mobile terminal and method for controlling camera shooting by utilizing mobile terminal
CN105120135A (en) Binocular camera
CN104660912A (en) Photographing method and photographing device
CN105159594A (en) Touch photographing device and method based on pressure sensor, and mobile terminal
CN105100491A (en) Device and method for processing photo
CN104735255A (en) Split screen display method and system
CN104731472A (en) Rapid icon clearing-up method and device
CN104811532A (en) Terminal screen display parameter adjustment method and device
CN105138261A (en) Shooting parameter adjustment apparatus and method
CN105243126A (en) Cross-screen screen capture method and apparatus
CN105338242A (en) Image synthesis method and device
CN105072351A (en) Photographing device and method based on front-facing camera
CN104917965A (en) Shooting method and device
CN105227865A (en) A kind of image processing method and terminal
CN104968033A (en) Terminal network processing method and apparatus
CN106851113A (en) A kind of photographic method and mobile terminal based on dual camera
CN105100642A (en) Image processing method and apparatus
CN104951236A (en) Wallpaper configuration method for terminal device, and terminal device
CN105100673A (en) Voice over long term evolution (VoLTE) based desktop sharing method and device
CN105338245A (en) Photo-taking sharing method, photo-taking sharing terminal and photo-taking sharing system
CN105100619A (en) Apparatus and method for adjusting shooting parameters

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant