CN106713716A - Double cameras shooting control method and device - Google Patents
Double cameras shooting control method and device Download PDFInfo
- Publication number
- CN106713716A CN106713716A CN201710058274.XA CN201710058274A CN106713716A CN 106713716 A CN106713716 A CN 106713716A CN 201710058274 A CN201710058274 A CN 201710058274A CN 106713716 A CN106713716 A CN 106713716A
- Authority
- CN
- China
- Prior art keywords
- image
- camera
- touch
- touch area
- control operation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/951—Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
Abstract
The present invention discloses a double cameras shooting control method and a device. The method comprises: invoking the double cameras that part of their visual angles overlap to collect the first image and the second image; according to the first image and the second image, synthesizing a wide-angle image; when displaying the wide-angle image at the preview interface, detecting the touch region for the touch control operation; recognizing the camera corresponding to the image in the touch region, controlling the camera to carry out the touch control operation. The present invention invokes two camera to collect the images respectively, can display and capture the wide-angle image whose visual angle is wider than the original image by synthesizing the two images; when displaying the wide-angle image, can adjust the camera settings of the two cameras simultaneously according to the touch control operation; the present invention carries out the touch control operation of the cameras based on the wide-angle image, so that the camera settings of the two cameras can refer to that of each other to improve the shooting quality.
Description
Technical field
The present invention relates to technical field of mobile terminals, more particularly to a kind of dual camera filming control method and device.
Background technology
With the rise of double-camera mobile terminal, user can be respectively controlled when taking pictures to dual camera, make
Obtain mobile terminal controls two cameras to be shot to photographic subjects respectively according to the setting of user.Dual camera is entered at present
Row shoots the mode for controlling:Transfer dual camera and gather picture respectively, show two view-finders respectively in display interface and divide
Picture that Xian Shi be acquired in two cameras, user is controlled to two cameras respectively in two regions of view-finder
Make to set two acquisition parameters of camera, and taken pictures according to the parameter.
However, being had the following problems to the mode that dual camera shoot control at present:Installing, dual camera it
Between there is certain distance, although the visual angle of two such camera has differences, but due to two cameras to each collection
Picture carry out independent display, it is independent to shoot, and the acquisition parameters of two cameras individually control, so cannot be by double shootings
Head increases the corresponding visual angle of single image, and the information that cannot be also gathered according to one of camera controls another camera
Acquisition parameters, and then image taking quality cannot be improved.
The content of the invention
Filming control method and device it is a primary object of the present invention to propose a kind of dual camera, it is intended to solve existing
Two cameras independently show to the picture of respective collection in technology, independent to shoot, and the acquisition parameters of two cameras are independent
Control, it is impossible to improve the problem of image taking quality.
For above-mentioned technical problem, the present invention is solved by the following technical programs:
The present invention provides a kind of filming control method of dual camera, and methods described includes:Call what visual angle partially overlapped
Dual camera gathers the first image and the second image respectively;According to described first image and second image, synthesize wide-angle figure
Picture;When preview interface shows the wide angle picture, the touch area of detection touch control operation;Recognize in the touch area
Image corresponding to camera, control the camera to perform the touch control operation.
Alternatively, the dual camera for calling visual angle to partially overlap gathers the first image and the second image respectively, including:
The dual camera that the visual angle partially overlaps includes:First camera and second camera;First camera is called to gather
Carry the first image of first camera information;The second camera collection is called to carry the second camera information
The second image.
Alternatively, it is described according to described first image and second image, synthesize wide angle picture, including:It is determined that described
The intersection image and difference section image of the first image and second image;According to described first image and described second
The intersection image and difference section image of image, synthesize the wide angle picture;Wherein, when the synthetic operation is performed,
The intersection image is set to carry first camera information and the second camera information.
Alternatively, the camera corresponding to the image in the identification touch area, including:According to the Petting Area
The camera information that image in domain is carried, recognizes the camera corresponding to the image in the touch area.
Alternatively, the camera corresponding to the image in the identification touch area, including:If described touch control
System operation is to slide touch control operation, and the intersection image and institute are crossed in the touch area of touch control operation
Difference section image is stated, is then recognized corresponding to the initial touch region of the touch area or the image of final touch area
Camera.
The present invention provides a kind of imaging control device of dual camera, and described device includes:Calling module, regards for calling
The dual camera that angle part overlaps gathers the first image and the second image respectively;Synthesis module, for according to described first image
With second image, synthesize wide angle picture;Detection module, for when preview interface shows the wide angle picture, detection to be touched
Touch the touch area of control operation;Control module, for recognizing the camera corresponding to the image in the touch area, control
The camera performs the touch control operation.
Alternatively, the calling module, is used for:The dual camera that the visual angle partially overlaps includes:First camera and
Second camera;The first camera collection is called to carry the first image of first camera information;Call described
The collection of two cameras carries the second image of the second camera information.
Alternatively, the synthesis module, is used for:Determine the intersection image of described first image and second image
With difference section image;According to described first image and the intersection image and difference section image of second image, close
Into the wide angle picture;Wherein, when the synthetic operation is performed, the intersection image is made to carry first camera
Information and the second camera information.
Alternatively, the control module, is used for:According to the camera information that the image in the touch area is carried, know
The camera corresponding to image in not described touch area.
Alternatively, the control module, is used for:If the touch control operation is operated to slide touch control, and institute
The touch area of touch control operation is stated across the intersection image and the difference section image, then recognizes the touch
Camera corresponding to the initial touch region in region or the image of final touch area.
Beneficial effects of the present invention are as follows:
The present invention calls two cameras to gather image respectively, synthesizes by two images, can show and clap
Take the photograph compared to the broader wide angle picture in original image visual angle;When wide angle picture is shown, can be simultaneously according to touch control operation
Two acquisition parameters of camera of adjustment, and because the present invention carries out touch control operation based on wide angle picture to camera,
So two acquisition parameters of camera can be referred to each other, and then shooting quality can be improved.
Brief description of the drawings
Fig. 1 is the hardware architecture diagram for realizing each optional mobile terminal of embodiment one of the invention;
Fig. 2 is the wireless communication system schematic diagram of mobile terminal as shown in Figure 1;
Fig. 3 is the flow chart of the filming control method of the dual camera according to first embodiment of the invention;
Fig. 4 is the particular flow sheet of the filming control method of the dual camera according to second embodiment of the invention;
Fig. 5 is the schematic diagram of the first image gathered according to the first camera of second embodiment of the invention;
Fig. 6 is the schematic diagram of the second image gathered according to the second camera of second embodiment of the invention;
Fig. 7 is the display schematic diagram of the wide angle picture according to second embodiment of the invention;
Fig. 8 is the display schematic diagram of the light-metering frame according to second embodiment of the invention;
Fig. 9 is that light measuring function according to second embodiment of the invention realizes schematic diagram;
Figure 10 is the imaging control device of the dual camera according to third embodiment of the invention.
The realization of the object of the invention, functional characteristics and advantage will be described further referring to the drawings in conjunction with the embodiments.
Specific embodiment
It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not intended to limit the present invention.
The mobile terminal of each embodiment of the invention is realized referring now to Description of Drawings.In follow-up description, use
For represent element such as " module ", " part " or " unit " suffix only for being conducive to explanation of the invention, itself
Not specific meaning.Therefore, " module " can be used mixedly with " part ".
Mobile terminal can be implemented in a variety of manners.For example, the terminal described in the present invention can include such as moving
Phone, smart phone, notebook computer, digit broadcasting receiver, PDA (personal digital assistant), PAD (panel computer), PMP
The mobile terminal of (portable media player), guider etc. and such as numeral TV, desktop computer etc. are consolidated
Determine terminal.Hereinafter it is assumed that terminal is mobile terminal.However, it will be understood by those skilled in the art that, except being used in particular for movement
Outside the element of purpose, construction according to the embodiment of the present invention can also apply to the terminal of fixed type.
Fig. 1 is that the hardware configuration for realizing each optional mobile terminal of embodiment one of the invention is illustrated.
Mobile terminal 1 00 can include wireless communication unit 110, A/V (audio/video) input block 120, user input
Unit 130, sensing unit 140, output unit 150, memory 160, interface unit 170, controller 180 and power subsystem 190
Etc..Fig. 1 shows the mobile terminal with various assemblies, it should be understood that being not required for implementing all groups for showing
Part.More or less component can alternatively be implemented.The element of mobile terminal will be discussed in more detail below.
Wireless communication unit 110 generally includes one or more assemblies, and it allows mobile terminal 1 00 and wireless communication system
Or the radio communication between network.For example, wireless communication unit can include broadcasting reception module 111, mobile communication module
112nd, at least one of wireless Internet module 113, short range communication module 114 and location information module 115.
Broadcasting reception module 111 receives broadcast singal and/or broadcast via broadcast channel from external broadcast management server
Relevant information.Broadcast channel can include satellite channel and/or terrestrial channel.Broadcast management server can be generated and sent
The broadcast singal and/or broadcast related information generated before the server or reception of broadcast singal and/or broadcast related information
And send it to the server of terminal.Broadcast singal can include TV broadcast singals, radio signals, data broadcasting
Signal etc..And, broadcast singal may further include the broadcast singal combined with TV or radio signals.Broadcast phase
Pass information can also be provided via mobile communications network, and in this case, broadcast related information can be by mobile communication mould
Block 112 is received.Broadcast singal can exist in a variety of manners, for example, it can be with the electronics of DMB (DMB)
The form of program guide (EPG), the electronic service guidebooks (ESG) of digital video broadcast-handheld (DVB-H) etc. and exist.Broadcast
Receiver module 111 can receive signal and broadcast by using various types of broadcast systems.Especially, broadcasting reception module 111
Can be wide by using such as multimedia broadcasting-ground (DMB-T), DMB-satellite (DMB-S), digital video
Broadcast-hand-held (DVB-H), forward link media (MediaFLO@) Radio Data System, received terrestrial digital broadcasting integrated service
Etc. (ISDB-T) digit broadcasting system receives digital broadcasting.Broadcasting reception module 111 may be constructed such that and be adapted to provide for extensively
Broadcast the various broadcast systems and above-mentioned digit broadcasting system of signal.Via broadcasting reception module 111 receive broadcast singal and/
Or broadcast related information can be stored in memory 160 (or other types of storage medium).
Mobile communication module 112 sends radio signals to base station (for example, access point, node B etc.), exterior terminal
And at least one of server and/or receive from it radio signal.Such radio signal can be logical including voice
Words signal, video calling signal or the various types of data for sending and/or receiving according to text and/or Multimedia Message.
Wireless Internet module 113 supports the Wi-Fi (Wireless Internet Access) of mobile terminal.The module can be internally or externally
It is couple to terminal.Wi-Fi (Wireless Internet Access) technology involved by the module can include WLAN (WLAN) (Wi-Fi), Wibro
(WiMAX), Wimax (worldwide interoperability for microwave accesses), HSDPA (high-speed downlink packet access) etc..
Short range communication module 114 is the module for supporting junction service.Some examples of short-range communication technology include indigo plant
ToothTM, radio frequency identification (RFID), Infrared Data Association (IrDA), ultra wide band (UWB), purple honeybeeTMEtc..
Location information module 115 is the module for checking or obtaining the positional information of mobile terminal.Location information module
Typical case be GPS (global positioning system).According to current technology, GPS module 115 is calculated and comes from three or more satellites
Range information and correct time information and the Information application triangulation for calculating, so as to according to longitude, latitude
Highly accurately calculate three-dimensional current location information.Currently, defended using three for calculating the method for position and temporal information
Star and the position that is calculated by using other satellite correction and the error of temporal information.Additionally, GPS module 115
Can be by Continuous plus current location information in real time come calculating speed information.
A/V input blocks 120 are used to receive audio or video signal.A/V input blocks 120 can include the He of camera 121
Microphone 122, the static images that 121 pairs, camera is obtained in Video Capture pattern or image capture mode by image capture apparatus
Or the view data of video is processed.Picture frame after treatment may be displayed on display unit 151.Processed through camera 121
Picture frame afterwards can be stored in memory 160 (or other storage mediums) or sent out via wireless communication unit 110
Send, two or more cameras 121 can be provided according to the construction of mobile terminal.Microphone 122 can be in telephone calling model, note
Sound (voice data) is received via microphone in record pattern, speech recognition mode etc. operational mode, and can be by so
Acoustic processing be voice data.Audio (voice) data after treatment can be converted in the case of telephone calling model can
The form for being sent to mobile communication base station via mobile communication module 112 is exported.Microphone 122 can implement various types of making an uproar
Sound eliminates (or suppression) algorithm to eliminate the noise or dry that (or suppression) produces during reception and transmission audio signal
Disturb.
User input unit 130 can generate key input data to control each of mobile terminal according to the order of user input
Plant operation.User input unit 130 allows the various types of information of user input, and can include keyboard, metal dome, touch
Plate (for example, detection due to being touched caused by resistance, pressure, electric capacity etc. change sensitive component), roller, rocking bar etc.
Deng.Especially, when touch pad is superimposed upon on display unit 151 in the form of layer, touch-screen can be formed.
Sensing unit 140 detects the current state of mobile terminal 1 00, (for example, mobile terminal 1 00 opens or closes shape
State), the presence or absence of the contact (that is, touch input) of the position of mobile terminal 1 00, user for mobile terminal 1 00, mobile terminal
The acceleration or deceleration movement of 100 orientation, mobile terminal 1 00 and direction etc., and generate for controlling mobile terminal 1 00
The order of operation or signal.For example, when mobile terminal 1 00 is embodied as sliding-type mobile phone, sensing unit 140 can be sensed
The sliding-type phone is opened or closed.In addition, sensing unit 140 can detect power subsystem 190 whether provide electric power or
Whether person's interface unit 170 couples with external device (ED).
Interface unit 170 is connected the interface that can pass through with mobile terminal 1 00 as at least one external device (ED).For example,
External device (ED) can include wired or wireless head-band earphone port, external power source (or battery charger) port, wired or nothing
Line FPDP, memory card port, the port for connecting the device with identification module, audio input/output (I/O) end
Mouth, video i/o port, ear port etc..Identification module can be that storage uses each of mobile terminal 1 00 for verifying user
Kind of information and subscriber identification module (UIM), client identification module (SIM), Universal Subscriber identification module (USIM) can be included
Etc..In addition, the device (hereinafter referred to as " identifying device ") with identification module can take the form of smart card, therefore, know
Other device can be connected via port or other attachment means with mobile terminal 1 00.Interface unit 170 can be used for reception and come from
The input (for example, data message, electric power etc.) of the external device (ED) and input that will be received is transferred in mobile terminal 1 00
One or more elements can be used for transmitting data between mobile terminal and external device (ED).
In addition, when mobile terminal 1 00 is connected with external base, interface unit 170 can serve as allowing by it by electricity
Power provides to the path of mobile terminal 1 00 from base or can serve as allowing the various command signals being input into from base to pass through it
It is transferred to the path of mobile terminal.Be can serve as recognizing that mobile terminal is from the various command signals or electric power of base input
The no signal being accurately fitted within base.Output unit 150 is configured to provide defeated with vision, audio and/or tactile manner
Go out signal (for example, audio signal, vision signal, alarm signal, vibration signal etc.).Output unit 150 can include display
Unit 151, dio Output Modules 152, alarm unit 153 etc..
Display unit 151 may be displayed on the information processed in mobile terminal 1 00.For example, when mobile terminal 1 00 is in electricity
During words call mode, display unit 151 can show and converse or other communicate (for example, text messaging, multimedia file
Download etc.) related user interface (UI) or graphic user interface (GUI).When mobile terminal 1 00 is in video calling pattern
Or during image capture mode, display unit 151 can show the image of capture and/or the image of reception, show video or figure
UI or GUI of picture and correlation function etc..
Meanwhile, when display unit 151 and touch pad in the form of layer it is superposed on one another to form touch-screen when, display unit
151 can serve as input unit and output device.Display unit 151 can include liquid crystal display (LCD), thin film transistor (TFT)
In LCD (TFT-LCD), Organic Light Emitting Diode (OLED) display, flexible display, three-dimensional (3D) display etc. at least
It is a kind of.Some in these displays may be constructed such that transparence to allow user to be watched from outside, and this is properly termed as transparent
Display, typical transparent display can be, for example, TOLED (transparent organic light emitting diode) display etc..According to specific
Desired implementation method, mobile terminal 1 00 can include two or more display units (or other display devices), for example, moving
Dynamic terminal can include outernal display unit (not shown) and inner display unit (not shown).Touch-screen can be used to detect touch
Input pressure and touch input position and touch input area.
Dio Output Modules 152 can mobile terminal be in call signal reception pattern, call mode, logging mode,
It is that wireless communication unit 110 is received or in memory 160 when under the isotypes such as speech recognition mode, broadcast reception mode
The voice data transducing audio signal of middle storage and it is output as sound.And, dio Output Modules 152 can be provided and movement
The audio output (for example, call signal receives sound, message sink sound etc.) of the specific function correlation that terminal 100 is performed.
Dio Output Modules 152 can include loudspeaker, buzzer etc..
Alarm unit 153 can provide output and be notified to mobile terminal 1 00 with by event.Typical event can be with
Including calling reception, message sink, key signals input, touch input etc..In addition to audio or video is exported, alarm unit
153 can in a different manner provide output with the generation of notification event.For example, alarm unit 153 can be in the form of vibrating
Output is provided, when calling, message or some other entrance communication (incoming communication) are received, alarm list
Unit 153 can provide tactile output (that is, vibrating) to notify to user.Exported by providing such tactile, even if
When in pocket of the mobile phone of user in user, user also can recognize that the generation of various events.Alarm unit 153
The output of the generation of notification event can be provided via display unit 151 or dio Output Modules 152.
Memory 160 can store software program for the treatment and control operation performed by controller 180 etc., Huo Zheke
Temporarily to store oneself data (for example, telephone directory, message, still image, video etc.) through exporting or will export.And
And, memory 160 can store the vibration of various modes on being exported when touching and being applied to touch-screen and audio signal
Data.
Memory 160 can include the storage medium of at least one type, and the storage medium includes flash memory, hard disk, many
Media card, card-type memory (for example, SD or DX memories etc.), random access storage device (RAM), static random-access storage
Device (SRAM), read-only storage (ROM), Electrically Erasable Read Only Memory (EEPROM), programmable read only memory
(PROM), magnetic storage, disk, CD etc..And, mobile terminal 1 00 can perform memory with by network connection
The network storage device cooperation of 160 store function.
The overall operation of the generally control mobile terminal of controller 180.For example, controller 180 is performed and voice call, data
Communication, video calling etc. related control and treatment.In addition, controller 180 can be included for reproducing (or playback) many matchmakers
The multi-media module 181 of volume data, multi-media module 181 can be constructed in controller 180, or can be structured as and control
Device 180 is separated.Controller 180 can be with execution pattern identifying processing, the handwriting input that will be performed on the touchscreen or picture
Draw input and be identified as character or image.
Power subsystem 190 receives external power or internal power under the control of controller 180 and provides operation each unit
Appropriate electric power needed for part and component.
Various implementation methods described herein can be with use such as computer software, hardware or its any combination of calculating
Machine computer-readable recording medium is implemented.Implement for hardware, implementation method described herein can be by using application-specific IC
(ASIC), digital signal processor (DSP), digital signal processing device (DSPD), programmable logic device (PLD), scene can
Programming gate array (FPGA), processor, controller, microcontroller, microprocessor, it is designed to perform function described herein
At least one in electronic unit is implemented, and in some cases, such implementation method can be implemented in controller 180.
For software implementation, the implementation method of such as process or function can with allow to perform the single of at least one function or operation
Software module is implemented.Software code can be come by the software application (or program) write with any appropriate programming language
Implement, software code can be stored in memory 160 and performed by controller 180.
So far, oneself according to its function through describing mobile terminal.Below, for the sake of brevity, will description such as folded form,
Slide type mobile terminal in various types of mobile terminals of board-type, oscillating-type, slide type mobile terminal etc. is used as showing
Example.Therefore, the present invention can be applied to any kind of mobile terminal, and be not limited to slide type mobile terminal.
Mobile terminal 1 00 as shown in Figure 1 may be constructed such that using via frame or packet transmission data it is all if any
Line and wireless communication system and satellite-based communication system are operated.
The communication system that mobile terminal wherein of the invention can be operated is described referring now to Fig. 2.
Such communication system can use different air interface and/or physical layer.For example, used by communication system
Air interface includes such as frequency division multiple access (FDMA), time division multiple acess (TDMA), CDMA (CDMA) and universal mobile communications system
System (UMTS) (especially, Long Term Evolution (LTE)), global system for mobile communications (GSM) etc..As non-limiting example, under
The description in face is related to cdma communication system, but such teaching is equally applicable to other types of system.
With reference to Fig. 2, cdma wireless communication system can include multiple mobile terminal 1s 00, multiple base station (BS) 270, base station
Controller (BSC) 275 and mobile switching centre (MSC) 280.MSC280 is configured to and Public Switched Telephony Network (PSTN)
290 form interface.MSC280 is also structured to form interface with the BSC275 that can be couple to base station 270 via back haul link.
If any one in the interface that back haul link can be known according to Ganji is constructed, the interface includes such as E1/T1, ATM, IP,
PPP, frame relay, HDSL, ADSL or xDSL.It will be appreciated that system can include multiple BSC275 as shown in Figure 2.
Each BS270 can service one or more subregions (or region), by multidirectional antenna or the day of sensing specific direction
Each subregion of line covering is radially away from BS270.Or, each subregion can be by two or more for diversity reception
Antenna is covered.Each BS270 may be constructed such that the multiple frequency distribution of support, and the distribution of each frequency has specific frequency spectrum
(for example, 1.25MHz, 5MHz etc.).
What subregion and frequency were distributed intersects can be referred to as CDMA Channel.BS270 can also be referred to as base station transceiver
System (BTS) or other equivalent terms.In this case, term " base station " can be used for broadly representing single
BSC275 and at least one BS270.Base station can also be referred to as " cellular station ".Or, each subregion of specific BS270 can be claimed
It is multiple cellular stations.
As shown in Figure 2, broadcast singal is sent to broadcsting transmitter (BT) 295 mobile terminal operated in system
100.Broadcasting reception module 111 as shown in Figure 1 is arranged at mobile terminal 1 00 to receive the broadcast sent by BT295
Signal.In fig. 2 it is shown that several global positioning system (GPS) satellites 300.Satellite 300 helps position multiple mobile terminals
At least one of 100.
In fig. 2, multiple satellites 300 are depicted, it is understood that be, it is possible to use any number of satellite obtains useful
Location information.GPS module 115 as shown in Figure 1 is generally configured to coordinate with satellite 300 to be believed with obtaining desired positioning
Breath.Substitute GPS tracking techniques or outside GPS tracking techniques, it is possible to use other of the position of mobile terminal can be tracked
Technology.In addition, at least one gps satellite 300 can optionally or additionally process satellite dmb transmission.
Used as a typical operation of wireless communication system, BS270 receives the reverse link from various mobile terminal 1s 00
Signal.Mobile terminal 1 00 generally participates in call, information receiving and transmitting and other types of communication.Each of the reception of certain base station 270 is anti-
Processed in specific BS270 to link signal.The data of acquisition are forwarded to the BSC275 of correlation.BSC provides call
Resource allocation and the mobile management function of the coordination including the soft switching process between BS270.The number that BSC275 will also be received
According to MSC280 is routed to, it provides the extra route service for forming interface with PSTN290.Similarly, PSTN290 with
MSC280 forms interface, and MSC and BSC275 form interface, and BSC275 correspondingly controls BS270 with by forward link signals
It is sent to mobile terminal 1 00.
Based on above-mentioned mobile terminal hardware configuration and communication system, the inventive method each embodiment is proposed.
Embodiment one
The embodiment of the present invention provides a kind of filming control method of dual camera.The executive agent of the present embodiment is to be provided with
The mobile terminal of dual camera.
Fig. 3 is the flow chart of the filming control method of the dual camera according to first embodiment of the invention.
Step S310, the dual camera for calling visual angle to partially overlap gathers the first image and the second image respectively.
In the present embodiment, dual camera is located in a side of mobile terminal, for example, be all located at mobile terminal just
Face is all located at the back side of mobile terminal.The visual angle of the dual camera partially overlaps, i.e., two cameras can be with when finding a view
The region overlapped at visual angle collects identical image.
A camera in dual camera is called to gather the first image;Another camera in dual camera is called to adopt
Collect the second image.
Step S320, according to described first image and second image, synthesizes wide angle picture.
In the present embodiment, because the visual angle of dual camera partially overlaps, so gathered respectively in dual camera first
There is the identical image image different with part in part in image and the second image, can by the first image and the second image it
Between part identical image be referred to as intersection image, the different image in part between the first image and the second image is referred to as poor
Different parts of images.In other words, difference section image is the remainder image in addition to intersection image in image.
When wide angle picture is synthesized, the weight in the first image and the second image can be matched by matching integration technology
Close parts of images;According to intersection image, in determining the difference section image and the second image in the first image respectively
Difference section image;Using the difference section image in the first image, the first image and in the second image intersection image with
And the second difference section image in image, an image is synthesized, because the visual angle of the image is compared to the first image and
Two images will be wide, so the image is referred to as into wide angle picture.
When wide angle picture is synthesized, if the coincidence part component in intersection image and the second image in the first image
The acquisition parameters of picture are identical, then can directly by weight in the difference section image in the first image, the first image and the second image
The difference section image in parts of images and the second image is closed, wide angle picture is spliced into according to position relationship.
When wide angle picture is synthesized, if the coincidence part component in intersection image and the second image in the first image
The acquisition parameters of picture are different, such as brightness, and colourity is different, then need the intersection image and the second image in the first image first
In intersection image synthesize shared intersection image, then by the difference section image in the first image, share and overlap
Difference section image in parts of images and the second image, wide angle picture is spliced into according to position relationship.Further, closing
Into the pixel of the correspondence position of the intersection image in the pixel and the second image of the intersection image in the first image
During point, two average values of corresponding pixel points, such as average brightness, colourity average value are taken in parameter.
Step S330, when preview interface shows the wide angle picture, the touch area of detection touch control operation.
The control operations such as click, slip touch that touch control operation is carried out for user in preview interface.
Touch control is operated for realizing that the touch control operates corresponding function.For example:Touch control is operated for reality
Now open and close the flash lamp that camera is carried.And for example:Touch control is operated for realizing that focusing, brightness, exposure etc. are clapped
Take the photograph the regulation of parameter.
After wide angle picture is synthesized, it is possible to which preview is closed to the image gathered according to two cameras in preview interface
Into wide angle picture.User according to the deficiency of wide angle picture, can adjust the shooting ginseng of camera when wide angle picture is checked
Number, such as regulation brightness of image.
In the present embodiment, formed because wide angle picture is synthesized by the first image and the second image, user can by
The different zones of image carry out touch-control to realize the acquisition parameters adjustment to different cameras, so the present embodiment is being received
Need to detect the touch area of user input touch control operation after the touch control operation of user input.
Step S340, recognizes the camera corresponding to the image in the touch area, controls the camera to perform institute
State touch control operation.
In the present embodiment, can be when dual camera gathers the first image and the second image respectively, in each pixel
Increase the camera information in each pixel source;When wide angle picture is synthesized, the pixel in difference section image carries it
The camera information in source, intersection image carries two information of camera;When touch control operation is received, it is determined that
The coordinate information of touch area, according to the coordinate information of touch area, can determine the coordinate in the wide angle picture of synthesis
The corresponding pixel of information;And then the camera information according to entrained by the pixel determined, identify in touch area
Camera corresponding to image.
In the present embodiment, can also can be recorded positioned at the difference portion of wide angle picture both sides when wide angle picture is synthesized
The camera of the regional extent of partial image and each difference section image sources, weight of the record positioned at wide angle picture centre position
Close the regional extent of parts of images, acquiescence intersection image two cameras of correspondence.
If touch control operation is to slide touch control operation, and touch control operation touch area across
The intersection image and the difference section image, then recognize the initial touch region of the touch area or final
Camera corresponding to the image of touch area.
If touch control operation is point touching control operation, and touch control operation touch area across
The intersection image and the difference section image, then recognize the image correspondence difference portion in the touch area
The corresponding camera of partial image and/or the corresponding camera of the correspondence intersection image.
If the camera for identifying is two cameras, controls two cameras to be carried out touch control and operate.
After touch control operation is performed to the camera for identifying, image capture operations are performed.
In the present embodiment, call two cameras to gather image respectively, synthesize by two images, Ke Yixian
Show and photograph compared to the broader wide angle picture in original image visual angle;When wide angle picture is shown, can according to touch control operation
With adjustment simultaneously to two acquisition parameters of camera, and because this example carries out touch control based on wide angle picture to camera
System operation, so two acquisition parameters of camera can be referred to each other, and then can improve shooting quality.
Embodiment two
In order that technical scheme is clearer, the method to identification camera is further retouched below
State.In the present embodiment, the dual camera that the visual angle partially overlaps includes:First camera and second camera.
Fig. 4 is the particular flow sheet of the filming control method of dual camera according to a second embodiment of the present invention.
Step S410, calls the first camera to gather the first image for carrying first camera information.
Step S420, calls second camera to gather the second image for carrying the second camera information.
Step S430, determines the intersection image and difference section image of described first image and second image.
Fig. 5 show the schematic diagram of the first image of the first camera collection, and Fig. 6 show the of second camera collection
The schematic diagram of two images.By comparing Fig. 5 and Fig. 6 it is recognised that image on the left of image and Fig. 6 dotted lines on the right side of Fig. 5 dotted lines
It is identical, it is the part acquired image of dual camera visual angle coincidence, the figure on the left of image and Fig. 6 dotted lines on the right side of Fig. 5 dotted lines
As being coincidence parts of images;Image on the left of Fig. 5 dotted lines is different with the image on the right side of Fig. 6 dotted lines, is that dual camera visual angle does not weigh
The part acquired image of conjunction, the image on the right side of image and Fig. 6 dotted lines on the left of Fig. 5 dotted lines is difference section image.
Step S440, according to described first image and the intersection image and difference section image of second image,
Synthesize the wide angle picture.
In the present embodiment, using the first image difference section image, the first image and the second image intersection
The difference section image of image and the second image, wide angle picture is spliced into according to the position relationship between image.
When the synthesis is performed, the difference section image from the first image can be made to carry the first camera information,
The difference section image from the second image is carried second camera information, intersection image is carried first shooting
Header and the second camera information.First camera information and second camera information can be unique volumes of camera
Code.Wherein, it is the pixel in image is carried camera information image is carried camera information.
In order to enable users to identify the source of the image in wide angle picture, can be when wide angle picture be shown, in wide-angle
The regional extent and two regional extents of difference section image of intersection image are identified in image.It is illustrated in figure 7
The display schematic diagram of wide angle picture, label 1 is mobile terminal, and marking frame Z1 is the coincidence part between the first image and the second image
Partial image, marking frame Z2 is the difference section image in the first image, and marking frame Z3 is the difference section image in the second image.
Step S450, when preview interface shows the wide angle picture, the touch area of detection touch control operation.
User the touch control such as is clicked on by preview interface, slided and operating to be believed to mobile terminal input operation
Breath, to adjust the acquisition parameters of camera.
In the present embodiment, multiple operation boxs can be shown in preview interface.For example:In tri- regions of Z1, Z2 and Z3
It is middle to show light-metering frame respectively.
In the present embodiment, touch control operation can be respectively carried out in three regions (Z1, Z2 and Z3), for example:Fig. 8
It is shown, show light-metering frame respectively in tri- regions of Z1, Z2 and Z3, user can operate survey by touch control operation respectively
Light frame, to be respectively controlled to these three regions.
Step S460, according to the camera information that the image in the touch area is carried, recognizes in the touch area
Image corresponding to camera.
For example:According to the coordinate information of touch area, the corresponding pixel of the coordinate information is determined, carried according to pixel
Camera information, identify the camera corresponding to the image in touch area.
And for example:The regional extent of marking frame Z1, marking frame Z2 and marking frame Z3 in record Fig. 7, marking frame Z1 correspondences first
Camera and second camera, marking frame Z1 the first cameras of correspondence, marking frame Z3 correspondence second cameras;Due to each mark
There is corresponding relation in frame and camera, so the marking frame according to where touch area, determines the image in the touch area
Corresponding camera, such as:In marking frame Z1, then the corresponding camera of image in touch area is taken the photograph for two for touch area
As head.
Step S470, after the touch control operation is performed to the camera for identifying, performs image taking
Operation.
If identifying that touch control operation is located in Z1 regions, identify that the correspondence of the image in touch area first is taken the photograph
As head and second camera, and then operated according to touch control while controlling the first camera and second camera to acquisition parameters
Adjustment is synchronized, and carries out sync pulse jamming.
If identifying that touch control operation is located in Z2 or Z3 regions, the image correspondence in touch area is identified
First camera or second camera, and then according to the camera of touch control operational control first or second camera to clapping
Take the photograph parameter to be adjusted, afterwards, the first camera and second camera is synchronized shooting.
If touch control operation is to slide touch control operation, and coincidence part is crossed in the touch area of touch control operation
Partial image and the difference section image, then recognize the initial touch region of touch area or the image institute of final touch area
Corresponding camera.
By taking the light measuring function shown in Fig. 8 as an example, light-metering frame of user's operation in Z3 regions is slid into Z2 regions
Position where circle, slip touch control operation touches control across Z3, Z1 and Z2 region, i.e. touch control operation to slide
System operation, and intersection image and the difference section image are crossed in the touch area of touch control operation.Determine light-metering frame
Origin coordinates and light-metering frame termination coordinate, and then can identify that the origin coordinates is corresponding respectively with the termination coordinate and take the photograph
As head, if the camera for identifying is different, when touch control operation is performed, the camera terminated corresponding to coordinate is obtained
Current photometric parameter, the photometric parameter of the camera according to corresponding to the photometric parameter adjusts initiation region.
Adjustable range when mobile terminal shoots can be improved by the present embodiment, obtaining cannot when single camera shoots
The acquisition parameters of acquisition, due to having at regular intervals between two cameras, cause one of camera to obtain another
The data that individual camera cannot be obtained, for example, the first camera may obtain more preferable light data, and second camera cannot
Obtain the light data, then can be by the case where not moving or rotating mobile terminal, using by the present embodiment
The light information that one camera is obtained controls the second camera to be shot.
Embodiment three
The embodiment of the present invention provides a kind of imaging control device of dual camera.The shooting control of the dual camera of the present embodiment
Device processed can be arranged in the mobile terminal of the dual camera partially overlapped with visual angle.
As shown in Figure 10, it is the imaging control device of dual camera according to third embodiment of the invention.
Calling module 1010, the dual camera for calling visual angle to partially overlap gathers the first image and the second figure respectively
Picture;
Synthesis module 1020, for according to described first image and second image, synthesizing wide angle picture;
Detection module 1030, for when preview interface shows the wide angle picture, detecting the touch of touch control operation
Region;
Control module 1040, for recognizing the camera corresponding to the image in the touch area, controls the shooting
Head performs the touch control operation.
Further, the calling module 1010, is used for:The dual camera that the visual angle partially overlaps includes:First takes the photograph
As head and second camera;The first camera collection is called to carry the first image of first camera information;Call
The second camera collection carries the second image of the second camera information.
Further, the synthesis module 1020, is used for:Determine the coincidence part of described first image and second image
Partial image and difference section image;According to described first image and the intersection image and difference section figure of second image
Picture, synthesizes the wide angle picture;Wherein, when the synthetic operation is performed, the intersection image is made to carry described first
Camera information and the second camera information.
Further, the control module 1040, is used for:According to the camera letter that the image in the touch area is carried
Breath, recognizes the camera corresponding to the image in the touch area.
Further, the control module 1040, is used for:If the touch control operation is grasped to slide touch control
Make, and the intersection image and the difference section image are crossed in the touch area of touch control operation, then recognize
Camera corresponding to the initial touch region of the touch area or the image of final touch area.
The function of the device described in the present embodiment is described in the embodiment shown in Fig. 1~Fig. 9, therefore this reality
Not detailed part in the description of example is applied, the related description in previous embodiment is may refer to, be will not be described here.
Device described in the present embodiment can call two cameras to gather image respectively, be closed by two images
Into can show and photograph compared to the broader wide angle picture in original image visual angle;When wide angle picture is shown, controlled according to touching
System operation can be adjusted to two acquisition parameters of camera simultaneously, and because the present invention is entered based on wide angle picture to camera
Row touch control is operated, so two acquisition parameters of camera can be referred to each other, and then can improve shooting quality.
It should be noted that herein, term " including ", "comprising" or its any other variant be intended to non-row
His property is included, so that process, method, article or device including a series of key elements not only include those key elements, and
And also include other key elements being not expressly set out, or also include for this process, method, article or device institute are intrinsic
Key element.In the absence of more restrictions, the key element limited by sentence "including a ...", it is not excluded that including this
Also there is other identical element in the process of key element, method, article or device.
The embodiments of the present invention are for illustration only, and the quality of embodiment is not represented.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side
Method can add the mode of required general hardware platform to realize by software, naturally it is also possible to by hardware, but in many cases
The former is more preferably implementation method.Based on such understanding, technical scheme is substantially done to prior art in other words
The part for going out contribution can be embodied in the form of software product, and the computer software product is stored in a storage medium
In (such as ROM/RAM, magnetic disc, CD), including some instructions are used to so that a station terminal equipment (can be mobile phone, computer, clothes
Business device, air-conditioner, or network equipment etc.) perform method described in each embodiment of the invention.
The preferred embodiments of the present invention are these are only, the scope of the claims of the invention is not thereby limited, it is every to utilize this hair
Equivalent structure or equivalent flow conversion that bright specification and accompanying drawing content are made, or directly or indirectly it is used in other related skills
Art field, is included within the scope of the present invention.
Claims (10)
1. a kind of filming control method of dual camera, it is characterised in that methods described includes:
The dual camera for calling visual angle to partially overlap gathers the first image and the second image respectively;
According to described first image and second image, synthesize wide angle picture;
When preview interface shows the wide angle picture, the touch area of detection touch control operation;
The camera corresponding to the image in the touch area is recognized, controls the camera to perform the touch control behaviour
Make.
2. method according to claim 1, it is characterised in that the dual camera for calling visual angle to partially overlap is adopted respectively
Collect the first image and the second image, including:
The dual camera that the visual angle partially overlaps includes:First camera and second camera;
The first camera collection is called to carry the first image of first camera information;
The second camera collection is called to carry the second image of the second camera information.
3. method according to claim 2, it is characterised in that described according to described first image and second image,
Synthesis wide angle picture, including:
Determine the intersection image and difference section image of described first image and second image;
According to described first image and the intersection image and difference section image of second image, synthesize the wide-angle figure
Picture;Wherein,
When the synthetic operation is performed, the intersection image is carried first camera information and described second and take the photograph
As header.
4. method according to claim 3, it is characterised in that corresponding to the image in the identification touch area
Camera, including:
According to the camera information that the image in the touch area is carried, recognize corresponding to the image in the touch area
Camera.
5. the method according to claim 3 or 4, it is characterised in that the image institute in the identification touch area is right
The camera answered, including:
If the touch control operation is to slide touch control operation, and institute is crossed in the touch area of touch control operation
Intersection image and the difference section image are stated, then recognizes the initial touch region of the touch area or final touch
Camera corresponding to the image in region.
6. a kind of imaging control device of dual camera, it is characterised in that described device includes:
Calling module, the dual camera for calling visual angle to partially overlap gathers the first image and the second image respectively;
Synthesis module, for according to described first image and second image, synthesizing wide angle picture;
Detection module, for when preview interface shows the wide angle picture, detecting the touch area of touch control operation;
Control module, for recognizing the camera corresponding to the image in the touch area, controls the camera to perform institute
State touch control operation.
7. device according to claim 6, it is characterised in that the calling module, is used for:
The dual camera that the visual angle partially overlaps includes:First camera and second camera;
The first camera collection is called to carry the first image of first camera information;
The second camera collection is called to carry the second image of the second camera information.
8. device according to claim 7, it is characterised in that the synthesis module, is used for:
Determine the intersection image and difference section image of described first image and second image;
According to described first image and the intersection image and difference section image of second image, synthesize the wide-angle figure
Picture;Wherein,
When the synthetic operation is performed, the intersection image is carried first camera information and described second and take the photograph
As header.
9. device according to claim 8, it is characterised in that the control module, is used for:
According to the camera information that the image in the touch area is carried, recognize corresponding to the image in the touch area
Camera.
10. device according to claim 8 or claim 9, it is characterised in that the control module, is used for:If described touch control
System operation is to slide touch control operation, and the intersection image and institute are crossed in the touch area of touch control operation
Difference section image is stated, is then recognized corresponding to the initial touch region of the touch area or the image of final touch area
Camera.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710058274.XA CN106713716B (en) | 2017-01-23 | 2017-01-23 | Shooting control method and device for double cameras |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710058274.XA CN106713716B (en) | 2017-01-23 | 2017-01-23 | Shooting control method and device for double cameras |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106713716A true CN106713716A (en) | 2017-05-24 |
CN106713716B CN106713716B (en) | 2020-03-31 |
Family
ID=58910248
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710058274.XA Active CN106713716B (en) | 2017-01-23 | 2017-01-23 | Shooting control method and device for double cameras |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106713716B (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108008889A (en) * | 2017-11-30 | 2018-05-08 | 努比亚技术有限公司 | Photographic method, mobile terminal and the computer-readable recording medium of flexible screen |
CN108062072A (en) * | 2017-11-23 | 2018-05-22 | 哈尔滨理工大学 | A kind of dual camera image collecting device and the flat-bottom milling cutter of image mosaic abrasion online test method |
CN109120818A (en) * | 2017-06-23 | 2019-01-01 | 华为技术有限公司 | A kind of image processing method, device and equipment |
CN110266983A (en) * | 2019-06-30 | 2019-09-20 | 联想(北京)有限公司 | A kind of image processing method, equipment and storage medium |
WO2019184667A1 (en) * | 2018-03-30 | 2019-10-03 | 深圳岚锋创视网络科技有限公司 | Color correction method for panoramic image and electronic device |
CN110933297A (en) * | 2019-11-12 | 2020-03-27 | 武汉联一合立技术有限公司 | Photographing control method and device of intelligent photographing system, storage medium and system |
CN112333386A (en) * | 2020-10-29 | 2021-02-05 | 维沃移动通信(杭州)有限公司 | Shooting method and device and electronic equipment |
CN112714255A (en) * | 2020-12-30 | 2021-04-27 | 维沃移动通信(杭州)有限公司 | Shooting method, shooting device, electronic equipment and readable storage medium |
US11409434B2 (en) | 2019-06-30 | 2022-08-09 | Lenovo (Beijing) Co., Ltd. | Image collection and processing method, apparatus, and storage medium |
Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101930162A (en) * | 2009-06-22 | 2010-12-29 | 富士胶片株式会社 | Imaging device and control method thereof |
CN102467341A (en) * | 2010-11-04 | 2012-05-23 | Lg电子株式会社 | Mobile terminal and method of controlling an image photographing therein |
CN103916582A (en) * | 2013-01-07 | 2014-07-09 | 华为技术有限公司 | Image processing method and device |
US20140285618A1 (en) * | 2013-03-21 | 2014-09-25 | Lg Electronics Inc. | Display device and method for controlling the same |
CN104123732A (en) * | 2014-07-14 | 2014-10-29 | 中国科学院信息工程研究所 | Online target tracking method and system based on multiple cameras |
CN105376396A (en) * | 2014-08-07 | 2016-03-02 | Lg电子株式会社 | Mobile terminal and controlling method thereof |
US20160127642A1 (en) * | 2014-10-30 | 2016-05-05 | Technion Research & Development Foundation Limited | Wide-scale terrestrial light-field imaging of the sky |
CN105791701A (en) * | 2016-04-27 | 2016-07-20 | 努比亚技术有限公司 | Image photographing device and image photographing method |
CN105991930A (en) * | 2016-07-19 | 2016-10-05 | 广东欧珀移动通信有限公司 | Zoom processing method and device for dual cameras and mobile terminal |
CN106060386A (en) * | 2016-06-08 | 2016-10-26 | 维沃移动通信有限公司 | Preview image generation method and mobile terminal |
CN106131449A (en) * | 2016-07-27 | 2016-11-16 | 维沃移动通信有限公司 | A kind of photographic method and mobile terminal |
CN106131416A (en) * | 2016-07-19 | 2016-11-16 | 广东欧珀移动通信有限公司 | Zoom processing method, device and the mobile terminal of dual camera |
CN106161964A (en) * | 2016-08-31 | 2016-11-23 | 宇龙计算机通信科技(深圳)有限公司 | A kind of photographic method and device |
KR101678861B1 (en) * | 2015-07-28 | 2016-11-23 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
CN106257909A (en) * | 2015-06-16 | 2016-12-28 | Lg电子株式会社 | Mobile terminal and control method thereof |
US20170006220A1 (en) * | 2015-06-30 | 2017-01-05 | Gopro, Inc. | Image stitching in a multi-camera array |
CN106341522A (en) * | 2015-07-08 | 2017-01-18 | Lg电子株式会社 | Mobile Terminal And Method For Controlling The Same |
-
2017
- 2017-01-23 CN CN201710058274.XA patent/CN106713716B/en active Active
Patent Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101930162A (en) * | 2009-06-22 | 2010-12-29 | 富士胶片株式会社 | Imaging device and control method thereof |
CN102467341A (en) * | 2010-11-04 | 2012-05-23 | Lg电子株式会社 | Mobile terminal and method of controlling an image photographing therein |
CN103916582A (en) * | 2013-01-07 | 2014-07-09 | 华为技术有限公司 | Image processing method and device |
US20140285618A1 (en) * | 2013-03-21 | 2014-09-25 | Lg Electronics Inc. | Display device and method for controlling the same |
CN104123732A (en) * | 2014-07-14 | 2014-10-29 | 中国科学院信息工程研究所 | Online target tracking method and system based on multiple cameras |
CN105376396A (en) * | 2014-08-07 | 2016-03-02 | Lg电子株式会社 | Mobile terminal and controlling method thereof |
US20160127642A1 (en) * | 2014-10-30 | 2016-05-05 | Technion Research & Development Foundation Limited | Wide-scale terrestrial light-field imaging of the sky |
CN106257909A (en) * | 2015-06-16 | 2016-12-28 | Lg电子株式会社 | Mobile terminal and control method thereof |
US20170006220A1 (en) * | 2015-06-30 | 2017-01-05 | Gopro, Inc. | Image stitching in a multi-camera array |
CN106341522A (en) * | 2015-07-08 | 2017-01-18 | Lg电子株式会社 | Mobile Terminal And Method For Controlling The Same |
KR101678861B1 (en) * | 2015-07-28 | 2016-11-23 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
CN105791701A (en) * | 2016-04-27 | 2016-07-20 | 努比亚技术有限公司 | Image photographing device and image photographing method |
CN106060386A (en) * | 2016-06-08 | 2016-10-26 | 维沃移动通信有限公司 | Preview image generation method and mobile terminal |
CN105991930A (en) * | 2016-07-19 | 2016-10-05 | 广东欧珀移动通信有限公司 | Zoom processing method and device for dual cameras and mobile terminal |
CN106131416A (en) * | 2016-07-19 | 2016-11-16 | 广东欧珀移动通信有限公司 | Zoom processing method, device and the mobile terminal of dual camera |
CN106131449A (en) * | 2016-07-27 | 2016-11-16 | 维沃移动通信有限公司 | A kind of photographic method and mobile terminal |
CN106161964A (en) * | 2016-08-31 | 2016-11-23 | 宇龙计算机通信科技(深圳)有限公司 | A kind of photographic method and device |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11095812B2 (en) | 2017-06-23 | 2021-08-17 | Huawei Technologies Co., Ltd. | Image processing method, apparatus, and device |
CN109120818A (en) * | 2017-06-23 | 2019-01-01 | 华为技术有限公司 | A kind of image processing method, device and equipment |
CN109120818B (en) * | 2017-06-23 | 2020-10-16 | 华为技术有限公司 | Image processing method, device and equipment |
CN108062072A (en) * | 2017-11-23 | 2018-05-22 | 哈尔滨理工大学 | A kind of dual camera image collecting device and the flat-bottom milling cutter of image mosaic abrasion online test method |
CN108008889A (en) * | 2017-11-30 | 2018-05-08 | 努比亚技术有限公司 | Photographic method, mobile terminal and the computer-readable recording medium of flexible screen |
WO2019184667A1 (en) * | 2018-03-30 | 2019-10-03 | 深圳岚锋创视网络科技有限公司 | Color correction method for panoramic image and electronic device |
US11948228B2 (en) | 2018-03-30 | 2024-04-02 | Arashi Vision Inc. | Color correction method for panoramic image and electronic device |
CN110266983A (en) * | 2019-06-30 | 2019-09-20 | 联想(北京)有限公司 | A kind of image processing method, equipment and storage medium |
US11409434B2 (en) | 2019-06-30 | 2022-08-09 | Lenovo (Beijing) Co., Ltd. | Image collection and processing method, apparatus, and storage medium |
CN110933297A (en) * | 2019-11-12 | 2020-03-27 | 武汉联一合立技术有限公司 | Photographing control method and device of intelligent photographing system, storage medium and system |
CN112333386A (en) * | 2020-10-29 | 2021-02-05 | 维沃移动通信(杭州)有限公司 | Shooting method and device and electronic equipment |
CN112714255A (en) * | 2020-12-30 | 2021-04-27 | 维沃移动通信(杭州)有限公司 | Shooting method, shooting device, electronic equipment and readable storage medium |
CN112714255B (en) * | 2020-12-30 | 2023-02-21 | 维沃移动通信(杭州)有限公司 | Shooting method and device, electronic equipment and readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN106713716B (en) | 2020-03-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106502693B (en) | A kind of image display method and device | |
CN106713716A (en) | Double cameras shooting control method and device | |
CN106454121A (en) | Double-camera shooting method and device | |
CN106909274A (en) | A kind of method for displaying image and device | |
CN107018331A (en) | A kind of imaging method and mobile terminal based on dual camera | |
CN106888349A (en) | A kind of image pickup method and device | |
CN106851128A (en) | A kind of video data handling procedure and device based on dual camera | |
CN105430258B (en) | A kind of method and apparatus of self-timer group photo | |
CN106873936A (en) | Electronic equipment and information processing method | |
CN107016639A (en) | A kind of image processing method and device | |
CN107071329A (en) | The method and device of automatic switchover camera in video call process | |
CN106909681A (en) | A kind of information processing method and its device | |
CN106911881A (en) | A kind of an action shot filming apparatus based on dual camera, method and terminal | |
CN107018334A (en) | A kind of applied program processing method and device based on dual camera | |
CN108668071A (en) | A kind of image pickup method, device, system and a kind of mobile terminal | |
CN106851125A (en) | A kind of mobile terminal and multiple-exposure image pickup method | |
CN106534552A (en) | Mobile terminal and photographing method thereof | |
CN106937056A (en) | The focusing process method of dual camera, focusing process device and mobile terminal | |
CN106993134A (en) | A kind of video generation device and method, terminal | |
CN107018326A (en) | A kind of image pickup method and device | |
CN106791016A (en) | A kind of photographic method and terminal | |
CN106790994A (en) | The triggering method and mobile terminal of control | |
CN107087108A (en) | A kind of image processing method and device based on dual camera | |
CN106657619A (en) | Screenshot method and device | |
CN106651823A (en) | Device and method for eliminating picture light spot and mobile terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |