CN106454087A - Shooting device and method - Google Patents

Shooting device and method Download PDF

Info

Publication number
CN106454087A
CN106454087A CN201610881993.7A CN201610881993A CN106454087A CN 106454087 A CN106454087 A CN 106454087A CN 201610881993 A CN201610881993 A CN 201610881993A CN 106454087 A CN106454087 A CN 106454087A
Authority
CN
China
Prior art keywords
eyeball
visual field
head
change
eyes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610881993.7A
Other languages
Chinese (zh)
Other versions
CN106454087B (en
Inventor
陈小翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201610881993.7A priority Critical patent/CN106454087B/en
Publication of CN106454087A publication Critical patent/CN106454087A/en
Application granted granted Critical
Publication of CN106454087B publication Critical patent/CN106454087B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body

Abstract

The invention discloses a shooting device and method. The device comprises a detection module, a determining module and a shooting module; the detection module detects the change angles of the head and eyeballs through a front-facing camera under a preset vision tracking and shooting module; the determining module determines the visual field change of eyes according to the detected change angle of the head and/or the eyeballs; and the shooting module shoots pictures which are consistent with the visual field of the eyes through a rear camera. According to the scheme of the embodiment of the invention, the pictures which are consistent with the visual field of a user can be shot according to the visual field change of the user, so that user experience and shooting satisfaction of the user can be improved.

Description

A kind of filming apparatus and method
Technical field
The present invention relates to terminal applies design field, more particularly, to a kind of filming apparatus and method.
Background technology
Widely available with terminal camera function, people's also more and more higher of the requirement to its shooting effect.For example, with When family is shot, generally there is situation generation that is scintillating, catching sight of the scenery liked or material object, but when user really claps When taking the photograph, perhaps terminal can't find accurate focus, thus the effect that user wants can not be shot, at this moment due to terminal work( The limitation of energy, it will usually miss beautiful scenery, reduces the experience sense of user.Therefore, how to shoot exactly in the user visual field Scenery, is the problem of current technical staff's urgent need to resolve.
Content of the invention
Present invention is primarily targeted at proposing a kind of filming apparatus and method, can be shot according to the visual field change of user Go out the photo consistent with the user visual field, improve the experience sense of user and shoot satisfaction.
For achieving the above object, the invention provides a kind of filming apparatus, this device includes:Detection module, determining module And taking module.
Detection module, for, under the track up pattern of the default visual field, detecting head and eyeball by front-facing camera Angle changing.
Determining module, for determining the visual field change of eyes according to the head detecting and/or eyeball angle changing.
Taking module, for shooting the photo consistent with the visual field of eyes by post-positioned pick-up head.
Alternatively, by front-facing camera, detection module detects that head and eyeball angle changing include:
Head angle changing is detected by default head position detection technique.
Detect eyeball angle changing in conjunction with head angle changing and default eyeball detection technique.
Optionally it is determined that module determines the visual field change bag of eyes according to the head detecting and/or eyeball angle changing Include:
Obtain the first eyeball and the initial position of the second eyeball, and the first eyeball and the second eyeball are determined according to initial position Initial visual focus and the initial visual field corresponding with this initial visual focus.
Determined after the first eyeball and the change of the second eyeball according to initial position, head angle changing and eyeball angle changing Position, and according to change after position determine the visual focus after the first eyeball and the change of the second eyeball and with this change after The visual field after the corresponding change of visual focus.
Optionally it is determined that module according to initial position determine the first eyeball and the second eyeball initial visual focus and with this The corresponding initial visual field of initial visual focus includes:
Obtain the second initial position of first eyeball the first initial position and the second eyeball.
In default analog systemss, according to the first initial position of the first eyeball determine the first imitation pupil central point with First line of the first view membrane simulation point, and the second imitation pupil center is determined according to the second initial position of the second eyeball Put the second line with the second view membrane simulation point.
The intersection point of the first line and the extension line of the second line is burnt as the initial visual of the first eyeball and the second eyeball Point.
By the first angle being middle separated time with the first line, and comprised by the second angle of middle separated time with the second line Spatial dimension is defined as the initial visual field.
Optionally it is determined that according to the position after change, module determines that the vision after the first eyeball and the change of the second eyeball is burnt The visual field after point and change corresponding with the visual focus after this change includes:
In default analog systemss, the 3rd position after the change according to the first eyeball determines the first imitation pupil center Put the 3rd line with the first view membrane simulation point, and the second simulation is determined according to the 4th position after the change of the second eyeball Pupil center's point and the 4th line of the second view membrane simulation point.
Using the intersection point of the 3rd line and the extension line of the 4th line as regarding after the change of the first eyeball and the second eyeball Feel focus.
By the third angle being middle separated time with the 3rd line, and comprised by the fourth angle of middle separated time with the 4th line Spatial dimension is defined as changing rearward vision.
Alternatively, taking module shoots the photo consistent with the visual field of eyes by post-positioned pick-up head and includes;
Shoot the photo consistent with the final visual field of eyes, or entirely regarding in shooting and the visual field change procedure of eyes Wild consistent photo.
Alternatively,
Taking module shoots the photo consistent with the final visual field of eyes by post-positioned pick-up head and includes:
At the camera lens focal adjustments of post-positioned pick-up head to final visual focus.
Shoot photo according to default acquisition parameters.
Optionally it is determined that module is additionally operable to:
When shutter start-up operation finishes, last visual focus that front-facing camera is detected is as final vision Focus, or when the time that front-facing camera detects one visual focus of eye gaze is more than or equal to default time threshold When, this visual focus is defined as final visual focus.
Alternatively, taking module shoots the photo consistent with the whole visual field in the visual field change procedure of eyes and includes:
Using the visual focus after initial visual focus, one or more change as the camera lens focus of post-positioned pick-up head, point Do not shoot multiple pictures with default acquisition parameters successively.
The multiple pictures being arranged in order are carried out with splicing acquisition consistent with the whole visual field in the visual field change procedure of eyes Photo.
Additionally, for achieving the above object, present invention also offers a kind of image pickup method, the method includes:
Under the track up pattern of the default visual field, head and eyeball angle changing are detected by front-facing camera.
Determine the visual field change of eyes according to the head detecting and/or eyeball angle changing.
The photo consistent with the visual field of eyes is shot by post-positioned pick-up head.
Alternatively, detect that head and eyeball angle changing include by front-facing camera:
Head angle changing is detected by default head position detection technique.
Detect eyeball angle changing in conjunction with head angle changing and default eyeball detection technique.
Alternatively, included according to the visual field change that the head detecting and/or eyeball angle changing determine eyes:
Obtain the first eyeball and the initial position of the second eyeball, and the first eyeball and the second eyeball are determined according to initial position Initial visual focus and the initial visual field corresponding with this initial visual focus.
Determined after the first eyeball and the change of the second eyeball according to initial position, head angle changing and eyeball angle changing Position, and according to change after position determine the visual focus after the first eyeball and the change of the second eyeball and with this change after The visual field after the corresponding change of visual focus.
Alternatively, according to initial position determine the first eyeball and the second eyeball initial visual focus and with this initial visual The corresponding initial visual field of focus includes:
Obtain the second initial position of first eyeball the first initial position and the second eyeball.
In default analog systemss, according to the first initial position of the first eyeball determine the first imitation pupil central point with First line of the first view membrane simulation point, and the second imitation pupil center is determined according to the second initial position of the second eyeball Put the second line with the second view membrane simulation point.
The intersection point of the first line and the extension line of the second line is burnt as the initial visual of the first eyeball and the second eyeball Point.
By the first angle being middle separated time with the first line, and comprised by the second angle of middle separated time with the second line Spatial dimension is defined as the initial visual field.
Alternatively, according to change after position determine the visual focus after the first eyeball and the change of the second eyeball and with change The visual field after the corresponding change of visual focus after change includes:
In default analog systemss, the 3rd position after the change according to the first eyeball determines the first imitation pupil center Put the 3rd line with the first view membrane simulation point, and the second simulation is determined according to the 4th position after the change of the second eyeball Pupil center's point and the 4th line of the second view membrane simulation point.
Using the intersection point of the 3rd line and the extension line of the 4th line as regarding after the change of the first eyeball and the second eyeball Feel focus.
By the third angle being middle separated time with the 3rd line, and comprised by the fourth angle of middle separated time with the 4th line Spatial dimension is defined as changing rearward vision.
Alternatively, shoot the photo consistent with the visual field of eyes by post-positioned pick-up head to include;
Shoot the photo consistent with the final visual field of eyes, or entirely regarding in shooting and the visual field change procedure of eyes Wild consistent photo.
Alternatively, shoot the photo consistent with the final visual field of eyes by post-positioned pick-up head to include:
At the camera lens focal adjustments of post-positioned pick-up head to final visual focus.
Shoot described photo according to default acquisition parameters.
Alternatively, the method also includes:
When shutter start-up operation finishes, last visual focus that front-facing camera is detected is as final vision Focus, or when the time that front-facing camera detects one visual focus of eye gaze is more than or equal to default time threshold When, this visual focus is defined as final visual focus.
Alternatively, shoot the photo consistent with the whole visual field in the visual field change procedure of eyes to include:
Using the visual focus after initial visual focus, one or more change as the camera lens focus of post-positioned pick-up head, point Do not shoot multiple pictures with default acquisition parameters successively.
The multiple pictures being arranged in order are carried out with splicing acquisition consistent with the whole visual field in the visual field change procedure of eyes Photo.
The present invention proposes a kind of filming apparatus and method, and this device includes:Detection module, determining module and shooting mould Block.Detection module, under the track up pattern of the default visual field, detects head and eyeball angle changing by front-facing camera;Really Cover half tuber determines the visual field change of eyes according to the head detecting and/or eyeball angle changing;Taking module passes through rearmounted taking the photograph As head shoots the photo consistent with the visual field of eyes.By embodiment of the present invention scheme, can be clapped according to the visual field change of user Take out the photo consistent with the user visual field, improve the experience sense of user and shoot satisfaction.
Brief description
Fig. 1 is the hardware architecture diagram realizing the optional mobile terminal of each embodiment of the present invention one;
Fig. 2 is the wireless communication system schematic diagram of mobile terminal as shown in Figure 1;
Fig. 3 is the structural representation of the filming apparatus of the embodiment of the present invention;
Fig. 4 is the image pickup method flow chart of the embodiment of the present invention;
Fig. 5 is the image pickup method schematic diagram of the embodiment of the present invention;
Fig. 6 determines schematic diagram for initial visual focus in the image pickup method of the embodiment of the present invention;
Fig. 7 determines schematic diagram for the visual field initial in the image pickup method of the embodiment of the present invention;
Fig. 8 determines schematic diagram for the visual focus after changing in the image pickup method of the embodiment of the present invention;
Fig. 9 determines schematic diagram for the visual field after changing in the image pickup method of the embodiment of the present invention.
The realization of the object of the invention, functional characteristics and advantage will be described further in conjunction with the embodiments referring to the drawings.
Specific embodiment
It should be appreciated that specific embodiment described herein, only in order to explain the present invention, is not intended to limit the present invention.
Realize the optional mobile terminal of each embodiment of the present invention one referring now to Description of Drawings.In follow-up description In, using such as " module ", " part " or " unit " for representing element suffix only for being conducive to the explanation of the present invention, Itself does not have specific meaning.Therefore, " module " and " part " can mixedly use.
Mobile terminal can be implemented in a variety of manners.For example, the terminal described in the present invention can include such as moving Phone, smart phone, notebook computer, digit broadcasting receiver, PDA (personal digital assistant), PAD (panel computer), PMP The mobile terminal of (portable media player), guider etc. and such as numeral TV, desk computer etc. consolidate Determine terminal.Hereinafter it is assumed that terminal is mobile terminal.However, it will be understood by those skilled in the art that, except being used in particular for moving Outside the element of purpose, construction according to the embodiment of the present invention can also apply to the terminal of fixed type.
Fig. 1 is that the hardware configuration of the mobile terminal realizing each embodiment of the present invention is illustrated.
Mobile terminal 1 00 can include wireless communication unit 110, A/V (audio/video) input block 120, user input Unit 130, sensing unit 140, output unit 150, memorizer 160, interface unit 170, controller 180 and power subsystem 190 Etc..Fig. 1 shows the mobile terminal with various assemblies, it should be understood that being not required for implementing all groups illustrating Part.More or less of assembly can alternatively be implemented.Will be discussed in more detail below the element of mobile terminal.
Wireless communication unit 110 generally includes one or more assemblies, and it allows mobile terminal 1 00 and wireless communication system Or the radio communication between network.For example, wireless communication unit can include broadcasting reception module 111, mobile communication module 112nd, at least one of wireless Internet module 113, short range communication module 114 and location information module 115.
Broadcasting reception module 111 receives broadcast singal and/or broadcast via broadcast channel from external broadcast management server Relevant information.Broadcast channel can include satellite channel and/or terrestrial channel.Broadcast management server can be generated and sent The broadcast singal generating before the server of broadcast singal and/or broadcast related information or reception and/or broadcast related information And send it to the server of terminal.Broadcast singal can include TV broadcast singal, radio signals, data broadcasting Signal etc..And, broadcast singal may further include the broadcast singal combining with TV or radio signals.Broadcast phase Pass information can also provide via mobile communications network, and in this case, broadcast related information can be by mobile communication mould Block 112 is receiving.Broadcast singal can exist in a variety of manners, and for example, it can be with the electronics of DMB (DMB) The form of program guide (EPG), the electronic service guidebooks (ESG) of digital video broadcast-handheld (DVB-H) etc. and exist.Broadcast Receiver module 111 can be broadcasted by using various types of broadcast system receipt signals.Especially, broadcasting reception module 111 Can be wide by using such as multimedia broadcasting-ground (DMB-T), DMB-satellite (DMB-S), digital video Broadcast-hand-held (DVB-H), forward link media (MediaFLO@) Radio Data System, received terrestrial digital broadcasting integrated service Etc. (ISDB-T) digit broadcasting system receives digital broadcasting.Broadcasting reception module 111 may be constructed such that and is adapted to provide for extensively Broadcast the various broadcast systems of signal and above-mentioned digit broadcasting system.Via broadcasting reception module 111 receive broadcast singal and/ Or broadcast related information can be stored in memorizer 160 (or other types of storage medium).
Mobile communication module 112 sends radio signals to base station (for example, access point, node B etc.), exterior terminal And at least one of server and/or receive from it radio signal.Such radio signal can include voice and lead to Words signal, video calling signal or the various types of data sending and/or receiving according to text and/or Multimedia Message.
Wireless Internet module 113 supports the Wi-Fi (Wireless Internet Access) of mobile terminal.This module can be internally or externally It is couple to terminal.Wi-Fi (Wireless Internet Access) technology involved by this module can include WLAN (WLAN) (Wi-Fi), Wibro (WiMAX), Wimax (worldwide interoperability for microwave accesses), HSDPA (high-speed downlink packet access) etc..
Short range communication module 114 is the module for supporting junction service.Some examples of short-range communication technology include indigo plant ToothTM, RF identification (RFID), Infrared Data Association (IrDA), ultra broadband (UWB), purple honeybeeTMEtc..
Location information module 115 be for check or obtain mobile terminal positional information module.Location information module Typical case be GPS (global positioning system).According to current technology, GPS module 115 calculates and is derived from three or more satellites Range information and correct time information and for the Information application triangulation calculating, thus according to longitude, latitude Highly accurately calculate three-dimensional current location information.Currently, the method for calculating position and temporal information is defended using three Star and the error of the position that calculates by using other satellite correction and temporal information.Additionally, GPS module 115 Can be by Continuous plus current location information in real time come calculating speed information.
A/V input block 120 is used for receiving audio or video signal.A/V input block 120 can include camera 121 He Mike 1220, camera 121 is to the static map being obtained by image capture apparatus in Video Capture pattern or image capture mode The view data of piece or video is processed.Picture frame after process may be displayed on display unit 151.At camera 121 Picture frame after reason can be stored in memorizer 160 (or other storage medium) or carry out via wireless communication unit 110 Send, two or more cameras 1210 can be provided according to the construction of mobile terminal.Mike 122 can be in telephone relation mould Sound (voice data) is received via mike in formula, logging mode, speech recognition mode etc. operational mode, and can be by Such acoustic processing is voice data.Audio frequency (voice) data after process can be changed in the case of telephone calling model For can be sent to the form output of mobile communication base station via mobile communication module 112.Mike 122 can implement all kinds Noise eliminate (or suppression) algorithm with eliminate (or suppression) receive and the noise that produces during sending audio signal or Person disturbs.
User input unit 130 can generate key input data to control each of mobile terminal according to the order of user input Plant operation.User input unit 130 allows the various types of information of user input, and can include keyboard, metal dome, touch Plate (for example, detection due to touched and lead to resistance, pressure, the change of electric capacity etc. sensitive component), roller, rocking bar etc. Deng.Especially, when touch pad is superimposed upon on display unit 151 as a layer, touch screen can be formed.
Sensing unit 140 detect mobile terminal 1 00 current state, (for example, mobile terminal 1 00 open or close shape State), the position of mobile terminal 1 00, user is for the presence or absence of the contact (that is, touch input) of mobile terminal 1 00, mobile terminal 100 orientation, the acceleration or deceleration movement of mobile terminal 1 00 and direction etc., and generate for controlling mobile terminal 1 00 The order of operation or signal.For example, when mobile terminal 1 00 is embodied as sliding-type mobile phone, sensing unit 140 can sense This sliding-type phone opens or cuts out.In addition, sensing unit 140 can detect power subsystem 190 whether provide electric power or Whether person's interface unit 170 is coupled with external device (ED).Sensing unit 140 can include proximity transducer 1410 and will combine below Touch screen this is described.
Interface unit 170 is connected, with mobile terminal 1 00, the interface that can pass through as at least one external device (ED).For example, External device (ED) can include wired or wireless head-band earphone port, external power source (or battery charger) port, wired or nothing Line FPDP, memory card port, the port of device for connection with identification module, audio input/output (I/O) end Mouth, video i/o port, ear port etc..Identification module can be storage for verifying that user uses each of mobile terminal 1 00 Kind of information and subscriber identification module (UIM), client identification module (SIM), Universal Subscriber identification module (USIM) can be included Etc..In addition, the device (hereinafter referred to as " identifying device ") with identification module can take the form of smart card, therefore, know Other device can be connected with mobile terminal 1 00 via port or other attachment means.Interface unit 170 can be used for reception and is derived from The input (for example, data message, electric power etc.) of the external device (ED) and input receiving is transferred in mobile terminal 1 00 One or more elements or can be used for transmission data between mobile terminal and external device (ED).
In addition, when mobile terminal 1 00 is connected with external base, interface unit 170 can serve as allowing by it by electricity Power provides the path of mobile terminal 1 00 from base or can serve as allowing the various command signals from base input to pass through it It is transferred to the path of mobile terminal.May serve as identifying that mobile terminal is from the various command signals of base input or electric power The no signal being accurately fitted within base.Output unit 150 is configured to defeated with the offer of vision, audio frequency and/or tactile manner Go out signal (for example, audio signal, video signal, alarm signal, vibration signal etc.).Output unit 150 can include showing Unit 151, dio Output Modules 152, alarm unit 153 etc..
Display unit 151 may be displayed on the information processing in mobile terminal 1 00.For example, when mobile terminal 1 00 is in electricity During words call mode, display unit 151 can show (for example, text messaging, the multimedia file that communicate with call or other Download etc.) related user interface (UI) or graphic user interface (GUI).When mobile terminal 1 00 is in video calling pattern Or during image capture mode, display unit 151 can show the image of capture and/or the image of reception, illustrate video or figure UI or GUI of picture and correlation function etc..
Meanwhile, when display unit 151 and the touch pad touch screen with formation superposed on one another as a layer, display unit 151 can serve as input equipment and output device.Display unit 151 can include liquid crystal display (LCD), thin film transistor (TFT) In LCD (TFT-LCD), Organic Light Emitting Diode (OLED) display, flexible display, three-dimensional (3D) display etc. at least A kind of.Some in these display may be constructed such that transparence to allow user from outside viewing, and this is properly termed as transparent Display, typical transparent display can be, for example, TOLED (transparent organic light emitting diode) display etc..According to specific The embodiment wanted, mobile terminal 1 00 can include two or more display units (or other display device), for example, moves Dynamic terminal can include outernal display unit (not shown) and inner display unit (not shown).Touch screen can be used for detection and touches Input pressure and touch input position and touch input area.
Dio Output Modules 152 can mobile terminal be in call signal reception pattern, call mode, logging mode, When under the isotypes such as speech recognition mode, broadcast reception mode, that wireless communication unit 110 is received or in memorizer 160 The voice data transducing audio signal of middle storage and be output as sound.And, dio Output Modules 152 can provide and move The audio output (for example, call signal receives sound, message sink sound etc.) of the specific function correlation of terminal 100 execution. Dio Output Modules 152 can include speaker, buzzer etc..
Alarm unit 153 can provide output to notify event to mobile terminal 1 00.Typical event is permissible Including calling reception, message sink, key signals input, touch input etc..In addition to audio or video output, alarm unit 153 can provide output in a different manner with the generation of notification event.For example, alarm unit 153 can be in the form of vibrating Output is provided, enters when communicating (incomingcommunication) when receiving calling, message or some other, alarm list Unit 153 can provide tactile output (that is, vibrating) to notify to user.By providing such tactile output, even if When the mobile phone of user is in the pocket of user, user also can recognize that the generation of various events.Alarm unit 153 The output of the generation of notification event can be provided via display unit 151 or dio Output Modules 152.
Memorizer 160 can store software program of the process being executed by controller 180 and control operation etc., or can Temporarily to store oneself data (for example, telephone directory, message, still image, video etc.) through exporting or will export.And And, memorizer 160 can be to store the vibration of various modes with regard to exporting and audio signal when touching and being applied to touch screen Data.
Memorizer 160 can include the storage medium of at least one type, and described storage medium includes flash memory, hard disk, many Media card, card-type memorizer (for example, SD or DX memorizer etc.), random access storage device (RAM), static random-access storage Device (SRAM), read only memory (ROM), Electrically Erasable Read Only Memory (EEPROM), programmable read only memory (PROM), magnetic storage, disk, CD etc..And, mobile terminal 1 00 can execute memorizer with by network connection The network storage device cooperation of 160 store function.
Controller 180 generally controls the overall operation of mobile terminal.For example, controller 180 execution and voice call, data The related control of communication, video calling etc. and process.In addition, controller 180 can be included for reproducing (or playback) many matchmakers The multi-media module 1810 of volume data, multi-media module 1810 can construct in controller 180, or it is so structured that and control Device 180 processed separates.Controller 180 can be with execution pattern identifying processing, by the handwriting input executing on the touchscreen or figure Piece is drawn input and is identified as character or image.
Power subsystem 190 receives external power or internal power under the control of controller 180 and provides operation each unit Suitable electric power needed for part and assembly.
Various embodiment described herein can be with using such as computer software, hardware or its any combination of calculating Machine computer-readable recording medium is implementing.Hardware is implemented, embodiment described herein can be by using application-specific IC (ASIC), digital signal processor (DSP), digital signal processing device (DSPD), programmable logic device (PLD), scene can Program gate array (FPGA), processor, controller, microcontroller, microprocessor, be designed to execute function described herein At least one in electronic unit implementing, in some cases, can be implemented in controller 180 by such embodiment. Software is implemented, the embodiment of such as process or function can with allow to execute the single of at least one function or operation Software module is implementing.Software code can be come by the software application (or program) write with any suitable programming language Implement, software code can be stored in memorizer 160 and be executed by controller 180.
So far, oneself is through describing mobile terminal according to its function.Below, for the sake of brevity, will describe such as folded form, Slide type mobile terminal in various types of mobile terminals of board-type, oscillating-type, slide type mobile terminal etc. is as showing Example.Therefore, the present invention can be applied to any kind of mobile terminal, and is not limited to slide type mobile terminal.
As shown in Figure 1 mobile terminal 1 00 may be constructed such that using via frame or packet transmission data all if any Line and wireless communication system and satellite-based communication system are operating.
The communication system being wherein operable to according to the mobile terminal of the present invention referring now to Fig. 2 description.
Such communication system can use different air interfaces and/or physical layer.For example, used by communication system Air interface includes such as frequency division multiple access (FDMA), time division multiple acess (TDMA), CDMA (CDMA) and universal mobile communications system System (UMTS) (especially, Long Term Evolution (LTE)), global system for mobile communications (GSM) etc..As non-limiting example, under The description in face is related to cdma communication system, but such teaching is equally applicable to other types of system.
With reference to Fig. 2, cdma wireless communication system can include multiple mobile terminal 1s 00, multiple base station (BS) 270, base station Controller (BSC) 275 and mobile switching centre (MSC) 280.MSC280 is configured to and Public Switched Telephony Network (PSTN) 290 formation interfaces.MSC280 is also structured to and can form interface via the BSC275 that back haul link is couple to base station 270. If back haul link can construct according to any one in the interface that Ganji knows, described interface includes such as E1/T1, ATM, IP, PPP, frame relay, HDSL, ADSL or xDSL.It will be appreciated that system as shown in Figure 2 can include multiple BSC2750.
Each BS270 can service one or more subregions (or region), by the sky of multidirectional antenna or sensing specific direction Each subregion that line covers is radially away from BS270.Or, each subregion can by for diversity reception two or more Antenna covers.Each BS270 may be constructed such that support multiple frequency distribution, and the distribution of each frequency has specific frequency spectrum (for example, 1.25MHz, 5MHz etc.).
Intersecting that subregion and frequency are distributed can be referred to as CDMA Channel.BS270 can also be referred to as base station transceiver System (BTS) or other equivalent terms.In this case, term " base station " can be used for broadly representing single BSC275 and at least one BS270.Base station can also be referred to as " cellular station ".Or, each subregion of specific BS270 can be claimed For multiple cellular stations.
As shown in Figure 2, broadcast singal is sent to the mobile terminal of operation in system by broadcsting transmitter (BT) 295 100.Broadcasting reception module 111 is arranged at mobile terminal 1 00 to receive the broadcast being sent by BT295 as shown in Figure 1 Signal.In fig. 2 it is shown that several global positioning system (GPS) satellites 300.Satellite 300 helps position multiple mobile terminals At least one of 100.
In fig. 2, depict multiple satellites 300, it is understood that be, it is possible to use any number of satellite obtains useful Location information.GPS module 115 is generally configured to coordinate with satellite 300 to obtain the positioning letter wanted as shown in Figure 1 Breath.Substitute GPS tracking technique or outside GPS tracking technique, it is possible to use other of the position of mobile terminal can be followed the tracks of Technology.In addition, at least one gps satellite 300 can optionally or additionally process satellite dmb transmission.
As a typical operation of wireless communication system, BS270 receives the reverse link from various mobile terminal 1s 00 Signal.Mobile terminal 1 00 generally participates in call, information receiving and transmitting and other types of communication.Each of certain base station 270 reception is anti- Processed in specific BS270 to link signal.The data obtaining is forwarded to the BSC275 of correlation.BSC provides call Resource allocation and the mobile management function of including the coordination of soft switching process between BS270.BSC275 is also by the number receiving According to being routed to MSC280, it provides the extra route service for forming interface with PSTN290.Similarly, PSTN290 with MSC280 forms interface, and MSC and BSC275 form interface, and BSC275 correspondingly controls BS270 with by forward link signals It is sent to mobile terminal 1 00.
Based on above-mentioned optional mobile terminal hardware configuration and communication system, each embodiment of the inventive method is proposed.
As shown in figure 3, first embodiment of the invention proposes a kind of filming apparatus 1, this device includes:Detection module 01, Determining module 02 and taking module 03.
Detection module 01, for, under the track up pattern of the default visual field, detecting head and eye by front-facing camera Ball angle changing.
Determining module 02, for determining the visual field change of eyes according to the head detecting and/or eyeball angle changing.
Taking module 03, for shooting the photo consistent with the visual field of eyes by post-positioned pick-up head.
Alternatively, by front-facing camera, detection module 01 detects that head and eyeball angle changing include:
Head angle changing is detected by default head position detection technique.
Detect eyeball angle changing in conjunction with head angle changing and default eyeball detection technique.
Optionally it is determined that module 02 determines the visual field change of eyes according to the head detecting and/or eyeball angle changing Including:
Obtain the first eyeball and the initial position of the second eyeball, and the first eyeball and the second eyeball are determined according to initial position Initial visual focus and the initial visual field corresponding with this initial visual focus.
Determined after the first eyeball and the change of the second eyeball according to initial position, head angle changing and eyeball angle changing Position, and according to change after position determine the visual focus after the first eyeball and the change of the second eyeball and with this change after The visual field after the corresponding change of visual focus.
Optionally it is determined that module 02 according to initial position determine the first eyeball and the second eyeball initial visual focus and with The corresponding initial visual field of this initial visual focus includes:
Obtain the second initial position of first eyeball the first initial position and the second eyeball.
In default analog systemss, according to the first initial position of the first eyeball determine the first imitation pupil central point with First line of the first view membrane simulation point, and the second imitation pupil center is determined according to the second initial position of the second eyeball Put the second line with the second view membrane simulation point.
The intersection point of the first line and the extension line of the second line is burnt as the initial visual of the first eyeball and the second eyeball Point.
By the first angle being middle separated time with the first line, and comprised by the second angle of middle separated time with the second line Spatial dimension is defined as the initial visual field.
Optionally it is determined that module 02 determines the vision after the first eyeball and the change of the second eyeball according to the position after change The visual field after focus and change corresponding with the visual focus after this change includes:
In default analog systemss, the 3rd position after the change according to the first eyeball determines the first imitation pupil center Put the 3rd line with the first view membrane simulation point, and the second simulation is determined according to the 4th position after the change of the second eyeball Pupil center's point and the 4th line of the second view membrane simulation point.
Using the intersection point of the 3rd line and the extension line of the 4th line as regarding after the change of the first eyeball and the second eyeball Feel focus.
By the third angle being middle separated time with the 3rd line, and comprised by the fourth angle of middle separated time with the 4th line Spatial dimension is defined as changing rearward vision.
Alternatively, taking module 03 shoots the photo consistent with the visual field of eyes by post-positioned pick-up head and includes;
Shoot the photo consistent with the final visual field of eyes, or entirely regarding in shooting and the visual field change procedure of eyes Wild consistent photo.
Alternatively,
Taking module 03 shoots the photo consistent with the final visual field of eyes by post-positioned pick-up head and includes:
At the camera lens focal adjustments of post-positioned pick-up head to final visual focus.
Shoot photo according to default acquisition parameters.
Optionally it is determined that module 02 is additionally operable to:
When shutter start-up operation finishes, last visual focus that front-facing camera is detected is as final vision Focus, or when the time that front-facing camera detects one visual focus of eye gaze is more than or equal to default time threshold When, this visual focus is defined as final visual focus.
Alternatively, taking module 03 shoots the photo consistent with the whole visual field in the visual field change procedure of eyes and includes:
Using the visual focus after initial visual focus, one or more change as the camera lens focus of post-positioned pick-up head, point Do not shoot multiple pictures with default acquisition parameters successively.
The multiple pictures being arranged in order are carried out with splicing acquisition consistent with the whole visual field in the visual field change procedure of eyes Photo.
Additionally, second embodiment of the invention also proposed a kind of image pickup method, the method can apply to there is double shootings In the terminal of head, as shown in Figure 4, Figure 5, the method includes S101-S103:
S101, under the track up pattern of the default visual field, head and eyeball angle changing are detected by front-facing camera.
In embodiments of the present invention, in order to distinguish with common screening-mode, can be in default visual field track up Embodiment of the present invention scheme is completed under pattern.Terminal can be entered by detecting the trigger action of default visual field track up pattern Enter visual field track up pattern.This default trigger action can include one or more of:Finger manipulation, gesture high up in the air with And voice command etc..In embodiment of the present invention scheme, the concrete form for this trigger action is not limited.
In embodiments of the present invention, after entering this default visual field track up pattern, in order to accurately obtain regarding of user Wild change, can complete visual field detection by one of dual camera photographic head, and testing result is transferred to another to take the photograph As head, shot according to visual field testing result by another photographic head.Need before this to first pass through default detection technique Detect head and/or eyeball angle changing, to become according to the visual field that this head and/or eyeball angle changing determine eyes Change.The detection process of specific head and/or eyeball angle changing can be realized by below scheme.
Alternatively, detect that head and eyeball angle changing include S201-S202 by front-facing camera:
S201, by default head position detection technique detect head angle changing.
In embodiments of the present invention, during user's viewing scene, generally with the twisting of head, to coordinate eyes Obtain more scenes, therefore, calculate eyeball angle changing when, need by the end rotation angle of user calculate in the lump into Go, to obtain accurate eyeball translation-angle.Accordingly, it would be desirable to first detect the head angle changing of photographer.Real in the present invention Apply in example, the head angle changing of photographer, for example, computer vision can be detected by default head position detection technique Technology.
Computer vision technique be a research how using the technology of machine " seeing ", further it is simply that referring to use Camera and computer replace human eye that target is identified, follows the tracks of and measures etc. with machine vision, and do graphics process further, make Computer is treated as the image being more suitable for eye-observation or sending instrument detection to.Head can accurately be obtained by this technology to turn Positional information before and after dynamic, and head angle changing is determined according to the positional information before and after this.
S202, combine head angle changing and default eyeball detection technique and detect eyeball angle changing.
In embodiments of the present invention, on the basis of obtaining user's head angle changing, need to obtain further eyeball Angle changing, with the accurate visual field change following the trail of user.
Visual field change due to eyes is to be realized by the rotation of eyeball, therefore, inspection to the visual field change of eyes Survey can be realized by detecting to the rotation of eyeball.Specifically, can be completed by default eyeball detection technique, this is pre- If eyeball detection technique can be eyeball tracking technology.
Eyeball tracking technology is a scientific application technology, and from principle, eyeball tracking technology mainly studies eyeball The acquisition of movable information, modeling and simulation.When the eyes of people are seen to different directions, eye has trickle change, these changes Change can produce the feature that can extract, and computer can extract these features by picture catching or scanning, thus real-time tracing The change of eyes, the state of prediction user and demand, and responded, reach the purpose with eyes control device.Obtain eyeball The method of movable information includes:One is to be tracked according to the changing features of eyeball and eyeball periphery;Two is according to iris angle Change is tracked;Three is actively to project the light beams such as infrared ray to extract feature to iris.Therefore obtain Eyeball motion information Equipment, in addition to infrared equipment, can also be the photographic head on image capture device, or even general computer or mobile phone, it is soft Eye tracking can also be realized under the support of part.
In embodiments of the present invention, after the movable information by above-mentioned eyeball tracking technical limit spacing eyeball, just permissible Change in location information in the whole motor process of eyeball is obtained according to this movable information, thus passing through this eyeball position change information Calculate the angle change information of eyeball.
In embodiments of the present invention, eyeball tracking technology can be realized according to any one or more above-mentioned method, This is not limited for its concrete methods of realizing.
The head that S102, basis detect and/or eyeball angle changing determine that the visual field of eyes changes.
In embodiments of the present invention, determine head and/or the eyeball angle changing of photographer by above scheme after, just The visual field after change can be obtained according to the eyeball angles after change, specifically can be realized by below scheme.
Alternatively, S301-S302 is included according to the visual field change of the head detecting and/or determination eyes:
S301, obtain the initial position of the first eyeball and the second eyeball, and the first eyeball and the are determined according to initial position The initial visual focus of two eyeballs and the initial visual field corresponding with this initial visual focus.
In embodiments of the present invention, determined in the visual field change procedure of eyes according to eyeball angle changing, first can root Determine the initial visual field of eyes according to the initial position of eyeball.Specifically may be referred to shown in Fig. 6, realized by below scheme.
Alternatively, according to initial position determine the first eyeball and the second eyeball initial visual focus and with this initial visual The corresponding initial visual field of focus includes S401-S404:
Second initial position of S401, acquisition first eyeball the first initial position and the second eyeball.
In embodiments of the present invention, determine the work of the second initial position of first eyeball the first initial position and the second eyeball Work can be carried out when by eyeball tracking technology for detection eyeball angle changing, specifically related to determines in embodiments of the present invention The first following imitation pupil central points and the position of the second imitation pupil central point.
In embodiments of the present invention, can by starting to detect shutter start-up operation when, front-facing camera detect work as , as the first initial position, current second eyeball position is as the second initial position for front first eyeball position.Or by detection Default finger manipulation, voice command, gesture high up in the air, software command are determining the detection time of eyeball initial position.For example, right For finger manipulation, detect the trigger action of default button (including hardware button and software keys), by default finger Stricture of vagina identifying device detects default touch operation;For voice command, can be the voices such as " starting to shoot " or " beginning " Order;For gesture high up in the air, it can be the gesture default high up in the air that default proximity transducer detects;For software life For order, it can be the beginning sense command that central processing unit sends after the Preset Time after shutter start-up operation is detected.
S402, in default analog systemss, determined in the first imitation pupil according to the first initial position of the first eyeball Heart point and the first line of the first view membrane simulation point, and the second simulation pupil is determined according to the second initial position of the second eyeball Central point of hole and the second line of the second view membrane simulation point.
In embodiments of the present invention, because human eye sees that object completes with pupil with retina, therefore, in order to After obtaining eyeball position conversion exactly, the visual field of eyes changes, can be by pre-setting the analog systemss of eyes come simulated eye The visual field of eyeball.The first eyeball and the second eyeball simulated is comprised in this analog systems, and corresponding with the first eyeball first Imitation pupil center and the first view membrane simulation point, second imitation pupil center corresponding with the second eyeball and the second retina mould Intend point.Wherein, the first view membrane simulation point and the second view membrane simulation point are fixing, the first imitation pupil center and the second mould Intend pupil center also correspondingly changing with the change location of the eyeball detecting.By this analog systems, can be connected One imitation pupil central point and the first view membrane simulation o'clock are as first simulation light, i.e. the first above-mentioned line;In the same manner, even Connect the second imitation pupil central point and the second view membrane simulation point and simulate light, i.e. the second above-mentioned line as Article 2.
It should be noted that in other embodiments it is also possible to without pre-building above-mentioned analog systemss, only pass through phase The algorithm answered completes above-mentioned simulation process.
S403, using the intersection point of the first line and the extension line of the second line initially regarding as the first eyeball and the second eyeball Feel focus.
In embodiments of the present invention, because the light of an object reflection enters after two eyeballs of human eye, then enter The reverse extending line of this two light of human eye must intersect on this object.Therefore, according to this principle, the first simulated light The intersection point of the reverse extending line of line and the second simulation light must be the object that user is seen, i.e. the first line and the second line Extension line intersection point must be user visual focus.Therefore, the intersection point of the first line and the extension line of the second line is made Initial visual focus for the first eyeball and the second eyeball.
S404, by be middle separated time with the first line first angle, and the second angle institute being middle separated time with the second line The spatial dimension comprising is defined as the initial visual field.
In embodiments of the present invention, because everyone eyes have certain field range it is generally the case that for For human eye, in the horizontal plane, region about within 60 degree of left and right for the eyes vision area.Therefore, in user along horizontal transformation eye During eyeball viewing angle, first angle and second angle all can be defined as 120 °.In vertical, if it is assumed that standard sight line It is level, and this eye-level display is set to 0 degree, then absolute visual field scope is below 50 degree of more than eye-level display and eye-level display 70 Degree.Therefore, when user is along vertical transitions eyes viewing angle, first angle and second angle all can be defined as 100 °.
In embodiments of the present invention, determine the field range of each eye of photographer by such scheme after, just permissible Centered on above-mentioned initial visual focus, the spatial dimension being comprised between this first angle and second angle is defined as clapping The initial visual field of the person of taking the photograph, as shown in Figure 7.
S302, determine that according to initial position, head angle changing and eyeball angle changing the first eyeball and the second eyeball become Position after change, and according to change after position determine the visual focus after the first eyeball and the change of the second eyeball and with this change The visual field after the corresponding change of visual focus after change.
In embodiments of the present invention, determine the initial visual field of photographer by such scheme after, according to same principle, Just the visual field after the eyeball angles change of photographer can be obtained, as shown in Figure 8, Figure 9.
Alternatively, according to change after position determine the visual focus after the first eyeball and the change of the second eyeball and with change The visual field after the corresponding change of visual focus after change includes S501-S502:
S501, in default analog systemss, the 3rd position after the change according to the first eyeball determines the first simulation pupil Central point of hole and the 3rd line of the first view membrane simulation point, and determine according to the 4th position after the change of the second eyeball Two imitation pupil central points and the 4th line of the second view membrane simulation point.
In embodiments of the present invention, reason ibid, when the first eyeball is when changing to three positions, equally connects the first mould Intend pupil center's point and the first view membrane simulation point obtains the 3rd line, and when the second eyeball is when changing to four positions, Connect the second imitation pupil central point and the second view membrane simulation point obtains the 4th line, to obtain two after eyeball position change Bar simulates light.
S502, using the intersection point of the 3rd line and the extension line of the 4th line as after the change of the first eyeball and the second eyeball Visual focus.
In embodiments of the present invention, the intersection point of the extension line of two articles of simulation light the 3rd lines and the 4th line must be used The object seen after the eyeball position change at family, i.e. visual focus after eyeball change.
S503, by be middle separated time with the 3rd line third angle, and the fourth angle institute being middle separated time with the 4th line The spatial dimension comprising is defined as the visual field after changing.
In embodiments of the present invention, centered on the visual focus after above-mentioned change, by this third angle and fourth angle The spatial dimension being comprised between degree is defined as the visual field after the change of photographer, as shown in Figure 9.It should be noted that due to Above-mentioned first angle and the field range that third angle is all corresponding first eyeball, second angle and fourth angle are all right The field range of the second eyeball answered, therefore, first angle and third angle are identical preset value, second angle and fourth angle Spend for identical preset value.Generally, for same person, calculate for convenience, can ignore two eyeballs it Between field range gap, therefore first angle, second angle, third angle and fourth angle all can be set to identical Pre-set.
S103, the photo consistent with the visual field of eyes is shot by post-positioned pick-up head.
In embodiments of the present invention, obtain the visual field excursion of photographer by above scheme after, just can basis The change of this visual field shoots the photo consistent with the visual field of eyes, in concrete shooting, can be divided into following two shooting situations.
Alternatively, shoot the photo consistent with the visual field of eyes by post-positioned pick-up head to include:
Shoot the photo consistent with the final visual field of eyes, or entirely regarding in shooting and the visual field change procedure of eyes Wild consistent photo.
In embodiments of the present invention, with the gradually popularization of panoramic photographing technique, embodiment of the present invention scheme can be passed through The final photo within the vision determining of user is shot in the visual field change following the trail of photographer, can also follow the trail of regarding of photographer Field range inswept for the eyes of whole photographer is all filmed by wild change.And this process is not limited to shoot photo, Video process can also be applied to.
Shoot the within the vision of the final determination of user for by following the trail of the visual field change of photographer first below The embodiment of photo is described in detail.
Alternatively, the photo consistent with the final visual field of eyes is shot by post-positioned pick-up head and include S601-S602:
S601, by the camera lens focal adjustments of post-positioned pick-up head to final visual focus.
In embodiments of the present invention, by such scheme determine photographer the visual field each time change after visual focus with And behind the corresponding visual field, equally can determine the final visual focus after the change of the last visual field and finally regard accordingly Wild.Need exist for illustrating, determine which time is that the final visual field can be realized by below scheme.
Alternatively, the method also includes:
When shutter start-up operation finishes, last visual focus that front-facing camera is detected is as final vision Focus, or when the time that front-facing camera detects one visual focus of eye gaze is more than or equal to default time threshold When, this visual focus is defined as final visual focus.
In embodiments of the present invention, if a terminal detects that when shutter key is fully pressed, that is, shutter is activated, at this moment say Bright photographer has determined and starts to shoot, in this condition, last visual focus that front-facing camera can be detected As final visual focus, and using corresponding for this final visual focus field range as the final visual field.
In other embodiments, the retention time of a visual focus can also be detected, i.e. one vision Jiao of eye gaze The time of point, when this time being more than or equal to default time threshold, this visual focus is defined as final visual focus.This In this default time threshold can be according to different application scenarios self-definings.
S602, according to default acquisition parameters shoot described photo.
In embodiments of the present invention it is determined that final visual focus, and the camera lens intersection point of post-positioned pick-up head is determined at this After on final visual focus, just can carry out taking pictures according to default acquisition parameters.
Below for the visual field change following the trail of photographer, field range inswept for the eyes of whole photographer is all shot The embodiment got off is described in detail.
Alternatively, shoot the photo consistent with the whole visual field in the visual field change procedure of eyes and include S701-S702:
S701, burnt as the camera lens of post-positioned pick-up head using the visual focus after initial visual focus, one or more change Point, shoots multiple pictures with default acquisition parameters respectively successively.
In embodiments of the present invention, in order to by the photographic schemes phase of this style of shooting and above-mentioned step S601-S602 Difference, it may be determined that two kinds of screening-modes, such as full visual field screening-mode and haplopia open country screening-mode, states embodiment on the implementation Haplopia open country screening-mode can be selected during scheme, full visual field screening-mode can be selected when implementing this embodiment scheme.
Vision under the screening-mode of the full visual field it may be determined that after initial visual focus, and each user's conversion visual field Focus (this visual focus includes final visual focus), can shoot one and this visual focus after each conversion visual focus Corresponding photo within the vision, just can obtain multiple pictures after carrying out the conversion of the multiple visual field, this multiple pictures according to Order according to visual field change is arranged in order, and prepares for follow-up photomosaic.
S702, the whole visual field multiple pictures being arranged in order being carried out in splicing acquisition and the visual field change procedure of eyes Consistent photo.
In embodiments of the present invention, after multiple pictures being shot by such scheme, just this multiple pictures can be entered Row splicing, to obtain the scene photo in photographer's whole visual field excursion.Here for specific joining method and algorithm Be not limited, any method being capable of embodiment of the present invention scheme and algorithm all the embodiment of the present invention protection domain it Interior.
So far, by the agency of is over whole basic features of the embodiment of the present invention, it should be noted that such scheme is only One or more specific embodiments of the embodiment of the present invention, in other embodiments, can also adopt and remove his embodiment. Any embodiment same or analogous with embodiment of the embodiment of the present invention, and the basic feature of the embodiment of the present invention Combination in any, all within the protection domain of the embodiment of the present invention.
The present invention proposes a kind of filming apparatus and method, and this device includes:Detection module, determining module and shooting mould Block.By front-facing camera, detection module, under the track up pattern of the default visual field, detects that head and/or eyeball change angle Degree;Determining module determines the visual field change of eyes according to the head detecting and/or eyeball angle changing;After taking module passes through Put photographic head and shoot the photo consistent with the visual field of eyes.By embodiment of the present invention scheme, can be become according to the visual field of user Change and shoot the photo consistent with the user visual field, improve the experience sense of user and shoot satisfaction.
It should be noted that herein, term " inclusion ", "comprising" or its any other variant are intended to non-row The comprising of his property, so that including a series of process of key elements, method, article or device not only include those key elements, and And also include other key elements of being not expressly set out, or also include intrinsic for this process, method, article or device institute Key element.In the absence of more restrictions, the key element being limited by sentence "including a ..." is it is not excluded that including being somebody's turn to do Also there is other identical element in the process of key element, method, article or device.
The embodiments of the present invention are for illustration only, do not represent the quality of embodiment.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side Method can be realized by the mode of software plus necessary general hardware platform naturally it is also possible to pass through hardware, but in many cases The former is more preferably embodiment.Based on such understanding, technical scheme is substantially done to prior art in other words Go out partly can embodying in the form of software product of contribution, this computer software product is stored in a storage medium In (as ROM/RAM, magnetic disc, CD), including some instructions with so that a station terminal equipment (can be mobile phone, computer, clothes Business device, air-conditioner, or network equipment etc.) method described in execution each embodiment of the present invention.
These are only the preferred embodiments of the present invention, not thereby limit the present invention the scope of the claims, every using this Equivalent structure or equivalent flow conversion that bright description and accompanying drawing content are made, or directly or indirectly it is used in other related skills Art field, is included within the scope of the present invention.

Claims (10)

1. a kind of filming apparatus are it is characterised in that described device includes:Detection module, determining module and taking module;
Described detection module, for, under the track up pattern of the default visual field, detecting head and eyeball by front-facing camera Angle changing;
Described determining module, for determining the visual field change of eyes according to the described head detecting and/or eyeball angle changing;
Described taking module, for shooting the photo consistent with the visual field of described eyes by post-positioned pick-up head.
2. filming apparatus as claimed in claim 1 are it is characterised in that described detection module detects head by front-facing camera Include with eyeball angle changing:
Described head angle changing is detected by default head position detection technique;
Detect described eyeball angle changing in conjunction with described head angle changing and default eyeball detection technique.
3. filming apparatus as claimed in claim 2 are it is characterised in that described determining module is according to the described head detecting And/or eyeball angle changing determines that the visual field change of eyes includes:
Obtain the first eyeball and the initial position of the second eyeball, and described first eyeball and second are determined according to described initial position The initial visual focus of eyeball and the initial visual field corresponding with described initial visual focus;
Described first eyeball and second are determined according to described initial position, described head angle changing and described eyeball angle changing Position after eyeball change, and regarding after described first eyeball and the change of the second eyeball is determined according to the position after described change Feel focus and with described change after visual focus corresponding change after the visual field.
4. the filming apparatus as described in claim 1 or 3 are it is characterised in that described taking module is shot by post-positioned pick-up head The photo consistent with the visual field of described eyes includes;
Shoot the photo consistent with the final visual field of described eyes, or whole in shooting and the visual field change procedure of described eyes The consistent photo in the individual visual field.
5. filming apparatus as claimed in claim 4 it is characterised in that
Described taking module shoots the photo consistent with the final visual field of described eyes by post-positioned pick-up head and includes:
At the camera lens focal adjustments of described post-positioned pick-up head to final visual focus;
Shoot described photo according to default acquisition parameters;
Described taking module shoots the photo consistent with the whole visual field in the visual field change procedure of described eyes and includes:
Using the visual focus after described initial visual focus, one or more described change as the camera lens of described post-positioned pick-up head Focus, shoots multiple pictures with described default acquisition parameters respectively successively;
The described multiple pictures being arranged in order are carried out splice the whole visual field obtaining with the visual field change procedure of described eyes Consistent photo.
6. a kind of image pickup method is it is characterised in that methods described includes:
Under the track up pattern of the default visual field, head and eyeball angle changing are detected by front-facing camera;
Determine the visual field change of eyes according to the described head detecting and/or eyeball angle changing;
The photo consistent with the visual field of described eyes is shot by post-positioned pick-up head.
7. image pickup method as claimed in claim 6 is it is characterised in that by front-facing camera, described detect that head and eyeball become Change angle to include:
Head angle changing is detected by default head position detection technique;
Detect described eyeball angle changing in conjunction with described head angle changing and default eyeball detection technique.
8. image pickup method as claimed in claim 7 is it is characterised in that the described head that detects of described basis and/or eyeball Angle changing determines that the visual field change of eyes includes:
Obtain the first eyeball and the initial position of the second eyeball, and described first eyeball and second are determined according to described initial position The initial visual focus of eyeball and the initial visual field corresponding with described initial visual focus;
Described first eyeball and second are determined according to described initial position, described head angle changing and described eyeball angle changing Position after eyeball change, and regarding after described first eyeball and the change of the second eyeball is determined according to the position after described change Feel focus and with described change after visual focus corresponding change after the visual field.
9. the image pickup method as described in claim 6 or 8 is it is characterised in that described shot and described eye by post-positioned pick-up head The photo that the visual field of eyeball is consistent includes;
Shoot the photo consistent with the final visual field of described eyes, or whole in shooting and the visual field change procedure of described eyes The consistent photo in the individual visual field.
10. image pickup method as claimed in claim 9 it is characterised in that
The described photo consistent with the final visual field of described eyes by post-positioned pick-up head shooting includes:
At the camera lens focal adjustments of described post-positioned pick-up head to final visual focus;
Shoot described photo according to default acquisition parameters;
The described shooting photo consistent with the whole visual field in the visual field change procedure of described eyes includes:
Using the visual focus after described initial visual focus, one or more described change as the camera lens of described post-positioned pick-up head Focus, shoots multiple pictures with described default acquisition parameters respectively successively;
The described multiple pictures being arranged in order are carried out splice the whole visual field obtaining with the visual field change procedure of described eyes Consistent photo.
CN201610881993.7A 2016-10-09 2016-10-09 A kind of filming apparatus and method Active CN106454087B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610881993.7A CN106454087B (en) 2016-10-09 2016-10-09 A kind of filming apparatus and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610881993.7A CN106454087B (en) 2016-10-09 2016-10-09 A kind of filming apparatus and method

Publications (2)

Publication Number Publication Date
CN106454087A true CN106454087A (en) 2017-02-22
CN106454087B CN106454087B (en) 2019-10-29

Family

ID=58172980

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610881993.7A Active CN106454087B (en) 2016-10-09 2016-10-09 A kind of filming apparatus and method

Country Status (1)

Country Link
CN (1) CN106454087B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107277376A (en) * 2017-08-03 2017-10-20 上海闻泰电子科技有限公司 The method and device that camera is dynamically shot
CN107589837A (en) * 2017-08-22 2018-01-16 努比亚技术有限公司 A kind of AR terminals picture adjusting method, equipment and computer-readable recording medium
CN110099219A (en) * 2019-06-13 2019-08-06 Oppo广东移动通信有限公司 Panorama shooting method and Related product

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102914932A (en) * 2011-08-03 2013-02-06 浪潮乐金数字移动通信有限公司 Photographic device and method for focusing by eyes of photographic device user
CN103338331A (en) * 2013-07-04 2013-10-02 上海斐讯数据通信技术有限公司 Image acquisition system adopting eyeballs to control focusing
CN103780839A (en) * 2014-01-21 2014-05-07 宇龙计算机通信科技(深圳)有限公司 Shooting method and terminal
CN104702919A (en) * 2015-03-31 2015-06-10 小米科技有限责任公司 Play control method and device and electronic device
CN105007424A (en) * 2015-07-22 2015-10-28 深圳市万姓宗祠网络科技股份有限公司 Automatic focusing system, method and wearable device based on eye tracking
CN205485923U (en) * 2016-02-16 2016-08-17 广东小天才科技有限公司 Virtual reality equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102914932A (en) * 2011-08-03 2013-02-06 浪潮乐金数字移动通信有限公司 Photographic device and method for focusing by eyes of photographic device user
CN103338331A (en) * 2013-07-04 2013-10-02 上海斐讯数据通信技术有限公司 Image acquisition system adopting eyeballs to control focusing
CN103780839A (en) * 2014-01-21 2014-05-07 宇龙计算机通信科技(深圳)有限公司 Shooting method and terminal
CN104702919A (en) * 2015-03-31 2015-06-10 小米科技有限责任公司 Play control method and device and electronic device
CN105007424A (en) * 2015-07-22 2015-10-28 深圳市万姓宗祠网络科技股份有限公司 Automatic focusing system, method and wearable device based on eye tracking
CN205485923U (en) * 2016-02-16 2016-08-17 广东小天才科技有限公司 Virtual reality equipment

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107277376A (en) * 2017-08-03 2017-10-20 上海闻泰电子科技有限公司 The method and device that camera is dynamically shot
CN107589837A (en) * 2017-08-22 2018-01-16 努比亚技术有限公司 A kind of AR terminals picture adjusting method, equipment and computer-readable recording medium
CN110099219A (en) * 2019-06-13 2019-08-06 Oppo广东移动通信有限公司 Panorama shooting method and Related product
CN110099219B (en) * 2019-06-13 2021-10-08 Oppo广东移动通信有限公司 Panoramic shooting method and related product

Also Published As

Publication number Publication date
CN106454087B (en) 2019-10-29

Similar Documents

Publication Publication Date Title
CN106453924A (en) Image shooting method and apparatus
CN105404484B (en) Terminal split screen device and method
EP2824910B1 (en) Electronic device and method of operating the same
CN106454121A (en) Double-camera shooting method and device
KR20150011705A (en) Mobile terminal and panorama capturing method thereof
CN104731340B (en) Cursor position determines method and terminal device
CN106909274A (en) A kind of method for displaying image and device
CN106506927A (en) A kind of terminal and the method shot using terminal
CN106888349A (en) A kind of image pickup method and device
CN106331499A (en) Focusing method and shooting equipment
CN106686213A (en) Shooting method and apparatus thereof
CN106776774A (en) Mobile terminal chooses picture device and method
CN106713716A (en) Double cameras shooting control method and device
CN107016639A (en) A kind of image processing method and device
CN106550126A (en) The control device and method of mobile terminal and its screen intensity
CN105979148A (en) Panoramic photographing device, system and method
CN107071329A (en) The method and device of automatic switchover camera in video call process
CN106850941A (en) Method, photo taking and device
CN106506858A (en) Star orbital Forecasting Methodology and device
CN104917965A (en) Shooting method and device
CN106373110A (en) Method and device for image fusion
CN107018334A (en) A kind of applied program processing method and device based on dual camera
CN106534552A (en) Mobile terminal and photographing method thereof
CN106131327A (en) Terminal and image-pickup method
CN106454087B (en) A kind of filming apparatus and method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant