CN108574801A - A kind of image-pickup method and terminal - Google Patents
A kind of image-pickup method and terminal Download PDFInfo
- Publication number
- CN108574801A CN108574801A CN201710142557.2A CN201710142557A CN108574801A CN 108574801 A CN108574801 A CN 108574801A CN 201710142557 A CN201710142557 A CN 201710142557A CN 108574801 A CN108574801 A CN 108574801A
- Authority
- CN
- China
- Prior art keywords
- terminal
- target
- target area
- image
- acquisition
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
Abstract
The embodiment of the present application provides a kind of image-pickup method and terminal.The method is applied to client, specifically includes:When the client is in front stage operation state, whether the terminal where judging the client is in target area;If be in, it is in beginning timing in the target area for the terminal, and detect whether timing duration reaches target duration threshold value corresponding with the target area;If reached, image is acquired.Using scheme provided by the embodiments of the present application, user's input picture acquisition instructions are not necessarily to, user experience can be improved.
Description
Technical field
This application involves field of computer technology, more particularly to a kind of image-pickup method and terminal.
Background technology
Mobile phone, camera, tablet computer etc. have the equipment of image collecting function more and more universal.People are usually encountering U.S.
Fine moment can be recorded by way of taking pictures or recording a video when beautiful landscape.
In order to help user preferably to acquire image, user experience is improved, in the prior art, usually to get user defeated
Image is acquired after the image capture instruction entered.For example, when user wants acquisition image every time, it is required for the point on mobile phone screen
The corresponding button is hit, or presses the corresponding button on camera, to start to acquire image.Although this image-pickup method may be implemented
Image collecting function, but user needs attention to be transferred in Image Acquisition to send out image capture instruction, thus
It can not preferably enjoy fine moment, user experience is not high enough.
Invention content
The embodiment of the present application has been designed to provide a kind of image-pickup method and terminal, to be not necessarily to user's input picture
Acquisition instructions improve user experience.Specific technical solution is as follows.
In order to achieve the above object, the embodiment of the present application discloses a kind of image-pickup method, is applied to client, described
Method includes:
When the client is in front stage operation state, whether the terminal where judging the client is in target area
In domain;
If be in, it is in beginning timing in the target area for the terminal, and whether detect timing duration
Reach target duration threshold value corresponding with the target area;
If reached, image is acquired.
Optionally, after detecting that timing duration reaches the target duration threshold value, the method further includes:
Determine first relative position of the terminal in the target area;
Obtain the first geographic orientation of terminal institute direction;
Judge the correspondence of first relative position and the first geographic orientation whether meet preset relative position with
The correspondence of geographic orientation;
If it is satisfied, then the step of executing the acquisition image.
Optionally, the step of acquisition image, including:
Image is acquired according to target acquisition mode corresponding with the target area.
Optionally, described the step of acquiring image according to target acquisition mode corresponding with the target area, including:
Obtain target acquisition mode corresponding with the target area;
Obtain the current target state of the terminal;
Judge whether the target acquisition mode and the correspondence of the target state meet preset acquisition mould
The correspondence of formula and motion state;
If it is, acquisition image.
Optionally, the method further includes:
Acquired image is stored to memory space corresponding with the target acquisition mode.
Optionally, in the terminal judged where the client whether before the step in target area, institute
The method of stating further includes:
Receive target area input by user;And/or
Receive target duration threshold value corresponding with target area input by user;And/or
Receive target acquisition mode corresponding with target area input by user;And/or
Receive memory space corresponding with target acquisition mode input by user.
Optionally, after the acquisition image the step of, the method further includes:
When detecting that the terminal leaves the target area, stop acquisition image.
Optionally, the step for judging the terminal and whether being in target area, including:
Receive the position for the terminal that positioning device is sent;
According to the position received, judge the terminal whether in target area.
In order to achieve the above object, the embodiment of the present application discloses a kind of terminal, and the terminal includes:Processor and image
Collector;
The processor, for when the client of the terminal is in front stage operation state, obtaining locator acquisition
The position of the terminal judges that the terminal whether in target area, if be in, is directed to according to the position of acquisition
The terminal is in beginning timing in the target area, and detects whether timing duration reaches corresponding with the target area
Target duration threshold value sends image capture instruction if reached to described image collector;
Described image collector, for after receiving the image capture instruction that the processor is sent, acquiring image.
Optionally, the processor is additionally operable to after detecting that timing duration reaches the target duration threshold value, is determined
First relative position of the terminal in the target area;Obtain the of the terminal institute direction of position detector detection
One geographic orientation;Judge whether first relative position and the correspondence of the first geographic orientation meet preset relative position
With the correspondence of geographic orientation, if it is satisfied, then to described image collector send image capture instruction.
Optionally, the processor, specifically for determining target acquisition mode corresponding with the target area, and according to
The target acquisition mode generates image capture instruction, and described image acquisition instructions are sent to described image collector;
Described image collector is specifically used for receiving the image capture instruction that the processor is sent, and according to the figure
As acquisition instructions acquire image.
Optionally, the terminal further includes:Motion sensor;
The motion sensor, the target state current for detecting the terminal, and the target detected is transported
Dynamic state is sent to the processor;
The processor is additionally operable to obtain target acquisition mode corresponding with the target area;The movement is obtained to pass
The target state that sensor is sent;It is pre- to judge whether the correspondence of the target acquisition mode and target state meets
If acquisition mode and motion state correspondence, if it is, to described image collector send image capture instruction.
Optionally, described image collector, be additionally operable to by acquired image store to the target acquisition mode pair
The memory answered.
Optionally, the processor is additionally operable to:
Before judging the terminal whether in target area, target area input by user is received;And/or
Before judging the terminal whether in target area, mesh corresponding with target area input by user is received
Mark duration threshold value;And/or
Before judging the terminal whether in target area, mesh corresponding with target area input by user is received
Mark acquisition mode;And/or
Before judging the terminal whether in target area, receive input by user corresponding with target acquisition mode
Memory.
Optionally, the processor is additionally operable to after sending image capture instruction to described image collector, is detecting
When leaving the target area to the terminal, is sent to described image collector and stop acquisition image command.
Optionally, the processor is specifically used for receiving the position for the terminal that locator is sent, according to what is received
Whether position judges the terminal in target area.
Image-pickup method provided by the embodiments of the present application and terminal, can when client is in front stage operation state,
Whether the terminal where judging client is in target area, if be in, is in target area and starts for terminal
Timing, and detect timing duration and whether reach target duration threshold value corresponding with target area and start acquisition figure if reached
Picture.That is, using scheme provided by the embodiments of the present application, it is in target area in terminal and timing duration reaches threshold
When value, the opportunity for starting to acquire image is determined, and start to acquire image, be not necessarily to the manual input picture acquisition instructions of user, therefore
User experience can be improved.
Description of the drawings
In order to illustrate the technical solutions in the embodiments of the present application or in the prior art more clearly, to embodiment or will show below
There is attached drawing needed in technology description to be briefly described.It should be evident that the accompanying drawings in the following description is only this
Some embodiments of application for those of ordinary skill in the art without creative efforts, can be with
Obtain other attached drawings according to these attached drawings.
Fig. 1 is a kind of flow diagram of image-pickup method provided by the embodiments of the present application;
Fig. 2 is a kind of judgement schematic diagram to terminal relative position and geographic orientation provided by the embodiments of the present application;
Fig. 3 a are a kind of schematic diagram of a scenario of target area and terminal location;
Fig. 3 b are a kind of schematic diagram of the specific dividing mode of target area;
Fig. 4 is a kind of flow diagram of step S101 in Fig. 1;
Fig. 5 is a kind of structural schematic diagram of image capturing system provided by the embodiments of the present application;
Fig. 6 is a kind of structural schematic diagram of terminal provided by the embodiments of the present application.
Specific implementation mode
Below in conjunction with the attached drawing in the embodiment of the present application, technical solutions in the embodiments of the present application carries out clear, complete
Whole description.Obviously, described embodiment is only a part of the embodiment of the application, instead of all the embodiments.Base
Embodiment in the application, those of ordinary skill in the art are obtained all without making creative work
Other embodiment shall fall in the protection scope of this application.
The embodiment of the present application provides a kind of image-pickup method and terminal, to be not necessarily to user's input picture acquisition instructions,
Improve user experience.Below by specific embodiment, the application is described in detail.
Fig. 1 is a kind of flow diagram of image-pickup method provided by the embodiments of the present application, is applied to client, packet
Include following steps S101~step S103:
Step S101:When the client is in front stage operation state, judge whether is terminal where the client
In target area, if be in, S102 is thened follow the steps, otherwise, is disregarded.
Wherein, above-mentioned terminal be the equipment with image collecting function, specifically, above-mentioned terminal can be smart mobile phone,
The equipment such as tablet computer, computer, camera, intelligent wearable device, UAV (referred to as " unmanned plane ").Pacify in terminal
Equipped with above-mentioned client, which is specifically by above-mentioned client executing for acquiring image, the present embodiment.
Specifically, whether the terminal where judging client may include in target area:Obtain client place
Terminal residing for geographical location, the geographical location is judged whether in the target area, if it is, determining the client
Terminal where end is in target area.Above-mentioned terminal can have geographical location detection module, can not also have geography
Station acquisition module.When above-mentioned terminal has geographical location detection module, residing for the terminal where the acquisition client
The step of geographical location may include:Directly acquire the geographical location of the terminal where client.When above-mentioned terminal does not have ground
Manage position detecting module when, obtain client where terminal residing for geographical location the step of, may include:Receive geographical position
The geographical location, is determined as the geographical location of the terminal where the client by the geographical location for setting detection module transmission.
As an implementation, geographical location detection module can be GPS (Global Positioning System, global location
System) module etc..
Above-mentioned client is in front stage operation state, it can be understood as:Client is in operation and the boundary of client
Face is in the forefront of terminal screen.Non- front stage operation state is also referred to as background operation state, and above-mentioned client is in non-foreground
Operating status, it can be understood as:Client is not in operation, alternatively, client is in operation, but the interface of client
It is not in the forefront of terminal screen.
The front stage operation state of client is it can be appreciated that standby mode, i.e. terminal are in the shape of preparation for acquiring image
State may be at operating status with acquisition other relevant programs of image, may be at not with the acquisition incoherent program of image
Operating status.May include corresponding program of geographical location detection module etc. with the acquisition relevant program of image.After client
Platform operating status can be understood as the state that terminal is not in preparation for acquiring image, including terminal is in off-mode or power saving shape
State and lens cap are in covering state.Under normal circumstances, the repertoire of equipment is turned off under off-mode, under power down mode
Other than Remote Wake Up component working, other component does not work.
For example, when terminal is camera, magazine client can be magazine application program of taking pictures, when camera
When client is in front stage operation state, corresponding camera is in standby, and lens cap is in the open state, in camera
GPS module it is in operating status;When camera is in off-mode or power down mode or lens cap are in covering state, all
When thinking that the client of camera is in background operation state.
The application scenarios of the embodiment of the present application are illustrated below by citing.
A kind of concrete application scene is:Terminal where active client is in target area, and client is in non-foreground fortune
After client receives status switching instruction, front stage operation state is switched to by non-front stage operation state for row state.At this point, objective
Family end can start whether to judge in target area terminal.
Another concrete application scene is:Terminal where active client is not in target area, and client is in foreground
Operating status, user carry above-mentioned terminal and enter target area, and client is constantly in front stage operation state during this.It is this
In the case of, once above-mentioned terminal is in target area, client can detect above-mentioned terminal and be in target area.
The application is only illustrated by taking above-mentioned scene as an example, and the application scenarios of the embodiment of the present application are unlimited in practical application
In this.
Above-mentioned target area can be acquiescence, such as just be set in advance in terminal inner when equipment is dispatched from the factory, and also may be used
To be temporarily determining, such as according to the input determination of user after equipment manufacture.As a kind of specific embodiment,
Whether the terminal where judging the client is in before the step in target area, and the method for the present embodiment can also wrap
It includes:Receive target area input by user.Above-mentioned target area can be a region, can also be at least two regions, this
Application embodiment is not specifically limited this.When target area is inputted by user, target area can be closer to the need of user
It asks.
Step S102:It is in the target area for the terminal and starts timing, and detect whether timing duration reaches
S103 is thened follow the steps if reached to target duration threshold value corresponding with the target area, otherwise, is disregarded.
Wherein, it is in the target area for the terminal and starts timing it is to be understood that being in target in terminal
Start timing in the state of in region.
As a kind of specific embodiment, the step whether above-mentioned terminal is in target area is judged in step S101
Can carry out in real time, when judging result be in when, that is, start timing.During timing, if judge above-mentioned terminal from
Target area is opened, then stops timing, and judges whether timing duration reaches above-mentioned target duration threshold value.If from timing is started to meter
Constantly length does not judge that always above-mentioned terminal leaves target area during reaching above-mentioned target duration threshold value, then can also be true
Fixed above-mentioned testing result is to reach.
Above-mentioned target duration threshold value can be acquiescence, can also be to determine temporarily.As a kind of specific embodiment party
Formula, whether in the terminal where judging the client before the step in target area, the method for the present embodiment may be used also
To include:Receive target duration threshold value corresponding with target area input by user.When target duration threshold value is inputted by user,
Target duration threshold value can be closer to the demand of user.In addition, above-mentioned target duration threshold value, which can be client, is receiving target
The information received together when region.
It is understood that when the judging result of step S101 and step S102 are to be, it is believed that current time
The opportunity for starting to acquire image is met, therefore can start to acquire image.
Step S103:Acquire image.
Above-mentioned client can shoot image when acquiring image, can also recorded video, the present embodiment is to client
Image Acquisition operation is not specifically limited.Specifically, when client acquires image, may include:According to the target area
The corresponding target acquisition mode in domain acquires image.Wherein, target acquisition mode can be acquiescence, can also be interim determination
's.Different target areas can correspond to different target acquisition modes.
As a kind of specific embodiment, in the terminal where judging the client whether in target area
Before step, the method for the present embodiment can also include:Receive target acquisition mode corresponding with target area input by user.
When target acquisition mode is inputted by user, target acquisition mode can be closer to the demand of user.In addition, above-mentioned target acquisition
Pattern can be the information that client receives together when receiving target area and/or target duration threshold value.
Specifically, target acquisition mode can be understood as including a kind of operating mode of particular content.Wherein, target acquires
Pattern may include:It takes pictures or records a video;Interval is captured, or continuous candid photograph;Interval video recording or continuously recording;It takes pictures more
Terminate for a long time, or how long video recording the information such as terminates.Above-mentioned interval is captured and continuous candid photograph can be understood as with pre-
If the mode that the first duration interval is taken pictures, the two difference lies in captured and taken pictures with long period interval, example by interval
Such as it is spaced a few minutes or more;Continuous capture is taken pictures with short period of time, for example, be spaced several seconds or less.Interval video recording
It can be understood as in such a way that default second time interval is recorded a video;Continuously recording can be understood as recording a video incessantly.When
So, target acquisition mode can also include limiting the information of this acquisition image duration.For example, target acquisition mode can be with
It is captured 3 minutes for interval or continuously recording 5 minutes etc..
After acquiring image, the method can also include:Acquired image is stored to the target and is acquired
The corresponding memory space of pattern.Specifically, different target acquisition modes can correspond to different memory spaces.For example, working as mesh
Mark acquisition mode is when taking pictures, and acquired image data volume is smaller, and above-mentioned memory space can be the storage card of terminal;Work as mesh
It is when recording a video to mark acquisition mode, and acquired image data volume is larger, and above-mentioned memory space can be that the cloud in Cloud Server be deposited
Store up space.Data are stored using different storage modes according to different acquisition modes, the utilization of storage resource can be made more
Rationally.
Memory space corresponding with target acquisition mode can be acquiescence, can also be to determine temporarily.As one kind
Specific embodiment, in the terminal where judging the client whether before the step in target area, this implementation
Example method can also include:Receive memory space corresponding with target acquisition mode input by user.When above-mentioned memory space
When being inputted by user, storage mode can be closer to the demand of user.It is being received in addition, above-mentioned memory space can be client
The information received together when target area and/or target duration threshold value and/or target acquisition mode.
As an example, it is known that user prepares to acquire image using smart mobile phone when going sight-seeing Hangzhou West Lake, in order not to
Excitement when oneself ornamental beautiful scenery is influenced, while the beautiful scenery of the West Lake can be recorded again, user is before reaching the West Lake just on mobile phone
The information such as target area and corresponding target duration threshold value are set.When target area that client receives and corresponding target
Long threshold value, target acquisition mode, memory space are shown in Table 1.
Table 1
Target area | Target duration threshold value | Target acquisition mode | Memory space |
Sight spot 1:Pinghu autumn month | 20 seconds | Capture picture in interval | Storage card |
Sight spot 2:The moon is reflected in three pools | 30 seconds | Continuously recording | Cloud storage |
Sight spot 3:Lei Feng's evening glow | 30 seconds | It is continuous to capture picture+video recording | Storage card and cloud storage |
Wherein, for target duration threshold value, client can prompt user according to the popularity of target area to be arranged.
According to the set content in table 1, when client is in front stage operation state, residing for terminal where client real-time judge itself
Geographical location whether be in target area.When judging that terminal is in the range of three pools reflect the moon, then start timing, in timing
When duration reaches 30 seconds and terminal is constantly in the range of the moon is reflected in three pools in 30 seconds of timing, start to record a video.It is recording a video
After reception, Video data is sent to cloud storage space, so that cloud storage space stores Video data.When terminal is in
Other dispositions at two when sight spot are similar, repeat no more.
As a kind of specific embodiment, after the step of acquiring image, the method can also include:It is detecting
When leaving the target area to the terminal, stop acquisition image.
As shown in the above, image-pickup method provided in this embodiment and terminal can be in foreground in client
When operating status, whether the terminal where judging client is in target area, if be in, target is in for terminal
Start timing in region, and detects whether timing duration reaches target duration threshold value corresponding with target area, if reached,
Start to acquire image.That is, using scheme provided by the embodiments of the present application, it is in target area and timing in terminal
When duration reaches threshold value, the opportunity for starting to acquire image is determined, and start to acquire image, without the manual input picture acquisition of user
Instruction, therefore user experience can be improved.
As a kind of application scenarios, scheme provided in this embodiment can be applied to unmanned plane.Unmanned plane is one kind by nothing
The unmanned vehicle that line electrical remote control equipment or itself presetting apparatus manipulate.In general, unmanned plane can be used for record a video or
It takes pictures.
It in the application of the prior art, is recorded a video using unmanned plane or when taking pictures, a kind of situation is can to make on unmanned plane
Camera is constantly in open state, when unmanned plane arrives at, may be implemented to record a video to target area or taken pictures;Separately
A kind of situation is, by the camera on manual control unmanned plane when record a video or take pictures, i.e., when unmanned plane during flying to target area
When domain, manual remote control opens the camera on unmanned plane, and camera is made to start to record a video or take pictures.But the first situation will obviously clap
A large amount of useless videos or photo are taken the photograph, increase useless data volume, camera, which is constantly in shooting state also and can increase electricity, to disappear
Consumption;Another situation needs whether artificial real time monitoring unmanned plane arrived target area, and labor intensive, user experience is not
It is high.
When using image-pickup method provided in this embodiment, unmanned plane can when judging that itself is in target area,
And when timing duration reaches threshold value, start to acquire image.The present embodiment is constantly in shooting shape without the camera on unmanned plane
State, therefore data volume can be reduced, electric quantity consumption is reduced, while whether target area is arrived without artificial monitoring unmanned plane
Domain saves manpower, can improve user experience.
In a kind of embodiment based on embodiment illustrated in fig. 1, detecting that timing duration reaches the target duration threshold
After value, i.e., when the judging result of step S102 is to reach, the method can also include terminal relative position shown in Fig. 2
With the deterministic process of geographic orientation, following steps S201~step S203 is specifically included:
Step S201:Determine first relative position of the terminal in the target area.
Specifically, when determining first relative position of the terminal in the target area, may include:Described in acquisition
Relative position of the first position in the target area is determined as the first relative position by the first position of terminal.Its
In, the first position can be understood as the absolute position of terminal, i.e. the coordinate of terminal on the ground, coordinate can use longitude and latitude
Degree indicates.Target area can be the region indicated with absolute position.As a kind of specific embodiment, first is being determined
When setting the first relative position in the target area, can by target area longitudinally and transversely on be normalized respectively,
It is the first relative position by the location determination behind first position in the target area corresponding normalization.
As an example, in fig. 3 a, the region of box is target area, the left upper apex of target area and bottom right top
The coordinate of point is respectively (120.10 ° of E, 30.10 ° of N) and (120.30 ° of E, 30.00 ° of N), this is absolute position.Wherein, E is east
The mark of warp, N are the mark of north latitude.The first position of terminal be A points shown in position, A point coordinates be (120.29 ° of E, 30.05 °
N), this is absolute position.So the coordinate of target area is normalized, i.e., is changed to the left upper apex coordinate of target area
(0,0), target area bottom right vertex coordinate are changed to (1,1), then A points can be calculated in target area according to above- mentioned information
In the first relative position (x, y) be:X=(30.1-30.05)/(30.1-30.0), y=(120.29-120.1)/(120.3-
120.1), i.e., (x, y) is (0.95,0.5), this is relative position.
Certainly, as a kind of specific implementation mode, the first relative position can also be indicated with relatively abstract position, for example,
First relative position can be at middle section, at left hand edge, at right edge, upper edge and lower edge etc..That is,
Target area can be divided according to mode shown in Fig. 3 b, including at middle section, at left hand edge, right hand edge
At place, upper edge and lower edge.The example for continuing to use epimere, after obtaining coordinate (0.95,0.5), according to pre-set
The range in each orientation, it may be determined that the first relative position is right edge.Wherein, (range is with upper left for the range in each orientation
Vertex and bottom right vertex indicate) be respectively:[(0.1,0.1), (0.90,0.90)] at middle section, right edge [(0.90,
0), (1,0.90)], [(0,0), (0.10,1)], upper edge [(0.10,0), (0.90,0.10)] and lower edge at left hand edge
Locate [(0.10,0.90), (1,1)].
Step S202:Obtain the first geographic orientation of terminal institute direction.
Wherein, the first geographic orientation can be east, south, west, north and east by south N degree, east by north N degree, west by south N degree,
West by north N degree etc..N can within the scope of 0~108 ° value.First geographic orientation can be understood as image capture module in terminal
The direction of camera lens.The picture shown in its interface when client is in front stage operation state, the as practical picture of terminal institute direction
Part picture in face.
Specifically, the first geographic orientation can be determined by orientation detection module.Above-mentioned terminal can have orientation inspection
Module is surveyed, there can not also be orientation detection module.When above-mentioned terminal has orientation detection module, terminal institute direction is obtained
When the first geographic orientation, may include:Directly acquire the first geographic orientation of terminal institute direction.When above-mentioned terminal does not have orientation
When detection module, when obtaining the first geographic orientation of terminal institute direction, may include:Receive the terminal that orientation detection module is sent
First geographic orientation of institute's direction.Wherein, orientation detection module can be the devices such as electronic compass.
Step S203:Judge whether first relative position and the correspondence of the first geographic orientation meet preset phase
To the correspondence of position and geographic orientation, if it is satisfied, then executing step S103, that is, image is acquired.If conditions are not met, then not
Give processing.
Specifically, when preset geographic orientation is indicated with angular range, judge first relative position with the first
The step of whether correspondence in reason orientation meets the correspondence of preset relative position and geographic orientation, may include:From
Target geographic corresponding with the first relative position orientation is determined in the correspondence of preset relative position and geographic orientation, is judged
Whether the first geographic orientation is in the range of target geographic orientation, if it is, determining first relative position and first
The correspondence of geographic orientation meets the correspondence of preset relative position and geographic orientation.As an example, it sets in advance
The correspondence of the relative position and geographic orientation set can be shown in Table 2.
Table 2
Relative position | Geographic orientation |
Upper edge | West by south α~east by south α (is less than 180 ° of ranges) |
At lower edge | West by north α~east by north α (is less than 180 ° of ranges) |
At left hand edge | North by east α~south by east α (is less than 180 ° of ranges) |
Right edge | North by west α~south by west α (is less than 180 ° of ranges) |
At middle section | Any angle |
Wherein, angle [alpha] can in the range of less than 90 ° value, for example, the value within the scope of 0~10 °.Ground in table 2
Defer to the orientation rule of up north and down south, left west and right east in reason orientation.
For example, setting angle [alpha] as 10 °, the first relative position is respectively that upper edge and west are inclined with the first geographic orientation
30 ° of north can then determine that west by north 30 ° do not fall at west by south 10 ° to 10 ° of east by south in the range of being less than 180 °, therefore can
To determine that the correspondence of the first relative position and the first geographic orientation is unsatisfactory for pair of preset relative position and geographic orientation
It should be related to.If above-mentioned first relative position and the first geographic orientation are respectively upper edge and 30 ° west by south, can determine
West by south 30 ° fall at west by south 10 ° to 10 ° of east by south in the range of being less than 180 °, thus may determine that the first relative position
Meet the correspondence of preset relative position and geographic orientation with the correspondence of the first geographic orientation.
It should be noted that when terminal be in upper edge, at lower edge, at left hand edge or these marginal zones of right edge
When domain, and terminal institute direction geographic orientation it is as shown in table 2 when, it is believed that central area of the terminal towards target area
Domain, the possibility that user acquires image is larger, therefore starts to acquire image.On the contrary, when terminal is in above-mentioned fringe region, and
When other than geographic orientation range shown in table 2 of terminal institute direction, it is believed that terminal deviates from the middle section of target area, uses
The possibility that family acquires image is smaller, therefore does not acquire image.As an example, in fig 3b, if user is at A points,
When terminal is towards right side, illustrate user there is no facing to target area, the possibility that user takes pictures is little, at this time may not be used
Acquire image.If user is at A points, when terminal is towards left side, illustrate user plane against target area, what user took pictures
Possibility is larger, can acquire image at this time.When terminal is in above-mentioned middle section, tube terminal is not used towards what orientation
Family may all be wanted to acquire image, therefore when terminal is in middle section, not limit the direction of terminal.
In summary, in the technical solution that present embodiment provides, before acquiring image, judge terminal in target area
Whether the geographic orientation of relative position and terminal institute direction in domain meets preset condition, if it is satisfied, then acquisition image, energy
Enough photo opportunity for more accurately determining user improve accuracy when client acquisition image.
It is understood that when client is taken pictures, if the terminal where client is static, can shoot higher
The image of quality, and if terminal is moving, the picture quality shot is not high, in some instances it may even be possible to be fuzzy.
Therefore, in order to improve the quality of acquired image, in the embodiment shown in fig. 1, step S103 acquisition images
A kind of embodiment includes:Image is acquired according to target acquisition mode corresponding with the target area.Further, the step
It can specifically include 1~step 4 of following steps:
Step 1:Obtain target acquisition mode corresponding with the target area.
Specifically, target acquisition mode can be pre-set, can also be to receive input by user and target area
Corresponding target acquisition mode and determination.Target acquisition mode may include exposal model and RECORD mode etc..
Step 2:Obtain the current target state of the terminal.
Specifically, terminal can include motion state detection module, the current target state of the terminal is obtained
Step may include:The current kinematic parameter of the terminal that detects of motion state detection module is obtained, according to the movement
Parameter determines the current target state of the terminal.Wherein, motion state detection module can be:Acceleration detection device
And/or angular acceleration detects device.Kinematic parameter may include acceleration value and/or angular acceleration values etc..As a kind of specific
Embodiment, target state may include stationary state and nonstatic state, described to be determined according to the kinematic parameter
The step of terminal current target state, may include:Judge whether the kinematic parameter is less than parameter preset threshold
Value, if it is, determining that the current target state of the terminal is stationary state;If it is not, then determining that the terminal is worked as
Preceding target state is nonstatic state.
Step 3:It is preset to judge whether the correspondence of the target acquisition mode and the target state meets
Otherwise the correspondence of acquisition mode and motion state, is disregarded if so, thening follow the steps 4.
Step 4:Acquire image.
Specifically, it is preset to judge whether the correspondence of the target acquisition mode and the target state meets
The step of correspondence of acquisition mode and motion state, may include:It is corresponding with motion state from preset acquisition mode
The first motion state corresponding with the target acquisition mode is determined in relationship, judges that above-mentioned first motion state is moved with target
Whether state matches, if it is, determining that the correspondence of target acquisition mode and target state meets preset acquisition
The correspondence of pattern and motion state.
It is understood that exposal model is to the more demanding of motion state, that is, when terminal remains static when taking pictures
The quality of captured image is higher.Requirement of the RECORD mode to motion state is relatively low, can not limiting terminal movement
State.As a kind of specific implementation mode, the correspondence of preset acquisition mode and motion state can be shown in Table 3.
Table 3
Acquisition mode | Motion state |
Exposal model | Stationary state |
RECORD mode | Free position |
For example, it is known that target acquisition mode is exposal model, and target state is stationary state, then from table 3
Correspondence can determine that the correspondence of target acquisition mode and target state meets preset acquisition mode and fortune
The correspondence of dynamic state.Known target acquisition mode is RECORD mode, and target state is motion state, then from table 3
Correspondence can determine that the correspondence of target acquisition mode and target state meets preset acquisition mode and fortune
The correspondence of dynamic state.Known target acquisition mode is exposal model, and target state is motion state, then from table 3
Correspondence can determine the correspondence of target acquisition mode and target state be unsatisfactory for preset acquisition mode with
The correspondence of motion state.
A kind of specific application scenarios are that user is in target area and handheld camera is shooting image, when this
After shooting process, camera lens are turned to another direction and continue to shoot by user, then during user rotates, due to
Camera is kept in motion, and magazine client is without Image Acquisition.When camera remains static, start acquisition figure
Picture.It in this way can be to avoid taking unclear picture in camera rotation process.
To sum up, in the scheme that present embodiment provides, before acquiring image, judge acquisition mode and motion state
Whether correspondence meets preset correspondence, if it is satisfied, then image can be acquired.In this way, institute can be improved as far as possible
The quality of the image of acquisition.
In another embodiment based on embodiment illustrated in fig. 1, step S101 judges whether the terminal is in target
Step in region can carry out according to flow diagram shown in Fig. 4, specifically include step S101A and step S101B:
Step S101A:Receive the position for the terminal that positioning device is sent.
In the present embodiment, terminal can not have geographical location detection module, but according to communicating with connection
Positioning device carries out Geographic mapping.Specifically, above-mentioned positioning device and terminal are all the portable equipment of user.Positioning
Equipment detects current position first, and the position that detection obtains is sent to the terminal, terminal as the position of terminal
To receive the above-mentioned position of positioning device transmission.As an example, above-mentioned terminal can be general camera, the wearable phase of intelligence
The equipment such as machine, positioning device can be the equipment such as smart mobile phone or tablet computer.A kind of application scenarios are, user can by camera and
Portable mobile phone carries, the generally included Geographic mapping module of mobile phone, and camera can not have Geographic mapping module.It is logical
The communication of camera and mobile phone is crossed, the current location that can detect mobile phone is sent to camera, the position as camera itself.
Specifically, can install the APP that is interacted with above-mentioned terminal in positioning device, (Application, third party answer
With program), the position that the position detected is used as to terminal by the APP is sent to terminal, and terminal receives positioning device and passes through
The above-mentioned position that APP is sent.
Step S101B:According to the position received, judge the terminal whether in target area.
Specifically, judging that the terminal whether in target area, specifically includes:Judge whether received position is located
In in target area, if it is, determining that the terminal is in target area.Wherein, terminal receives positioning device transmission
Position when, using the position as the position of terminal itself, so according to the position judgment terminal itself whether be in target area
In domain.
To sum up, in the scheme that present embodiment provides, the position that terminal is sent according to positioning device is to judge itself
It is no to be in target area, that is to say, that terminal can not have Geographic mapping module, and pass through and receive other equipment
Location information determines the position of itself.
Fig. 5 is a kind of structural schematic diagram of image capturing system provided by the embodiments of the present application, which includes terminal
501, positioning device 502 and Cloud Server 503.
Wherein, the terminal 501, for when the client of the terminal 501 is in front stage operation state, receiving positioning
The position for the terminal that equipment 502 is sent judges whether the terminal 501 is in target area according to the position received
It is interior, if be in, it is in beginning timing in the target area for the terminal 501, and detect whether timing duration reaches
To target duration threshold value corresponding with the target area;If reached, image is acquired, acquired image is sent to cloud
Server 503.
The positioning device 502, the position for detecting the terminal 501, and detected position is sent to end
End 501.
The Cloud Server 503, the terminal acquired image for receiving the transmission of terminal 501, and store the figure
Picture.
Since above system embodiment is obtained based on embodiment of the method, and this method technique effect having the same,
Therefore details are not described herein for the technique effect of system embodiment.For system embodiments, since it is substantially similar to method
Embodiment, so describing fairly simple, the relevent part can refer to the partial explaination of embodiments of method.
Fig. 6 is a kind of structural schematic diagram of terminal provided by the embodiments of the present application, opposite with embodiment of the method shown in Fig. 1
It answers, which includes:Processor 601 and image acquisition device 602;
The processor 601, for when the client of the terminal is in front stage operation state, obtaining locator acquisition
The position of the terminal terminal is judged whether in target area according to the position of acquisition, if be in, needle
The terminal is in the target area and starts timing, and detects whether timing duration reaches corresponding with the target area
Target duration threshold value, if reached, to described image collector 602 send image capture instruction;
Described image collector 602, for after the image capture instruction for receiving the transmission of the processor 601, adopting
Collect image.
Wherein, locator is used for the position of acquisition terminal, and the position acquired is sent to terminal.The terminal can wrap
Can not also include the locator containing the locator.As a kind of specific implementation mode, the locator can be GPS devices or
Big Dipper device.Image acquisition device can be the devices such as camera.
In a kind of specific implementation mode of embodiment shown in Fig. 6, the processor 601 can be also used for detecting
After timing duration reaches the target duration threshold value, first relative position of the terminal in the target area is determined;
Obtain the first geographic orientation of the terminal institute direction of position detector detection;Judge first relative position with the first
Whether the correspondence in reason orientation meets the correspondence of preset relative position and geographic orientation, if it is satisfied, then to described
Image acquisition device 602 sends image capture instruction.
Wherein, terminal can include position detector, can not also include position detector.As a kind of specific embodiment party
Formula, the position detector can be the devices such as electronic compass or electronic compass.
In a kind of specific implementation mode of embodiment shown in Fig. 6, the processor 601, be specifically used for determine with it is described
The corresponding target acquisition mode in target area, and image capture instruction is generated according to the target acquisition mode, by described image
Acquisition instructions are sent to described image collector 602;
Described image collector 602, the image capture instruction sent specifically for receiving the processor 601, and according to
Described image acquisition instructions acquire image.
In a kind of specific implementation mode of embodiment shown in Fig. 6, the terminal can also include:Motion sensor;(figure
In be not shown)
The motion sensor, the target state current for detecting the terminal, and the target detected is transported
Dynamic state is sent to the processor 601;
The processor 601 is additionally operable to obtain target acquisition mode corresponding with the target area;Obtain the movement
The target state that sensor is sent;Judge whether the correspondence of the target acquisition mode and target state meets
The correspondence of preset acquisition mode and motion state refers to if it is, sending Image Acquisition to described image collector 602
It enables.
Wherein, motion sensor can be acceleration detection device, can also be angular acceleration detection device.
In a kind of specific implementation mode of embodiment shown in Fig. 6, described image collector 602 is additionally operable to be acquired
Image store to memory corresponding with the target acquisition mode.
Wherein, which can include the memory, can not also include the memory.When terminal does not include the memory
When, which can be the cloud storage space in Cloud Server.
In a kind of specific implementation mode of embodiment shown in Fig. 6, the processor 601 can be also used for:
Before judging the terminal whether in target area, target area input by user is received;And/or
Before judging the terminal whether in target area, target duration threshold corresponding with target area input by user is received
Value;And/or it before judging the terminal whether in target area, receives input by user corresponding with target area
Target acquisition mode;And/or before judging the terminal whether in target area, receive input by user and target
The corresponding memory of acquisition mode.
In a kind of specific implementation mode of embodiment shown in Fig. 6, the processor 601 is additionally operable to described image
After collector 602 sends image capture instruction, when detecting that the terminal leaves the target area, adopted to described image
Storage 602, which is sent, stops acquisition image command.
In a kind of specific implementation mode of embodiment shown in Fig. 6, the processor 601 is specifically used for receiving locator
Whether the position of the terminal sent judges the terminal in target area according to the position received.
By the above as it can be seen that in terminal provided in this embodiment, processor is in foreground in the client of the terminal
When operating status, the position of the terminal of locator acquisition is obtained, whether target area is located at according to the position judgment terminal of acquisition
It is interior, if be in, it is in target area beginning timing for terminal, and detect timing duration and whether reach and target area
Corresponding target duration threshold value sends image capture instruction, image acquisition device is being received if reached to image acquisition device
Start to acquire image after the image capture instruction that processor is sent.That is, using scheme provided in this embodiment, at end
End determines the opportunity for starting to acquire image in target area and when timing duration reaches threshold value, and starts to acquire image,
Without the manual input picture acquisition instructions of user, therefore user experience can be improved.
For terminal embodiment, since it is substantially similar to the method embodiment, so describe fairly simple, it is related
Place illustrates referring to the part of embodiment of the method.
It should be noted that herein, relational terms such as first and second and the like are used merely to a reality
Body or operation are distinguished with another entity or operation, are deposited without necessarily requiring or implying between these entities or operation
In any actual relationship or order or sequence.Moreover, the terms "include", "comprise" or any other variant be intended to it is non-
It is exclusive to include, so that the process, method, article or equipment including a series of elements includes not only those elements,
But also include other elements that are not explicitly listed, or further include solid by this process, method, article or equipment
Some elements.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including
There is also other identical elements in the process, method, article or equipment of the element.
Each embodiment in this specification is all made of relevant mode and describes, identical similar portion between each embodiment
Point just to refer each other, and each embodiment focuses on the differences from other embodiments.
The foregoing is merely the preferred embodiments of the application, are not intended to limit the protection domain of the application.It is all
Any modification, equivalent substitution, improvement and etc. done within spirit herein and principle are all contained in the protection domain of the application
It is interior.
Claims (16)
1. a kind of image-pickup method, which is characterized in that it is applied to client, the method includes:
When the client is in front stage operation state, whether the terminal where judging the client is in target area
It is interior;
If be in, it is in beginning timing in the target area for the terminal, and detect whether timing duration reaches
Target duration threshold value corresponding with the target area;
If reached, image is acquired.
2. according to the method described in claim 1, it is characterized in that, detecting that timing duration reaches the target duration threshold value
Later, the method further includes:
Determine first relative position of the terminal in the target area;
Obtain the first geographic orientation of terminal institute direction;
Judge whether the correspondence of first relative position and the first geographic orientation meets preset relative position and geography
The correspondence in orientation;
If it is satisfied, then the step of executing the acquisition image.
3. according to the method described in claim 2, it is characterized in that, the step of the acquisition image, including:
Image is acquired according to target acquisition mode corresponding with the target area.
4. according to the method described in claim 3, it is characterized in that, described acquire according to target corresponding with the target area
The step of type collection image, including:
Obtain target acquisition mode corresponding with the target area;
Obtain the current target state of the terminal;
Judge whether the correspondence of the target acquisition mode and target state meets preset acquisition mode and movement
The correspondence of state;
If it is, acquisition image.
5. according to the method described in claim 3, it is characterized in that, the method further includes:
Acquired image is stored to memory space corresponding with the target acquisition mode.
6. according to the method described in claim 5, it is characterized in that, it is described judge the client where terminal whether locate
Before the step in target area, the method further includes:
Receive target area input by user;And/or
Receive target duration threshold value corresponding with target area input by user;And/or
Receive target acquisition mode corresponding with target area input by user;And/or
Receive memory space corresponding with target acquisition mode input by user.
7. according to the method described in claim 3, it is characterized in that, after the acquisition image the step of, the method is also
Including:
When detecting that the terminal leaves the target area, stop acquisition image.
8. according to the method described in claim 2, it is characterized in that, whether described judge the terminal in target area
Step, including:
Receive the position for the terminal that positioning device is sent;
According to the position received, judge the terminal whether in target area.
9. a kind of terminal, which is characterized in that the terminal includes:Processor and image acquisition device;
The processor, for when the client of the terminal is in front stage operation state, obtaining the described of locator acquisition
Whether the position of terminal judges the terminal in target area according to the position of acquisition;When the terminal is in target area
When in domain, it is in the target area for the terminal and starts timing, and detect whether timing duration reaches and the mesh
Mark the corresponding target duration threshold value in region;When timing duration reaches target duration threshold value corresponding with the target area, to
Described image collector sends image capture instruction;
Described image collector, for after receiving the image capture instruction that the processor is sent, acquiring image.
10. terminal according to claim 9, which is characterized in that
The processor is additionally operable to after detecting that timing duration reaches the target duration threshold value, determines that the terminal exists
The first relative position in the target area;Obtain the first geography side of the terminal institute direction of position detector detection
Position;Judge whether first relative position and the correspondence of the first geographic orientation meet preset relative position and geography side
The correspondence of position, if it is satisfied, then sending image capture instruction to described image collector.
11. terminal according to claim 10, which is characterized in that
The processor is specifically used for determining target acquisition mode corresponding with the target area, and is adopted according to the target
Integrated mode generates image capture instruction, and described image acquisition instructions are sent to described image collector;
Described image collector is specifically used for receiving the image capture instruction that the processor is sent, and is adopted according to described image
Collect instruction acquisition image.
12. terminal according to claim 11, which is characterized in that the terminal further includes:Motion sensor;
The motion sensor, the target state current for detecting the terminal, and the target detected is moved into shape
State is sent to the processor;
The processor is additionally operable to obtain target acquisition mode corresponding with the target area;Obtain the motion sensor
The target state of transmission;It is preset to judge whether the correspondence of the target acquisition mode and target state meets
The correspondence of acquisition mode and motion state, if it is, sending image capture instruction to described image collector.
13. terminal according to claim 11, which is characterized in that
Described image collector is additionally operable to store acquired image to memory corresponding with the target acquisition mode.
14. terminal according to claim 13, which is characterized in that the processor is additionally operable to:
Before judging the terminal whether in target area, target area input by user is received;And/or
Before judging the terminal whether in target area, when receiving target corresponding with target area input by user
Long threshold value;And/or
Before judging the terminal whether in target area, receives target corresponding with target area input by user and adopt
Integrated mode;And/or
Before judging the terminal whether in the target area, input by user corresponding with target acquisition mode deposit is received
Reservoir.
15. terminal according to claim 11, which is characterized in that
The processor is additionally operable to after sending image capture instruction to described image collector, is detecting the terminal
When leaving the target area, is sent to described image collector and stop acquisition image command.
16. terminal according to claim 10, which is characterized in that the processor is specifically used for receiving locator transmission
The position of the terminal judge the terminal whether in target area according to the position received.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710142557.2A CN108574801A (en) | 2017-03-10 | 2017-03-10 | A kind of image-pickup method and terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710142557.2A CN108574801A (en) | 2017-03-10 | 2017-03-10 | A kind of image-pickup method and terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108574801A true CN108574801A (en) | 2018-09-25 |
Family
ID=63577433
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710142557.2A Pending CN108574801A (en) | 2017-03-10 | 2017-03-10 | A kind of image-pickup method and terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108574801A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110300136A (en) * | 2018-03-22 | 2019-10-01 | 杭州萤石软件有限公司 | A kind of cradle head control optimization method and system |
Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04324768A (en) * | 1991-04-24 | 1992-11-13 | Olympus Optical Co Ltd | Contrast information detection system and automatic focusing system |
JP2005065021A (en) * | 2003-08-18 | 2005-03-10 | Honda Motor Co Ltd | Image photographing apparatus of mobile robot using portable terminal device, and method thereof |
JP2005167504A (en) * | 2003-12-01 | 2005-06-23 | Canon Inc | System and method for photograph management |
US20080107411A1 (en) * | 2006-11-07 | 2008-05-08 | Sony Ericsson Mobile Communications Ab | User defined autofocus area |
US20100208123A1 (en) * | 2009-02-13 | 2010-08-19 | Fujitsu Limited | Photographing device, photographing method, and portable terminal apparatus |
CN102141628A (en) * | 2009-12-22 | 2011-08-03 | 卡西欧计算机株式会社 | Positioning device and positioning method |
CN102595017A (en) * | 2011-01-14 | 2012-07-18 | 柯尼卡美能达商用科技株式会社 | Image processing system including portable terminal |
CN103200366A (en) * | 2013-04-15 | 2013-07-10 | 中国联合网络通信集团有限公司 | Long-distance shooting control method, network server and terminal equipment |
CN103581528A (en) * | 2012-07-19 | 2014-02-12 | 百度在线网络技术(北京)有限公司 | Method for preprocessing in photographing process of mobile terminal and mobile terminal |
CN103685944A (en) * | 2013-11-26 | 2014-03-26 | 移康智能科技(上海)有限公司 | Positioning photography method for photography equipment |
JP2014107750A (en) * | 2012-11-28 | 2014-06-09 | Olympus Imaging Corp | Photographing apparatus |
CN104038684A (en) * | 2013-03-08 | 2014-09-10 | 联想(北京)有限公司 | Information processing method and electronic device |
CN104853104A (en) * | 2015-06-01 | 2015-08-19 | 深圳市微队信息技术有限公司 | Method and system for automatically tracking and shooting moving object |
CN104902170A (en) * | 2015-05-19 | 2015-09-09 | 广东欧珀移动通信有限公司 | Photo-taking method and terminal |
CN105049720A (en) * | 2015-07-08 | 2015-11-11 | 成都西可科技有限公司 | Method for automatically starting and stopping motion camera according to motion scene |
CN106454104A (en) * | 2016-10-28 | 2017-02-22 | 维沃移动通信有限公司 | Photographing method and mobile terminal |
-
2017
- 2017-03-10 CN CN201710142557.2A patent/CN108574801A/en active Pending
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04324768A (en) * | 1991-04-24 | 1992-11-13 | Olympus Optical Co Ltd | Contrast information detection system and automatic focusing system |
JP2005065021A (en) * | 2003-08-18 | 2005-03-10 | Honda Motor Co Ltd | Image photographing apparatus of mobile robot using portable terminal device, and method thereof |
JP2005167504A (en) * | 2003-12-01 | 2005-06-23 | Canon Inc | System and method for photograph management |
US20080107411A1 (en) * | 2006-11-07 | 2008-05-08 | Sony Ericsson Mobile Communications Ab | User defined autofocus area |
US20100208123A1 (en) * | 2009-02-13 | 2010-08-19 | Fujitsu Limited | Photographing device, photographing method, and portable terminal apparatus |
CN102141628A (en) * | 2009-12-22 | 2011-08-03 | 卡西欧计算机株式会社 | Positioning device and positioning method |
CN102595017A (en) * | 2011-01-14 | 2012-07-18 | 柯尼卡美能达商用科技株式会社 | Image processing system including portable terminal |
CN103581528A (en) * | 2012-07-19 | 2014-02-12 | 百度在线网络技术(北京)有限公司 | Method for preprocessing in photographing process of mobile terminal and mobile terminal |
JP2014107750A (en) * | 2012-11-28 | 2014-06-09 | Olympus Imaging Corp | Photographing apparatus |
CN104038684A (en) * | 2013-03-08 | 2014-09-10 | 联想(北京)有限公司 | Information processing method and electronic device |
CN103200366A (en) * | 2013-04-15 | 2013-07-10 | 中国联合网络通信集团有限公司 | Long-distance shooting control method, network server and terminal equipment |
CN103685944A (en) * | 2013-11-26 | 2014-03-26 | 移康智能科技(上海)有限公司 | Positioning photography method for photography equipment |
CN104902170A (en) * | 2015-05-19 | 2015-09-09 | 广东欧珀移动通信有限公司 | Photo-taking method and terminal |
CN104853104A (en) * | 2015-06-01 | 2015-08-19 | 深圳市微队信息技术有限公司 | Method and system for automatically tracking and shooting moving object |
CN105049720A (en) * | 2015-07-08 | 2015-11-11 | 成都西可科技有限公司 | Method for automatically starting and stopping motion camera according to motion scene |
CN106454104A (en) * | 2016-10-28 | 2017-02-22 | 维沃移动通信有限公司 | Photographing method and mobile terminal |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110300136A (en) * | 2018-03-22 | 2019-10-01 | 杭州萤石软件有限公司 | A kind of cradle head control optimization method and system |
CN110300136B (en) * | 2018-03-22 | 2021-12-24 | 杭州萤石软件有限公司 | Cloud deck control optimization method and system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104571532B (en) | A kind of method and device for realizing augmented reality or virtual reality | |
JP5659304B2 (en) | Image generating apparatus and image generating method | |
CN103907340B (en) | Video generation device and image generating method | |
JP5769813B2 (en) | Image generating apparatus and image generating method | |
JP5865388B2 (en) | Image generating apparatus and image generating method | |
CN105578027B (en) | A kind of photographic method and device | |
CN106550182A (en) | Shared unmanned plane viewing system | |
CN106406343B (en) | Control method, device and system of unmanned aerial vehicle | |
US10051180B1 (en) | Method and system for removing an obstructing object in a panoramic image | |
CN103442436A (en) | Indoor positioning terminal, network, system and method | |
CN107343165A (en) | A kind of monitoring method, equipment and system | |
CN109520500A (en) | One kind is based on the matched accurate positioning of terminal shooting image and streetscape library acquisition method | |
CN101872243B (en) | System and method for realizing 360-degree panoramic play following real space direction | |
CN107197153B (en) | Shooting method and shooting device for photo | |
EP2993894A1 (en) | Image capturing method, panorama image generating method and electronic apparatus | |
CN105827928A (en) | Focusing area selection method and focusing area selection device | |
CN105979140A (en) | Image generation device and image generation method | |
CN105678289A (en) | Control method and device of unmanned aerial vehicle | |
WO2011096343A1 (en) | Photographic location recommendation system, photographic location recommendation device, photographic location recommendation method, and program for photographic location recommendation | |
WO2012004622A1 (en) | An augmented reality method, and a corresponding system and software | |
CN110741625B (en) | Motion estimation method and photographic equipment | |
CN108574801A (en) | A kind of image-pickup method and terminal | |
CN107343142A (en) | The image pickup method and filming apparatus of a kind of photo | |
CN105653664B (en) | A kind of visual information processing method and system | |
CN109981973A (en) | Prevent the method, apparatus and storage medium of dangerous self-timer |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180925 |