CN109391762A - A kind of method and apparatus of track up - Google Patents
A kind of method and apparatus of track up Download PDFInfo
- Publication number
- CN109391762A CN109391762A CN201710656922.1A CN201710656922A CN109391762A CN 109391762 A CN109391762 A CN 109391762A CN 201710656922 A CN201710656922 A CN 201710656922A CN 109391762 A CN109391762 A CN 109391762A
- Authority
- CN
- China
- Prior art keywords
- image
- monitoring objective
- shooting
- target
- shooting direction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
The invention discloses a kind of method and apparatus of track up, belong to field of computer technology.The described method includes: obtaining the first image of image pickup section shooting;Determine position of the monitoring objective in the first image;According to relative position information of the monitoring objective between the position and preset image reference position in the first image, the first shooting direction adjustment information of described image shooting unit is determined;Based on the first shooting direction adjustment information, the shooting direction of described image shooting unit is adjusted.Using the present invention, the Quality advance of track up can be made.
Description
Technical field
The disclosure is directed to field of computer technology, especially with respect to a kind of method and apparatus of track up.
Background technique
In life, often the case where track up in need.For example, the process of track up traffic police, tracking
Shoot wild animal production documentary film etc..
The mode of general track up, is using PTZ camera, operator observes the target for needing track up, hand
Dynamic operation holder is rotated or is slided in guide level, is in tracking target in the coverage of video camera.
In the implementation of the present invention, the inventor finds that the existing technology has at least the following problems:
Based on above-mentioned treatment process, system needs operator to fully rely on observation, then manipulates holder movement, to tracking mesh
Mark is shot, and this method is more demanding to operator, while operator is also possible to generate fault, makes to track target movement
It has arrived except coverage, the quality of track up is caused to reduce.
Summary of the invention
In order to solve problems in the prior art, the embodiment of the invention provides a kind of method and apparatus of track up.Institute
It is as follows to state technical solution:
According to the first aspect of the embodiments of the present disclosure, a kind of method of track up is provided, which comprises
Obtain the first image of image pickup section shooting;
Determine position of the monitoring objective in the first image;
It is opposite between the position and preset image reference position in the first image according to the monitoring objective
Location information determines the first shooting direction adjustment information of described image shooting unit;
Based on the first shooting direction adjustment information, the shooting direction of described image shooting unit is adjusted.
Optionally, position of the determining monitoring objective in the first image, comprising:
According to training in advance for detecting the algorithm of target detection model of monitoring objective position, the monitoring objective is determined
Position in the first image.
Optionally, the algorithm of target detection model for being used to detect monitoring objective position of basis training in advance, determines
Position of the monitoring objective in the first image, comprising:
According to training in advance for detecting the algorithm of target detection model of monitoring objective position, at least one target is determined
Position in the first image;
In the first image of display, according to position of each target in the first image, corresponding each mesh
Mark display label, when receiving the selection instruction of corresponding first object, by the first object in the first image
Position is determined as position of the monitoring objective in the first image.
Optionally, the method also includes:
After the selection instruction for receiving corresponding first object, the second image of image pickup section shooting is got
When, according to the image feature information of the monitoring objective, determine position of the monitoring objective in second image,
In, the image feature information of the monitoring objective extracts from the previous frame image of second image;
It is opposite between the position and preset image reference position in second image according to the monitoring objective
Location information determines the second shooting direction adjustment information of described image shooting unit;
Based on the second shooting direction adjustment information, the shooting direction of described image shooting unit is adjusted.
Optionally, the method also includes:
According to the pitch angle of described image shooting unit, the mounting height and preset monitoring of described image shooting unit
Object height calculates the shooting distance between described image shooting unit and the monitoring objective;
The corresponding relationship of shooting distance and focal length according to the pre-stored data determines the corresponding target of calculated shooting distance
Focal length;
It is the target focal length by the Focussing of described image shooting unit.
Optionally, described image reference position is image center location.
Second aspect, provides a kind of device of track up, and described device includes:
A kind of device of track up, which is characterized in that described device includes:
Module is obtained, for obtaining the first image of image pickup section shooting;
Determining module, for determining position of the monitoring objective in the first image;It is also used to according to the monitoring mesh
The relative position information between position and preset image reference position being marked in the first image determines that described image is clapped
Take the photograph the first shooting direction adjustment information of component;
Module is adjusted, for being based on the first shooting direction adjustment information, to the shooting side of described image shooting unit
To being adjusted.
Optionally, the determining module, is used for:
According to training in advance for detecting the algorithm of target detection model of monitoring objective position, the monitoring objective is determined
Position in the first image.
Optionally, the determining module, is also used to:
According to training in advance for detecting the algorithm of target detection model of monitoring objective position, at least one target is determined
Position in the first image;
In the first image of display, according to position of each target in the first image, corresponding each mesh
Mark display label, when receiving the selection instruction of corresponding first object, by the first object in the first image
Position is determined as position of the monitoring objective in the first image.
Optionally it is determined that module, is also used to after the selection instruction for receiving corresponding first object, image bat is got
When taking the photograph the second image of component shooting, according to the image feature information of the monitoring objective, determine the monitoring objective described
Position in second image, wherein the image feature information of the monitoring objective extracts from the former frame figure of second image
Picture;According to relative position letter of the monitoring objective between the position and preset image reference position in second image
Breath, determines the second shooting direction adjustment information of described image shooting unit;
Module is adjusted, the shooting to described image shooting unit based on the second shooting direction adjustment information is also used to
Direction is adjusted.
Optionally it is determined that module, it is also used to the pitch angle according to described image shooting unit, described image shooting unit
Mounting height and preset monitoring objective height, calculate shooting between described image shooting unit and the monitoring objective away from
From;The corresponding relationship of shooting distance and focal length according to the pre-stored data determines the corresponding target focal length of calculated shooting distance;
Module is adjusted, is also used to the Focussing of described image shooting unit be the target focal length.
Optionally, image reference position is image center location.
According to the third aspect of an embodiment of the present disclosure, a kind of terminal is provided, the terminal includes processor and memory, institute
It states and is stored at least one instruction in memory, described instruction is loaded by the processor and executed to realize as in first aspect
The information displaying method.
According to a fourth aspect of embodiments of the present disclosure, a kind of computer readable storage medium is provided, in the storage medium
It is stored at least one instruction, described instruction is loaded as processor and executed to realize that the information as described in first aspect is shown
Method.
Technical solution provided in an embodiment of the present invention has the benefit that
In the embodiment of the present invention, the first image of image pickup section shooting is obtained, determines monitoring objective described first
Position in image, according to the monitoring objective between the position and preset image reference position in the first image
Relative position information determines the first shooting direction adjustment information of described image shooting unit, is based on first shooting direction
Adjustment information is adjusted the shooting direction of described image shooting unit.This way it is not necessary to operator manipulates holder movement,
PTZ camera can auto-tracking shooting monitoring objective, therefore the fault of shooting will not be generated, make the quality of track up
It improves.
It should be understood that above general description and following detailed description be only it is exemplary and explanatory, not
The disclosure can be limited.
Detailed description of the invention
To describe the technical solutions in the embodiments of the present invention more clearly, make required in being described below to embodiment
Attached drawing is briefly described, it should be apparent that, drawings in the following description are only some embodiments of the invention, for
For those of ordinary skill in the art, without creative efforts, it can also be obtained according to these attached drawings other
Attached drawing.
Fig. 1 is a kind of flow chart of track up method provided in an embodiment of the present invention;
Fig. 2 is a kind of display interface schematic diagram for detecting goal approach provided in an embodiment of the present invention;
Fig. 3 is a kind of display interface schematic diagram for clicking monitoring objective method provided in an embodiment of the present invention;
Fig. 4 is a kind of flow chart for adjusting focal length method provided in an embodiment of the present invention;
Fig. 5 is a kind of flow chart of determining monitoring objective method provided in an embodiment of the present invention;
Fig. 6 is a kind of structural schematic diagram of the device of track up provided in an embodiment of the present invention;
Fig. 7 is a kind of structural schematic diagram of terminal provided in an embodiment of the present invention.
Specific embodiment
To make the object, technical solutions and advantages of the present invention clearer, below in conjunction with attached drawing to embodiment party of the present invention
Formula is described in further detail.
The embodiment of the invention provides a kind of method of track up, this method can be realized by terminal.Terminal can be certainly
Band image pickup section is circumscribed with image pickup section, and image pickup section can be PTZ camera etc..The present embodiment with
The detailed description of scheme is carried out for the external PTZ camera of terminal, other situations are similar therewith, and this embodiment is not repeated.
Wherein, terminal may include the components such as processor, memory, screen.Processor can be CPU (Central
Processing Unit, central processing unit) etc., can be used for receiving instruction, control display carries out the processing such as showing,
In the embodiment of the present disclosure, processor can receive the image of PTZ camera transmission, will be related where identifying the target in image
On the one hand information is sent to display screen shows display screen, on the other hand calculates PTZ camera needs according to image information
The angle of transmission, and control PTZ camera and be adjusted rotation.Memory can be RAM (Random Access
Memory, random access memory), Flash (flash memory) etc. can be used for needed for storing the data received, treatment process
The data etc. generated in data, treatment process, in the embodiments of the present disclosure, memory can store holder camera shooting with visual form
The image of machine shooting, can also store image reference position etc..Screen can be touch screen etc., be displayed for equipment column
Table, the control page can be also used for detection touching signal etc., and in the embodiments of the present disclosure, display screen is used to show that holder images
The image and processor that machine takes identify the rectangle frame for being used to mark after target.
Terminal can also include transceiver, image-detection component, audio output part and audio input means etc..Transmitting-receiving
Device can be used for carrying out data transmission with other equipment, may include antenna, match circuit, modem etc..Image detection
Component can be camera etc..Audio output part can be speaker, earphone etc..Audio input means can be microphone etc..
The monitoring objective for carrying out track up can be varied, such as wild animal, personage, vehicle.The disclosure is implemented
Example carries out the detailed description of scheme, other situations are similar therewith, the embodiment of the present disclosure by taking track up traffic police process as an example
It repeats no more.
Below in conjunction with embodiment, process flow shown in FIG. 1 is described in detail, content can be such that
In a step 101, the first image of image pickup section shooting is obtained.
Wherein, the first image is any one picture frame when detecting monitoring objective in shooting video.
In an implementation, PTZ camera is mounted on the top of traffic police's cruiser by traffic police, in the car by terminal installation, terminal
It is connect with PTZ camera, control is transmitted and shot for data.When law enforcement, traffic police opens a terminal the track up system of middle installation
System, selection start to shoot, and the shooting direction of PTZ camera is then directed at the scene of the accident, and PTZ camera can will take
Video is persistently sent to terminal.Traffic police goes to the scene of the accident and starts to enforce the law after getting off, PTZ camera can continue to send to terminal
The video of shooting.Each picture frame in the available video shot to PTZ camera of terminal.First image can be
State any image frame in the video taken.
In a step 102, position of the monitoring objective in the first image is determined.
Wherein, position of the monitoring objective in the first image can be the central point of the monitoring objective identified in the first figure
Coordinate as in, or can be the coordinate etc. of the profile point of monitoring objective in the picture.
In an implementation, terminal can pass through preset image recognition side for being obtained from every image of PTZ camera
Formula, identification has the monitoring objective of certain feature in the picture, to determine that (the first image can for the position of monitoring objective in the picture
To be any one image that terminal obtained from PTZ camera and carried out identifying processing).For example, monitoring objective is traffic police, that
The yellow vest of traffic police can be identified in the first image, alternatively, monitoring objective is zebra, then can be in the first image
Identify black and white strip.When traffic police, which goes to the scene of the accident, enters the coverage of PTZ camera, terminal can then be imaged in holder
Monitoring objective is detected in the image of machine shooting, and determines its position in the picture.
It, can be without subsequent to first image if terminal can not identify monitoring objective in the first image
The processing of process.In this case, terminal can be without any processing, or control PTZ camera changes shooting side at random
To, and continue the identification that target is monitored to subsequent image, when it is clipped to monitoring objective, carry out the place of this process
Reason.
It is alternatively possible to the position detection of target is monitored by algorithm of target detection model, correspondingly, step 102
Specific processing can be such that according to training in advance for detecting the algorithm of target detection model of monitoring objective position, determine
Position of the monitoring objective in the first image.
Wherein, algorithm of target detection model can be a kind of HOG+ADABOOST (title of iterative algorithm) algorithm model
It can be the data of image Deng, the input of HOG+ADABOOST algorithm model, HOG+ADABOOST algorithm model uses rectangle frame
Mobile scanning is carried out in the picture, is often moved to a position, and whether output can be the image at the position in rectangle frame
Meet the image feature information of target, if the output of algorithm model is "Yes", expression detects that target, terminal obtain in the picture
The coordinate on four vertex of current rectangle frame.Terminal can be using the coordinate on this four vertex as the position of monitoring objective, Huo Zheye
The coordinate that rectangle frame central point can be calculated based on the coordinate on this four vertex, the position as monitoring objective.If algorithm mould
The output of type is "No", then indicates to can't detect target in present image, relevant treatment is referring to above-mentioned steps 102.
Optionally, it may detect that multiple targets (such as multiple traffic police) in the picture, it at this moment can be by manually selecting
Mode selects monitoring objective in multiple targets, and corresponding step is as follows:
In step 102A, according to training in advance for detecting the algorithm of target detection model of monitoring objective position, really
Fixed position of at least one target in the first image.
In an implementation, when the first image of terminal acquisition image pickup section shooting, the first image can be input to instruction
It, will whenever HOG+ADABOOST algorithm model Scanning Detction is to a target in the HOG+ADABOOST algorithm model practised
A result "Yes" is exported, terminal can obtain scanning to the coordinate on four vertex of rectangle frame of this target.I.e. when one is schemed
When including multiple targets as in, the coordinate on four vertex of rectangle frame of the available each scanning target of terminal.Terminal can incite somebody to action
Position of the coordinate on each four vertex of target as target, or can also be based on the coordinate on four vertex of each target
Calculate the coordinate of rectangle frame central point, the position as each target.
In step 102B, as shown in Fig. 2, in the first image of display, according to each target in the first image
Position, corresponding each target display label, when receiving the selection instruction of corresponding first object, by first object in the first figure
Position as in, is determined as position of the monitoring objective in the first image.
Wherein, label can be rectangle frame etc..
In an implementation, terminal is by the video real-time display received from PTZ camera in display screen.The first of display
In image, according to the position for each target that algorithm of target detection model inspection goes out, and shows and marks in the position of each target,
For example, the position of target is the coordinate on four vertex of the rectangle frame of the image range of target, then the label shown can be
Rectangle frame, alternatively, the position of target is the coordinate of the central point on aforementioned four vertex, then the label shown can be the center
A dot, box or round frame etc. at point.When user (another traffic police being seated within a vehicle) manually selects a target as prison
When controlling target, terminal determines position of the monitoring objective in the first image using the target as monitoring objective.
In step 103, as shown in figure 3, according to position of the monitoring objective in the first image and preset image reference
Relative position information between position determines the first shooting direction adjustment information of image pickup section.
Wherein, image reference position is image center location, and the other positions being also possible in image can be according to reality
Demand arbitrarily selects.Image reference position is preset to wish monitoring objective the location of in the picture.
In an implementation, position and preset image reference position of the monitoring objective in the first image are determined, and then
In one image, direction of the position of monitoring objective relative to image reference position is determined, as their relative position information, into
The direction can be determined as the first shooting direction adjustment information by one step.
At step 104, it is based on the first shooting direction adjustment information, the shooting direction of image pickup section is adjusted
It is whole.
In an implementation, after obtaining the first shooting direction adjustment information, terminal can be to image pickup section sending direction tune
Whole notice, wherein carrying the first shooting direction adjustment information.Image Adjusting component receive direction adjustment notice after, to itself
Shooting direction is adjusted according to the first shooting direction adjustment message.
Optionally, in addition to above-mentioned adjustment shooting direction, there are also adjust focal length, corresponding place for the adjustment of image pickup section
Reason can be as shown in figure 4, include the following steps:
In step 401, according to the pitch angle of image pickup section, the mounting height of image pickup section and preset
Monitoring objective height calculates the shooting distance between image pickup section and monitoring objective.
Wherein, the pitch angle of image pickup section is that image pickup section according to shooting direction adjustment information is adjusted it
Pitch angle afterwards.
In an implementation, image pickup section is first adjusted according to shooting direction adjustment information, uses image adjusted
The pitch angle of shooting unit, calculates the shooting distance between image pickup section and monitoring objective, this process needs to use right angle
The knowledge of the triangle cosine law.The mounting height of image pickup section subtracts preset monitoring objective height, obtained numerical value
The as length of a right-angle side of right angled triangle, the pitch angle of image pickup section be bevel edge required by right angled triangle with
The angle of known right-angle side, according to the right angled triangle cosine law, it is known that one right-angle side of right angled triangle and one it is non-straight
The angle at angle can find out the length of bevel edge.Bevel edge is the shooting distance between image pickup section and monitoring objective.
In step 402, the corresponding relationship of shooting distance and focal length according to the pre-stored data, determine it is calculated shooting away from
From corresponding target focal length.
In an implementation, in order to reach preferably with clapping effect, need to guarantee that the size of monitoring objective in the picture is stablized
Near some numerical value, the numerical value can be first arranged in technical staff, and then is arranged between shooting distance and focal length based on the numerical value
Corresponding relationship, storage in the terminal, can be stored, as shown in table 1 in the form of mapping table.
Table 1
After calculating the shooting distance between shooting unit and monitoring objective, it can be looked into above-mentioned mapping table
Look for the corresponding focal length of the shooting distance, i.e. target focal length.
It in step 403, is target focal length by the Focussing of image pickup section.
In an implementation, after terminal determines target focal length, Focussing message can be sent to image pickup section, wherein taking
Band target focal length.After Image Adjusting component receives Focussing message, target focal length is set by the focal length of itself.
Optionally, the case where manually selecting monitoring objective for above-mentioned user can obtain monitoring objective in the picture
Characteristics of image, for determining the position of monitoring objective in the image of subsequent shooting, corresponding processing can with as shown in figure 5,
Include the following steps:
In step 501, after the selection instruction for receiving corresponding first object, image pickup section shooting is got
The second image when, according to the image feature information of monitoring objective, determine position of the monitoring objective in the second image.Wherein,
The image feature information of monitoring objective extracts from the previous frame image of the second image.
Wherein, the second image is after user manually selects monitoring objective, and terminal receives the selection instruction of first object
Later, the arbitrary image frame that image pickup section takes.
In an implementation, user manually selects first object in each target, and terminal can then receive the selection of first object
Instruction.Hereafter, when terminal receives image (i.e. the second image) of image pickup section transmission again, determine monitoring objective the
Corresponding position in two image previous frame images, the position in the second image determine a range, the centre bit of the range
Set identical as the center of monitoring objective in previous frame image, area is default times of monitoring objective area in previous frame image
The image of monitoring objective in the image and previous frame image of different location in the range is carried out matching judgement, and then determined by number
Position of the monitoring objective in the second image.
For each frame image received after selection instruction, the position of monitoring objective can be identified according to the processing mode
It sets.
In step 502, according to monitoring objective between the position and preset image reference position in the second image
Relative position information determines the second shooting direction adjustment information of image pickup section.
In step 503, it is based on the second shooting direction adjustment information, the shooting direction of image pickup section is adjusted
It is whole.
In an implementation, step 502,503 processing respectively with step 103,104 similar, may refer in above-described embodiment
Content.
In the whole process, two stages can be divided into.One stage is manual from one or more targets in user
Before selecting monitoring objective, based on the position of XX algorithm model detection monitoring objective, and then shooting direction is adjusted, this process can be with
Referring to step 101 to 104.Another stage is the base in user after manually selecting monitoring objective in one or more targets
The image of monitoring objective in previous image frame detects the position of monitoring objective in current image frame, and then adjusts shooting
Direction, this process may refer to step 501 to 503.
In the embodiment of the present invention, the first image of image pickup section shooting is obtained, determines monitoring objective described first
Position in image, according to the monitoring objective between the position and preset image reference position in the first image
Relative position information determines the first shooting direction adjustment information of described image shooting unit, is based on first shooting direction
Adjustment information is adjusted the shooting direction of described image shooting unit.This way it is not necessary to operator manipulates holder movement,
PTZ camera can auto-tracking shooting monitoring objective, therefore the fault of shooting will not be generated, make the quality of track up
It improves.
Disclosure another exemplary embodiment provides a kind of device of track up, as shown in fig. 6, the device includes:
Module 610 is obtained, determining module 620 adjusts module 630.
The acquisition module 610 is configured as obtaining the first image of image pickup section shooting;
The determining module 620 is configured to determine that position of the monitoring objective in the first image;According to the monitoring
Relative position information of the target between the position and preset image reference position in the first image, determines described image
First shooting direction adjustment information of shooting unit;
The adjustment module 630 is configured as based on the first shooting direction adjustment information, to described image shooting unit
Shooting direction be adjusted.
Optionally, the determining module 620 is also configured to
According to training in advance for detecting the algorithm of target detection model of monitoring objective position, the monitoring objective is determined
Position in the first image.
Optionally, the determining module 620, is also configured to
According to training in advance for detecting the algorithm of target detection model of monitoring objective position, at least one target is determined
Position in the first image;
In the first image of display, according to position of each target in the first image, corresponding each mesh
Mark display label, when receiving the selection instruction of corresponding first object, by the first object in the first image
Position is determined as position of the monitoring objective in the first image.
Optionally, the determining module 620, comprising:
Determining module 620 is additionally configured to after the selection instruction for receiving corresponding first object, gets image bat
When taking the photograph the second image of component shooting, according to the image feature information of the monitoring objective, determine the monitoring objective described
Position in second image, wherein the image feature information of the monitoring objective extracts from the former frame figure of second image
Picture;According to relative position letter of the monitoring objective between the position and preset image reference position in second image
Breath, determines the second shooting direction adjustment information of described image shooting unit;
Module 630 is adjusted, is additionally configured to based on the second shooting direction adjustment information, to described image shooting unit
Shooting direction be adjusted.
Optionally, the determining module 620, comprising:
Determining module 620 is additionally configured to the pitch angle according to described image shooting unit, described image shooting unit
Mounting height and preset monitoring objective height, calculate shooting between described image shooting unit and the monitoring objective away from
From;The corresponding relationship of shooting distance and focal length according to the pre-stored data determines the corresponding target focal length of calculated shooting distance;
Module 630 is adjusted, is additionally configured to the Focussing of described image shooting unit be the target focal length.
Optionally, image reference position is image center location.
About the device in above-described embodiment, wherein modules execute the concrete mode of operation in related this method
Embodiment in be described in detail, no detailed explanation will be given here.
In the embodiment of the present invention, the first image of image pickup section shooting is obtained, determines monitoring objective described first
Position in image, according to the monitoring objective between the position and preset image reference position in the first image
Relative position information determines the first shooting direction adjustment information of described image shooting unit, is based on first shooting direction
Adjustment information is adjusted the shooting direction of described image shooting unit.This way it is not necessary to operator manipulates holder movement,
PTZ camera can auto-tracking shooting monitoring objective, therefore the fault of shooting will not be generated, make the quality of track up
It improves.
It should be understood that the device of track up provided by the above embodiment is in track up, only with above-mentioned each function
Can module division progress for example, in practical application, can according to need and by above-mentioned function distribution by different functions
Module is completed, i.e., the internal structure of equipment is divided into different functional modules, described above all or part of to complete
Function.In addition, the device of track up provided by the above embodiment and the embodiment of the method for track up belong to same design,
Specific implementation process is detailed in embodiment of the method, and which is not described herein again.
Referring to FIG. 7, the terminal can be used for it illustrates the structural schematic diagram of terminal involved in the embodiment of the present invention
The method of the track up provided in above-described embodiment is provided.Specifically:
Terminal 700 may include RF (Radio Frequency, radio frequency) circuit 710, include one or more meter
The memory 720 of calculation machine readable storage medium storing program for executing, input unit 730, display unit 740, sensor 750, voicefrequency circuit 760,
WiFi (wireless fidelity, Wireless Fidelity) module 770, the processing for including one or more than one processing core
The components such as device 780 and power supply 790.It will be understood by those skilled in the art that terminal structure shown in Fig. 7 is not constituted pair
The restriction of terminal may include perhaps combining certain components or different component cloth than illustrating more or fewer components
It sets.Wherein:
RF circuit 710 can be used for receiving and sending messages or communication process in, signal sends and receivees, particularly, by base station
After downlink information receives, one or the processing of more than one processor 780 are transferred to;In addition, the data for being related to uplink are sent to
Base station.In general, RF circuit 710 includes but is not limited to antenna, at least one amplifier, tuner, one or more oscillators, uses
Family identity module (SIM) card, transceiver, coupler, LNA (Low Noise Amplifier, low-noise amplifier), duplex
Device etc..In addition, RF circuit 710 can also be communicated with network and other equipment by wireless communication.The wireless communication can make
With any communication standard or agreement, and including but not limited to GSM (Global System of Mobile communication, entirely
Ball mobile communcations system), GPRS (General Packet Radio Service, general packet radio service), CDMA (Code
Division Multiple Access, CDMA), WCDMA (Wideband Code Division Multiple
Access, wideband code division multiple access), LTE (Long Term Evolution, long term evolution), Email, SMS (Short
Messaging Service, short message service) etc..
Memory 720 can be used for storing software program and module, and processor 780 is stored in memory 720 by operation
Software program and module, thereby executing various function application and data processing.Memory 720 can mainly include storage journey
Sequence area and storage data area, wherein storing program area can the (ratio of application program needed for storage program area, at least one function
Such as sound-playing function, image player function) etc.;Storage data area, which can be stored, uses created number according to terminal 700
According to (such as audio data, phone directory etc.) etc..In addition, memory 720 may include high-speed random access memory, can also wrap
Include nonvolatile memory, a for example, at least disk memory, flush memory device or other volatile solid-state parts.
Correspondingly, memory 720 can also include Memory Controller, to provide processor 780 and input unit 730 to memory
720 access.
Input unit 730 can be used for receiving the number or character information of input, and generate and user setting and function
Control related keyboard, mouse, operating stick, optics or trackball signal input.Specifically, input unit 730 may include touching
Sensitive surfaces 731 and other input equipments 732.Touch sensitive surface 731, also referred to as touch display screen or Trackpad are collected and are used
Family on it or nearby touch operation (such as user using any suitable object or attachment such as finger, stylus in touch-sensitive table
Operation on face 731 or near touch sensitive surface 731), and corresponding attachment device is driven according to preset formula.It is optional
, touch sensitive surface 731 may include both touch detecting apparatus and touch controller.Wherein, touch detecting apparatus detection is used
The touch orientation at family, and touch operation bring signal is detected, transmit a signal to touch controller;Touch controller is from touch
Touch information is received in detection device, and is converted into contact coordinate, then gives processor 780, and can receive processor 780
The order sent simultaneously is executed.Furthermore, it is possible to using multiple types such as resistance-type, condenser type, infrared ray and surface acoustic waves
Realize touch sensitive surface 731.In addition to touch sensitive surface 731, input unit 730 can also include other input equipments 732.Specifically,
Other input equipments 732 can include but is not limited to physical keyboard, function key (such as volume control button, switch key etc.),
One of trace ball, mouse, operating stick etc. are a variety of.
Display unit 740 can be used for showing information input by user or the information and terminal 700 that are supplied to user
Various graphical user interface, these graphical user interface can be made of figure, text, icon, video and any combination thereof.
Display unit 740 may include display panel 741, optionally, can use LCD (Liquid Crystal Display, liquid crystal
Show device), the forms such as OLED (Organic Light-Emitting Diode, Organic Light Emitting Diode) configure display panel
741.Further, touch sensitive surface 731 can cover display panel 741, when touch sensitive surface 731 detects touching on it or nearby
After touching operation, processor 780 is sent to determine the type of touch event, is followed by subsequent processing device 780 according to the type of touch event
Corresponding visual output is provided on display panel 741.Although touch sensitive surface 731 and display panel 741 are conducts in Fig. 7
Two independent components realize input and input function, but in some embodiments it is possible to by touch sensitive surface 731 and display
Panel 741 is integrated and realizes and outputs and inputs function.
Terminal 700 may also include at least one sensor 750, such as optical sensor, motion sensor and other sensings
Device.Specifically, optical sensor may include ambient light sensor and proximity sensor, wherein ambient light sensor can be according to environment
The light and shade of light adjusts the brightness of display panel 741, and proximity sensor can close display when terminal 700 is moved in one's ear
Panel 741 and/or backlight.As a kind of motion sensor, gravity accelerometer can detect in all directions (generally
Three axis) acceleration size, can detect that size and the direction of gravity when static, can be used to identify mobile phone posture application (ratio
Such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, tap);Extremely
In other sensors such as gyroscope, barometer, hygrometer, thermometer, the infrared sensors that terminal 700 can also configure, herein
It repeats no more.
Voicefrequency circuit 760, loudspeaker 761, microphone 762 can provide the audio interface between user and terminal 700.Audio
Electric signal after the audio data received conversion can be transferred to loudspeaker 761, be converted to sound by loudspeaker 761 by circuit 760
Sound signal output;On the other hand, the voice signal of collection is converted to electric signal by microphone 762, after being received by voicefrequency circuit 760
Audio data is converted to, then by after the processing of audio data output processor 780, such as another end is sent to through RF circuit 710
End, or audio data is exported to memory 720 to be further processed.Voicefrequency circuit 760 is also possible that earphone jack,
To provide the communication of peripheral hardware earphone Yu terminal 700.
WiFi belongs to short range wireless transmission technology, and terminal 700 can help user's transceiver electronics by WiFi module 770
Mail, browsing webpage and access streaming video etc., it provides wireless broadband internet access for user.Although Fig. 7 is shown
WiFi module 770, but it is understood that, and it is not belonging to must be configured into for terminal 700, it can according to need completely
Do not change in the range of the essence of invention and omits.
Processor 180 is the control centre of terminal 700, utilizes each portion of various interfaces and connection whole mobile phone
Point, by running or execute the software program and/or module that are stored in memory 720, and calls and be stored in memory 720
Interior data execute the various functions and processing data of terminal 700, to carry out integral monitoring to mobile phone.Optionally, processor
780 may include one or more processing cores;Preferably, processor 780 can integrate application processor and modem processor,
Wherein, the main processing operation system of application processor, user interface and application program etc., modem processor mainly handles nothing
Line communication.It is understood that above-mentioned modem processor can not also be integrated into processor 780.
Terminal 700 further includes the power supply 790 (such as battery) powered to all parts, it is preferred that power supply can pass through electricity
Management system and processor 780 are logically contiguous, to realize management charging, electric discharge and power consumption by power-supply management system
The functions such as management.Power supply 790 can also include one or more direct current or AC power source, recharging system, power supply event
Hinder the random components such as detection circuit, power adapter or inverter, power supply status indicator.
Although being not shown, terminal 700 can also include camera, bluetooth module etc., and details are not described herein.Specifically in this reality
It applies in example, the display unit of terminal 700 is touch-screen display, and terminal 700 further includes having memory and one or one
Above program, one of them perhaps more than one program be stored in memory and be configured to by one or one with
The method that upper processor executes this or more than one program to execute track up described in above-mentioned each embodiment.
Those of ordinary skill in the art will appreciate that realizing that all or part of the steps of above-described embodiment can pass through hardware
It completes, relevant hardware can also be instructed to complete by program, the program can store in a kind of computer-readable
In storage medium, storage medium mentioned above can be read-only memory, disk or CD etc..
The foregoing is merely presently preferred embodiments of the present invention, is not intended to limit the invention, it is all in spirit of the invention and
Within principle, any modification, equivalent replacement, improvement and so on be should all be included in the protection scope of the present invention.
Claims (14)
1. a kind of method of track up, which is characterized in that the described method includes:
Obtain the first image of image pickup section shooting;
Determine position of the monitoring objective in the first image;
According to relative position of the monitoring objective between the position and preset image reference position in the first image
Information determines the first shooting direction adjustment information of described image shooting unit;
Based on the first shooting direction adjustment information, the shooting direction of described image shooting unit is adjusted.
2. the method according to claim 1, wherein position of the determining monitoring objective in the first image,
Include:
According to training in advance for detecting the algorithm of target detection model of monitoring objective position, determine the monitoring objective in institute
State the position in the first image.
3. according to the method described in claim 2, it is characterized in that, basis training in advance is used to detect monitoring objective position
The algorithm of target detection model set determines position of the monitoring objective in the first image, comprising:
According to training in advance for detecting the algorithm of target detection model of monitoring objective position, determine at least one target in institute
State the position in the first image;
In the first image of display, according to position of each target in the first image, corresponding each target is aobvious
Indicating note, when receiving the selection instruction of corresponding first object, by position of the first object in the first image,
It is determined as position of the monitoring objective in the first image.
4. according to the method described in claim 3, it is characterized in that, the method also includes:
After the selection instruction for receiving corresponding first object, when getting the second image of image pickup section shooting, root
According to the image feature information of the monitoring objective, position of the monitoring objective in second image is determined, wherein described
The image feature information of monitoring objective extracts from the previous frame image of second image;
According to relative position of the monitoring objective between the position and preset image reference position in second image
Information determines the second shooting direction adjustment information of described image shooting unit;
Based on the second shooting direction adjustment information, the shooting direction of described image shooting unit is adjusted.
5. the method according to claim 1, wherein the method also includes:
According to the pitch angle of described image shooting unit, the mounting height of described image shooting unit and preset monitoring objective
Highly, the shooting distance between described image shooting unit and the monitoring objective is calculated;
The corresponding relationship of shooting distance and focal length according to the pre-stored data determines that the corresponding target of calculated shooting distance is burnt
Away from;
It is the target focal length by the Focussing of described image shooting unit.
6. the method according to claim 1, wherein described image reference position is image center location.
7. a kind of device of track up, which is characterized in that described device includes:
Module is obtained, for obtaining the first image of image pickup section shooting;
Determining module, for determining position of the monitoring objective in the first image;According to the monitoring objective described
The relative position information between position and preset image reference position in one image determines the of described image shooting unit
One shooting direction adjustment information;
Adjust module, for being based on the first shooting direction adjustment information, to the shooting direction of described image shooting unit into
Row adjustment.
8. device according to claim 7, which is characterized in that the determining module is used for:
According to training in advance for detecting the algorithm of target detection model of monitoring objective position, determine the monitoring objective in institute
State the position in the first image.
9. device according to claim 8, which is characterized in that the determining module is used for:
According to training in advance for detecting the algorithm of target detection model of monitoring objective position, determine at least one target in institute
State the position in the first image;
In the first image of display, according to position of each target in the first image, corresponding each target is aobvious
Indicating note, when receiving the selection instruction of corresponding first object, by position of the first object in the first image,
It is determined as position of the monitoring objective in the first image.
10. device according to claim 9, which is characterized in that the determining module is also used to receiving corresponding first
After the selection instruction of target, when getting the second image of image pickup section shooting, according to the image of the monitoring objective
Characteristic information determines position of the monitoring objective in second image, wherein the characteristics of image of the monitoring objective is believed
Breath extracts from the previous frame image of second image;According to position of the monitoring objective in second image and preset
Image reference position between relative position information, determine the second shooting direction adjustment information of described image shooting unit;
Module is adjusted, is also used to based on the second shooting direction adjustment information, to the shooting direction of described image shooting unit
It is adjusted.
11. device according to claim 7, which is characterized in that the determining module is also used to be shot according to described image
The pitch angle of component, the mounting height of described image shooting unit and preset monitoring objective height calculate described image and clap
Take the photograph the shooting distance between component and the monitoring objective;The corresponding relationship of shooting distance and focal length according to the pre-stored data, really
Determine the corresponding target focal length of calculated shooting distance;
Module is adjusted, is also used to the Focussing of described image shooting unit be the target focal length.
12. device according to claim 7, which is characterized in that image reference position is image center location.
13. a kind of terminal, which is characterized in that the terminal includes processor and memory, is stored at least in the memory
One instruction, at least a Duan Chengxu, code set or instruction set, at least one instruction, an at least Duan Chengxu, the generation
Code collection or instruction set are loaded by the processor and are executed to realize the acquisition firmware code as described in claim 1 to 6 is any
Method.
14. a kind of computer readable storage medium, which is characterized in that be stored at least one instruction, extremely in the storage medium
A few Duan Chengxu, code set or instruction set, at least one instruction, an at least Duan Chengxu, the code set or instruction
The method that collection is loaded by the processor and executed to realize the acquisition firmware code as described in claim 1 to 6 is any.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710656922.1A CN109391762B (en) | 2017-08-03 | 2017-08-03 | Tracking shooting method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710656922.1A CN109391762B (en) | 2017-08-03 | 2017-08-03 | Tracking shooting method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109391762A true CN109391762A (en) | 2019-02-26 |
CN109391762B CN109391762B (en) | 2021-10-22 |
Family
ID=65412997
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710656922.1A Active CN109391762B (en) | 2017-08-03 | 2017-08-03 | Tracking shooting method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109391762B (en) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110456829A (en) * | 2019-08-07 | 2019-11-15 | 深圳市维海德技术股份有限公司 | Positioning and tracing method, device and computer readable storage medium |
CN110719403A (en) * | 2019-09-27 | 2020-01-21 | 北京小米移动软件有限公司 | Image processing method, device and storage medium |
CN110719406A (en) * | 2019-10-15 | 2020-01-21 | 腾讯科技(深圳)有限公司 | Shooting processing method, shooting equipment and computer equipment |
CN111123959A (en) * | 2019-11-18 | 2020-05-08 | 亿航智能设备(广州)有限公司 | Unmanned aerial vehicle control method based on gesture recognition and unmanned aerial vehicle adopting same |
CN111243030A (en) * | 2020-01-06 | 2020-06-05 | 浙江大华技术股份有限公司 | Target focusing dynamic compensation method and device and storage device |
CN111656403A (en) * | 2019-06-27 | 2020-09-11 | 深圳市大疆创新科技有限公司 | Method and device for tracking target and computer storage medium |
CN111756990A (en) * | 2019-03-29 | 2020-10-09 | 阿里巴巴集团控股有限公司 | Image sensor control method, device and system |
CN111862620A (en) * | 2020-07-10 | 2020-10-30 | 浙江大华技术股份有限公司 | Image fusion processing method and device |
CN111901528A (en) * | 2020-08-05 | 2020-11-06 | 深圳市浩瀚卓越科技有限公司 | Shooting equipment stabilizer |
CN112017210A (en) * | 2020-07-14 | 2020-12-01 | 创泽智能机器人集团股份有限公司 | Target object tracking method and device |
CN112070061A (en) * | 2020-09-22 | 2020-12-11 | 苏州臻迪智能科技有限公司 | Unmanned aerial vehicle-based motion monitoring method and device |
CN112648476A (en) * | 2020-07-06 | 2021-04-13 | 深圳市寻视光电有限公司 | Automatic tracking cradle head support and tracking method thereof |
CN112648477A (en) * | 2020-07-06 | 2021-04-13 | 深圳市寻视光电有限公司 | Automatic tracking cradle head support and tracking method thereof |
CN112843734A (en) * | 2020-12-31 | 2021-05-28 | 上海米哈游天命科技有限公司 | Picture shooting method, device, equipment and storage medium |
CN113489893A (en) * | 2020-07-31 | 2021-10-08 | 深圳技术大学 | Real-time target object tracking and positioning method and real-time target object tracking and positioning device |
CN113766175A (en) * | 2020-06-04 | 2021-12-07 | 杭州萤石软件有限公司 | Target monitoring method, device, equipment and storage medium |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101888479A (en) * | 2009-05-14 | 2010-11-17 | 汉王科技股份有限公司 | Method and device for detecting and tracking target image |
CN103149939A (en) * | 2013-02-26 | 2013-06-12 | 北京航空航天大学 | Dynamic target tracking and positioning method of unmanned plane based on vision |
CN103197491A (en) * | 2013-03-28 | 2013-07-10 | 华为技术有限公司 | Method capable of achieving rapid automatic focusing and image acquisition device |
CN103248799A (en) * | 2012-02-01 | 2013-08-14 | 联想(北京)有限公司 | Photographing method, photographing device and electronic equipment all for tracking target object |
CN103248824A (en) * | 2013-04-27 | 2013-08-14 | 天脉聚源(北京)传媒科技有限公司 | Method and device for determining shooting angle of camera and picture pick-up system |
US20130222607A1 (en) * | 2012-02-24 | 2013-08-29 | Kyocera Corporation | Camera device, camera system and camera calibration method |
CN105357442A (en) * | 2015-11-27 | 2016-02-24 | 小米科技有限责任公司 | Shooting angle adjustment method and device for camera |
CN105718887A (en) * | 2016-01-21 | 2016-06-29 | 惠州Tcl移动通信有限公司 | Shooting method and shooting system capable of realizing dynamic capturing of human faces based on mobile terminal |
CN105898136A (en) * | 2015-11-17 | 2016-08-24 | 乐视致新电子科技(天津)有限公司 | Camera angle adjustment method, system and television |
CN106303195A (en) * | 2015-05-28 | 2017-01-04 | 中兴通讯股份有限公司 | Capture apparatus and track up method and system |
-
2017
- 2017-08-03 CN CN201710656922.1A patent/CN109391762B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101888479A (en) * | 2009-05-14 | 2010-11-17 | 汉王科技股份有限公司 | Method and device for detecting and tracking target image |
CN103248799A (en) * | 2012-02-01 | 2013-08-14 | 联想(北京)有限公司 | Photographing method, photographing device and electronic equipment all for tracking target object |
US20130222607A1 (en) * | 2012-02-24 | 2013-08-29 | Kyocera Corporation | Camera device, camera system and camera calibration method |
CN103149939A (en) * | 2013-02-26 | 2013-06-12 | 北京航空航天大学 | Dynamic target tracking and positioning method of unmanned plane based on vision |
CN103197491A (en) * | 2013-03-28 | 2013-07-10 | 华为技术有限公司 | Method capable of achieving rapid automatic focusing and image acquisition device |
CN103248824A (en) * | 2013-04-27 | 2013-08-14 | 天脉聚源(北京)传媒科技有限公司 | Method and device for determining shooting angle of camera and picture pick-up system |
CN106303195A (en) * | 2015-05-28 | 2017-01-04 | 中兴通讯股份有限公司 | Capture apparatus and track up method and system |
CN105898136A (en) * | 2015-11-17 | 2016-08-24 | 乐视致新电子科技(天津)有限公司 | Camera angle adjustment method, system and television |
CN105357442A (en) * | 2015-11-27 | 2016-02-24 | 小米科技有限责任公司 | Shooting angle adjustment method and device for camera |
CN105718887A (en) * | 2016-01-21 | 2016-06-29 | 惠州Tcl移动通信有限公司 | Shooting method and shooting system capable of realizing dynamic capturing of human faces based on mobile terminal |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111756990A (en) * | 2019-03-29 | 2020-10-09 | 阿里巴巴集团控股有限公司 | Image sensor control method, device and system |
CN111656403A (en) * | 2019-06-27 | 2020-09-11 | 深圳市大疆创新科技有限公司 | Method and device for tracking target and computer storage medium |
CN110456829A (en) * | 2019-08-07 | 2019-11-15 | 深圳市维海德技术股份有限公司 | Positioning and tracing method, device and computer readable storage medium |
CN110719403A (en) * | 2019-09-27 | 2020-01-21 | 北京小米移动软件有限公司 | Image processing method, device and storage medium |
CN110719406A (en) * | 2019-10-15 | 2020-01-21 | 腾讯科技(深圳)有限公司 | Shooting processing method, shooting equipment and computer equipment |
CN110719406B (en) * | 2019-10-15 | 2022-06-14 | 腾讯科技(深圳)有限公司 | Shooting processing method, shooting equipment and computer equipment |
CN111123959A (en) * | 2019-11-18 | 2020-05-08 | 亿航智能设备(广州)有限公司 | Unmanned aerial vehicle control method based on gesture recognition and unmanned aerial vehicle adopting same |
CN111243030A (en) * | 2020-01-06 | 2020-06-05 | 浙江大华技术股份有限公司 | Target focusing dynamic compensation method and device and storage device |
CN111243030B (en) * | 2020-01-06 | 2023-08-11 | 浙江大华技术股份有限公司 | Target focusing dynamic compensation method and device and storage device |
CN113766175A (en) * | 2020-06-04 | 2021-12-07 | 杭州萤石软件有限公司 | Target monitoring method, device, equipment and storage medium |
CN112648476A (en) * | 2020-07-06 | 2021-04-13 | 深圳市寻视光电有限公司 | Automatic tracking cradle head support and tracking method thereof |
CN112648477A (en) * | 2020-07-06 | 2021-04-13 | 深圳市寻视光电有限公司 | Automatic tracking cradle head support and tracking method thereof |
CN112648476B (en) * | 2020-07-06 | 2022-10-18 | 深圳市寻视光电有限公司 | Automatic tracking cradle head support and tracking method thereof |
CN112648477B (en) * | 2020-07-06 | 2022-12-27 | 深圳市寻视光电有限公司 | Automatic tracking cradle head support and tracking method thereof |
CN111862620A (en) * | 2020-07-10 | 2020-10-30 | 浙江大华技术股份有限公司 | Image fusion processing method and device |
CN112017210A (en) * | 2020-07-14 | 2020-12-01 | 创泽智能机器人集团股份有限公司 | Target object tracking method and device |
CN113489893A (en) * | 2020-07-31 | 2021-10-08 | 深圳技术大学 | Real-time target object tracking and positioning method and real-time target object tracking and positioning device |
CN113489893B (en) * | 2020-07-31 | 2023-04-07 | 深圳技术大学 | Real-time target object tracking and positioning method and real-time target object tracking and positioning device |
CN111901528A (en) * | 2020-08-05 | 2020-11-06 | 深圳市浩瀚卓越科技有限公司 | Shooting equipment stabilizer |
CN112070061A (en) * | 2020-09-22 | 2020-12-11 | 苏州臻迪智能科技有限公司 | Unmanned aerial vehicle-based motion monitoring method and device |
CN112843734A (en) * | 2020-12-31 | 2021-05-28 | 上海米哈游天命科技有限公司 | Picture shooting method, device, equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN109391762B (en) | 2021-10-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109391762A (en) | A kind of method and apparatus of track up | |
CN103702029B (en) | The method and device of focusing is pointed out during shooting | |
CN105005457B (en) | Geographical location methods of exhibiting and device | |
US20170187566A1 (en) | Alerting Method and Mobile Terminal | |
CN107124555B (en) | Method and device for controlling focusing, computer equipment and computer readable storage medium | |
CN107770451A (en) | Take pictures method, apparatus, terminal and the storage medium of processing | |
CN106371086B (en) | A kind of method and apparatus of ranging | |
CN104967790B (en) | Method, photo taking, device and mobile terminal | |
CN107038681A (en) | Image weakening method, device, computer-readable recording medium and computer equipment | |
US11394871B2 (en) | Photo taking control method and system based on mobile terminal, and storage medium | |
CN106504303B (en) | A kind of method and apparatus playing frame animation | |
CN111857793B (en) | Training method, device, equipment and storage medium of network model | |
CN105989572B (en) | Picture processing method and device | |
WO2016173350A1 (en) | Picture processing method and device | |
CN106851119B (en) | Picture generation method and equipment and mobile terminal | |
CN106204423A (en) | A kind of picture-adjusting method based on augmented reality, device and terminal | |
KR101848696B1 (en) | A method of superimposing location information on a collage, | |
CN107124556A (en) | Focusing method, device, computer-readable recording medium and mobile terminal | |
CN104122981B (en) | Photographic method, camera arrangement and mobile terminal applied to mobile terminal | |
CN108541015A (en) | A kind of signal strength reminding method and mobile terminal | |
CN110209245A (en) | Face identification method and Related product | |
CN105635553B (en) | Image shooting method and device | |
CN105120158B (en) | The image pickup method and device of mobile terminal | |
CN107147823A (en) | Exposure method, device, computer-readable recording medium and mobile terminal | |
CN107330867B (en) | Image synthesis method, image synthesis device, computer-readable storage medium and computer equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |