CN103365294A - Unmanned aerial vehicle control system and method thereof - Google Patents

Unmanned aerial vehicle control system and method thereof Download PDF

Info

Publication number
CN103365294A
CN103365294A CN2012100876543A CN201210087654A CN103365294A CN 103365294 A CN103365294 A CN 103365294A CN 2012100876543 A CN2012100876543 A CN 2012100876543A CN 201210087654 A CN201210087654 A CN 201210087654A CN 103365294 A CN103365294 A CN 103365294A
Authority
CN
China
Prior art keywords
unmanned vehicle
image
user
dimensional human
human
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2012100876543A
Other languages
Chinese (zh)
Inventor
李后贤
李章荣
罗治平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hongfujin Precision Industry Shenzhen Co Ltd
Hon Hai Precision Industry Co Ltd
Original Assignee
Hongfujin Precision Industry Shenzhen Co Ltd
Hon Hai Precision Industry Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hongfujin Precision Industry Shenzhen Co Ltd, Hon Hai Precision Industry Co Ltd filed Critical Hongfujin Precision Industry Shenzhen Co Ltd
Priority to CN2012100876543A priority Critical patent/CN103365294A/en
Publication of CN103365294A publication Critical patent/CN103365294A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention provides an unmanned aerial vehicle control system and a method thereof. The system is installed and operates in a host machine. Before the use of the system, the establishment of a 3D humanoid sample database used for judging body posture and movement of a user is needed. A depth camera is used to shoot a front scene comprising the user to obtain a scene image, the 3D humanoid image of the user is detected from the scene image, and the 3D humanoid image and the 3D humanoid sample database are compared and analyzed to obtain the body posture and movement information of the user. According to the system, the body posture and movement information of the user can be converted into the corresponding unmanned aerial vehicle control instruction, and the flying state of the unmanned aerial vehicle is controlled according to the control instruction.

Description

Unmanned vehicle control system and method
Technical field
The present invention relates to and a kind of flight control system and method, particularly about a kind of unmanned vehicle control system and method that possesses 3D human type detection technology.
Background technology
Traditional unmanned vehicle control system only can provide the modes such as the operating rod of user with the entity controller, button, controller carries out the operation of controlling of unmanned vehicle flight directive, except may be mixed up or loses because of factors such as telepilot in the family too much or conveniently leave about.In addition, the user also needs to be familiar with one by one the position of each function controlling bar of entity controller, button, could carry out with controller the function controlling operations such as flying height, direction of unmanned vehicle.If the user is very not familiar for operating functions such as each operating rod of entity controller, buttons, also cause easily the user to control the situation such as wrong and occur, cause the damage of personnel or unmanned vehicle, in the use comparatively inconvenience.
Summary of the invention
In view of above content, be necessary to provide a kind of unmanned vehicle control system and method, can be by user's body gesture, the state of flight that unmanned vehicle is directly controlled in action.
Described unmanned vehicle control system is installed and is run in the main frame, and this main frame comprises degree of depth video camera and memory storage.This system comprises: the Database module is used for setting up a three-dimensional human-like sample database of 3D that is used for differentiating user's body gesture, action, and the three-dimensional human-like sample database of this 3D is stored in the memory storage; The image detecting module, be used for utilizing degree of depth video camera to take for the scene that the place ahead comprises the user, obtain the scene image that the place ahead scene comprises Z direction depth of view information, from scene image, detect user's the three-dimensional human-like image of 3D, the three-dimensional human-like image of 3D and the three-dimensional human-like sample database of 3D are compared body gesture, the action message that analyzes the user; The flight control module, be used for body gesture, action message with the user and be converted to corresponding unmanned vehicle and control instruction, and will control instruction and be sent to unmanned vehicle by cordless communication network and control, adjust operation for unmanned vehicle.
Described unmanned vehicle control method is applied in the main frame, and this main frame comprises degree of depth video camera and memory storage.The method comprising the steps of: set up a three-dimensional human-like sample database of 3D that is used for differentiating user's body gesture, action, and the three-dimensional human-like sample database of this 3D is stored in the memory storage; Utilize degree of depth video camera to take for the scene that the place ahead comprises the user, obtain the scene image that the place ahead scene comprises Z direction depth of view information; From scene image, detect user's the three-dimensional human-like image of 3D; The three-dimensional human-like image of 3D and the three-dimensional human-like sample database of 3D are compared body gesture, the action message that analyzes the user; User's body gesture, action message is converted to corresponding unmanned vehicle controls instruction; Control instruction and be sent to unmanned vehicle by cordless communication network and control, adjust operation for unmanned vehicle described.
Compared to prior art, unmanned vehicle control system of the present invention and method, can directly carry out the operation of controlling of unmanned vehicle state of flight with body gesture, action by the user, not only can effectively solve the control function of each operating rod that the user obscures controller, or can't seek the puzzlement of controller all over, also allow the user can carry out in mode more intuitively the control operations such as heading, height of unmanned vehicle.
Description of drawings
Fig. 1 is the Organization Chart of unmanned vehicle control system of the present invention preferred embodiment.
Fig. 2 is the process flow diagram of unmanned vehicle control method of the present invention preferred embodiment.
Fig. 3 is the synoptic diagram of setting up the three-dimensional human-like sample database of 3D.
Fig. 4 utilizes degree of depth video camera photographed scene image synoptic diagram.
Fig. 5 utilizes user's body gesture to control the instantiation synoptic diagram of unmanned vehicle.
The main element symbol description
Main frame 1
Unmanned vehicle control system 10
Database module 101
Image detecting module 102
Flight control module 103
Degree of depth video camera 11
Memory storage 12
Microprocessor 13
Unmanned vehicle 2
Cordless communication network 3
Following embodiment further specifies the present invention in connection with above-mentioned accompanying drawing.
Embodiment
With reference to shown in Figure 1, it is the Organization Chart of unmanned vehicle of the present invention (Unmanned Aerial Vehicle, UAV) control system 10 preferred embodiments.In the present embodiment, described unmanned vehicle control system 10 is installed and is run in the main frame 1, can directly control by the user state of flight of unmanned vehicle 2 with body gesture, action, for example control unmanned vehicle 2 turn left, bend to right, to nutation or to facing upward, adjust flying speed, flight sideslip angle and yaw speed and course etc.In the present embodiment, described main frame 1 also include but not limited to, degree of depth video camera (Depth-sensing Camera) 11, memory storage 12 and microprocessor 13.This main frame 1 can communicate by cordless communication network 3 and unmanned vehicle 2, such as the steering order of the state of flight that sends control unmanned vehicle 2 etc.Described steering order is used for, but is not limited only to, control unmanned vehicle 2 turn left, bend to right, to nutation, to face upward, flying speed, flight sideslip angle and yaw speed and course.
Described degree of depth video camera 11 is a kind of TOF (Time of Flight) camera device with 3D camera function, horizontal direction (XY direction) range information of each point in the subject image capturing range can be obtained, that is subject image each point can be obtained apart from depth direction (Z direction) range information of camera lens.This degree of depth video camera 11 mainly utilizes illumination for subject transmitted-reference light beam, mistiming by calculating Returning beam or phase differential carry out video camera and subject between distance transform, thereby produce one group of depth distance information, i.e. Z direction range information.
In the present embodiment, described unmanned vehicle control system 10 comprises Database module 101, image detecting module 102 and flight control module 103.The alleged module of the present invention means a kind of can be by the microprocessor 13 of main frame 1 performed and can finish the series of computation machine program segment of fixed function, it is stored in the memory storage 12 of main frame 1, will be described specifically in the process flow diagram of Fig. 2 about the function of each module.
With reference to shown in Figure 2, it is the process flow diagram of unmanned vehicle control method of the present invention preferred embodiment.In the present embodiment, the method can be by user's body gesture, the state of flight that unmanned vehicle 2 is directly controlled in action, for example control unmanned vehicle 2 turn left, bend to right, to nutation or to facing upward, adjust flying speed, flight sideslip angle and yaw speed and course etc.
Step S21, Database module 101 is set up a three-dimensional human-like sample database of 3D that is used for differentiating user's body gesture, action, and the three-dimensional human-like sample database of this 3D is stored in the memory storage 12.With reference to shown in Figure 3, the user utilizes first human-like body gesture corresponding to 11 pairs of unmanned vehicles of degree of depth video camera, 2 whole steering orders, action to take to collect the three-dimensional human-like image data of 3D of a large amount of human-like body gestures, action, and according to each human-like body gesture, the action corresponding unmanned vehicle 2 steering order, utilize the three-dimensional human-like sample database of 3D of Database module 101 Erecting and improvings, to differentiate the foundation of user's body gesture, action as 3D human type detection technology.
With reference to shown in Figure 3, it is the synoptic diagram of setting up the three-dimensional human-like sample database of 3D.In the present embodiment, set up the three-dimensional human-like sample database of 3D and comprise the steps: that (a) utilizes degree of depth video camera 11 to take and collect a large amount of human-like data images, obtain degree of depth video camera 11 camera lenses to the range data of human-like each point, and classify by human-like front, side, the back side; (b) with the range data data of human-like profile each point position to camera lens, transfer pixel value to and be stored as human-like feature array, wherein the peak pixel value is 255, the minimum point pixel value is 0, distributes in proportion in the feature array; (c) carry out feature alignment operation for all human-like feature array datas of collecting, and the human-like feature array that all alignment are finished is carried out the pointwise Information Statistics, carry out the permissible range of each point pixel value in the human-like feature array in the standard deviation mode; (d) the human-like feature array each point numerical value permissible range of finishing statistics is human-like data template, and it is divided into front, side, the back side three classes, as various human-like 3D sample, thereby sets up the three-dimensional human-like sample database of 3D.
Step S22, the scene that image detecting module 102 utilizes degree of depth video camera 11 to continue to comprise for the place ahead the user is taken, and obtains the scene image that the place ahead scene comprises Z direction depth of view information.In the present embodiment, when degree of depth video camera 11 shooting the place aheads comprise user's scene, obtain simultaneously the camera lens of degree of depth video camera 11 to the range data of scene each point.With reference to shown in Figure 4, the scenario A that image detecting module 102 utilizes degree of depth video camera 11 to continue the place ahead is comprised the user is taken, and obtains scene image B and the Z direction depth of view information of the XY direction that comprises the user in the scene.Described Z direction depth of view information means the range data of camera lens and the scene each point of degree of depth video camera 11.
Step S23, image detecting module 102 utilize 3D human type detection technology from take the photograph scene image detect user's the three-dimensional human-like image of 3D.In the present embodiment, the described 3D of utilization human type detection technology from take the photograph scene image detect the three-dimensional human-like image of 3D step comprise the steps: (a) image detecting module 102 with degree of depth video camera 11 take the photograph scene image each point to the Z direction depth of view information of degree of depth video camera 11 camera lenses transfer pixel value to, and each pixel value is stored as the scene array; (b) image detecting module 102 compares the three-dimensional human-like image of the 3D that detects the user with the human-like data template in scene array and the three-dimensional human-like sample database of 3D; (c) if (for example template point bit value is 255 to the gap of the interior identical point bit value of certain some bit value relatively in the scene domain and human-like data template less than 5% the time, the scene point bit value is 250, both ratio gaps 2%), judge that then this point value falls within the permissible range of database template identical point bit value in the scene; (d) if there is its numerical value of some position more than 85%~90% all to fall within the permissible range of human-like data template identical point position in the comparison domain, the numeric distribution that then represents this comparison domain is close with human-like data template, this comparison domain can be defined as the three-dimensional human-like image of 3D.
Step S24, image detecting module 102 compares body gesture, the action message that analyzes the user with the three-dimensional human-like image of the 3D that detects and the three-dimensional human-like sample database of 3D that is stored in the memory storage 12.In the present embodiment, after image detecting module 102 obtains the information such as the three-dimensional human-like image of 3D of user in the image, namely the position of the three-dimensional human-like image of 3D is demarcated, and compare, analyze with the three-dimensional human-like sample database of the 3D in the memory storage 12 according to data such as the three-dimensional human-like images of the 3D that obtains the user, with affirmation user's body gesture, action message.
Step S25, flight control module 103 is converted to the instruction of controlling of corresponding unmanned vehicle 2 with user's body gesture, action message, and this is controlled instruction is sent in the unmanned vehicle 2 by cordless communication network 3.In the present embodiment, flight control module 103 produces the operational order of the corresponding unmanned vehicle 2 of user's body gesture, moving state according to user's body gesture, mobile message, and is sent in the unmanned vehicle 2 by cordless communication network 3.
Step S26, operation is controlled, adjusted to flight control module 103 according to the brake unit that this operational order drives unmanned vehicle 2 to unmanned vehicle 2, for example control unmanned vehicle 2 turn left, bend to right, to nutation, to face upward, flying speed, flight sideslip angle and yaw speed and course.
With reference to shown in Figure 5, be that the user utilizes body gesture and moving state to control the concrete synoptic diagram of the state of flight of unmanned vehicle.In the present embodiment, the user utilizes the combination of both hands swinging position can control the various state of flights of unmanned vehicle 2, and for example, during posture that user's both hands swing left, then control is controlled unmanned vehicle 2 and turned left; During posture that user's both hands swing to the right, then control is controlled unmanned vehicle 2 and is bent to right.When user's both hands move down, then control unmanned vehicle 2 to nutation; When user's both hands move up, then control unmanned vehicle 2 to facing upward.
Above embodiment is only unrestricted in order to technical scheme of the present invention to be described, although with reference to above preferred embodiment the present invention is had been described in detail, those of ordinary skill in the art should be appreciated that and can make amendment or be equal to the spirit and scope that replacement should not break away from technical solution of the present invention technical scheme of the present invention.

Claims (10)

1. a unmanned vehicle control system is installed and is run in the main frame, and this main frame comprises degree of depth video camera and memory storage, it is characterized in that, described unmanned vehicle control system comprises:
The Database module is used for setting up a three-dimensional human-like sample database of 3D that is used for differentiating user's body gesture, action, and the three-dimensional human-like sample database of this 3D is stored in the memory storage;
The image detecting module, be used for utilizing degree of depth video camera to take for the scene that the place ahead comprises the user, obtain the scene image that the place ahead scene comprises Z direction depth of view information, from scene image, detect user's the three-dimensional human-like image of 3D, the three-dimensional human-like image of 3D and the three-dimensional human-like sample database of 3D are compared body gesture, the action message that analyzes the user;
The flight control module, be used for body gesture, action message with the user and be converted to corresponding unmanned vehicle and control instruction, and this is controlled instruction be sent to unmanned vehicle by cordless communication network and control, adjust operation for unmanned vehicle.
2. unmanned vehicle control system as claimed in claim 1, it is characterized in that, described degree of depth video camera is a kind of TOF camera device with 3D camera function, is used for obtaining the XY direction horizontal range information of each point in the subject image capturing range and subject image each point apart from the Z direction depth distance information of camera lens.
3. unmanned vehicle control system as claimed in claim 1 is characterized in that, the step of the three-dimensional human-like sample database of the described 3D of foundation comprises:
Utilize degree of depth video camera human-like body gesture corresponding to the whole steering orders of unmanned vehicle, action to be taken to collect the three-dimensional human-like image data of 3D of a large amount of human-like body gestures, action;
The three-dimensional human-like sample database of 3D according to the steering order Erecting and improving of each human-like body gesture, the corresponding unmanned vehicle of action.
4. unmanned vehicle control system as claimed in claim 1 is characterized in that, the step of the three-dimensional human-like image of described detecting 3D comprises:
With degree of depth video camera take the photograph scene image each point to the Z direction depth of view information of degree of depth camera lens transfer pixel value to, and each pixel value is stored as the scene array;
Human-like data template in scene array and the three-dimensional human-like sample database of 3D is compared the three-dimensional human-like image of the 3D that detects the user.
5. unmanned vehicle control system as claimed in claim 1, it is characterized in that, described steering order comprise control unmanned vehicle turn left, bend to right, to nutation, to face upward, the instruction in flying speed, flight sideslip angle and yaw speed and course.
6. a unmanned vehicle control method is applied in the main frame, and this main frame comprises degree of depth video camera and memory storage, it is characterized in that, the method comprising the steps of:
Set up a three-dimensional human-like sample database of 3D that is used for differentiating user's body gesture, action, and the three-dimensional human-like sample database of this 3D is stored in the memory storage;
Utilize degree of depth video camera to take for the scene that the place ahead comprises the user, obtain the scene image that the place ahead scene comprises Z direction depth of view information;
From scene image, detect user's the three-dimensional human-like image of 3D;
The three-dimensional human-like image of 3D and the three-dimensional human-like sample database of 3D are compared body gesture, the action message that analyzes the user;
User's body gesture, action message is converted to the instruction of controlling of corresponding unmanned vehicle;
Control instruction and be sent to unmanned vehicle by cordless communication network and control, adjust operation for unmanned vehicle described.
7. unmanned vehicle control method as claimed in claim 6, it is characterized in that, described degree of depth video camera is a kind of TOF camera device with 3D camera function, is used for obtaining the XY direction horizontal range information of each point in the subject image capturing range and subject image each point apart from the Z direction depth distance information of camera lens.
8. unmanned vehicle control method as claimed in claim 6 is characterized in that, the step of the three-dimensional human-like sample database of the described 3D of foundation comprises:
Utilize degree of depth video camera human-like body gesture corresponding to the whole steering orders of unmanned vehicle, action to be taken to collect the three-dimensional human-like image data of 3D of a large amount of human-like body gestures, action;
The three-dimensional human-like sample database of 3D according to the steering order Erecting and improving of each human-like body gesture, the corresponding unmanned vehicle of action.
9. unmanned vehicle control method as claimed in claim 6 is characterized in that, the described step that detects user's the three-dimensional human-like image of 3D from scene image comprises:
With degree of depth video camera take the photograph scene image each point to the Z direction depth of view information of degree of depth camera lens transfer pixel value to, and each pixel value is stored as the scene array;
Human-like data template in scene array and the three-dimensional human-like sample database of 3D is compared the three-dimensional human-like image of the 3D that detects the user.
10. unmanned vehicle control method as claimed in claim 6, it is characterized in that, described steering order comprise control unmanned vehicle turn left, bend to right, to nutation, to face upward, the instruction in flying speed, flight sideslip angle and yaw speed and course.
CN2012100876543A 2012-03-29 2012-03-29 Unmanned aerial vehicle control system and method thereof Pending CN103365294A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2012100876543A CN103365294A (en) 2012-03-29 2012-03-29 Unmanned aerial vehicle control system and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2012100876543A CN103365294A (en) 2012-03-29 2012-03-29 Unmanned aerial vehicle control system and method thereof

Publications (1)

Publication Number Publication Date
CN103365294A true CN103365294A (en) 2013-10-23

Family

ID=49366861

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2012100876543A Pending CN103365294A (en) 2012-03-29 2012-03-29 Unmanned aerial vehicle control system and method thereof

Country Status (1)

Country Link
CN (1) CN103365294A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103853464A (en) * 2014-04-01 2014-06-11 郑州捷安高科股份有限公司 Kinect-based railway hand signal identification method
CN104808799A (en) * 2015-05-20 2015-07-29 成都通甲优博科技有限责任公司 Unmanned aerial vehicle capable of indentifying gesture and identifying method thereof
CN106203299A (en) * 2016-06-30 2016-12-07 北京二郎神科技有限公司 The control method of a kind of controllable equipment and device
CN106394555A (en) * 2016-08-29 2017-02-15 无锡卓信信息科技股份有限公司 Unmanned automobile obstacle avoidance system and method based on 3D camera
CN108351651A (en) * 2016-09-27 2018-07-31 深圳市大疆创新科技有限公司 A kind of control method, device and aircraft based on image

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101853623A (en) * 2009-04-03 2010-10-06 鸿富锦精密工业(深圳)有限公司 Image monitoring system and information display system having same
CN101859371A (en) * 2009-04-10 2010-10-13 鸿富锦精密工业(深圳)有限公司 Pick-up device and object identification method thereof
CN102044034A (en) * 2009-10-22 2011-05-04 鸿富锦精密工业(深圳)有限公司 Commodity catalog display system and method
CN102088551A (en) * 2009-12-03 2011-06-08 鸿富锦精密工业(深圳)有限公司 Camera adjustment system and method
CN102117117A (en) * 2010-01-06 2011-07-06 致伸科技股份有限公司 System and method for control through identifying user posture by image extraction device
CN102219051A (en) * 2011-04-29 2011-10-19 北京工业大学 Method for controlling four-rotor aircraft system based on human-computer interaction technology

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101853623A (en) * 2009-04-03 2010-10-06 鸿富锦精密工业(深圳)有限公司 Image monitoring system and information display system having same
CN101859371A (en) * 2009-04-10 2010-10-13 鸿富锦精密工业(深圳)有限公司 Pick-up device and object identification method thereof
CN102044034A (en) * 2009-10-22 2011-05-04 鸿富锦精密工业(深圳)有限公司 Commodity catalog display system and method
CN102088551A (en) * 2009-12-03 2011-06-08 鸿富锦精密工业(深圳)有限公司 Camera adjustment system and method
CN102117117A (en) * 2010-01-06 2011-07-06 致伸科技股份有限公司 System and method for control through identifying user posture by image extraction device
CN102219051A (en) * 2011-04-29 2011-10-19 北京工业大学 Method for controlling four-rotor aircraft system based on human-computer interaction technology

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103853464A (en) * 2014-04-01 2014-06-11 郑州捷安高科股份有限公司 Kinect-based railway hand signal identification method
CN103853464B (en) * 2014-04-01 2017-02-15 郑州捷安高科股份有限公司 Kinect-based railway hand signal identification method
CN104808799A (en) * 2015-05-20 2015-07-29 成都通甲优博科技有限责任公司 Unmanned aerial vehicle capable of indentifying gesture and identifying method thereof
CN106203299A (en) * 2016-06-30 2016-12-07 北京二郎神科技有限公司 The control method of a kind of controllable equipment and device
US10710244B2 (en) 2016-06-30 2020-07-14 Beijing Airlango Technology Co., Ltd. Robot control using gestures
CN106394555A (en) * 2016-08-29 2017-02-15 无锡卓信信息科技股份有限公司 Unmanned automobile obstacle avoidance system and method based on 3D camera
CN108351651A (en) * 2016-09-27 2018-07-31 深圳市大疆创新科技有限公司 A kind of control method, device and aircraft based on image

Similar Documents

Publication Publication Date Title
US11042723B2 (en) Systems and methods for depth map sampling
CN110687902B (en) System and method for controller-free user drone interaction
US10969784B2 (en) System and method for providing easy-to-use release and auto-positioning for drone applications
CN110494360B (en) System and method for providing autonomous photography and photography
KR102359806B1 (en) Control of the host vehicle based on the detected parking vehicle characteristics
CN105931263B (en) A kind of method for tracking target and electronic equipment
CN110216674B (en) Visual servo obstacle avoidance system of redundant degree of freedom mechanical arm
Monajjemi et al. UAV, come to me: End-to-end, multi-scale situated HRI with an uninstrumented human and a distant UAV
US20170054962A1 (en) Three-dimensional depth perception method and apparatus with an adjustable working range
CN105329238B (en) A kind of autonomous driving vehicle lane-change control method based on monocular vision
EP3568334A1 (en) System, method and non-transitory computer readable storage medium for parking vehicle
US10399229B2 (en) Method of tracking target object
TW201339903A (en) System and method for remotely controlling AUV
CN103365294A (en) Unmanned aerial vehicle control system and method thereof
CN106774436A (en) The control system and method for the rotor wing unmanned aerial vehicle tenacious tracking target of view-based access control model
CN106973221B (en) Unmanned aerial vehicle camera shooting method and system based on aesthetic evaluation
CN110516578A (en) A kind of monitoring system based on recognition of face and target following
CN106020234A (en) Unmanned aerial vehicle flight control method, device and equipment
CN105892668A (en) Equipment control method and device
Juang et al. Real-time indoor surveillance based on smartphone and mobile robot
CN110493521A (en) Automatic Pilot camera control method, device, electronic equipment, storage medium
CN106408593A (en) Video-based vehicle tracking method and device
WO2018121794A1 (en) Control method, electronic device and storage medium
CN108363387A (en) Sensor control method and device
CN114782639A (en) Rapid differential latent AGV dense three-dimensional reconstruction method based on multi-sensor fusion

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20131023