CN207008413U - Robot control system is accompanied in one kind flight - Google Patents
Robot control system is accompanied in one kind flight Download PDFInfo
- Publication number
- CN207008413U CN207008413U CN201720767936.6U CN201720767936U CN207008413U CN 207008413 U CN207008413 U CN 207008413U CN 201720767936 U CN201720767936 U CN 201720767936U CN 207008413 U CN207008413 U CN 207008413U
- Authority
- CN
- China
- Prior art keywords
- module
- processor
- infrared
- flight
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Landscapes
- Toys (AREA)
Abstract
The utility model provides a kind of flight and accompanies robot control system, including:Processor, distance measuring instruction is sent to ultrasonic wave module;Ultrasonic wave module, ultrasonic wave will be projected and launched;Reception reflectance ultrasound ripple is converted into the first electric signal and sent to processor;Processor, the first rpyroelectric infrared module of control are tracked to the user after identification;First rpyroelectric infrared module, human infrared signal caused by the user after identification is converted into the second electric signal and sent to processor;The inertia measuring module being arranged on pedestal, the 3rd electric signal collected is sent to processor;The Angle Measurement Module being arranged on pedestal, the 4th electric signal collected is sent to processor;Processor, according to first, second, third and the 4th electric signal control flight accompany robot adjustment flight attitude, avoiding barrier and tracking user.The most functions of alloing flight to accompany robot to replace present mobile phone, so as to accompany user.
Description
Technical field
Electronic technology field is the utility model is related to, more particularly to robot control system is accompanied in a kind of flight.
Background technology
With the popularization of smart mobile phone product and enriching for cell-phone function, the dependence of modern people opponent's machine can be further strong
Strong, thing followed negative effect is exactly to cause serious excess eye-using, it is difficult to imagine that if things go on like this development is gone down, future ten
Year, 20 years, after 30 years, the eyes of the mankind will be what kind of situation, therefore how help people to break away from opponent
The dependence of machine turns into an extremely urgent research topic.
With the expansion of intelligent robot application field, it is intended that robot can service in more areas for the mankind,
Robot of China industry is also heroes and rises, be like a raging fire at this stage, in man-machine interaction, big data foundation and deep learning etc.
Aspect technology has been walked in world forefront of the same trade, it may be said that intelligent robot industry should be that China realizes that bend is overtaken other vehicles
Important handgrip, therefore country in this respect constantly increase investment and policy inclination dynamics.
The company machine National People's Congress patented at this stage is to utilize both legs walking either wheeled and crawler-type traveling mostly
Mechanism realizes motion, and this kind of robot is mostly bulky, slow in action, and flexibility is relatively low, therefore use environment is wanted
Asking will be higher, and cost also remains high, and causes intelligent robot popularity rate also than relatively low reality.
Utility model content
The problem of the utility model aims to solve the problem that existing company machine human body product is big and flexibility is low.
Main purpose of the present utility model is that providing a kind of flight accompanies robot control system.
To reach above-mentioned purpose, what the technical solution of the utility model was specifically realized in:
On the one hand the utility model provides a kind of flight and accompanies robot control system, including:Processor, to being arranged on
Ultrasonic wave module on pedestal sends distance measuring instruction;Ultrasonic wave module, distance measuring instruction is received, ultrasonic wave is projected in generation, and will penetrate
Go out ultrasonic wave to be launched;Receive the reflectance ultrasound ripple that reflects, by reflectance ultrasound ripple be converted into the first electric signal send to
Processor;Processor, the information control obtained by the image photographing module being arranged in spherical tanks and the first infrared photography module
The first rpyroelectric infrared module that system is arranged on pedestal upper surface is tracked to the user after identification;First rpyroelectric infrared mould
Block, human infrared signal caused by the user after acquisition identification, human infrared signal caused by the user after identification is converted to
Second electric signal is sent to processor;The inertia measuring module being arranged on pedestal, by the 3rd electric signal collected send to
Processor;The Angle Measurement Module being arranged on pedestal, the 4th electric signal collected is sent to processor;Processor, root
Flight is controlled to accompany robot adjustment flight appearance according to the first electric signal, the second electric signal, the 3rd electric signal and the 4th electric signal
State, avoiding barrier and tracking user.
In addition, system also includes:The infrared projection module of base bottom is arranged on, infrared light spot is projected to bottom environment;
It is arranged on the second infrared photography module of base bottom and is arranged on the second rpyroelectric infrared module of base bottom, gathers respectively
The distributed intelligence of infrared light spot, distributed intelligence is sent to processor;Processor, depth information is identified using distributed intelligence;
Surrounding three-dimensional map is created according to the first electric signal, the second electric signal, the 3rd electric signal, the 4th electric signal and depth information,
And navigated.
In addition, system also includes:Processor, information to be output is sent to the infrared projection mould for being arranged on base bottom
Block;Infrared projection module, by interactive projection by information projection to be output in preset plane, formed projected picture;Second
Infrared photography module, catch in projected picture indicate thing produce infrared light diffusing reflection information, by diffusing reflection information send to
Processor;Processor, diffusing reflection information is positioned and identified, control man-machine interaction.
In addition, system also includes:Processor, information to be output is sent to the voice playing module being arranged on pedestal;
Voice playing module, play information to be output;The acoustics acquisition module being arranged on pedestal, user speech information is gathered, by language
Message breath is sent to processor;Processor, man-machine interaction is controlled according to voice messaging.
In addition, image photographing module, the first infrared photography module and/or the second infrared photography module, gather text to be translated
Word information, and the text information to be translated collected is sent to processor;Processor, text information to be translated is carried out real-time
Translation, the text information after translation is exported.
In addition, system also includes:Charging bracket, launch infrared ray;First rpyroelectric infrared module, the second rpyroelectric infrared
Module, the first infrared photography module and the second infrared photography module, the infrared ray of charging bracket transmitting is gathered respectively, and will collection
To infrared ray information send to processor;Processor, according to the first rpyroelectric infrared module, the second rpyroelectric infrared module,
The infrared ray Information locating charging bracket that first infrared photography module and the second infrared photography module collect, and control flight to accompany
Flown with robot to charging bracket and charging bracket of taking a seat is to be charged.
In addition, system also includes:The fingerprint identification module being arranged on pedestal, is acquired to user fingerprint image, will
The fingerprint image collected is sent to processor;Processor, flight is controlled to accompany machine manually to be opened as system according to fingerprint image
Open.
The flight provided it can be seen from above-mentioned technical scheme provided by the utility model by the utility model embodiment
Accompany robot control system so that flight accompanies robot by the control system of intelligence and the cooperation of multisensor, can be with
Provide the user man-machine interaction, communication, flight shooting, audio and video playing, obtain information, service for life, safely accompany and attend to, navigate,
The services such as translation, game, most functions of present mobile phone can be replaced, so as to accompany user.
Brief description of the drawings
It is required in being described below to embodiment in order to illustrate more clearly of the technical scheme of the utility model embodiment
The accompanying drawing used is briefly described, it should be apparent that, drawings in the following description are only some implementations of the present utility model
Example, on the premise of not paying creative work, can also be according to these accompanying drawings for one of ordinary skill in the art
Obtain other accompanying drawings.
Fig. 1 is the top view that robot is accompanied in the flight that the utility model embodiment provides;
Fig. 2 is the front view that robot is accompanied in the flight that the utility model embodiment provides;
Fig. 3 is the upward view that robot is accompanied in the flight that the utility model embodiment provides;
Fig. 4 is the flow chart that robot control method is accompanied in the flight that the utility model embodiment provides;
Fig. 5 is the structural representation that robot control system is accompanied in the flight that the utility model embodiment provides;
Fig. 6 is the structural representation for the charging bracket that the utility model embodiment provides.
Embodiment
With reference to the accompanying drawing in the utility model embodiment, the technical scheme in the embodiment of the utility model is carried out clear
Chu, it is fully described by, it is clear that described embodiment is only the utility model part of the embodiment, rather than whole realities
Apply example.Based on embodiment of the present utility model, those of ordinary skill in the art are obtained under the premise of creative work is not made
The every other embodiment obtained, belongs to the scope of protection of the utility model.
In description of the present utility model, it is to be understood that term " " center ", " longitudinal direction ", " transverse direction ", " on ", " under ",
The orientation or position relationship of the instruction such as "front", "rear", "left", "right", " vertical ", " level ", " top ", " bottom ", " interior ", " outer " are
Based on orientation shown in the drawings or position relationship, it is for only for ease of description the utility model and simplifies description, rather than instruction
Or imply that signified device or element must have specific orientation, with specific azimuth configuration and operation, therefore be not understood that
For to limitation of the present utility model.In addition, term " first ", " second " be only used for describe purpose, and it is not intended that instruction or
Imply relative importance or quantity or position.
, it is necessary to which explanation, unless otherwise clearly defined and limited, term " are pacified in description of the present utility model
Dress ", " connected ", " connection " should be interpreted broadly, for example, it may be fixedly connected or be detachably connected, or integratedly
Connection;Can be mechanical connection or electrical connection;Can be joined directly together, can also be indirectly connected by intermediary,
It can be the connection of two element internals.For the ordinary skill in the art, above-mentioned art can be understood with concrete condition
Concrete meaning of the language in the utility model.
The utility model embodiment is described in further detail below in conjunction with accompanying drawing.
Robot is accompanied in the flight that the utility model embodiment provides, and it could be arranged to knot as shown in Figure 1 to Figure 3
Structure, it is, of course, also possible to increase miscellaneous part as needed or delete section components, only the utility model embodiment is carried below
The flight of confession accompanies the structure of robot to be briefly described, but the utility model is not limited thereto, and other are as needed certainly
The increase of row setting or the structure deleted all should belong to the scope of protection of the utility model.
Referring to Fig. 1 to Fig. 3, the flight that the utility model embodiment provides accompanies robot to include:Spherical tanks 1 and base
Can be in flying saucer shape together with seat 2, spherical tanks 1 and the entire combination of pedestal 2, spherical tanks 1 can rely on card in pedestal 2
Button is fixed, naturally it is also possible to is fixed by other means.
Image photographing module 101 (such as high-definition camera) and first red can be arranged side by side in centre position in spherical tanks 1
Outer photographing module 102 (such as first infrared camera), the first flash lamp can also be set among the dual camera of spherical tanks 1
103, the arc surface bottom of spherical tanks 1 distribution built-in aerial 104.
The top of pedestal 2 could be arranged to arc-shaped, and bottom is flat circular flat, and the edge of pedestal 2 can uniformly divide
Four ultrasonic wave modules 201 (such as ultrasonic sensor) of cloth, naturally it is also possible to be arranged as required to as eight or other are counted
Amount, this is not limited in the utility model, and the utility model is only illustrated exemplified by four.The top arc surface of pedestal 2
Centre position can set the first rpyroelectric infrared module 202 (such as first pyroelectric infrared sensor), and the top of pedestal 2 is also set
Inertia measuring module 203 is put, can include gyro sensor 2031 and acceleration transducer 2032, the top of pedestal 2 can be with
Angle Measurement Module 204 (such as angular transducer) is set, and the top of pedestal 2 can also set temperature sensor 205, on pedestal 2
Portion can also set voice playing module 206 (such as audio player) and acoustics acquisition module 207 (such as acoustic sensor),
The top arc surface of pedestal 2 can also set fingerprint identification module 208, and the top arc surface of pedestal 2 can also be set at bottom surface
Charging inlet 209 and USB interface 210 etc..
The bottom center of pedestal 2 can set infrared projection module 211 (such as infrared projector), in base bottom close to red
Second rpyroelectric infrared module 212 (such as second pyroelectric infrared sensor) is set at outer projection's instrument camera lens, and the bottom of pedestal 2 connects
Second infrared photography module 213 (such as second small-sized infrared camera), base can also be set at near-infrared projector lens
Second flash lamp 214 can also be set beside the bottom infrared projector of seat 2.
The base plane periphery of pedestal 2 can be uniformly distributed four embedded rotors 215, naturally it is also possible to be arranged as required to
For eight or other quantity, this is not limited in the utility model, and the utility model is only illustrated exemplified by four.
2 edge of pedestal, one week annular rubber fender 216 of bonding, the bottom symmetrical of pedestal 2 set four hemispherical solid rubber protection pads 217,
It can certainly be arranged as required to as eight or other quantity, this is not limited in the utility model, and this practicality is new
Type is only illustrated exemplified by four.There is carbon fiber protecting wire net 218 outside rotor 215, prevent rotor to hinder hand.The bottom of pedestal 2 sets two
Group motor 219, is connected with four rotors 215, rotor flying is driven by electric machine controller respectively by every group two.
Equally distributed four rotors 215 in robot base bottom periphery are accompanied in the flight that the utility model embodiment provides
Robot is accompanied to provide flying power for flight, four rotors 215 are in an elevation plane, four structures of rotor 215 and half
Footpath is identical, and two groups of motors 219 are located among the rotor of base bottom symmetrical both sides two, every group of two motors 219, pass through electricity
The rotating speed of machine controller regulation motor 219 changes variable rotor speed, realizes lift variation, so as to control flight attitude and position.Four
The rotor rotate counterclockwise that two of rotor 215 are oppositely arranged, what two other was oppositely arranged turns clockwise, therefore when flight
When accompanying robot balance flight, gyroscopic effect and air force moment of torsion effect are cancelled.Flight accompanies robot to pass through tune
The direction of rotation of four rotors 215 of section and rotating speed are realized vertical movement, elevating movement, rolling movement, yawing rotation, moved forward and backward
Moved with tendency.
The infrared projector both sides of bottom can respectively set one piece of graphene-based lithium ion battery 220 as flight in pedestal 2
Robot is accompanied to provide electric energy.Graphene-based lithium ion battery reserve of electricity is higher than conventional lithium-ion battery, and charging rate is fast.
The shell of pedestal 2 can be carbon fibre materials, and the material of spherical tanks 1 can use makrolon material.
Can be set in pedestal 2 based on Xilinx products integrate FPGA (Field-Programmable Gate Array,
Field programmable gate array) with ARM (Acorn RISC Machine, embedded microprocessor) on-chip system (System-
On-Chip, SoC) --- Zynq-7020 is main control unit.Zynq-7020 can include processor system (Processor
System, PS) with FPGA (Programmable Logic, PL), wherein PS can be based on ARM Cortex-A9 double-cores
Processor is built, and PL can be made up of Xilinx 7 Series FPGAs, can use Verilog Programming with Pascal Language.Flight chaperone machine
Device people can use ROS (Robot Operating System, robot operating system), and system can integrate the people of robot
Machine interactive module, sensor data acquisition module, visual identity module, communication module, taking module, wireless network module, indigo plant
The functional modules such as tooth module, alternative projection module, flight control modules, navigation module and translation module, it is responsible for man-machine interaction, refers to
Make the functions such as parsing, Behavior- Based control, data upload.To be consistent with ROS communication interfaces, Ubuntu can be used as operation
System, run among Zynq PS ends arm processor.
The flight that the utility model embodiment provides accompanies robot to rely on the rotor for being distributed in the bottom periphery of pedestal 2
215 flights, processor can complete automatic obstacle avoiding and various flights by the signals direct flight of all kinds sensor collection
Posture and various instruction tasks.
The flight that the utility model embodiment provides accompanies robot to implement by way of flight-follow to user
Accompany and shooting be provided, communication, information, life assistant and navigation, translation etc. service, and honeycomb level neutral net can be utilized
System of users instruction is analyzed and judged in real time, and makes the action for meeting that user requires in time.
In addition, the flight that the utility model embodiment provides accompanies robot to liberate the both hands of user, make it no longer
Depend on mobile phone unduly, more efficient voice man-machine interaction mode is progressively substituted other man-machine interaction modes, improve life
Efficiency, succinct voice interactive mode make man-machine interaction knowledge threshold reduction, adapt to each age group, various knowledge hierarchies
User use, flight accompanies robot to provide the user the optimal solutions of various demands at any time in a manner of flight follows
Scheme, turn into the optimal assistant and partner of user's life.
Fig. 4 shows that the flow chart of robot control method is accompanied in the flight that the utility model embodiment provides, referring to figure
4, robot control method is accompanied in the flight that the utility model embodiment provides, including:
S401, processor send distance measuring instruction to the ultrasonic wave module being arranged on pedestal;
S402, ultrasonic wave module receive distance measuring instruction, and ultrasonic wave is projected in generation, and is launched ultrasonic wave is projected;
S403, ultrasonic wave module receive the reflectance ultrasound ripple reflected, reflectance ultrasound ripple are converted into the first electric signal
Send to processor.
Specifically, when robot autonomous flight is accompanied in the flight that the utility model embodiment provides, processor can lead to
Cross integrated circuit and send distance measuring instruction to ultrasonic sensor, the transmitter of ultrasonic sensor will pass through ceramic vibrator transducer
Electric oscillation energy conversion is shaken into machinery and produces ultrasonic wave, continues to accompany to flight by the ultrasonic sensor of four direction
With ultrasonic wave is launched in the ambient air of robot, reflected after ultrasonic wave runs into barrier, ultrasonic sensor connects
Receive device and receive the ultrasonic wave reflected, machinery concussion is converted into electric flux again by ceramic vibrator transducer, as
The output of receiver, feeds back to processor, as adjustment flight attitude and the foundation of avoidance;
S404, processor are entered by the image photographing module being arranged in spherical tanks and the first infrared photography module to user
Row identification and positioning, obtain identification location information;
S405, processor are arranged on the first rpyroelectric infrared module pair of pedestal upper surface according to identification location information control
User after identification is tracked;
S406, the first rpyroelectric infrared module obtains human infrared signal caused by the user after identification, after identification
Human infrared signal caused by user is converted to the second electric signal and sent to processor.
Specifically, when the flight that the utility model embodiment provides accompanies robot to implement with any time, utilizing ball user
Two cameras in shape cabin by binocular vision identification technology to user's face and it is humanoid be identified and position after, processor will
Instruct the first pyroelectric infrared sensor to be tracked the user after identification according to identification location information, produced using human body itself
Raw infra-red radiation, focused on by Fresnel Lenses in the first pyroelectric infrared sensor detection member, the first rpyroelectric infrared
The infrared signal of human body is converted to electric signal output to signal processing module, signal processing module and sensor is exported by sensor
Ultra-weak electronic signal be amplified, filter, postpone, compare, be output to processor, so as to processor coordinate ultrasonic sensor be total to
With the tracking realized to user, certainly, signal processing module, which can be used as single module to set, can also be arranged on processor
Interior, this is not limited in the utility model.Dual camera (image camera and infrared camera) to face and it is humanoid enter
Row is identified and positioned, and will be positioned target lock-on by processor and be remembered, and processor instructs the first pyroelectric infrared sensor to lead to
The infra-red radiation for screening target body is crossed, robot is accompanied by the people of motion and other biological and abiotic standard to help to fly
Really make a distinction, can prevent electromagnetism and light from disturbing, ultrasonic sensor can carry out ranging to target body, to ensure to fly
Row accompanies robot to remain that appropriately distance is tracked flight to human body.It is remote by dual camera and the first pyroelectricity
Infrared sensor coordinates ultrasonic sensor to be tracked positioning to human body, can improve to the efficiency of user's identification and accurately
Rate, it is possible to achieve to user's all the period of time, the tracking and company at no dead angle.
S407, the inertia measuring module being arranged on pedestal send the 3rd electric signal collected to processor;
S408, the Angle Measurement Module being arranged on pedestal send the 4th electric signal collected to processor;
S409, processor are controlled according to the first electric signal, the second electric signal, the 3rd electric signal and the 4th electric signal and flown
Accompany robot adjustment flight attitude, avoiding barrier and tracking user.
Specifically, inertia measuring module can use three-axis gyroscope and 3-axis acceleration sensor combination to realize its work(
Can, fit angle sensor helps the steady control for accompanying robot to realize self poisoning and flight attitude of flying.
What deserves to be explained is step S401 to S403, step S403 to S406, step S407 and step S408 execution
Order can be with executed sequentially, such as step S401 to S403, step S403 to S406, step S407 and step S408 same
Shi Zhihang, or, first carry out step S403 to S406 and perform step S401 to S403, step S407 and step S408 etc. again,
This is not limited in the utility model, as long as the scope of protection of the utility model should then be belonged to by performing above-mentioned steps.
Wherein, the data-signal of various sensor collections, passes through ADC (Anabog-to Digital Converter, mould
Number change-over circuit) data signal is converted analog signals into, then the signal input processor after conversion is analyzed and processed, and
Finally judge.
As can be seen here, robot control method is accompanied in the flight provided by the utility model embodiment so that flight is accompanied
With robot by the control system of intelligence and the cooperation of multisensor, man-machine interaction, communication, flight bat can be provided the user
Take the photograph, audio and video playing, obtain information, service for life, safety are accompanied and attended to, navigated, translating, play etc. and servicing, present hand can be replaced
Most functions of machine.Flight provided by the utility model accompanies machine human body product small, takes action flexible, easy to carry, cost
Also it is greatly reduced, therefore the flight accompanies robot to be easier to be commercialized and family's popularization.
As an optional embodiment of the utility model embodiment, flight accompanies the shooting of robot to rely primarily on peace
High-definition camera and the first infrared camera loaded on juxtaposed positions in the middle part of spherical tanks and positioned at the second infrared of base bottom
Camera is completed, and the flash lamp by being separately positioned on spherical tanks and base bottom aids in shooting.
As an optional embodiment of the utility model embodiment, the flight that the utility model embodiment provides is accompanied
Robot control method also includes:
The infrared projection module for being arranged on base bottom projects infrared light spot to bottom environment;
It is arranged on the second infrared photography module of base bottom and is arranged on the second rpyroelectric infrared module of base bottom
The distributed intelligence of infrared light spot is gathered, distributed intelligence is sent to processor;
Processor identifies depth information using distributed intelligence;
Processor is according to the first electric signal, the second electric signal, the 3rd electric signal, the 4th electric signal and depth information, profit
Surrounding three-dimensional map is created with vision SLAM algorithms, and is navigated.
Specifically, the image photographing module in spherical tanks and the first infrared photography module are used to shoot parallel to user perspective
Or video record and binocular identification technology is realized, processor carries out ranging and by double by ultrasonic sensor to environment
Mesh visual identity, which helps to fly, accompanies robot automatic identification and understands the information of different objects on flight path, position, height
Degree and size realize autonomous flight avoidance, and by these paths tag information superpositions on the map that self is created, so that flying
Row accompanies robot can be from the environment semantically understood residing for oneself, and user can also assign some and refer to from higher level
Order.The Minitype infrared camera of base bottom can realize alternative projection with supplementary base bottom infrared projector, and to flight
Robot bottom environment is accompanied to carry out ranging and depth recognition, during work, infrared projector actively projects complicated to bottom environment
Infrared light spot, positioned at the Minitype infrared camera of parallel position and the second pyroelectricity far-infrared sensor collection hot spot distribution letter
Breath, using succinct binocular ranging algorithm, identifies depth information, using two cameras positioned at spherical tanks and positioned at base
The depth information that the infrared camera and infrared projector at bottom create jointly, obtain the three-dimensional coordinate information and environment ginseng of environment
Number, vision SLAM algorithms are recycled to create three-dimensional map navigation.
As can be seen here, robot control method is accompanied in the flight that the utility model embodiment provides, and also further passes through ball
Dual camera in shape cabin can establish the three of environment using binocular identification technology with reference to the infrared camera positioned at base bottom
Dimension coordinate, base bottom camera create depth information by gathering the infrared light spot that infrared projector projects, use vision
SLAM algorithms create surrounding three-dimensional digital map navigation.It is possible thereby to ensure that robot is accompanied in the flight that the utility model embodiment provides
The robots such as family, office, market can be directed to can not be three-dimensional using self being created under the complex environment of satellite positioning navigation
Navigation map, so as to realize accurate avoidance and complete user instruction task.
Therefore, face characteristic is being extracted by spherical tanks camera binocular identification technology and is coordinating the first pyroelectricity remote
On the premise of infrared sensor and ultrasonic sensor are to user's real-time tracing and sensing, base bottom camera can also be utilized
Depth recognition technology is to humanoid and motion feature extraction and the foundation of three-dimensional map, to provide the user real-time three-dimensional road
Condition information navigation, at the same time it can also complete the motion pick of user images using the shooting-recording function of bottom camera.Further,
For there is the user of eye disease, real-time voice guide service can be provided.Utilize second infrared camera of base bottom
Shooting-recording function, can provide the user motion image shoots with video-corder shadow service, can be provided for the shooting of film or variety show
The more preferable solution of effect.In addition, benefit when the second flash lamp positioned at substrate bottom is also used as ambient light deficiency
Repay.
As an optional embodiment of the utility model embodiment, the flight that the utility model embodiment provides is accompanied
Robot control method also includes:
Processor sends information to be output to the infrared projection module for being arranged on base bottom;
Infrared projection module by interactive projection by information projection to be output in preset plane, formed projected picture;
Second infrared photography module catches the infrared light diffusing reflection information for indicating that thing produces in projected picture, by diffusing reflection
Information is sent to processor;
Processor is positioned and identified to diffusing reflection information, controls man-machine interaction.
Specifically, the infrared projector positioned at base bottom can be used as flight to accompany robot visual output unit real
When accompany the information of robot decision-making output to show in front of the user by alternative projection form the flight, during work, infrared throwing
Shadow instrument projects projected picture in the plane, launches positioned at the Minitype infrared camera and infrared projector of base bottom infrared
Line is paved with whole projection screen, forms an infrared surface, infrared when the indicant such as finger or instruction pen touches infrared picture
Light can produce diffusing reflection in the tip of indicant, and these infrared external reflection light can be captured by the second infrared camera, pass through processing
The positioning and identification of device, realize the control to projected picture.
Based on this, the flight that the utility model embodiment provides accompanies robot to be carried by infrared projector for user
For real-time Domestic News, various chat tool interfaces, interface, navigation interface, translation interface etc., user passes through infrared projection
Interaction is operated, and projection platform does not have particular/special requirement, as long as the smooth table top can projection operation of enough area, operation letter
It is single, it is easy to left-hand seat.
Further, the integrated communication module of robot ROS systems control, navigation module, wireless network module, bluetooth module
With the functional module such as translation module so that the flight that the utility model embodiment provides accompanies robot to be opened according to user instruction
Corresponding module, provide the user service.
As an optional embodiment of the utility model embodiment, the flight that the utility model embodiment provides is accompanied
Robot control method also includes:
Processor sends information to be output to the voice playing module being arranged on pedestal;
Voice playing module plays information to be output;
The acoustics acquisition module collection user speech information being arranged on pedestal, voice messaging is sent to processor;
Processor parses voice messaging, man-machine interaction is controlled according to the information parsed according to voice messaging.
Specifically, the flight that the utility model embodiment provides accompanies robot to have phonetic controller, the dress
Voice acquisition module, pre- pre-processing module, model comparison module and output control module can be included by putting.Robot is accompanied in flight
User speech information is gathered by acoustic sensor, voice acquisition module converts voice signals into talk spurt sequence, before pre-
Processing module filters out secondary information and background noise in primary speech signal etc., including anti-aliasing filter, preemphasis, analog
The processing procedures such as conversion, automatic growth control, digitization of speech signals extracts after analyzing the parameters,acoustic of voice
Speech characteristic parameter, and vector quantization is carried out to parameter, with hidden Markov model (HMM model) to phonetic feature normal vector
Counted and assessment result is transferred to the middle layer unit of honeycomb level nerve network system by parameter evaluation, model comparison module
Wall, by received pronunciation characteristic vector compared with the speech model in honeycomb nerve network system, judge current speech order
The result submission honeycomb decision phase is made repeat-back and action executing by function, output control module according to voice command.
Based on this, user can open communication, navigation, wireless network, bluetooth and interpretative function etc. by phonetic order, and
Selection audio or video reception mode as requested.User selects audible receive information, and flight accompanies robot can be with
Voice opens address list, and voice is sent short messages, Voice Navigation, voice broadcast user interest point news and information etc..To have in vision
In the case of obstacle, man-machine interaction is realized using voice.
As an optional embodiment of the utility model embodiment, the flight that the utility model embodiment provides is accompanied
Robot control method also includes:
Image photographing module, the first infrared photography module and/or the second infrared photography module gather text information to be translated,
And the text information to be translated collected is sent to processor;
Processor carries out real time translation to text information to be translated, and the text information after translation is exported.
Specifically, the flight that the utility model embodiment provides accompanies robot to be taken the photograph by the way that base bottom second is infrared
As head is acquired to the text information translated of needs, real time translation is carried out to word by translation module, required according to user
Using infrared projector by the word after translation according to projecting and being covered on original text word, it is projected on original text word nearby and projects
In three kinds in blank plane can lectotype shown, real time translation and project by way of, provide the user accessible
Foreign language reading experience;It is of course also possible to the word translated is played out by audio player.For being accompanied parallel to flight
With the foreign language interface at robot spherical tanks camera visual angle, foreign language can be gathered by one of spherical tanks dual camera or all and believed
Breath, required to be converted to voice or projection translation word according to user.By the function, user can read a variety of languages in real time
The text of word is sayed, without personnel with the help of an interpreter and other translation tools, makes exchange between different context user more
It is smooth and convenient.User selects video mode receive information, and information will be presented in throwing by correlation module by alternative projection technology
On shadow face, user can be with real-time operation and use.
As an optional embodiment of the utility model embodiment, the flight that the utility model embodiment provides is accompanied
Robot control method also includes:
Charging bracket launches infrared ray;
First rpyroelectric infrared module, the second rpyroelectric infrared module, the first infrared photography module and the second infrared photography
Module gathers the infrared ray of charging bracket transmitting respectively, and the infrared ray information collected is sent to processor;
Processor judges whether self electric quantity is less than preset value, and when less than preset value, according to the first rpyroelectric infrared
The infrared ray information that module, the second rpyroelectric infrared module, the first infrared photography module and the second infrared photography module collect
Position charging bracket, and control flight to accompany robot flies to charging bracket and charging bracket of taking a seat is to be charged.
Specifically, the special charging support that the flight that the utility model embodiment provides accompanies robot to be equipped with random provides
Charging, referring to Fig. 6, the charging bracket can be divided into ring seat 301, supporting leg 302 and the three parts of base 303.Relied between three parts
Rotating connecting part 304 is connected, when without using or when carrying with support, top ring seat 301 can be folded into supporting leg 302,
Then fold jointly again into base 303, be changed into an oval plate-like after folding twice, be easy to carry.There is branch at the rear portion of ring seat 301
Charging plug 306, two side legs, 302 each built-in one piece of graphene-based lithium ion battery are arranged at stake pad 305, the bottom of supporting pad 305
307, the bottom right side of base 303 has charging inlet 308.When flight accompanies robot to enter conventional charging scenarios, charging bracket
Automatic transmitting infrared ray, flight accompany robot that Visual identification technology and infrared ray sensor can be used to be automatically positioned searching charging
It support, when self electric quantity deficiency preset value (such as 50%), actively will ask the user whether to charge, obtain after instructing certainly,
Robot Automatic-searching positioning ring seat is accompanied in flight, and dropping place charges.When without external power supply, what usable charging bracket was provided for oneself
Graphene-based lithium ion battery accompanies robot charging for flight.The broadcasting support that charging bracket is alternatively arranged as alternative projection is use
Family provides more stable drop shadow effect, and angle is multidirectional adjustable and can save itself electric energy.
As an optional embodiment of the utility model embodiment, the flight that the utility model embodiment provides is accompanied
Robot control method also includes:
The fingerprint identification module being arranged on pedestal is acquired to user fingerprint image, and the fingerprint image collected is sent out
Deliver to processor;
Processor is handled the fingerprint image collected, and carries out characteristic matching with system fingerprint storehouse fingerprint, is passed through
Matching checking, control flight accompany machine manually to make system unlatching.
Specifically, the flight that the utility model embodiment provides accompanies robot to be set on the arc surface of pedestal top
Fingerprint recognition window, to be acquired by fingerprint recognition window to user fingerprint image, the image of collection is pre-processed, carried
Image feature value is taken, itself and system fingerprint storehouse fingerprint are subjected to characteristic matching, is verified by matching, flight chaperone machine can be opened
Device people's work system.
Further, collection user's finger fingerprint can be set when robot initial setting up is accompanied in flight, while can be with auxiliary
Face, the Human detection system of helping others collectively constitute user profile identification device, it is ensured that flight accompany robot to user's identification only
One property, flight accompanies robot to wake up, start and the exploitation of payment system from now on can use the device.
As can be seen here, the flight that the utility model embodiment provides accompanies robot total structure similar to traditional
Flying saucer shape, the spherical tanks on top are similar to the cockpit of flying saucer, and the butterfly pedestal of bottom is similar to the hull part of flying saucer, overall
Flight is realized by four rotors.Flight accompanies robot by the control system of intelligence and the cooperation of multisensor, Ke Yiwei
User provide man-machine interaction, communication, flight shooting, audio and video playing, obtain information, service for life, safely accompany and attend to, navigate, turning over
The service such as translate, play, most functions of present mobile phone can be replaced.The use of robot is accompanied by the flight, allows people
Gradually become accustomed to more using establishing man-machine interaction mode on the basis of interactive voice, and can design whereby to a certain degree
The upper habits and customs for changing people and life style, and more facilities and enjoyment are brought to the life of the mankind, allow intelligent future
It is more colourful.
The institute that flight company robot provided by the utility model can substitute existing mobile phone completely is functional, to original work(
It can make to innovate and develop, use a large amount of voice interactive functions, user can be made gradually to break away from the state of " eye is not disembarked ", allowed
People had both enjoyed the facility that modern science and technology is brought, and in turn ensure that the health of body.
In addition, infrared alternative projection technology provide it is a kind of go out speech enabled mode beyond interpersonal communication mode, interaction
Projection can provide the user it is small to being projeced into desktop browsing webpage, playing game, greatly to being projeced into the video of seeing a film of metope, award
The interactive experience that class etc. covers all around.
In addition, flight accompanies robot to provide camera+GPS vision guided navigation, sensor+GPS can also be provided
Non-vision navigation, and based on self create map navigation.
In addition, flight accompanies robot to provide the user the intertranslation function of multilingual in real time, the user is allowed no longer to be
Commercial negotiation, travelling go on a tour middle language obstacle the problem of and perplex.
In addition, the fast charge and Large Copacity storing up electricity characteristic of graphene-based lithium ion battery accompany robot to provide length for flight
The guarantee of time continuation of the journey, coordinates automatic charging function and charging bracket, for flight robot can be accompanied to provide endlessly
Electric power support, guarantee is provided long lasting for use for user.
In addition, flight accompanies robot to provide the user night walking illumination, when user confirms to be encroached on,
Live view can be real-time transmitted in PC or public security networked system, be escort for user security.
Fig. 5 shows that the structural representation of robot control system is accompanied in the flight that the utility model embodiment provides, this
The flight that utility model embodiment provides accompanies robot set-up of control system to accompany machine in flight as shown in Figure 1 to Figure 3
In people, using the above method, the structure of the flight company robot control system only provided below the utility model embodiment
It is briefly described, other unaccomplished matters, the structure description of robot is accompanied to flight and to flight chaperone machine referring to above-mentioned
The associated description of device people's control method, referring to Fig. 5, robot control system bag is accompanied in the flight that the utility model embodiment provides
Include:
Processor 501, distance measuring instruction is sent to the ultrasonic wave module 502 being arranged on pedestal;
Ultrasonic wave module 502, distance measuring instruction is received, ultrasonic wave is projected in generation, and is launched ultrasonic wave is projected;Receive
The reflectance ultrasound ripple reflected, reflectance ultrasound ripple is converted into the first electric signal and sent to processor 501;
Processor 501, obtained by the infrared photography module 504 of image photographing module 503 and first being arranged in spherical tanks
The first rpyroelectric infrared module 505 that the information control taken is arranged on pedestal upper surface is tracked to the user after identification;Tool
Body, processing 501 can by the infrared photography module 504 of image photographing module 503 and first that is arranged in spherical tanks to
Family is identified and positioned, and obtains identification location information;The first of pedestal upper surface is arranged on according to identification location information control
Rpyroelectric infrared module 505 is tracked to the user after identification;
First rpyroelectric infrared module 505, human infrared signal caused by the user after identification is obtained, by the use after identification
Human infrared signal caused by family is converted to the second electric signal and sent to processor 501;
The inertia measuring module 506 being arranged on pedestal, the 3rd electric signal collected is sent to processor 501;
The Angle Measurement Module 507 being arranged on pedestal, the 4th electric signal collected is sent to processor 501;
Processor 501, controlled and flown according to the first electric signal, the second electric signal, the 3rd electric signal and the 4th electric signal
Accompany robot adjustment flight attitude, avoiding barrier and tracking user.
As can be seen here, robot control system is accompanied in the flight provided by the utility model embodiment so that flight is accompanied
With robot by the control system of intelligence and the cooperation of multisensor, man-machine interaction, communication, flight bat can be provided the user
Take the photograph, audio and video playing, obtain information, service for life, safety are accompanied and attended to, navigated, translating, play etc. and servicing, present hand can be replaced
Most functions of machine.Flight provided by the utility model accompanies machine human body product small, takes action flexible, easy to carry, cost
Also it is greatly reduced, therefore the flight accompanies robot to be easier to be commercialized and family's popularization.
As an optional embodiment of the utility model embodiment, the flight that the utility model embodiment provides is accompanied
Robot control system also includes:
The infrared projection module 508 of base bottom is arranged on, infrared light spot is projected to bottom environment;
The the second infrared photography module 509 for being arranged on base bottom and the second rpyroelectric infrared mould for being arranged on base bottom
Block 510, the distributed intelligence of infrared light spot is gathered respectively, distributed intelligence is sent to processor 501;
Processor 501, depth information is identified using distributed intelligence;According to the first electric signal, the second electric signal, the 3rd electricity
Signal, the 4th electric signal and depth information create surrounding three-dimensional map, and are navigated.Can specifically vision SLAM be utilized to calculate
Method creates surrounding three-dimensional map.
As can be seen here, robot control system is accompanied in the flight that the utility model embodiment provides, and also further passes through ball
Dual camera in shape cabin can establish the three of environment using binocular identification technology with reference to the infrared camera positioned at base bottom
Dimension coordinate, base bottom camera create depth information by gathering the infrared light spot that infrared projector projects, use vision
SLAM algorithms create surrounding three-dimensional digital map navigation.It is possible thereby to ensure that robot is accompanied in the flight that the utility model embodiment provides
The robots such as family, office, market can be directed to can not be three-dimensional using self being created under the complex environment of satellite positioning navigation
Navigation map, so as to realize accurate avoidance and complete user instruction task.
As an optional embodiment of the utility model embodiment, the flight that the utility model embodiment provides is accompanied
Robot control system also includes:
Processor 501, information to be output is sent to the infrared projection module 508 for being arranged on base bottom;
Infrared projection module 508, by interactive projection by information projection to be output in preset plane, formed shadowgraph
Face;
Second infrared photography module 509, the infrared light diffusing reflection information for indicating that thing produces in projected picture is caught, will be unrestrained
Reflective information is sent to processor 501;
Processor 501, diffusing reflection information is positioned and identified, control man-machine interaction.
As can be seen here, it to be user by infrared projector that robot is accompanied in the flight that the utility model embodiment provides
Real-time Domestic News, various chat tool interfaces, interface, navigation interface, translation interface etc. are provided, user passes through infrared throwing
Shadow interaction is operated, and projection platform does not have particular/special requirement, as long as the smooth table top can projection operation of enough area, operation
Simply, it is easy to left-hand seat.
As an optional embodiment of the utility model embodiment, the flight that the utility model embodiment provides is accompanied
Robot control system also includes:
Processor 501, information to be output is sent to the voice playing module 511 being arranged on pedestal;
Voice playing module 511, play information to be output;
The acoustics acquisition module 512 being arranged on pedestal, user speech information is gathered, voice messaging is sent to processor
501;
Processor 501, man-machine interaction is controlled according to voice messaging.Specifically, processor 501 can according to voice messaging,
Voice messaging is parsed, man-machine interaction is controlled according to the information parsed.
As can be seen here, user can open communication, navigation, wireless network, bluetooth and interpretative function etc. by phonetic order,
And audio or video reception mode are selected as requested.User selects audible receive information, and flight accompanies robot can
Address list is opened with voice, voice is sent short messages, Voice Navigation, voice broadcast user interest point news and information etc..With in vision
In the case of having obstacle, man-machine interaction is realized using voice.
As an optional embodiment of the utility model embodiment, the flight that the utility model embodiment provides is accompanied
In robot control system:
Image photographing module 503, the first infrared photography module 504 and/or the second infrared photography module 509, gather and wait to turn over
Text information is translated, and the text information to be translated collected is sent to processor 501;
Processor 501, real time translation is carried out to text information to be translated, the text information after translation is exported.
By the function, user can will in real time read the text of multilingual word, without personnel with the help of an interpreter and
Other translation tools, make the exchange between different context user more smooth and convenient.User selects video mode to receive letter
Breath, correlation module will show information on the projection surface by alternative projection technology, and user can be with real-time operation and use.
As an optional embodiment of the utility model embodiment, the flight that the utility model embodiment provides is accompanied
Robot control system also includes:
Charging bracket 513, launch infrared ray;
First rpyroelectric infrared module 505, the second rpyroelectric infrared module 510, the first infrared photography module 504 and second
Infrared photography module 509, respectively gather charging bracket 513 launch infrared ray, and by the infrared ray information collected send to
Processor 501;
Processor 501, according to the first rpyroelectric infrared module 505, the second rpyroelectric infrared module 510, first is infrared takes the photograph
As the infrared ray Information locating charging bracket 513 that the infrared photography module 509 of module 504 and second collects, and flight is controlled to accompany
Flown with robot to charging bracket 513 and charging bracket 513 of taking a seat is to be charged.Specifically, processor 501 may determine that
Whether self electric quantity is less than preset value, and when less than preset value, according to the first rpyroelectric infrared module 505, the second pyroelectricity
The infrared ray Information locating that infrared module 510, the first infrared photography module 504 and the second infrared photography module 509 collect fills
Electric support 513, and control flight to accompany robot flies to charging bracket 513 and charging bracket 513 of taking a seat is to be charged.
When flight accompanies robot to enter conventional charging scenarios, charging bracket launches infrared ray, flight chaperone machine automatically
Device people can use Visual identification technology and infrared ray sensor to be automatically positioned searching charging bracket, when self electric quantity deficiency preset value
It when (such as 50%), actively will ask the user whether to charge, obtain after instructing certainly, robot Automatic-searching positioning is accompanied in flight
Ring seat, and dropping place charges.When without external power supply, the graphene-based lithium ion battery that usable charging bracket is provided for oneself is accompanied for flight
Charged with robot.The broadcasting support that charging bracket is alternatively arranged as alternative projection provides the user more stable drop shadow effect,
Angle is multidirectional adjustable and can save itself electric energy.
As an optional embodiment of the utility model embodiment, the flight that the utility model embodiment provides is accompanied
Robot control system also includes:
The fingerprint identification module 514 being arranged on pedestal, is acquired to user fingerprint image, the fingerprint image that will be collected
As sending to processor 501;
Processor 501, flight is controlled to accompany machine manually to make system unlatching according to fingerprint image.Specifically, processor 501
The fingerprint image collected can be handled, and characteristic matching is carried out with system fingerprint storehouse fingerprint, verified by matching, control
System flight accompanies machine manually to make system unlatching.
As can be seen here, collection user's finger fingerprint can be set when robot initial setting up is accompanied in flight, while can be with auxiliary
Face, the Human detection system of helping others collectively constitute user profile identification device, it is ensured that flight accompany robot to user's identification only
One property, flight accompanies robot to wake up, start and the exploitation of payment system from now on can use the device.
Any process or method described otherwise above description in flow chart or herein is construed as, and represents to include
Module, fragment or the portion of the code of the executable instruction of one or more the step of being used to realize specific logical function or process
Point, and the scope of preferred embodiment of the present utility model includes other realization, wherein can not press shown or discuss
Order, including according to involved function by it is basic simultaneously in the way of or in the opposite order, carry out perform function, this should be by this
The embodiment person of ordinary skill in the field of utility model is understood.
It should be appreciated that each several part of the present utility model can be realized with hardware, software, firmware or combinations thereof.
In above-mentioned embodiment, what multiple steps or method can be performed in memory and by suitable instruction execution system with storage
Software or firmware are realized.If, and in another embodiment, can be with known in this field for example, realized with hardware
Any one of following technology or their combination realize:With the gate for realizing logic function to data-signal
The discrete logic of circuit, the application specific integrated circuit with suitable combinational logic gate circuit, programmable gate array (PGA),
Field programmable gate array (FPGA) etc..
Those skilled in the art are appreciated that to realize all or part of step that above-described embodiment method carries
Suddenly it is that by program the hardware of correlation can be instructed to complete, described program can be stored in a kind of computer-readable storage medium
In matter, the program upon execution, including one or a combination set of the step of embodiment of the method.
In addition, each functional unit in each embodiment of the utility model can be integrated in a processing module,
Can be that unit is individually physically present, can also two or more units be integrated in a module.It is above-mentioned integrated
Module can both be realized in the form of hardware, can also be realized in the form of software function module.The integrated mould
If block is realized in the form of software function module and counted as independent production marketing or in use, one can also be stored in
In calculation machine read/write memory medium.
Storage medium mentioned above can be read-only storage, disk or CD etc..
In the description of this specification, reference term " one embodiment ", " some embodiments ", " example ", " specifically show
The description of example " or " some examples " etc. means specific features, structure, material or the spy for combining the embodiment or example description
Point is contained at least one embodiment or example of the present utility model.In this manual, to the schematic table of above-mentioned term
State and be not necessarily referring to identical embodiment or example.Moreover, specific features, structure, material or the feature of description can be
Combined in an appropriate manner in any one or more embodiments or example.
Although embodiment of the present utility model has been shown and described above, it is to be understood that above-described embodiment is
Exemplary, it is impossible to it is interpreted as not departing from the utility model to limitation of the present utility model, one of ordinary skill in the art
Principle and objective in the case of above-described embodiment can be changed in the scope of the utility model, change, replace and
Modification.The scope of the utility model is by appended claims and its equivalent limits.
Claims (8)
1. robot control system is accompanied in one kind flight, it is characterised in that including:
Processor, distance measuring instruction is sent to the ultrasonic wave module being arranged on pedestal;
The ultrasonic wave module, the distance measuring instruction is received, ultrasonic wave is projected in generation, and the injection ultrasonic wave is sent out
Penetrate;The reflectance ultrasound ripple reflected is received, the reflectance ultrasound ripple is converted into the first electric signal and sent to the processor;
The processor, the information control obtained by the image photographing module being arranged in spherical tanks and the first infrared photography module
The first rpyroelectric infrared module that system is arranged on the pedestal upper surface is tracked to the user after identification;
The first rpyroelectric infrared module, obtains human infrared signal caused by the user after the identification, by the identification
The human infrared signal caused by user afterwards is converted to the second electric signal and sent to the processor;
The inertia measuring module being arranged on the pedestal, the 3rd electric signal collected is sent to the processor;
The Angle Measurement Module being arranged on the pedestal, the 4th electric signal collected is sent to the processor;
The processor, according to first electric signal, second electric signal, the 3rd electric signal and the 4th electricity
Robot adjustment flight attitude, avoiding barrier and tracking user are accompanied in the signal control flight.
2. system according to claim 1, it is characterised in that also include:
The infrared projection module of the base bottom is arranged on, infrared light spot is projected to bottom environment;
It is arranged on the second infrared photography module of the base bottom and is arranged on the second rpyroelectric infrared of the base bottom
Module, the distributed intelligence of the infrared light spot is gathered respectively, the distributed intelligence is sent to the processor;
The processor, depth information is identified using the distributed intelligence;According to first electric signal, second telecommunications
Number, the 3rd electric signal, the 4th electric signal and the depth information create surrounding three-dimensional map, and navigated.
3. system according to claim 1, it is characterised in that also include:
The processor, information to be output is sent to the infrared projection module for being arranged on the base bottom;
The infrared projection module, by interactive projection by the information projection to be output in preset plane, formed projection
Picture;
Second infrared photography module, the infrared light diffusing reflection information for indicating that thing produces in the projected picture is caught, by described in
Diffusing reflection information is sent to the processor;
The processor, the diffusing reflection information is positioned and identified, control man-machine interaction.
4. the system according to claim 1 or 3, it is characterised in that also include:
The processor, information to be output is sent to the voice playing module being arranged on the pedestal;
The voice playing module, play the information to be output;
The acoustics acquisition module being arranged on the pedestal, user speech information is gathered, the voice messaging is sent to described
Processor;
The processor, man-machine interaction is controlled according to the voice messaging.
5. the system according to claim 1 or 3, it is characterised in that
The image photographing module, the first infrared photography module and/or the second infrared photography module, gather word to be translated
Information, and the text information to be translated collected is sent to the processor;
The processor, real time translation is carried out to the text information to be translated, the text information after translation is exported.
6. system according to claim 4, it is characterised in that
The image photographing module, the first infrared photography module and/or the second infrared photography module, gather word to be translated
Information, and the text information to be translated collected is sent to the processor;
The processor, real time translation is carried out to the text information to be translated, the text information after translation is exported.
7. system according to claim 1, it is characterised in that also include:
Charging bracket, launch infrared ray;
The first rpyroelectric infrared module, the second rpyroelectric infrared module, the first infrared photography module and second are infrared
Photographing module, gathers the infrared ray of the charging bracket transmitting respectively, and the infrared ray information collected is sent to the place
Manage device;
The processor, according to the first rpyroelectric infrared module, the second rpyroelectric infrared module, described first infrared
Charging bracket described in the infrared ray Information locating that photographing module and the second infrared photography module collect, and control described fly
Row accompanies that robot flies to the charging bracket and the charging bracket of taking a seat is to be charged.
8. system according to claim 1, it is characterised in that also include:
The fingerprint identification module being arranged on the pedestal, is acquired to user fingerprint image, the fingerprint that will be collected
Image is sent to the processor;
The processor, machine is accompanied manually to make system unlatching according to the fingerprint image control flight.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201720767936.6U CN207008413U (en) | 2017-06-28 | 2017-06-28 | Robot control system is accompanied in one kind flight |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201720767936.6U CN207008413U (en) | 2017-06-28 | 2017-06-28 | Robot control system is accompanied in one kind flight |
Publications (1)
Publication Number | Publication Date |
---|---|
CN207008413U true CN207008413U (en) | 2018-02-13 |
Family
ID=61447951
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201720767936.6U Active CN207008413U (en) | 2017-06-28 | 2017-06-28 | Robot control system is accompanied in one kind flight |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN207008413U (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108930890A (en) * | 2018-07-20 | 2018-12-04 | 辽宁工业大学 | A kind of packaged type external display screen bracket and its control method for driving assistance system experiment porch |
CN110647989A (en) * | 2019-09-16 | 2020-01-03 | 长春师范大学 | Graphene defect modification prediction method based on neural network |
-
2017
- 2017-06-28 CN CN201720767936.6U patent/CN207008413U/en active Active
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108930890A (en) * | 2018-07-20 | 2018-12-04 | 辽宁工业大学 | A kind of packaged type external display screen bracket and its control method for driving assistance system experiment porch |
CN108930890B (en) * | 2018-07-20 | 2023-08-01 | 辽宁工业大学 | Movable external display screen bracket for driving assistance system experiment platform and control method thereof |
CN110647989A (en) * | 2019-09-16 | 2020-01-03 | 长春师范大学 | Graphene defect modification prediction method based on neural network |
CN110647989B (en) * | 2019-09-16 | 2023-04-18 | 长春师范大学 | Graphene defect modification prediction method based on neural network |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107168371A (en) | Robot control method and system are accompanied in one kind flight | |
US11914370B2 (en) | System and method for providing easy-to-use release and auto-positioning for drone applications | |
US11649052B2 (en) | System and method for providing autonomous photography and videography | |
US10218885B2 (en) | Throwable cameras and network for operating the same | |
US9237317B2 (en) | Throwable camera and network for operating the same | |
JP6077016B2 (en) | Board assembly used with toy pieces | |
CN110119154A (en) | Control method, device and the equipment and aircraft of aircraft | |
EP2744580B1 (en) | Baseplate assembly for use with toy pieces | |
CN205263655U (en) | A system, Unmanned vehicles and ground satellite station for automatic generation panoramic photograph | |
JP2021520978A (en) | A method for controlling the interaction between a virtual object and a thrown object, its device, and a computer program. | |
CN105300181B (en) | It is a kind of can pre-tip shooting accurate photoelectronic collimating device and its adjusting process | |
CN108521812A (en) | Control method, unmanned plane and the machine readable storage medium of unmanned plane | |
CN105013137B (en) | A kind of intelligent running machine system and its application method | |
US20130286004A1 (en) | Displaying a collision between real and virtual objects | |
CN104932698A (en) | Handheld interactive device and projection interactive method thereof | |
CN106020227A (en) | Control method and device for unmanned aerial vehicle | |
CN110825121B (en) | Control device and unmanned aerial vehicle control method | |
CN108351574A (en) | System, method and apparatus for camera parameter to be arranged | |
CN205652330U (en) | Take flying robot of projecting apparatus | |
CN207008413U (en) | Robot control system is accompanied in one kind flight | |
CN110448912A (en) | Terminal control method and terminal device | |
CN109044753A (en) | A kind of man-machine body feeling interaction blind-guidance robot and working method | |
CN113334397B (en) | Emotion recognition entity robot device | |
CN207282003U (en) | The intelligent control bracelet and UAV system of unmanned plane | |
CN107219861A (en) | Robot control method and device are accompanied in one kind flight |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GR01 | Patent grant | ||
GR01 | Patent grant |