CN102141797B - Airport terminal service robot and control method thereof - Google Patents

Airport terminal service robot and control method thereof Download PDF

Info

Publication number
CN102141797B
CN102141797B CN201010587267A CN201010587267A CN102141797B CN 102141797 B CN102141797 B CN 102141797B CN 201010587267 A CN201010587267 A CN 201010587267A CN 201010587267 A CN201010587267 A CN 201010587267A CN 102141797 B CN102141797 B CN 102141797B
Authority
CN
China
Prior art keywords
stage
subsidiary engine
main frame
master control
links
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201010587267A
Other languages
Chinese (zh)
Other versions
CN102141797A (en
Inventor
高庆吉
董慧芬
牛国臣
王续乔
徐萍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Civil Aviation University of China
Original Assignee
Civil Aviation University of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Civil Aviation University of China filed Critical Civil Aviation University of China
Priority to CN201010587267A priority Critical patent/CN102141797B/en
Publication of CN102141797A publication Critical patent/CN102141797A/en
Application granted granted Critical
Publication of CN102141797B publication Critical patent/CN102141797B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64FGROUND OR AIRCRAFT-CARRIER-DECK INSTALLATIONS SPECIALLY ADAPTED FOR USE IN CONNECTION WITH AIRCRAFT; DESIGNING, MANUFACTURING, ASSEMBLING, CLEANING, MAINTAINING OR REPAIRING AIRCRAFT, NOT OTHERWISE PROVIDED FOR; HANDLING, TRANSPORTING, TESTING OR INSPECTING AIRCRAFT COMPONENTS, NOT OTHERWISE PROVIDED FOR
    • B64F1/00Ground or aircraft-carrier-deck installations
    • B64F1/36Other airport installations
    • B64F1/366Check-in counters

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses an airport terminal service robot and a control method thereof. The robot comprises a body, a control system, an identity card identifier and a boarding card printer; the body comprises a chassis, a shell, two driving wheels and two driven wheels; the shell is arranged on the surface of the chassis; the two driving wheels and the two driven wheels are arranged on two sides of the rear part and the front part of the chassis respectively, and a wheel cover is arranged above each wheel; and the control system, the identity card identifier and the boarding card printer are all arranged in an equipment cabin of the shell. The airport terminal service robot can realize manual remote control or automatic control, can provide functions of mobile self-service check-in, fixed line guide, baggage carrying, information query, multi-language voice communication, mobile monitoring and the like in a centralized service mode, and can be recreationally interacted with passengers so as to further simplify the check-in procedures, facilitate outgoing of civil aviation passengers, facilitate improvement of service efficiency and promote the integral scientific and technological level of an airport terminal.

Description

Terminal service robot and control method thereof
Technical field
The invention belongs to the civil aviation technical field, particularly relate to a kind of terminal service robot and control method thereof.
Background technology
Along with the develop rapidly of civil aviaton's cause, airport terminal in all parts of the country can not meet the demands far away on function and facility, therefore transforms one after another.Because the different style of each terminal, and the division of scale, moulding and functional areas also all has nothing in common with each other, and the passenger who therefore just goes into terminal in most cases is unfamiliar with the position of the layout in the terminal, Counter service etc.; So usually do not know to handle relevant formality where and how; Therefore still provide advice and each item business service with mode artificial, fixed point at present in most of airports, so not only can increase waiting the time of passenger, and passenger's check-in is numerous and diverse; But also can increase airport employe's labour intensity; Also need be equipped with more staff simultaneously, so the operational efficiency on airport is low, operation cost is high.
In order to address this problem, day intrinsic safety river Electronics Co., Ltd has developed the airport service robot of a kind of RoboPorter by name, tests on airport, Kitakyushu.The wheeled mobile robot of the artificial a kind of trolley type of this machine; It can be no more than the destination of the luggage of 20kg to any programming for passenger's carrying; F-Zero is 2.16km; The telephone microphone that the passenger can pick up in the robot during use is carried out speech exchange with it, and some information of inquiry airport periphery.But do not mention in the document that this robot possesses mobile SCM, mobile monitor and the multilingual function that exchanges the aspect.
In addition, the robot that Korea S airport people's commune discloses a kind of " omnipresent (Ubi Quitous) ", and used in domestic air routes such as Jin Pu, airports, Jinsen.This robot can provide various Airport information services to the passenger, comprises that the position of flight time, vehicles introduction and amenities is introduced, but also services for life such as weather, news, the physiological function rhythm and pace of moving things, body fat mensuration can be provided.This robot can move voluntarily, utilizes sound and simple gesture to greet to the passenger, and can show different postures and accompany passenger's souvenir of taking pictures.Though this robot can provide the information service than horn of plenty, shortcoming be can do nothing to help the passenger carry luggage, to passenger's channeling conduct, utility services such as moving SCM is provided, and not strong for the service specific aim of airport particular surroundings.
In a word, above-mentioned these similar machine people exist function singleness, service business disperse, can not simplify check-in, to defectives such as the specific aim of environment and adaptability are not strong.
Summary of the invention
In order to address the above problem, the object of the present invention is to provide a kind of terminal service robot that integrates traveler guiding, baggage handling, information consultation, moves multinomial functions such as SCM, multi-language voice interchange and mobile monitor.
Another object of the present invention is to provide a kind of control method of above-mentioned terminal service robot.
In order to achieve the above object, terminal service robot provided by the invention comprises body, control system, I.D. recognizer and boarding card printer; Described body comprises chassis, shell, two driving wheels and two engaged wheels; Wherein shell is installed in the surface on chassis; Two parts before and after being divided into; Anterior end face forms the hand baggage placement platform; The inside at rear portion is equipment compartment, and the middle part of rear end face is provided with an I.D. recognizer insertion mouth and a boarding card printer exports, and a side lower part of body also is provided with a battery door; Two driving wheels and two engaged wheels are installed in the rear portion and the front part sides on chassis respectively, and each top of taking turns all is equipped with a wheel cover; Control system, I.D. recognizer and boarding card printer are installed in the equipment compartment of shell.
Described control system comprises master control system, man-machine interactive system, environment sensing supervisory system, control and drive system, communication system and power-supply system; Wherein:
Master control system is made up of the main and auxiliaries that adopts cable network to communicate each other; Wherein main frame comprises main frame master control borad, control and driver module, audio frequency output module and PTZ driver module, and the main frame master control borad is connected with driver module, audio frequency output module and PTZ (Pan Tilt Zoom) driver module with control through the PC104+ bus; Subsidiary engine comprises subsidiary engine master control borad, A/D, IO module and image capture module, and the subsidiary engine master control borad is connected with A/D, IO module and image capture module through the PC104+ bus;
Man-machine interactive system comprises two parts, and first is made up of loudspeaker and microphone, and second portion is made up of touch-screen and LCD; Wherein loudspeaker links to each other with the main frame master control borad through the audio frequency output module; Microphone links to each other with the subsidiary engine master control borad through COBBAIF (ICH4), and the rear end top interior that it is installed in shell is fastenedly connected with shell; Touch-screen is installed in the rear end face upper face of shell, and it links to each other with the subsidiary engine master control borad through USB (USB) interface; LCD is installed in the inside of touch-screen, and with touch-screen near, its interface (VGA) and subsidiary engine master control borad through output simulating signal on the video card joins;
Control can be formed by device by the 1st drive motor, the 2nd drive motor, PTZ control device and power supply are continuous with drive system; Wherein the output shaft of the 1st drive motor and the 2nd drive motor links to each other with a driving wheel through reducer casing respectively, and joins through control on scrambler and the main frame and driver module; The PTZ control device links to each other with PTZ driver module on the main frame; Power supply is continuous then to join with the main frame master control board by device;
The environment sensing supervisory system is made up of inner sensory perceptual system and outside sensory perceptual system: wherein inner sensory perceptual system comprises the 1st, the 2nd photoelectric encoder and 3D digital compass; 1st, the 2nd photoelectric encoder is integrated in respectively on the tailing axle of the 1st, the 2nd drive motor, and links to each other with the main frame master control borad with driver module through control; The 3D digital compass links to each other with the subsidiary engine master control borad through serial ports RS232; Outside sensory perceptual system comprises Pan/Tilt/Zoom camera, two rear portion visible illumination lamps, two bottom visible illumination lamps, the 1st laser ranging system and the 2nd laser ranging systems; Wherein Pan/Tilt/Zoom camera is installed in the rear end face top surface of shell, and it links to each other with the subsidiary engine master control borad through image capture module; Two rear portion visible illumination lamps are arranged on the rear end face middle part surface of shell, and two bottom visible illumination lamps then are installed in respectively on the wheel cover of two driving wheels, and all link to each other with the subsidiary engine master control borad through A/D, IO module; The 1st laser ranging system is arranged on the rear end face lower surface of shell, and the 2nd laser ranging system then is installed in the front end of hand baggage placement platform, and links to each other with the subsidiary engine master control borad through serial ports RS232 respectively;
Communication system is made up of switch and WAP, and wherein switch links to each other with main and auxiliaries through wired network simultaneously; Switch joins through netting twine and WAP simultaneously;
Power-supply system is made up of battery, charger and power panel, is used to other system's power supply in the control system.
Described I.D. recognizer links to each other with the subsidiary engine master control borad through USB (USB); The boarding card printer joins through serial ports RS232 and subsidiary engine master control borad.
Described environment sensing supervisory system also comprises four collision avoidance systems that are installed in the wheel cover external margin position of two driving wheels and two engaged wheels respectively, and collision avoidance system links to each other with the subsidiary engine master control borad through A/D, IO module.
Described environment sensing supervisory system also comprises a smog, fire detecting arrangement, and this device mainly is made up of smog, temperature detection sensor and related circuit thereof, and it links to each other with the subsidiary engine master control borad through A/D, IO module.
Described driving wheel adopts inflated wheel, and engaged wheel adopts universal wheel.
The host control method of terminal service robot provided by the invention comprises the following step that carries out in order:
1) the S1 stage of initialization apparatus and self check, this stage is first stage after powering on, and in this stage, main frame will carry out power-up initializing, self check operation;
2) judge each of main frame operation normal S2 stage whether, in this stage, whether main frame will normally be judged the self check operating result in S1 stage; If judged result is " being ", gets into the S3 stage, otherwise get into the S13 stage; To carry out corresponding abnormality processing, withdraw from the master control circulation then;
3) judge whether to exist S3 stage of local steering order, this stage will judge currently whether have local steering order, if judged result is " being ", then get into the S4 stage, otherwise directly get into the S5 stage;
4) in the S4 stage of the corresponding local steering order of execution, in this stage, main frame will be carried out the local steering order of current existence;
5) judge whether S5 stage of internet message, this stage will judge currently whether have internet message, if judged result is " being ", will get into the S6 stage, otherwise will directly get into the S11 stage;
6) obtain S6 stage of decision behavior according to sensor information, in this stage, main frame will draw current decision behavior according to each sensor information that subsidiary engine is sent here;
7) the S7 stage of the display network message and the result of decision on the interface, in this stage, main frame transmits relevant idsplay order to subsidiary engine, and goes out the above-mentioned internet message and the result of decision through liquid crystal display displays;
8) behavior constantly obtains S8 stage of final behavior command in the fusion, and in this stage, main frame will be according to current action decision-making and last a status information constantly, determines the operational order of current the concrete action that will carry out;
9) whether decision instruction allows S9 stage of carrying out, and this stage will judge by determined operational order of S8 stage whether possess executable condition, if judged result is " being ", gets into the S10 stage, otherwise turns back to the porch in S6 stage;
10) coordinate to carry out 10 stages that order at each position, this stage will carry out by determined operational order of S8 stage;
11) judge whether S11 stage of withdrawing from, in this stage, main frame will judge whether exit criteria is set up, if judged result is " being ", withdraw from the master control circulation to stop operation, continue circulation otherwise turn back to the porch in S3 stage;
12) send the S12 stage of withdrawing from information to subsidiary engine, in this stage, main frame will send to subsidiary engine (31) and withdraw from information, promptly notify subsidiary engine to carry out and exit command, and main frame withdraws from the master control circulation to stop operation then.
The subsidiary engine control method of terminal service robot provided by the invention comprises the following step that carries out in order:
1) the S21 stage of initialization apparatus and self check; This stage is first stage after powering on, and in this stage, subsidiary engine will carry out power-up initializing, self check operation;
2) judge each of subsidiary engine operation normal S22 stage whether; In this stage, whether subsidiary engine will normally be judged the self check operating result in S21 stage, if judged result is " being "; Then get into the S23 stage; Otherwise get into the S31 stage,, withdraw from the master control circulation then to carry out corresponding abnormality processing;
3) gather S23 stage of each sensor information, in this stage, subsidiary engine will be gathered the information of each sensor successively;
4) the S24 stage of each sensor information of pre-service, in this stage, subsidiary engine adopts relevant filtering and fault-tolerant technique that sensor information is carried out pretreatment operation;
5) show S25 stage of each sensor pretreatment information, in this stage, subsidiary engine will go out resulting each sensing data in the stage at S24 through liquid crystal display displays;
6) judge the S26 stage whether network is communicated with, this stage will judge whether current network connects normal, if judged result is " being ", get into the S27 stage, otherwise turn back to the porch in S23 stage;
7) packing, to the S27 stage of main frame transmission sensor information, in this stage, subsidiary engine will send main frame (30) to through switch in S24 resulting each sensing data arrangement packing in the stage earlier then;
8) judge whether to receive the S28 stage that host computer control is instructed, in this stage, subsidiary engine will judge whether to receive the steering order of main frame, if judged result is " being ", will get into the S29 stage, otherwise will directly get into the S30 stage;
9) carry out the S29 stage that host computer control is instructed, in this stage, subsidiary engine will be carried out the control command that resulting main frame transmits in the stage at S28;
10) judge whether S30 stage of withdrawing from, in this stage, subsidiary engine will judge whether exit criteria is set up, if judged result is " being ", withdraw from the master control circulation stopping operation, otherwise the porch that turns back to S23 is to continue circulation.
Terminal service robot provided by the invention can be realized manual remote control or Autonomous Control; And can be that the passenger provides mobile SCM, can independently accomplish route guiding, baggage handling, information inquiry according to passenger's specific requirement with the mode of centralized services; Functions such as multi-language voice interchange and mobile monitor; And can with passenger's entertainment interactive, therefore can further simplify check-in, make things convenient for civil aviaton's travelling; Help improving efficiency of service, promote the whole scientific and technological level of terminal.
Description of drawings
Fig. 1 is a terminal service robot external structure stereographic map provided by the invention.
Fig. 2 is that terminal service-delivery machine philtrum control system provided by the invention constitutes block diagram.
Fig. 3 is a terminal service-delivery machine philtrum control system electrical structure diagram provided by the invention.
Fig. 4 is a terminal service robot host control method process flow diagram provided by the invention.
Fig. 5 is a terminal service robot subsidiary engine control method process flow diagram provided by the invention.
Embodiment
Below in conjunction with accompanying drawing and specific embodiment terminal service robot provided by the invention and control method thereof are elaborated.
Like Fig. 1-shown in Figure 3, terminal service robot provided by the invention comprises body, control system, I.D. recognizer 3 and boarding card printer 4; Described body comprises chassis 11, shell 15, two driving wheels 7 and two engaged wheels 12; Wherein shell 15 is installed in the surface on chassis 11; Two parts before and after being divided into; Anterior end face forms hand baggage placement platform 14; The inside at rear portion is equipment compartment, and the middle part of rear end face is provided with I.D. recognizer and inserts mouthful 43 and boarding card printers and export 44, and a side lower part of body also is provided with a battery door 10; Two driving wheels 7 and two engaged wheels 12 are installed in the rear portion and the front part sides on chassis 11 respectively, and each top of taking turns all is equipped with a wheel cover; Control system 23, I.D. recognizer 3 and boarding card printer 4 are installed in the equipment compartment of shell 15.
Described control system comprises master control system 20, man-machine interactive system 21, environment sensing supervisory system 22, control and drive system 23, communication system 24 and power-supply system 25; Wherein:
Master control system 20 is the core of control system; It is made up of main frame that adopts cable network to communicate each other 30 and subsidiary engine 31; Wherein main frame 30 comprises main frame master control borad, control and driver module, audio frequency output module and PTZ driver module, and the main frame master control borad is connected with driver module, audio frequency output module and PTZ driver module with control through the PC104+ bus; Subsidiary engine 31 comprises subsidiary engine master control borad, A/D, IO module and image capture module, and the subsidiary engine master control borad is connected with A/D, IO module and image capture module through the PC104+ bus; Robot all accomplishes under the control of master control system 20 perception of external environment condition and oneself state, planning, decision-making, action execution etc.Main frame 30 is mainly used in the master scheduling decision-making, voice music output, motion control, the Navigation Control that realize robot, keep away function such as barrier control; Subsidiary engine 31 is mainly used in tasks such as the environment sensing of accomplishing robot, man-machine interaction, information fusion, communication, main frame 30 and the subsidiary engine 31 common allomeric functions of accomplishing robot.
Man-machine interactive system 21 comprises two parts, and first is made up of loudspeaker 40 and microphone 35, and it is mutual to be used to accomplish the man machine language, and second portion is made up of touch-screen 2 and LCD 34, is used to accomplish man-machine touch screen interaction; Wherein loudspeaker 40 links to each other with the main frame master control borad through the audio frequency output module, is used for output audio signal, and the keeper can control the opening and closing of two-way voice transmissions as required; Microphone 35 links to each other with the subsidiary engine master control borad through COBBAIF (ICH4); It is installed in the rear end top interior of shell 15, is fastenedly connected with shell 15, is used to realize the bidirectional remote transmission of voice messaging; Be that speech recognition realizes in subsidiary engine 31, voice output is then accomplished in main frame 30; Touch-screen 2 is installed in the rear end face upper face of shell 15, and it links to each other with the subsidiary engine master control borad through USB (USB) interface; LCD 34 be installed in touch-screen 2 in the centre, and with touch-screen 2 near, its interface (VGA) and subsidiary engine master control borad through output simulating signal on the video card joins;
Control is used for control robot and on the road surface, moves freely with drive system 23, but and the movement velocity of real-time regulated robot and direction of motion, it is by the 1st drive motor the 38, the 2nd drive motor 39, PTZ control device 41 and power supply is continuous can form by device 42; Wherein the output shaft of the 1st, the 2 drive motor 38,39 links to each other with a driving wheel 7 through reducer casing respectively, and the while joins with control and driver module on the main frame 30; PTZ control device 41 links to each other with PTZ driver module on the main frame 30, and change and zoom, the change that is used for level, the pitching camera angle of Long-distance Control Pan/Tilt/Zoom camera 1 action such as doubly realizes location and interaction entertainment to cooperate robot body; The special-purpose motion control chip LM629 that National Semiconductor is partly adopted in motion control in control and the driver module is as kernel control chip; LM629 constitutes the master-slave mode kinetic control system through PC104+ bus and main frame master control borad, and LM629 can carry out accurate real-time calculation task in high performance digital moving control; The main frame master control borad is mutual through a senior command set and LM629; As parameters such as speed, distance are provided; LM629 is through PID controller output pulse width modulation signal, again through overdriving and control module just can promote the 1st drive motor 38 and the 2nd drive motor 39 turns round; Power supply continuous can device 42 join with the main frame master control board, be used to control the charging of power-supply system 25 or change battery, can continuous firing to guarantee robot;
Environment sensing supervisory system 22 is input perception parts of master control system 20, and it utilizes the sensor acquisition external information, and this information is carried out fusion treatment, is used for behavior planning and decision-making.This system is made up of inner sensory perceptual system and outside sensory perceptual system; Wherein inner sensory perceptual system comprises the 1st, the 2nd photoelectric encoder 45,46 and 3D digital compass 36, is used for the state of monitoring robot, with the behavior of adjustment and control robot; 1st, the 2nd photoelectric encoder 45,46 is integrated in respectively on the tailing axle of the 1st, the 2nd drive motor 38,39, and links to each other with the main frame master control borad with driver module through control, is used to calculate the mileage of two driving wheels, 7 motions; 3D digital compass 36 links to each other with the subsidiary engine master control borad through serial ports RS232, is used for confirming the angle of robot; Outside sensory perceptual system comprises Pan/Tilt/Zoom camera 1, two rear portion visible illumination lamps 5, two bottom visible illumination lamp the 6, the 1st laser ranging systems 8 and the 2nd laser ranging systems 13; Be used for state and characteristic information that robot obtains each class targets of surrounding environment; Make between robot-environment reciprocation can take place, thereby make robot adaptive ability arranged environment.Wherein Pan/Tilt/Zoom camera 1 is installed in the rear end face top surface of shell 15, and it links to each other with the subsidiary engine master control borad through image capture module, is used for the collection of external image; Two rear portion visible illumination lamps 5 are arranged on the rear end face middle part surface of shell 15; Two 6 on bottom visible illumination lamps are installed in respectively on the wheel cover of two driving wheels 7; And all, be used to provide effective illumination through linking to each other with the subsidiary engine master control borad through A/D, IO module; The 1st laser ranging system 8 is arranged on the rear end face lower surface of shell 15; The 2nd laser ranging system 13 then is installed in the front end of hand baggage placement platform 14; And link to each other with the subsidiary engine master control borad through serial ports RS232 respectively; To constitute 360 ° investigative range, be used for barrier around the detection machine people, and utilize this information to realize automatic obstacle avoiding; Robot carries out location and the navigation under the indoor environment through merging information such as Pan/Tilt/Zoom camera 1, photoelectric encoder 45,3D digital compass 36 and laser ranging system 8,13;
Communication system 24 is mainly used in realizes mutual between machine man-robot, the man-robot; It is made up of switch 32 and WAP (AP) 33; Wherein switch 32 links to each other with subsidiary engine 31 with main frame 30 through wired network simultaneously; To realize the communication between main frame 30 and the subsidiary engine 31, this mode can guarantee the speed and the efficient of communication; Simultaneously switch 32 joins through netting twine and WAP 33, realizing the communication of extraneous computing machine and robot, and is convenient to debugging and monitoring to robot;
Power-supply system 25 provides a reliable and stable power source for robot; Be used to other system's power supply in the control system; It is made up of battery, charger and power panel; Battery can adopt lead-acid battery, and its charging control is to accomplish by device 42 by power supply is continuous, makes the robot can continuous firing after the charging.
Described I.D. recognizer 3 links to each other with the subsidiary engine master control borad through USB (USB), is used to discern user's I.D. and obtains relevant information; Boarding card printer 4 joins through serial ports RS232 and subsidiary engine master control borad, is used to print passenger's boarding card.
Described environment sensing supervisory system 22 also comprises four collision avoidance systems 9 that are installed in the wheel cover external margin position of two driving wheels 7 and two engaged wheels 12 respectively; Collision avoidance system 9 links to each other with the subsidiary engine master control borad through A/D, IO module, and itself and Pan/Tilt/Zoom camera 1 are served as the task of perception monitoring operate outside environment jointly.
Described environment sensing supervisory system 22 also comprises a smog, fire detecting arrangement 37; This device mainly is made up of smog, temperature detection sensor and related circuit thereof; It links to each other with the subsidiary engine master control borad through A/D, IO module; Be used to monitor the working environment, and realize smog alarm and fire alarm.
Described driving wheel 7 adopts inflated wheel, can play the effect of damping like this; Engaged wheel 12 adopts universal wheel.
Terminal robot hardware provided by the invention system has adopted two host computer systems---and host computer system and auxiliary system, System Software also adopts two host computer systems with realizing.Host computer system is carried out behaviour decision making according to the subsidiary engine data that receive, with the action of control robot body; Carry out the output of voice output and music; Send the touch-screen steering order to subsidiary engine, accomplish man-machine interaction jointly with auxiliary system.Auxiliary system is mainly used in sensor information collection, pre-service, transmits to host computer system with the sensor information packing and through network.The collection of the collection of the collection of the collection that sensor information is handled the collection comprise voice signal and identification, image information and the collection of pre-service, LDMS ranging information and pre-service, the collection of clashing into information and pre-service, contact screen information and pre-service, battery voltage information and pre-service, compass information and pre-service, I.D. are discerned and printer prints boarding card etc.In addition, auxiliary system can also provide information searching function to the passenger, the passenger can through the mode of touching touch-screen accomplish multiple information inquiry and and the terminal robot between realize interactive.
Fig. 4 is a terminal service robot host control method process flow diagram provided by the invention.As shown in Figure 4, terminal service robot host control method provided by the invention comprises the following step that carries out in order:
1) the S1 stage of initialization apparatus and self check, this stage is first stage after powering on, and in this stage, main frame 30 will carry out power-up initializing, self check operation;
2) judge each operation of main frame 30 normal S2 stage whether; In this stage, whether main frame 30 will normally be judged the self check operating result in S1 stage, if judged result is " being "; Get into the S3 stage; Otherwise get into the S13 stage,, withdraw from the master control circulation then to carry out corresponding abnormality processing;
3) judge whether to exist S3 stage of local steering order, this stage will judge currently whether have local steering order, if judged result is " being ", then get into the S4 stage, otherwise directly get into the S5 stage;
4) in the S4 stage of the corresponding local steering order of execution, in this stage, main frame 30 will be carried out the local steering order of current existence;
5) judge whether S5 stage of internet message, this stage will judge currently whether have internet message, if judged result is " being ", will get into the S6 stage, otherwise will directly get into the S11 stage;
6) obtain S6 stage of decision behavior according to sensor information, in this stage, main frame 30 will draw current decision behavior according to each sensor information that subsidiary engine 31 is sent here;
7) the S7 stage of the display network message and the result of decision on the interface, in this stage, main frame 30 transmits relevant idsplay order to subsidiary engine 31, and demonstrates the above-mentioned internet message and the result of decision through LCD 34;
8) behavior constantly obtains S8 stage of final behavior command in the fusion, and in this stage, main frame 30 will be according to current action decision-making and last a status information constantly, determines the operational order of current the concrete action that will carry out;
9) whether decision instruction allows S9 stage of carrying out, and this stage will judge by determined operational order of S8 stage whether possess executable condition, if judged result is " being ", gets into the S10 stage, otherwise turns back to the porch in S6 stage;
10) coordinate to carry out 10 stages that order at each position, this stage will carry out by determined operational order of S8 stage;
11) judge whether S11 stage of withdrawing from, in this stage, main frame 30 will judge whether exit criteria is set up, if judged result is " being ", withdraw from the master control circulation to stop operation, continue circulation otherwise turn back to the porch in S3 stage;
12) send the S12 stage of withdrawing from information to subsidiary engine, in this stage, main frame 30 will send to subsidiary engine 31 and withdraw from information, promptly notify subsidiary engine 31 to carry out and exit command, and main frame 30 withdraws from the master control circulation to stop operation then.
Fig. 5 is a terminal service robot subsidiary engine control method process flow diagram provided by the invention.As shown in Figure 5, terminal service robot subsidiary engine control method provided by the invention comprises the following step that carries out in order:
1) the S21 stage of initialization apparatus and self check; This stage is first stage after powering on, and in this stage, subsidiary engine 31 will carry out power-up initializing, self check operation;
2) judge each of subsidiary engine operation normal S22 stage whether; In this stage, whether subsidiary engine 31 will normally be judged the self check operating result in S21 stage, if judged result is " being "; Then get into the S23 stage; Otherwise get into the S31 stage,, withdraw from the master control circulation then to carry out corresponding abnormality processing;
3) gather S23 stage of each sensor information, in this stage, subsidiary engine 31 will be gathered the information of each sensor successively;
4) the S24 stage of each sensor information of pre-service, in this stage, subsidiary engine 31 adopts relevant filtering and fault-tolerant technique that sensor information is carried out pretreatment operation;
5) show S25 stage of each sensor pretreatment information, in this stage, subsidiary engine 31 will demonstrate resulting each sensing data in the stage at S24 through LCD 34;
6) judge the S26 stage whether network is communicated with, this stage will judge whether current network connects normal, if judged result is " being ", get into the S27 stage, otherwise turn back to the porch in S23 stage;
7) packing, to the S27 stage of main frame transmission sensor information, in this stage, subsidiary engine 31 will send main frame 30 to through switch 32 in S24 resulting each sensing data arrangement packing in the stage earlier then;
8) judge whether to receive the S28 stage that host computer control is instructed, in this stage, subsidiary engine 31 will judge whether to receive the steering order of main frame 30, if judged result is " being ", will get into the S29 stage, otherwise will directly get into the S30 stage;
9) carry out the S29 stage that host computer control is instructed, in this stage, subsidiary engine 31 will be carried out the control command that resulting main frame 30 transmits in the stage at S28;
10) judge whether S30 stage of withdrawing from, in this stage, subsidiary engine 31 will judge whether exit criteria is set up, if judged result is " being ", withdraw from the master control circulation stopping operation, otherwise the porch that turns back to S23 is to continue circulation.
Terminal service robot host control method provided by the invention and subsidiary engine control method have been described above, have been further specified main program module and the work characteristics thereof in master control system 20, moved below.
The Control Software that is adopted in the terminal service-delivery machine robot system provided by the invention mainly is divided into four modules; Be respectively odometer and compass and merge locating module, locating module, main system control module and user interactive module based on visual signature; First three module realizes that in robot main frame 30 the 4th module realizes in robot subsidiary engine 31.
It is that robot control provides real-time posture information that odometer and compass merge locating module.This module is through the 1st, the 2nd photoelectric encoder 45; 46 and 3D digital compass 36 gather the mileage of two driving wheels 7 and the angle-datas of robot in real time; After fusion treatment, obtain the higher robot posture information of confidence level, and with this information main system control module that is sent to Network Based.
Locating module based on visual signature is that robot provides reliably posture information accurately, swindles under the situation and plays a significant role in that initial pose is unknown, the pose confidence level is lower or robot takes place.This module utilizes the sift characteristic of the mark information of known map to position calculating.
Main system control module is mainly realized expression and establishment, task scheduling, the robot of the map functions such as path trace control based on fixed route.
User interactive module mainly realizes the mutual of software systems and passenger, and the passenger can assign the instruction of guiding to robot, and this module also can demonstrate the status information of current robot.
When the passenger need use terminal service robot provided by the invention to load luggage, it can be placed on hand baggage placement platform 14 with luggage, and this robot just can be transported to any position in the terminal with it like this.
When the passenger need guide, at first touch touch-screen 2 with the hand point, select bootmode and goal-selling point then the business model from LCD 34, click afterwards begins the guiding sign, and at this moment route guidance begins to carry out.The passenger can follow the terminal service robot according to the voice guide prompting of guiding of the picture on the LCD 34 and loudspeaker 40 and arrive required impact point.
When the passenger needs SCM; At first click touch-screen 2 with hand; Select the SCM pattern then in the business model on LCD 34, at this moment master control system 20 will be inserted the inside that mouth 43 is put into I.D. recognizer 3 with its I.D. through the I.D. recognizer through LCD 34 prompting passengers, after confirming that I.D. has been put into; Under the control of master control system 20 I.D. recognizer 3 will one-off scanning I.D. image and chip in data message; Afterwards I.D. is inserted the mouth 43 from the I.D. recognizer and withdraw from, master control system 20 will connect the remote data storehouse subsequently, and the prompting passenger operates by the flow process that provides on the LCD 34; Continuous to accomplish the correlation tractor driver; Boarding card printer 4 can print boarding card immediately under the control of master control system 20 at last, and offers the passenger through boarding card outlet 44, accomplishes the SCM process thus.
When the passenger needs information inquiry; At first click touch-screen 2 with hand; Select the information inquiry pattern then in the business model on LCD 34; LCD 3 will demonstrate the query pattern homepage, comprise four big types of functions under this homepage: Airport information, transportation guide, tourist lodging, entertainment interactive.The passenger can use finger to click submodule, gets into subpage frame and checks related content.
In addition, owing to have multilingual display interface on the LCD 34 of this terminal service robot, be applicable to that therefore the passenger of different language country uses, so easy to use.

Claims (6)

1. terminal service robot, it is characterized in that: described terminal service robot comprises body, control system, I.D. recognizer (3) and boarding card printer (4); Described body comprises chassis (11), shell (15), two driving wheels (7) and two engaged wheels (12); Wherein shell (15) is installed in the surface of chassis (11); Two parts before and after being divided into; Anterior end face forms hand baggage placement platform (14); The inside at rear portion is equipment compartment, and the middle part of rear end face is provided with an I.D. recognizer insertion mouthful (43) and a boarding card printer exports (44), and a side lower part of body also is provided with a battery door (10); Two driving wheels (7) and two engaged wheels (12) are installed in the rear portion and the front part sides on chassis (11) respectively, and each top of taking turns all is equipped with a wheel cover; Control system (23), I.D. recognizer (3) and boarding card printer (4) are installed in the equipment compartment of shell (15);
Described control system comprises master control system (20), man-machine interactive system (21), environment sensing supervisory system (22), control and drive system (23), communication system (24) and power-supply system (25); Wherein:
Master control system (20) is made up of the main frame (30) and the subsidiary engine (31) that adopt cable network to communicate each other; Wherein main frame (30) comprises main frame master control borad, control and driver module, audio frequency output module and PTZ driver module, and the main frame master control borad is connected with driver module, audio frequency output module and PTZ driver module with control through the PC104+ bus; Subsidiary engine (31) comprises subsidiary engine master control borad, A/D, IO module and image capture module, and the subsidiary engine master control borad is connected with A/D, IO module and image capture module through the PC104+ bus;
Man-machine interactive system (21) comprises two parts, and first is made up of loudspeaker (40) and microphone (35), and second portion is made up of touch-screen (2) and LCD (34); Wherein loudspeaker (40) links to each other with the main frame master control borad through the audio frequency output module; Microphone (35) links to each other with the subsidiary engine master control borad through COBBAIF, and it is installed in the rear end top interior of shell (15), is fastenedly connected with shell (15); Touch-screen (2) is installed in the rear end face upper face of shell (15), and it links to each other with the subsidiary engine master control borad through USB; LCD (34) is installed in the inside of touch-screen (2), and with touch-screen (2) near, its through on the video card output simulating signal interface and subsidiary engine master control borad join;
Control can be formed by device (42) by the 1st drive motor (38), the 2nd drive motor (39), PTZ control device (41) and power supply are continuous with drive system (23); Wherein the output shaft of the 1st drive motor (38) and the 2nd drive motor (39) links to each other with a driving wheel (7) through reducer casing respectively, and the output shaft of the 1st drive motor (38) and the 2nd drive motor (39) joins through control and driver module on scrambler and the main frame (30); PTZ control device (41) links to each other with PTZ driver module on the main frame (30); Power supply is continuous then to join with the main frame master control board by device (42);
Environment sensing supervisory system (22) is made up of inner sensory perceptual system and outside sensory perceptual system; Wherein inner sensory perceptual system comprises the 1st, the 2nd photoelectric encoder (45,46) and 3D digital compass (36); 1st, the 2nd photoelectric encoder (45,46) is integrated in respectively on the tailing axle of the 1st, the 2nd drive motor (38,39), and links to each other with the main frame master control borad with driver module through control; 3D digital compass (36) links to each other with the subsidiary engine master control borad through serial ports RS232; Outside sensory perceptual system comprises Pan/Tilt/Zoom camera (1), two rear portion visible illumination lamps (5), two bottom visible illumination lamps (6), the 1st laser ranging system (8) and the 2nd laser ranging systems (13); Wherein Pan/Tilt/Zoom camera (1) is installed in the rear end face top surface of shell (15), and it links to each other with the subsidiary engine master control borad through image capture module; Two rear portion visible illumination lamps (5) are arranged on the rear end face middle part surface of shell (15), and two bottom visible illumination lamps (6) then are installed in respectively on the wheel cover of two driving wheels (7), and all link to each other with the subsidiary engine master control borad through A/D, IO module; The 1st laser ranging system (8) is arranged on the rear end face lower surface of shell (15), and the 2nd laser ranging system (13) then is installed in the front end of hand baggage placement platform (14), and links to each other with the subsidiary engine master control borad through serial ports RS232 respectively;
Communication system (24) is made up of switch (32) and WAP (33), and wherein switch (32) links to each other with subsidiary engine (31) with main frame (30) through wired network simultaneously; Switch (32) joins through netting twine and WAP (33) simultaneously;
Power-supply system (25) is made up of battery, charger and power panel, is used to other system's power supply in the control system;
Described I.D. recognizer (3) links to each other with the subsidiary engine master control borad through USB (USB); Boarding card printer (4) joins through serial ports RS232 and subsidiary engine master control borad.
2. terminal service robot according to claim 1; It is characterized in that: described environment sensing supervisory system (22) also comprises four collision avoidance systems (9) that are installed in the wheel cover external margin position of two driving wheels (7) and two engaged wheels (12) respectively, and collision avoidance system (9) links to each other with the subsidiary engine master control borad through A/D, IO module.
3. terminal service robot according to claim 1; It is characterized in that: described environment sensing supervisory system (22) also comprises a smog, fire detecting arrangement (37); This device mainly is made up of smog, temperature detection sensor and related circuit thereof, and it links to each other with the subsidiary engine master control borad through A/D, IO module.
4. terminal service robot according to claim 1 is characterized in that: described driving wheel (7) adopts inflated wheel, and engaged wheel (12) adopts universal wheel.
5. the host control method of a terminal service robot as claimed in claim 1, it is characterized in that: described host control method comprises the following step that carries out in order:
1) the S1 stage of initialization apparatus and self check, this stage is first stage after powering on, and in this stage, main frame (30) will carry out power-up initializing, self check operation;
2) judge each of main frame operation normal S2 stage whether; In this stage, whether main frame (30) will normally be judged the self check operating result in S1 stage, if judged result is " being "; Get into the S3 stage; Otherwise get into the S13 stage,, withdraw from the master control circulation then to carry out corresponding abnormality processing;
3) judge whether to exist S3 stage of local steering order, this stage will judge currently whether have local steering order, if judged result is " being ", then get into the S4 stage, otherwise directly get into the S5 stage;
4) in the S4 stage of the corresponding local steering order of execution, in this stage, main frame (30) will be carried out the local steering order of current existence;
5) judge whether S5 stage of internet message, this stage will judge currently whether have internet message, if judged result is " being ", will get into the S6 stage, otherwise will directly get into the S11 stage;
6) obtain S6 stage of decision behavior according to sensor information, in this stage, main frame (30) will draw current decision behavior according to each sensor information that subsidiary engine (31) are sent here;
7) the S7 stage of the display network message and the result of decision on the interface, in this stage, main frame (30) transmits relevant idsplay order to subsidiary engine (31), and demonstrates the above-mentioned internet message and the result of decision through LCD (34);
8) behavior constantly obtains S8 stage of final behavior command in the fusion, and in this stage, main frame (30) will be according to current action decision-making and last a status information constantly, determines the operational order of current the concrete action that will carry out;
9) whether decision instruction allows S9 stage of carrying out, and this stage will judge by determined operational order of S8 stage whether possess executable condition, if judged result is " being ", gets into the S10 stage, otherwise turns back to the porch in S6 stage;
10) coordinate to carry out the S10 stage that order at each position, this stage will carry out by determined operational order of S8 stage;
11) judge whether S11 stage of withdrawing from, in this stage, main frame (30) will judge whether exit criteria is set up, if judged result is " being ", withdraw from the master control circulation to stop operation, continue circulation otherwise turn back to the porch in S3 stage;
12) send the S12 stage of withdrawing from information to subsidiary engine, in this stage, main frame (30) will send to subsidiary engine (31) and withdraw from information, promptly notify subsidiary engine (31) to carry out and exit command, and main frame (30) withdraws from the master control circulation to stop operation then.
6. the subsidiary engine control method of a terminal service robot as claimed in claim 1, it is characterized in that: described subsidiary engine control method comprises the following step that carries out in order:
1) the S21 stage of initialization apparatus and self check; This stage is first stage after powering on, and in this stage, subsidiary engine (31) will carry out power-up initializing, self check operation;
2) judge each of subsidiary engine operation normal S22 stage whether; In this stage, whether subsidiary engine (31) will normally be judged the self check operating result in S21 stage, if judged result is " being "; Then get into the S23 stage; Otherwise get into the S31 stage,, withdraw from the master control circulation then to carry out corresponding abnormality processing;
3) gather S23 stage of each sensor information, in this stage, subsidiary engine (31) will be gathered the information of each sensor successively;
4) the S24 stage of each sensor information of pre-service, in this stage, subsidiary engine (31) adopts relevant filtering and fault-tolerant technique that sensor information is carried out pretreatment operation;
5) show S25 stage of each sensor pretreatment information, in this stage, subsidiary engine (31) will demonstrate resulting each sensing data in the stage at S24 through LCD (34);
6) judge the S26 stage whether network is communicated with, this stage will judge whether current network connects normal, if judged result is " being ", get into the S27 stage, otherwise turn back to the porch in S23 stage;
7) packing, to the S27 stage of main frame transmission sensor information, in this stage, subsidiary engine (31) will send main frame (30) to through switch (32) in S24 resulting each sensing data arrangement packing in the stage earlier then;
8) judge whether to receive the S28 stage that host computer control is instructed, in this stage, subsidiary engine (31) will judge whether to receive the steering order of main frame (30), if judged result is " being ", will get into the S29 stage, otherwise will directly get into the S30 stage;
9) carry out the S29 stage that host computer control is instructed, in this stage, subsidiary engine (31) will be carried out the control command that resulting main frame (30) transmits in the stage at S28;
10) judge whether S30 stage of withdrawing from, in this stage, subsidiary engine (31) will judge whether exit criteria is set up, if judged result is " being ", withdraw from the master control circulation stopping operation, otherwise the porch that turns back to S23 is to continue circulation.
CN201010587267A 2010-12-15 2010-12-15 Airport terminal service robot and control method thereof Expired - Fee Related CN102141797B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201010587267A CN102141797B (en) 2010-12-15 2010-12-15 Airport terminal service robot and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201010587267A CN102141797B (en) 2010-12-15 2010-12-15 Airport terminal service robot and control method thereof

Publications (2)

Publication Number Publication Date
CN102141797A CN102141797A (en) 2011-08-03
CN102141797B true CN102141797B (en) 2012-09-26

Family

ID=44409379

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201010587267A Expired - Fee Related CN102141797B (en) 2010-12-15 2010-12-15 Airport terminal service robot and control method thereof

Country Status (1)

Country Link
CN (1) CN102141797B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11774961B2 (en) 2017-05-19 2023-10-03 Sita Information Networking Computing Usa, Inc. System and apparatus for resource management

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103240749A (en) * 2013-05-10 2013-08-14 广州博斯特智能科技有限公司 Service robot
CN103792905A (en) * 2013-09-16 2014-05-14 弗徕威数码科技(上海)有限公司 Multi-mode intelligent commercial service robot for store management
US9412110B2 (en) * 2013-11-12 2016-08-09 Globalfoundries Inc. Mobile image acquisition
CN103753551A (en) * 2014-01-24 2014-04-30 成都万先自动化科技有限责任公司 Station carrying service robot
CN103754286A (en) * 2014-01-24 2014-04-30 成都万先自动化科技有限责任公司 Airport carrying service robot
CN103926924A (en) * 2014-04-15 2014-07-16 哈尔滨工程大学 Method for controlling ice and snow robot
CN104217519A (en) * 2014-08-15 2014-12-17 国家电网公司 Fire alarm robot for transformer substations and inflammable places
NL2013474B1 (en) 2014-09-15 2016-09-28 Koninklijke Luchtvaart Mij N V Mobile airline kiosk for use at an airport, system for use with such a mobile airline kiosk and method for issuing aircraft boarding sequence numbers using such a mobile airline kiosk.
CN104375417B (en) * 2014-11-05 2017-11-07 济南大学 A kind of Waiting Lounge intellect service robot
CN104505091B (en) * 2014-12-26 2018-08-21 湖南华凯文化创意股份有限公司 Man machine language's exchange method and system
CN105077950B (en) * 2015-09-18 2017-08-15 中新智人(深圳)科技有限公司 A kind of trailing type luggage storage robot
CN105549588A (en) * 2015-12-10 2016-05-04 上海电机学院 Multifunctional guide robot
GB201608205D0 (en) * 2016-05-10 2016-06-22 Sita Ypenburg Bv And Bluebotics Sa Item handling system, method and apparatus therefor
CN106142100B (en) * 2016-08-01 2018-10-19 美的机器人产业发展有限公司 Service robot
CN106364585A (en) * 2016-11-28 2017-02-01 深圳哈乐派科技有限公司 Robot foot and robot
CN106493741A (en) * 2016-11-28 2017-03-15 广西乐美趣智能科技有限公司 A kind of hotel service Multifunctional intelligent robot
CN106774333B (en) * 2016-12-30 2020-05-29 中国民航信息网络股份有限公司 Airport service robot and working method thereof
CN110249278B (en) * 2017-04-26 2022-06-17 深圳市元征科技股份有限公司 Luggage van and driving method thereof
CN107627310A (en) * 2017-10-16 2018-01-26 华勤通讯技术有限公司 A kind of service robot
CN108897314A (en) * 2018-05-30 2018-11-27 苏州工业园区职业技术学院 A kind of intelligent vehicle control based on MC9S12DG128
CN109191704B (en) * 2018-07-25 2023-09-15 云南中商正晓农业科技有限公司 Using method applied to aviation case sharing system
CN114035569B (en) * 2021-11-09 2023-06-27 中国民航大学 Navigation station building manned robot path expanding and passing method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003241833A (en) * 2002-02-18 2003-08-29 Hitachi Ltd Information distribution service by mobile robot and information gathering system
WO2007135736A1 (en) * 2006-05-24 2007-11-29 Fujitsu Limited Mobile robot and method of controlling the same
KR100904191B1 (en) * 2008-05-29 2009-06-22 (주)다사로봇 Guidance robot

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060111814A1 (en) * 2004-11-19 2006-05-25 Shuji Hachitani Mobile robot

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003241833A (en) * 2002-02-18 2003-08-29 Hitachi Ltd Information distribution service by mobile robot and information gathering system
WO2007135736A1 (en) * 2006-05-24 2007-11-29 Fujitsu Limited Mobile robot and method of controlling the same
KR100904191B1 (en) * 2008-05-29 2009-06-22 (주)다사로봇 Guidance robot

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11774961B2 (en) 2017-05-19 2023-10-03 Sita Information Networking Computing Usa, Inc. System and apparatus for resource management

Also Published As

Publication number Publication date
CN102141797A (en) 2011-08-03

Similar Documents

Publication Publication Date Title
CN102141797B (en) Airport terminal service robot and control method thereof
US11541769B2 (en) Robot system and control method of the same
CN109703607B (en) Intelligent luggage van
CN1331641C (en) Security ensuring and patrolling robot
CN103786061B (en) Vehicular robot device and system
CN106182027B (en) A kind of open service robot system
CN107851394A (en) Drive assistance device, drive assist system, driving assistance method and automatic driving vehicle
CN106741028A (en) A kind of airport Intelligent baggage car
CN107851395A (en) Drive assistance device, drive assist system, driving assistance method and automatic driving vehicle
CN103697900A (en) Method for early warning on danger through augmented reality by vehicle-mounted emotional robot
CN106697322A (en) Automatic abutting system and method for boarding bridge
WO2021141723A1 (en) Directing secondary delivery vehicles using primary delivery vehicles
KR20190096871A (en) Artificial intelligence server for controlling plurality of robots and method for the same
CN112130570A (en) Blind guiding robot of optimal output feedback controller based on reinforcement learning
CN107705379A (en) A kind of accessible highway charge card access device
KR20210030155A (en) Robot and controlling method thereof
CN110796851A (en) Shared driver driving system, bicycle driving method and driver scheduling method
US20190389067A1 (en) Method and apparatus for providing food to user
KR20190107616A (en) Artificial intelligence apparatus and method for generating named entity table
US11074814B2 (en) Portable apparatus for providing notification
CN210198395U (en) Unmanned aerial vehicle and unmanned vehicle cooperative navigation system
WO2022143181A1 (en) Information processing method and apparatus, and information processing system
KR20190095190A (en) Artificial intelligence device for providing voice recognition service and operating mewthod thereof
CN216388538U (en) Unmanned automatic driving vehicle for education and training
CN115359222A (en) Unmanned interaction control method and system based on augmented reality

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120926

Termination date: 20151215

EXPY Termination of patent right or utility model