CN113359689A - New man-machine cooperative intelligent navigation technology in unstructured environment - Google Patents

New man-machine cooperative intelligent navigation technology in unstructured environment Download PDF

Info

Publication number
CN113359689A
CN113359689A CN202110622633.6A CN202110622633A CN113359689A CN 113359689 A CN113359689 A CN 113359689A CN 202110622633 A CN202110622633 A CN 202110622633A CN 113359689 A CN113359689 A CN 113359689A
Authority
CN
China
Prior art keywords
man
mobile robot
navigation
intelligent navigation
electroencephalogram
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110622633.6A
Other languages
Chinese (zh)
Inventor
谢松云
张晓伟
刘祥惠
付海洋
赵梓华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN202110622633.6A priority Critical patent/CN113359689A/en
Publication of CN113359689A publication Critical patent/CN113359689A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0016Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Theoretical Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention discloses a novel man-machine cooperative intelligent navigation technology in an unstructured environment, and relates to the field of brain-computer interaction. By collecting electroencephalogram signals of a brain of an operator, a multi-operation instruction set under three modes of motor imagery, steady-state visual evoked potential and blinking is constructed, and starting, two-dimensional mapping, map storage, mode switching, autonomous navigation and stopping of the mobile robot are achieved. Meanwhile, the path deviation of the mobile robot in the autonomous navigation is corrected by the electroencephalogram signals. The electroencephalogram eye movement rapid detection technology is added to detect the occurrence of abnormal events in the unstructured environment, and the human brain fatigue state detection analysis is set in the man-machine cooperative intelligent navigation.

Description

New man-machine cooperative intelligent navigation technology in unstructured environment
Technical Field
The invention belongs to the field of cross research of brain science, artificial intelligence, robots and intelligent control.
Background
The mobile robot navigation technology is widely applied to scenes such as automatic driving, warehouse logistics, service guidance and the like at present. Due to the influence of environmental changes, sensor precision and the like, the current mobile robot has a plurality of problems in autonomous navigation. Such as how to construct a high-precision map, how to perform high-precision positioning, sensor failure due to environmental influences, etc. The invention relates to a new man-machine cooperative intelligent navigation technology in an unstructured environment, namely, through mutual cooperation and mutual combination of human brain intelligence and artificial intelligence, an intelligent agent becomes a strong artificial intelligent agent (man-machine cooperation) with perception, decision and reasoning, and complex tasks such as real environment construction, cooperative navigation, abnormal condition decision in an unstructured environment and the like are completed.
With the development of brain-computer interface technology in recent years, research for controlling a mobile robot by using human brain intelligent technology is rapidly developed, and a plurality of research teams realize the operation of the brain-controlled mobile robot, for example, students such as Escolano of the university of salaga, spain, 2012 design a teleoperation control system based on the brain-computer interface; a brain-controlled wheelchair and a brain-controlled web browser developed by Li Yuanqing team of Ministry of academic in 2017; 2018, a man-machine interaction intelligent wheelchair developed by Brice and other scholars of national university of Singapore. However, most of the research teams adopt the paradigm of P300, and have the following problems:
(1) the existing mobile robot has the problems of insufficient interactivity between the self-guided navigation and the human brain, low information transmission rate and poor real-time performance.
(2) The existing navigation technology based on instant positioning And map building (SLAM) is affected by the surrounding environment And the accuracy of a sensor, And deviation is accumulated in the navigation process, so that the actual path is inconsistent with the planned path, And the autonomous navigation of the mobile robot fails.
(3) The brain-computer interface technology is used for controlling the mobile robot for a long time, so that the brain is easy to have heavier fatigue burden, and the task completion capability is poor.
(4) The mobile robot is used in the autonomous navigation task, although a relatively mature obstacle avoidance navigation algorithm exists, the mobile robot does not have the cognitive, decision and reasoning capabilities similar to those of a human brain in the face of a complex and changeable environment.
The invention aims to solve the problems of insufficient human-computer interaction, low transmission rate of a brain-computer interface navigation technology, no ability of recognizing and processing abnormal events and the like of a mobile robot in navigation autonomous navigation.
Disclosure of Invention
The invention provides a novel method for man-machine cooperative intelligent navigation in an unstructured environment, which adopts a Steady-State Visual Evoked Potential (SSVEP) paradigm and a Motor Imagery (MI) multimode to realize the man-machine cooperative intelligent navigation. And controlling the robot to perform two-dimensional image construction on the unstructured environment, and realizing mode switching through continuous blinking operation. And when the navigation error accumulation is overlarge, the position of the robot is adjusted by adopting the electroencephalogram signal. And an electroencephalogram eye movement abnormity detection model is established, and the subconscious reaction of the brain and corresponding eye movement signals are utilized to realize the rapid detection of abnormal conditions and ensure the continuity of the navigation of the mobile robot. The brain fatigue degree of a human is analyzed and detected by utilizing the electroencephalogram and eye movement information collected in real time, the human brain is prevented from being over-tired, and the working efficiency is ensured.
The invention is realized by the following technical scheme:
the hardware of the man-machine cooperative intelligent navigation system in the unstructured environment comprises an electroencephalogram signal acquisition and transmission device, an eye movement signal acquisition and transmission device, a mobile robot platform and a remote display operation platform. The system hardware block diagram is shown in fig. 1.
(1) Electroencephalogram signal acquisition and transmission device
Is used for collecting steady state visual evoked potential, conscious blink electroencephalogram signals and motor imagery electroencephalogram signals.
(2) Eye movement signal acquisition and transmission device
The method is used for acquiring eye movement signals of human eyes in the whole navigation process so as to obtain the concentration degree of attention.
(3) Mobile robot platform
(4) Remote display operation platform
Secondly, a new man-machine collaborative intelligent navigation technology in an unstructured environment comprises the following contents:
(1) man-machine cooperation intelligent navigation: two-dimensional mapping is performed for unstructured environments by SSVEP. The mode switching is controlled by using electroencephalogram signals generated by blinking, autonomous navigation is carried out in an unstructured environment, deviation generated in the navigation process is corrected in time through electroencephalogram, and therefore electroencephalogram, blinking and SLAM technologies are combined, and intelligent navigation is achieved in the unstructured environment.
(2) Multi-mode electroencephalogram feature extraction technology: extracting the frequency domain characteristics of the SSVEP electroencephalogram signals by using a characteristic extraction method of typical Correlation Analysis (CCA); extracting Event-related Synchronization (ERS)/Event-related Desynchronization (ERD) features induced by imaginary movement by applying a Common Spatial Pattern (CSP) method; and detecting whether the eye blink is consciousness or not by using the peak height range and the interval range of the electroencephalogram signals collected by the forehead lead Fp2 as judgment conditions.
(3) Abnormal condition electroencephalogram eye movement detection technology: the method comprises the steps of collecting electroencephalogram signals and eye movement information caused by abnormal states such as obstacles, preprocessing the signals and distinguishing the collected signals, ensuring that the mobile robot can complete tasks under the abnormal conditions, taking over the control right of the mobile robot through the electroencephalogram signals, and making suggestions for next decision.
(4) Human brain fatigue detection technology: the brain electrical signals and the eye movement signals of the human brain during navigation are collected in real time, and the CSP algorithm extracts the brain electrical characteristics and is fused with the eye movement characteristics. Analyzing indexes such as brain fatigue degree, subjective sleepiness degree value and the like to evaluate the fatigue state and make corresponding prompt and early warning.
(5) Distributed control technology: the system adopts distributed control, specifically as shown in figure 2, and is characterized in that a control station, an operator station and a processing station are arranged on a mobile robot through a three-level distributed model, various sensors are arranged below the mobile robot, and a wireless router is connected above the mobile robot; the operator station comprises various image display and brain and eye electric signal acquisition; the processing station is mainly responsible for electroencephalogram, eye movement signal processing, instruction set establishment and other functions. All parts are organically connected, centrally managed, dispersedly controlled and cooperatively navigated.
The invention utilizes the cooperation of the man-machine to carry out the robot navigation, and has the following innovation and advantages:
(1) a man-machine collaborative navigation system is constructed through the collection of multi-mode electroencephalogram signals, so that the surrounding environment is constructed, and the capability of sensing the environment is achieved; by means of the blink switching working mode, autonomous navigation and abnormal event perception and decision are achieved in the navigation mode, meanwhile, fatigue detection can be conducted on the human brain, and over fatigue is prevented.
(2) The invention considers the problem of coping with the sensor failure of the mobile robot in autonomous navigation, adopts the rapid detection technology combining electroencephalogram and eye movement, predicts and controls in advance by means of the rapid response of people to abnormal events, and simultaneously takes over the mobile robot with the sensor failure through the brain-computer interface technology.
(3) Aiming at errors generated by a sensor in autonomous navigation, the position of the mobile robot is adjusted through a brain-computer interface control technology, so that the deviation of a navigation path and a preset path is eliminated, and the mobile robot is prevented from colliding.
Drawings
FIG. 1 is a hardware block diagram of a man-machine cooperative intelligent navigation system;
FIG. 2 is a block diagram of a distributed architecture system;
FIG. 3 is a block diagram of a specific implementation;
Detailed Description
The invention is further illustrated with reference to figure 3 below:
after the mobile robot is started, an operator observes the real-time condition of the unstructured environment, superimposes the graphic stimulation block induced by steady-state vision and the returned video together by using a video superimposing technology, finely adjusts the stroke by using SSVEP, controls the mobile robot to carry out two-dimensional map building, saves the two-dimensional map by using a brain control command, and finishes the map building.
And switching the navigation modes by blinking continuously, wherein in the autonomous navigation mode, the human brain pays attention to the deviation between the environment and the preset path, when the deviation is greater than a set threshold value, the computer generates a prompt, performs position adjustment on the mobile robot by using a brain control command, and then resumes autonomous navigation.
Once the external sensor of the mobile robot has a problem, the mobile robot returns error information, and the human brain takes over the control of the mobile robot to continue to complete the task. For suddenly appearing obstacles, electroencephalogram signals of human brain instinct response and eye movement signals caused by abnormal states are collected, after data are fused, abnormal states are judged through a pre-established electroencephalogram eye movement detection model, and instructive opinions are made for next decision making.
By collecting biological signals of brain electricity, eye movement and the like, data processing, feature extraction and decision layer fusion are carried out, the fatigue state of the brain of a human is distinguished, and an operator is prompted to pay attention to rest and keep attention concentrated.

Claims (6)

1. New technology of man-machine intelligent navigation under unstructured environment, its characterized in that:
the multi-mode human brain perception and the SLAM of artificial intelligence are combined to realize cooperative intelligent navigation, and the brain and the machine cooperate to complete two-dimensional mapping and autonomous navigation of the mobile robot in an unstructured environment; and establishing an abnormal condition electroencephalogram eye movement detection model to realize the perception and decision of the mobile robot on the abnormal condition in the non-structural environment.
2. The new technology of man-machine cooperative intelligent navigation in unstructured environment of claim 1, characterized by:
the start, stop and map storage of the robot are controlled by using the electroencephalogram signals of the event imagination technology, and the motion of the robot is cooperatively controlled by using the electroencephalogram control signals induced by steady-state vision and the SLAM technology. And switching the working modes of the mobile robot is realized by using the continuous blink signals.
3. The new technology of man-machine cooperative intelligent navigation in unstructured environment of claim 1, characterized by:
and detecting the deviation between the radar point cloud of the mobile robot and the static map, and adjusting the position of the mobile robot by using an electroencephalogram signal to reduce the path deviation generated in navigation.
4. The new technology of man-machine cooperative intelligent navigation in unstructured environment of claim 1, characterized by:
when the abnormal condition which can not be solved by a computer is met in the unstructured environment, the human brain takes over the control right of the mobile robot to carry out tasks such as obstacle avoidance, identification and the like.
5. The new technology of man-machine cooperative intelligent navigation in unstructured environment of claim 1, characterized by:
information such as electroencephalogram and eye movement induced by abnormal conditions is collected, decision layer data fusion modeling is carried out, the abnormal conditions are detected in man-machine cooperative intelligent navigation, and the cognitive and decision making capabilities of the mobile robot to the unstructured environment are improved.
6. The new technology of man-machine cooperative intelligent navigation in unstructured environment of claim 1, characterized by:
and according to the electroencephalogram and eye movement data collected in real time, detecting and analyzing the human brain fatigue state in the man-machine cooperative intelligent navigation.
CN202110622633.6A 2021-06-04 2021-06-04 New man-machine cooperative intelligent navigation technology in unstructured environment Pending CN113359689A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110622633.6A CN113359689A (en) 2021-06-04 2021-06-04 New man-machine cooperative intelligent navigation technology in unstructured environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110622633.6A CN113359689A (en) 2021-06-04 2021-06-04 New man-machine cooperative intelligent navigation technology in unstructured environment

Publications (1)

Publication Number Publication Date
CN113359689A true CN113359689A (en) 2021-09-07

Family

ID=77532035

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110622633.6A Pending CN113359689A (en) 2021-06-04 2021-06-04 New man-machine cooperative intelligent navigation technology in unstructured environment

Country Status (1)

Country Link
CN (1) CN113359689A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190027617A (en) * 2017-09-07 2019-03-15 고려대학교 산학협력단 Brain-computer interface apparatus and brain-computer interfacing method for manipulating robot arm apparatus
CN109634407A (en) * 2018-11-08 2019-04-16 中国运载火箭技术研究院 It is a kind of based on control method multimode man-machine heat transfer agent synchronous acquisition and merged
CN110955251A (en) * 2019-12-25 2020-04-03 华侨大学 Petri network-based mobile robot brain-computer cooperative control method and system
CN111638724A (en) * 2020-05-07 2020-09-08 西北工业大学 Novel cooperative intelligent control method for unmanned aerial vehicle group computer
CN111890389A (en) * 2020-06-22 2020-11-06 东南大学 Multi-mobile robot cooperative control system based on multi-modal interactive interface

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190027617A (en) * 2017-09-07 2019-03-15 고려대학교 산학협력단 Brain-computer interface apparatus and brain-computer interfacing method for manipulating robot arm apparatus
CN109634407A (en) * 2018-11-08 2019-04-16 中国运载火箭技术研究院 It is a kind of based on control method multimode man-machine heat transfer agent synchronous acquisition and merged
CN110955251A (en) * 2019-12-25 2020-04-03 华侨大学 Petri network-based mobile robot brain-computer cooperative control method and system
CN111638724A (en) * 2020-05-07 2020-09-08 西北工业大学 Novel cooperative intelligent control method for unmanned aerial vehicle group computer
CN111890389A (en) * 2020-06-22 2020-11-06 东南大学 Multi-mobile robot cooperative control system based on multi-modal interactive interface

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
XIE SONGYUN: "Design of a video feedback SSVEP-BCI system for car control based on improved MUSIC method", 《2018 6TH INTERNATIONAL CONFERENCE ON BRAIN-COMPUTER INTERFACE (BCI)》 *
谢松云: "基于意识任务的机器人脑控系统", 《西北工业大学学报》 *

Similar Documents

Publication Publication Date Title
CN104083258B (en) A kind of method for controlling intelligent wheelchair based on brain-computer interface and automatic Pilot technology
CN107097227B (en) human-computer cooperation robot system
Escolano et al. A telepresence mobile robot controlled with a noninvasive brain–computer interface
Barea et al. Wheelchair guidance strategies using EOG
CN112356841B (en) Vehicle control method and device based on brain-computer interaction
CN102980454B (en) Explosive ordnance disposal (EOD) method of robot EOD system based on brain and machine combination
CN112051780B (en) Brain-computer interface-based mobile robot formation control system and method
CN106708251A (en) Eyeball tracking technology-based intelligent glasses control method
CN104021370A (en) Driver state monitoring method based on vision information fusion and driver state monitoring system based on vision information fusion
CN102981625A (en) Eye movement remote control method and system
CN103116279A (en) Vague discrete event shared control method of brain-controlled robotic system
CN105759650A (en) Method used for intelligent robot system to achieve real-time face tracking
CN102895093A (en) Walker aid robot tracking system and walker aid robot tracking method based on RGB-D (red, green and blue-depth) sensor
CN106774325A (en) Robot is followed based on ultrasonic wave, bluetooth and vision
CN106377228A (en) Monitoring and hierarchical-control method for state of unmanned aerial vehicle operator based on Kinect
Escolano et al. Human brain-teleoperated robot between remote places
CN113778113B (en) Pilot auxiliary driving method and pilot auxiliary driving system based on multi-mode physiological signals
CN113359689A (en) New man-machine cooperative intelligent navigation technology in unstructured environment
Pizziol et al. Towards human operator “state” assessment
Raković et al. The Gaze Dialogue Model: Nonverbal Communication in HHI and HRI
EP4002328A1 (en) Artificial assistance method, related devide and system
CN113084776B (en) Intelligent epidemic prevention robot and system based on vision and multi-sensor fusion
Huo et al. A BCI-Based Motion Control System for Heterogeneous Robot Swarm
Garrote et al. Reinforcement learning motion planning for an EOG-centered robot assisted navigation in a virtual environment
Yuan et al. Brain teleoperation of a mobile robot using deep learning technique

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination