CN109262656A - A kind of animal robot stimulation parameter measurement system and method based on machine vision - Google Patents

A kind of animal robot stimulation parameter measurement system and method based on machine vision Download PDF

Info

Publication number
CN109262656A
CN109262656A CN201811283293.3A CN201811283293A CN109262656A CN 109262656 A CN109262656 A CN 109262656A CN 201811283293 A CN201811283293 A CN 201811283293A CN 109262656 A CN109262656 A CN 109262656A
Authority
CN
China
Prior art keywords
stimulation
robot
stimulus
animal
animal robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811283293.3A
Other languages
Chinese (zh)
Other versions
CN109262656B (en
Inventor
杨俊卿
隋美娥
槐瑞托
汪慧
孙博
李玉霞
苏涛
杨瑞东
苏学成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University of Science and Technology
Original Assignee
Shandong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University of Science and Technology filed Critical Shandong University of Science and Technology
Priority to CN201811283293.3A priority Critical patent/CN109262656B/en
Priority to PCT/CN2018/123659 priority patent/WO2020087717A1/en
Publication of CN109262656A publication Critical patent/CN109262656A/en
Application granted granted Critical
Publication of CN109262656B publication Critical patent/CN109262656B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/0095Means or methods for testing manipulators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Toys (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a kind of, and the animal robot stimulation parameter based on machine vision measures system and method, belong to animal robot control field, the system obtains the motion profile and rotation direction of animal robot using machine vision technique, can more accurate and quantification analyzing animal robot motor behavior;And further quantitative analysis goes out the corresponding relationship of stimulation parameter Yu animal robot controlled action, so that it is determined that going out the optimum stimulating parameter suitable for particular animals individual, submits necessary information for the practical research of animal robot;By machine vision instead of artificial judgment, the standard of experiment flow and the objectivity of experimental data not only ensure that based on method proposed by the invention, and then obtain objective consistent stimulation parameter measurement result;Meanwhile greatly reducing and brought low value labour was tested by artificial stimulation parametric measurement originally, improve the Efficiency of animal robot.

Description

A kind of animal robot stimulation parameter measurement system and method based on machine vision
Technical field
The invention belongs to animal robot control fields, and in particular to a kind of animal robot stimulation based on machine vision Parametric measurement system and method.
Background technique
Animal robot refers to the motion function using animal, is stimulated by the feeling afferent nerve to animal, real Certain behaviors of now artificial control or guidance animal, also referred to as " half-electron people (cyborg).Animal robot utilizes brain-computer interface (Brain Computer Interface, BCI) technology realizes that the direct information between external control and biological brain area interacts, Complete the behavior for being accurately controlled animal robot.Compared to traditional mechanical robot, animal robot is due to being utilized life Object drives carrier as it, enormously simplifies the modules such as the motion control in Robot Design.Animal robot is facing simultaneously When the geographical environment of emergency situations or complexity, emergency situations can be made by its biological instinct and quickly and effectively handle, There are higher flexibility, environmental suitability ability and concealment.And animal robot relies on its foraging behavior, does not need outside Equipment, which provides, moves the required energy, greatly reduces the energy consumption of robot, promotes cruise duration.
So-called animal robot exactly controls its cerebral nerve or muscle as ontology, with coded electrical signal using living animal, To realize the intelligent animals controlled ontology.But due to position error in animal individual difference and surgical procedure, so that control is dynamic Parameters of electrical stimulation required for object robot has otherness.It therefore, is the every of each animal robot by stimulation test It is the important link in animal robot research process that one stimulation channels, which obtain experimental data and then select suitable stimulation parameter, Our this experiment link is referred to as " stimulation parameter measurement experiment ".The experiment is most important to animal robot research, it will be The animal robot practical research in later period provides prior information.
Stimulation parameter measurement experiment at this stage, relies primarily on manual operation, and experimenter is needed to combine itself operation warp It tests, constantly adjusts the motor behavior of stimulation parameter and continuous observation animal, while recording experimentation moderate stimulation supplemental characteristic, into And determine most suitable stimulation parameter.But this working method is with the presence of many shortcomings: first, experiment varies with each individual Randomness, different experimenters due to its experience and understand etc. difference, the subjectivity of experimenter, causes in addition The record of consistency can not be obtained in identical experiment process as a result, such as recording the animal robot under special parameter electro photoluminescence Rotational angle judges rotational angle by human eye, and deviation is unavoidable.This recorded by human factor bring inconsistency is tied Fruit will bring adverse effect to the analysis of stimulation parameter and the corresponding relationship of rotational angle and research, and eventually leading to cannot obtain The objective law of both sides relation.Second, since the physiology such as operator's attention, physical strength limitation can not accomplish long-time, high standard Quasi- puts into experimentation, the testing process of some keys can be missed often, although the test missed can be doed for supplement, The consistency for destroying experiment flow cannot obtain consistent experimental result without standardized experiment flow.Third, being based on Manually-operated stimulation parameter measurement experiment elapsed time is long, while also increasing manpower and financial resources cost.
Summary of the invention
For the above-mentioned technical problems in the prior art, the invention proposes a kind of animal machine based on machine vision Device people's stimulation parameter measures system and method, and design rationally, overcomes the deficiencies in the prior art, has good effect.
To achieve the goals above, the present invention adopts the following technical scheme:
A kind of animal robot stimulation parameter measurement system based on machine vision, including TT&C system and stimulator;It surveys Control system is mainly made of industrial camera, PC machine and wireless communication module A;Stimulator is mainly by microprocessor, multi-channel coding Signal generator, wireless communication module B and basic function circuit composition;
Industrial camera is connect by USB interface with PC machine, the animal machine being configurable for during acquisition stimulation test The motion state of device people sends video data to PC machine;
Wireless communication module A is connect by serial ports with PC machine, and being configurable for will be from the data wireless transmission of PC machine It goes out;
PC machine, using the data analysis system developed based on the library OpenCV, the stimulus signal and detections picture that come to transmission Face is analyzed and is handled, and analyzes status by the method for detecting rat roboting features point, calculates rat machine The angle of people's rotation;In conjunction with the corresponding stimulus signal of each rotation, the video recording picture effectively controlled rat robot is filtered out Face automatically saves these controllable picture recordings;Finally stimulus signal intensity and rotational angle are analyzed, compared, obtain by Corresponding relationship between the rotational angle and stimulus intensity of rat robot is surveyed, controllable sensitivity is calculated;
Wireless communication module B is connect with microprocessor by serial ports, is configurable for number of the wireless receiving from PC machine According to;
Microprocessor is connect by serial ports with wireless communication module B, meanwhile, pass through I/O mouthfuls of control multi-channel coding signals The working condition of generator;Microprocessor is based on the received information of wireless communication module B and generates desired coding electrical stimulation signal, And apply it in the target brain area of animal robot, so that animal is generated desired operation action;
Multi-channel coding signal generator is configurable for that the original signal of microprocessor is handled and converted, And the selection of stimulation channels is realized under control of the microprocessor, starts and stops function, and then generates a stimulus signal frequency The adjustable coding stimulus signal of rate, stimulus signal amplitude and stimulus signal time-histories, and this coding stimulus signal is applied to Target brain area;
Basic function circuit is configurable for providing electric energy for microprocessor and wireless communication module B;
Based on the TT&C system of PC machine according to the law generation stimulation parameter and control command of setting, and pass through wireless telecommunications Equipment A is sent to stimulator;Meanwhile the TT&C system synchronous recording based on PC machine and the stimulation parameter saved in experimentation are believed The video file of the controlled action of breath and reflection animal robot;Finally, it is based on video file and corresponding stimulation parameter, analysis The corresponding relationship of controlled action and stimulation parameter out.
Preferably, home block is provided on stimulator, home block is configurable for moving during standardization stimulation test The measurement of object robot steering angle takes the average value of three tag line steering angles as final in actual rudder angle measurement Steering angle;Home block is made of the lines of three different colours, and each lines are respectively marked with one for identifying the side of the lines To arrow, three lines intersect at a bit, are evenly distributed differ 120 degree two-by-two in the plane.
Preferably, wireless communication module A and wireless communication module B select wireless communications chips NRF9E5;Industrial camera Select the 5000000 pixel industrial cameras with USB3.0 data-interface;The microprocessor of stimulator selects C8051F410 chip; Code oscillator is made of 4 groups of symmetrical triodes and 2 MAX309 chips.
In addition, the present invention is also mentioned that a kind of animal robot stimulation parameter measuring method based on machine vision, this method System is measured using the animal robot stimulation parameter as described above based on machine vision, is carried out in accordance with the following steps:
Step 1: the stimulator with home block is mounted on to the back of animal robot, output end and in advance plant The electrode interface slot connection entered, and open TT&C system;
Step 2: animal robot being placed in the plane that background is pure color, to position and divide the feature in home block Lines;
Step 3: carrying out parameter setting;Wherein, initial parameters set as follows: current amplitude 50uA, pulse width 2, Pulse number is 5, pulse frequency 90Hz;Maximum value parameter setting is as follows: current amplitude 130uA, pulse width 9, arteries and veins Rushing number is 30, pulse frequency 130Hz;
Step 4: after the completion of TT&C system initialization, first according to the real-time image information from industrial camera, analysis is sentenced The current position of disconnected animal robot, selects stimulus type;The Parameters variation of every kind of stimulus type all follows stimulus intensity by weak The rule being incremented by strong gradual change;
Step 5: after stimulus type determines, TT&C system is according to the progress of the stimulus type, it then follows stimulus intensity gradual change is passed Supplemental characteristic and stimulus type order are sent collectively to stimulator by wireless communication module A by the rule of increasing;
Step 6: after stimulator completes initialization by microprocessor after powering on, wait state is carried out, when passing through wireless telecommunications After module B receives supplemental characteristic from TT&C system and stimulation order, microprocessor information based on the received controls multi-pass The working condition of road code oscillator makes it generate corresponding coding stimulation electric signal, and stimulation electric signal is applied to The corresponding cranial nuclei of animal robot, controls the motor behavior of animal robot;Above-mentioned stimulation electric signal is weighed in 5 seconds It is 10 times multiple;While generating stimulation, stimulator sends stimulation to TT&C system by wireless communication module B and starts to identify, observing and controlling System receives after stimulation starts mark, and starting picture recording function starts to use videotape to record, and by video file classification be stored in three it is different In file;After this stimulation, stimulator sends end of identification signal and surveys to TT&C system subsequently into wait state After control system receives end of identification signal, terminate picture recording after delay 2 seconds;
Step 7: after all experimentss, video data analysis being carried out by PC machine, animal machine is drawn according to video file Device people's motion profile, TT&C system read each frame of digital image in video file, fixed according to the particular color in home block Position goes out the intersection point of three lines, and calculates the coordinate of the intersection point in the picture, finally connects the coordinate of the intersection point in each frame Get up, so that it may obtain the motion profile of animal robot;Meanwhile first frame and last frame digitized map are taken out from video file Picture is handled for first frame image, extracts three tag lines in image according to color information, and calculate each The deflection of line on the image plane, is denoted as A, B, C respectively;Last frame image is handled in the same manner, direction Angle is denoted as a, b, c respectively;Then the steering angle of animal robot is ((a-A)+(B-B)+(c-C))/3;
Step 8: after completing above-mentioned processing, using image processing techniques, similar stimulation being had to the rail of identical stimulation parameter Mark image, is spliced together, and steering angle is shown in one image of formation on each corresponding trajectory diagram, is examined according to image Survey the stability of the consistency of control result and control effect under identical stimulation parameter, finally using the result with stability as Selected parameter.
Preferably, in step 4, stimulus type is divided into three kinds: left steering stimulus type, right turn stimulus type and advance Stimulus type;TT&C system chooses stimulation class according to the experiment process of the location of animal in experimentation and each type Type: if the right of animal robot selects left steering stimulus type close to place edge;If the left side of animal robot Close to place edge, then right turn stimulus type is selected;The destination edge if animal robot is absent from the scene, TT&C system is according to a left side Turn, right-hand rotation and the respective experiment process of ahead three kind stimulus type, that slower stimulus type of selection progress carry out advance thorn Experiment is swashed, until three kinds of stimulus types are all completed.
Preferably, in step 4, the incremental rule of stimulus intensity gradual change is: by the elder generation of amplitude, width, number and frequency Order afterwards, each Parameters variation alternately increase a variable in above-mentioned 4 parameters, and the increment of each variable is as follows: electric current width Value increment is 5uA, and pulse width increment is 1, and pulse number increment is 5, and pulse frequency increment is 10Hz;Until with stimulus intensity Related 4 variables all reach the maximum value of setting, when the parameter of a certain stimulus type reaches maximum value, indicate such reality It tests and completes a loop test, then carry out next loop test again.
Preferably, in step 6, three different files are with stimulus type code+cycle-index+stimulation parameter lattice Formula name;It is 1 that stimulus type code, which is respectively as follows: left-hand rotation code, and right-hand rotation code is 2, and advance code is 3;Cycle-index is opened from 1 Begin, 1 is added after every completion one cycle;Stimulation parameter is successive suitable by amplitude, pulse width, pulse number and stimulation signal frequencies Sequence is composed.
Advantageous effects brought by the present invention:
Present system obtains the behavioural characteristic of animal robot by machine vision technique, can move to animal robot More accurate and analysis and judgement of quantification are made, and automatically analyzes out the suitable of each stimulation channels in conjunction with the method for the present invention Suitable stimulation parameter range, effectively overcomes previous deficiency, not only ensure that property and the visitor of experimental data of experiment flow standard The property seen, obtains objective consistent stimulation parameter measurement result, and significantly reduce as brought by stimulation parameter measurement experiment Low value labour, improves the Efficiency of animal robot.
Detailed description of the invention
Fig. 1 is the animal robot experimental data acquisition system structural schematic diagram based on machine vision.
Fig. 2 is the animal robot experimental data acquisition system operation principle schematic diagram based on machine vision.
Fig. 3 is TT&C system work flow diagram.
Fig. 4 is stimulator work flow diagram.
Fig. 5 is mark line drawing and angle calculation schematic diagram.
Fig. 6 is the stimulation parameter measurement experiment result schematic diagram based on machine vision.
Specific embodiment
With reference to the accompanying drawing and specific embodiment invention is further described in detail:
A kind of animal robot stimulation parameter based on machine vision measures system, as shown in Figure 1, include TT&C system and Stimulator;TT&C system is mainly made of industrial camera, PC machine and wireless communication module A;Stimulator is mainly by microprocessor, more Channel coding signal generator, wireless communication module B and basic function circuit composition;
Industrial camera is connect by USB interface with PC machine, the animal machine being configurable for during acquisition stimulation test The motion state of device people sends video data to PC machine;
Wireless communication module A is connect by serial ports with PC machine, and being configurable for will be from the data wireless transmission of PC machine It goes out;
PC machine, using the data analysis system developed based on the library OpenCV, the stimulus signal and detections picture that come to transmission Face is analyzed and is handled, and analyzes status by the method for detecting rat roboting features point, calculates rat machine The angle of people's rotation;In conjunction with the corresponding stimulus signal of each rotation, the video recording picture effectively controlled rat robot is filtered out Face automatically saves these controllable picture recordings;Finally stimulus signal intensity and rotational angle are analyzed, compared, obtain by Corresponding relationship between the rotational angle and stimulus intensity of rat robot is surveyed, controllable sensitivity is calculated;
Wireless communication module B is connect with microprocessor by serial ports, is configurable for number of the wireless receiving from PC machine According to;
Microprocessor is connect by serial ports with wireless communication module B, meanwhile, pass through I/O mouthfuls of control multi-channel coding signals The working condition of generator;Microprocessor is based on the received information of wireless communication module B and generates desired coding electrical stimulation signal, And apply it in the target brain area of animal robot, so that animal is generated desired operation action;
Multi-channel coding signal generator is configurable for that the original signal of microprocessor is handled and converted, And the selection of stimulation channels is realized under control of the microprocessor, starts and stops function, and then generates a stimulus signal frequency The adjustable coding stimulus signal of rate, stimulus signal amplitude and stimulus signal time-histories, and this coding stimulus signal is applied to Target brain area;
Basic function circuit is configurable for providing electric energy for microprocessor and wireless communication module B;
Based on the TT&C system of PC machine according to the law generation stimulation parameter and control command of setting, and pass through wireless telecommunications Equipment A is sent to stimulator;Meanwhile the TT&C system synchronous recording based on PC machine and the stimulation parameter saved in experimentation are believed The video file of the controlled action of breath and reflection animal robot;Finally, it is based on video file and corresponding stimulation parameter, analysis The corresponding relationship of controlled action and stimulation parameter out.
Illustrate embodiments of the present invention by taking robot mouse as an example, the animal robot experimental data based on machine vision is adopted Collecting system working principle is as shown in Figure 2.The marking plate with three color tag line of red, green, blue is mounted on stimulator table when experiment The back of rat is fixed in face together.The output end of stimulator is connect with the electrode interface slot being previously implanted.Industrial camera is solid Due to the surface of experimental site, PC machine reads the video data from industrial camera by USB interface.TT&C system is responsible for dividing Image information is analysed, to judge that rat is controlled in the position of experimental site, and in a manner of wireless telecommunications with stimulator interactive information The working condition of stimulator processed, while receiving the feedback data of stimulator.
TT&C system workflow is as shown in figure 3, TT&C system real-time image analysis of the basis from industrial camera first Position of the rat in place, if the left side of rat, close to the edge in place, TT&C system selects right-hand rotation stimulus type;If rat Right side close to the edge in place, TT&C system selects left-hand rotation stimulus type;If rat be located at place center TT&C system according to The experiment process of three kinds of stimulus types, the stimulus type for selecting progress slower carry out stimulation test.After stimulus type determines, observing and controlling Then progress of the system according to the type, stimulation parameter needed for extrapolating this test pass through wireless communication module A for parameter Data and stimulus type order are sent collectively to stimulator, meanwhile, the feedback information of stimulator is waited, when receiving commencing signal Afterwards, starting picture recording function starts to record robot mouse experiment video, after receiving the end of identification from stimulator and postponing 2 seconds Terminate picture recording, the experiment visual classification just recorded is stored in specified file.Meanwhile with " stimulus type code+follow The format of ring number+stimulation parameter " names the video file, for example current stimulus type is the left-hand rotation of second of end-around carry Stimulation, stimulation parameter are current amplitude 100uA, pulse width 5, pulse number 20, pulse frequency 90Hz.Except stimulation class Type code is constant outer, remaining information by expanding into 3 in such a way that front mends 0, i.e., cycle-index 002, current amplitude 100, Pulse width 005, pulse number 020, pulse frequency 090.Above data is combined, then the file of the video is entitled 1002100005020090.Subsequently into recycling next time.
Stimulator workflow as shown in figure 4, stimulator power on after by microprocessor C8051F410 complete initialization after, Carry out wait state, after receiving the parameter and order come Self-measured/controlled TT&C system by wireless communication module B, microprocessor The working condition of C8051F410 information control encoded information generator based on the received, makes it generate corresponding coding stimulation electricity Signal, and stimulus signal is applied to the corresponding cranial nuclei of rat, control the motor behavior of rat.Above-mentioned stimulus signal exists It is repeated in 5 seconds 10 times.While generating stimulation, opening flag is sent out to TT&C system, this stimulation by wireless communications chips After send end mark, subsequently into wait state, repeat above procedure.
After experiment, TT&C system analyzes the video file of record one by one, and TT&C system is according to video file The available stimulation parameter corresponding with video file of title and stimulus type information, then read video in each frame figure As data, mark point O point is extracted according to preset color brown as shown in Figure 5, and calculate the coordinate of the point.To finally own Point coordinate connect, and be shown on an image and obtain the motion profile of robot mouse.In addition, reading the in video Simultaneously locating segmentation goes out the home block in image for one frame and last frame image, by method as shown in Figure 5, by three in home block Bar line is extracted according to colouring information, and calculates the coordinate of two endpoints of every line in the picture, is ginseng with horizontal dextrad Direction is examined, is θ 1 according to the angle that the coordinate of line segment two-end-point can respectively obtain red tag line CD, green tag line AB's Deflection is 360- θ 2, and the deflection of blue tag line EF is 180- θ 3.Assuming that shown in Fig. 3 is the mark in first frame image Block, three angles are denoted as A, B, C respectively.In the same way in available last frame image identification block three lines direction Angle is denoted as a, b, c respectively.Then the steering angle of robot mouse is an=((a-A)+(b-B)+(c-C))/3.The machine if an is greater than zero Device people mouse has turned left an degree, illustrates that robot mouse has been turned right an degree if an is less than zero.
After completing above-mentioned processing, is found out according to video files names with identical stimulus type and there is identical stimulation ginseng Several trace images is simultaneously spliced together, and steering angle is shown in formed on each corresponding trajectory diagram it is one big Image, as shown in fig. 6, a1a2 ... nN represents steering angle corresponding with corresponding track.It can detecte identical thorn according to this image The consistency of control result and the stability of control effect under parameter are swashed, finally by the choice of parameters with stability control effect Out, the effective stimulus parameter as tested robot mouse is stored in computer document, gos deep into animal machine with facilitate the later period Device people's practical research.
Animal robot experimental data acquisition method and system based on machine vision are obtained dynamic using machine vision technique The motion profile and rotation direction of object robot more accurate and quantification analysis and can judge the movement row of animal robot For.Further quantitative analysis goes out the corresponding relationship of stimulation parameter Yu animal robot controlled action, so that it is determined that going out to be suitable for The optimum stimulating parameter of particular animals individual submits necessary information for the practical research of animal robot.Machine vision replaces Human eye, this method not only ensure that the objectivity of experiment flow standard and experimental data, but also obtain objective consistent thorn Swash parametric measurement result.Meanwhile significantly reducing and brought low value labour was tested by artificial stimulation parametric measurement originally, it mentions The high Efficiency of animal robot.
Certainly, the above description is not a limitation of the present invention, and the present invention is also not limited to the example above, this technology neck The variations, modifications, additions or substitutions that the technical staff in domain is made within the essential scope of the present invention also should belong to of the invention Protection scope.

Claims (7)

1. a kind of animal robot stimulation parameter based on machine vision measures system, it is characterised in that: including TT&C system and Stimulator;TT&C system is mainly made of industrial camera, PC machine and wireless communication module A;Stimulator is mainly by microprocessor, more Channel coding signal generator, wireless communication module B and basic function circuit composition;
Industrial camera is connect by USB interface with PC machine, the animal robot being configurable for during acquisition stimulation test Motion state, send video data to PC machine;
Wireless communication module A is connect by serial ports with PC machine, is configurable for the data wireless transmission from PC machine It goes;
PC machine, using the data analysis system developed based on the library OpenCV, the stimulus signal and detections picture that come to transmission into Row analysis and processing analyze status by the method for detecting rat roboting features point, calculate rat robot and turn Dynamic angle;In conjunction with the corresponding stimulus signal of each rotation, the video frames effectively controlled rat robot are filtered out, from It is dynamic to save these controllable picture recordings;Finally stimulus signal intensity and rotational angle are analyzed, compared, tested rat is obtained Corresponding relationship between the rotational angle and stimulus intensity of robot, calculates controllable sensitivity;
Wireless communication module B is connect with microprocessor by serial ports, is configurable for data of the wireless receiving from PC machine;
Microprocessor is connect by serial ports with wireless communication module B, meanwhile, occurred by I/O mouthfuls of control multi-channel coding signals The working condition of device;Microprocessor is based on the received information of wireless communication module B and generates desired coding electrical stimulation signal, and will It is applied in the target brain area of animal robot, and animal is made to generate desired operation action;
Multi-channel coding signal generator is configurable for that the original signal of microprocessor is handled and converted, and The selection of stimulation channels is realized under the control of microprocessor, starts and stops function, and then generates a stimulation signal frequencies, thorn Energizing signal amplitude and the adjustable coding stimulus signal of stimulus signal time-histories, and this coding stimulus signal is applied to target brain Area;
Basic function circuit is configurable for providing electric energy for microprocessor and wireless communication module B;
Based on the TT&C system of PC machine according to the law generation stimulation parameter and control command of setting, and pass through wireless telecommunications system A is sent to stimulator;Meanwhile the TT&C system synchronous recording based on PC machine and save experimentation in stimulation parameter information and Reflect the video file of the controlled action of animal robot;Finally, be based on video file and corresponding stimulation parameter, analyze by The corresponding relationship of control behavior and stimulation parameter.
2. the animal robot stimulation parameter according to claim 1 based on machine vision measures system, it is characterised in that: It is provided with home block on stimulator, home block is configurable for animal robot steering angle during standardization stimulation test Measurement, in actual rudder angle measurement, takes the average value of three tag line steering angles as final steering angle;Home block by The lines composition of three different colours, each lines are respectively marked with the arrow in a direction for identifying the lines, three lines It intersects at a bit, is evenly distributed differs 120 degree two-by-two in the plane.
3. the animal robot stimulation parameter according to claim 1 based on machine vision measures system, it is characterised in that: Wireless communication module A and wireless communication module B select wireless communications chips NRF9E5;Industrial camera selection has USB3.0 number According to 5,000,000 pixel industrial cameras of interface;The microprocessor of stimulator selects C8051F410 chip;Code oscillator is by 4 The symmetrical triode of group and 2 MAX309 chip compositions.
4. a kind of animal robot stimulation parameter measuring method based on machine vision, it is characterised in that: using such as claim 1 The animal robot stimulation parameter based on machine vision measures system, carries out in accordance with the following steps:
Step 1: the stimulator with home block is mounted on the back of animal robot, output end be previously implanted The connection of electrode interface slot, and open TT&C system;
Step 2: animal robot being placed in the plane that background is pure color, to position and divide the characteristic curve in home block Item;
Step 3: carrying out parameter setting;Wherein, initial parameters set as follows: current amplitude 50uA, pulse width 2, pulse Number is 5, pulse frequency 90Hz;Maximum value parameter setting is as follows: current amplitude 130uA, pulse width 9, pulse Number is 30, pulse frequency 130Hz;
Step 4: after the completion of TT&C system initialization, first according to the real-time image information from industrial camera, analyzing and determining dynamic The current position of object robot selects stimulus type;The Parameters variation of every kind of stimulus type all follows stimulus intensity and grows from weak to strong The incremental rule of gradual change;
Step 5: after stimulus type determines, TT&C system is according to the progress of the stimulus type, it then follows what stimulus intensity gradual change was incremented by Supplemental characteristic and stimulus type order are sent collectively to stimulator by wireless communication module A by rule;
Step 6: stimulator power on after by microprocessor complete initialization after, carry out wait state, when pass through wireless communication module B After receiving supplemental characteristic and stimulation order from TT&C system, microprocessor based on the received compile by information, control multichannel The working condition of code signal generator makes it generate corresponding coding stimulation electric signal, and stimulation electric signal is applied to animal The corresponding cranial nuclei of robot, controls the motor behavior of animal robot;Above-mentioned stimulation electric signal was repeated 10 in 5 seconds It is secondary;While generating stimulation, stimulator sends stimulation to TT&C system by wireless communication module B and starts to identify, TT&C system It receives after stimulation starts mark, starting picture recording function starts to use videotape to record, and video file classification is stored in three different files In folder;After this stimulation, stimulator sends end of identification signal to TT&C system, subsequently into wait state, observing and controlling system After system receives end of identification signal, terminate picture recording after delay 2 seconds;
Step 7: after all experimentss, video data analysis being carried out by PC machine, animal robot is drawn according to video file Motion profile, TT&C system read each frame of digital image in video file, are oriented according to the particular color in home block The intersection point of three lines, and the coordinate of the intersection point in the picture is calculated, finally the coordinate of the intersection point in each frame is connected, The motion profile of animal robot just can be obtained;Meanwhile first frame and last frame digital picture, needle are taken out from video file First frame image is handled, three tag lines in image are extracted according to color information, and calculate each line and exist Deflection on the plane of delineation is denoted as A, B, C respectively;Last frame image is handled in the same manner, deflection point It is not denoted as a, b, c;Then the steering angle of animal robot is ((a-A)+(B-B)+(c-C))/3;
Step 8: after completing above-mentioned processing, using image processing techniques, similar stimulation being had to the trajectory diagram of identical stimulation parameter Picture is spliced together, and steering angle is shown in one new image of formation on each corresponding trajectory diagram, is examined according to image Survey the stability of the consistency of control result and control effect under identical stimulation parameter, finally using the result with stability as Selected parameter.
5. the animal robot stimulation parameter measuring method according to claim 4 based on machine vision, it is characterised in that: In step 4, stimulus type is divided into three kinds: left steering stimulus type, right turn stimulus type and advance stimulus type;Observing and controlling system System chooses stimulus type according to the experiment process of the location of animal in experimentation and each type: if animal machine The right of people then selects left steering stimulus type close to place edge;If the left side of animal robot close to place edge, Select right turn stimulus type;The destination edge if animal robot is absent from the scene, TT&C system is according to left-hand rotation, right-hand rotation and ahead three kind The respective experiment process of stimulus type, that slower stimulus type of selection progress carry out advance stimulation test, pierce until three kinds Sharp type is all completed.
6. the animal robot stimulation parameter measuring method according to claim 4 based on machine vision, it is characterised in that: In step 4, the incremental rule of stimulus intensity gradual change is: by the precedence of amplitude, width, number and frequency, every subparameter Variation alternately increases a variable in above-mentioned 4 parameters, and the increment of each variable is as follows: current amplitude increment is 5uA, pulse Width increment is 1, and pulse number increment is 5, and pulse frequency increment is 10Hz;Until 4 variables related with stimulus intensity all The maximum value for reaching setting indicates that such experiment completes one and follows when the parameter of a certain stimulus type reaches maximum value Then ring test carries out next loop test again.
7. the animal robot stimulation parameter measuring method according to claim 4 based on machine vision, it is characterised in that: In step 6, three different files are with stimulus type code+cycle-index+stimulation parameter format name;Stimulus type It is 1 that code, which is respectively as follows: left-hand rotation code, and right-hand rotation code is 2, and advance code is 3;Cycle-index is since 1, every completion one cycle Afterwards plus 1;Stimulation parameter is composed by the sequencing of amplitude, pulse width, pulse number and stimulation signal frequencies.
CN201811283293.3A 2018-10-31 2018-10-31 A kind of animal robot stimulation parameter measurement system and method based on machine vision Active CN109262656B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201811283293.3A CN109262656B (en) 2018-10-31 2018-10-31 A kind of animal robot stimulation parameter measurement system and method based on machine vision
PCT/CN2018/123659 WO2020087717A1 (en) 2018-10-31 2018-12-25 System and method for measuring stimulation parameter of animal robot based on machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811283293.3A CN109262656B (en) 2018-10-31 2018-10-31 A kind of animal robot stimulation parameter measurement system and method based on machine vision

Publications (2)

Publication Number Publication Date
CN109262656A true CN109262656A (en) 2019-01-25
CN109262656B CN109262656B (en) 2019-05-28

Family

ID=65190945

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811283293.3A Active CN109262656B (en) 2018-10-31 2018-10-31 A kind of animal robot stimulation parameter measurement system and method based on machine vision

Country Status (2)

Country Link
CN (1) CN109262656B (en)
WO (1) WO2020087717A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109675171A (en) * 2019-03-04 2019-04-26 中国科学院深圳先进技术研究院 Animal stimulating method, device, equipment and storage medium
CN110275532A (en) * 2019-06-21 2019-09-24 珠海格力智能装备有限公司 Control method and device, the control method and device of visual apparatus of robot
CN113384234A (en) * 2021-07-08 2021-09-14 中山大学 Animal three-dimensional vision measuring device and method
CN113627256A (en) * 2021-07-09 2021-11-09 武汉大学 Method and system for detecting counterfeit video based on blink synchronization and binocular movement detection
CN114532242A (en) * 2022-02-16 2022-05-27 深圳市元疆科技有限公司 Experimental box for studying behavior of small animals
CN115056235A (en) * 2022-05-27 2022-09-16 浙江大学 Rat search and rescue robot based on multi-mode fusion positioning and search and rescue method
CN115607143A (en) * 2022-11-10 2023-01-17 大连理工大学 Brain-computer interface behavior regulation and evaluation method based on wireless real-time attitude detection
CN117310508A (en) * 2023-11-30 2023-12-29 山东科技大学 Method for rapidly and accurately measuring electric variable of lithium battery

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1848951A (en) * 2006-03-09 2006-10-18 西安交通大学 Integrated vision monitoring multi-mode wireless computer interactive apparatus
WO2013057544A1 (en) * 2011-10-21 2013-04-25 Commissariat A L'energie Atomique Et Aux Energies Alternatives A method of calibrating and operating a direct neural interface system
CN104199446A (en) * 2014-09-18 2014-12-10 山东科技大学 Robot bird flying controllability evaluation system and evaluation method
CN107351080A (en) * 2017-06-16 2017-11-17 浙江大学 A kind of hybrid intelligent research system and control method based on array of camera units

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5857433A (en) * 1996-07-22 1999-01-12 John C. Files Animal training and tracking device having global positioning satellite unit
AU2003210924A1 (en) * 2002-02-08 2003-09-02 John K. Chapin Method and apparatus for guiding movement of a freely roaming animal through brain stimulation
CN100467087C (en) * 2005-09-30 2009-03-11 东北大学 Cranial nerve electrostimulating device capable of remotely controlling exercise behevior
CN100515187C (en) * 2007-03-23 2009-07-22 浙江大学 Cranial nerve electric stimulating/collecting device of BCI animal experiment system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1848951A (en) * 2006-03-09 2006-10-18 西安交通大学 Integrated vision monitoring multi-mode wireless computer interactive apparatus
WO2013057544A1 (en) * 2011-10-21 2013-04-25 Commissariat A L'energie Atomique Et Aux Energies Alternatives A method of calibrating and operating a direct neural interface system
CN104199446A (en) * 2014-09-18 2014-12-10 山东科技大学 Robot bird flying controllability evaluation system and evaluation method
CN107351080A (en) * 2017-06-16 2017-11-17 浙江大学 A kind of hybrid intelligent research system and control method based on array of camera units

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨俊卿 等: "基于新型多通道脑神经刺激遥控系统的动物机器人研究", 《自然科学进展》 *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109675171A (en) * 2019-03-04 2019-04-26 中国科学院深圳先进技术研究院 Animal stimulating method, device, equipment and storage medium
CN110275532A (en) * 2019-06-21 2019-09-24 珠海格力智能装备有限公司 Control method and device, the control method and device of visual apparatus of robot
CN110275532B (en) * 2019-06-21 2020-12-15 珠海格力智能装备有限公司 Robot control method and device and visual equipment control method and device
CN113384234B (en) * 2021-07-08 2022-10-25 中山大学 Animal three-dimensional vision measuring device and method
CN113384234A (en) * 2021-07-08 2021-09-14 中山大学 Animal three-dimensional vision measuring device and method
CN113627256A (en) * 2021-07-09 2021-11-09 武汉大学 Method and system for detecting counterfeit video based on blink synchronization and binocular movement detection
CN113627256B (en) * 2021-07-09 2023-08-18 武汉大学 False video inspection method and system based on blink synchronization and binocular movement detection
CN114532242B (en) * 2022-02-16 2023-02-28 深圳市元疆科技有限公司 Experimental box for studying behavior of small animals
CN114532242A (en) * 2022-02-16 2022-05-27 深圳市元疆科技有限公司 Experimental box for studying behavior of small animals
CN115056235A (en) * 2022-05-27 2022-09-16 浙江大学 Rat search and rescue robot based on multi-mode fusion positioning and search and rescue method
CN115056235B (en) * 2022-05-27 2023-09-05 浙江大学 Multi-mode fusion positioning-based rat search and rescue robot and search and rescue method
CN115607143A (en) * 2022-11-10 2023-01-17 大连理工大学 Brain-computer interface behavior regulation and evaluation method based on wireless real-time attitude detection
CN117310508A (en) * 2023-11-30 2023-12-29 山东科技大学 Method for rapidly and accurately measuring electric variable of lithium battery
CN117310508B (en) * 2023-11-30 2024-02-27 山东科技大学 Method for rapidly and accurately measuring electric variable of lithium battery

Also Published As

Publication number Publication date
WO2020087717A1 (en) 2020-05-07
CN109262656B (en) 2019-05-28

Similar Documents

Publication Publication Date Title
CN109262656B (en) A kind of animal robot stimulation parameter measurement system and method based on machine vision
CN105354548B (en) A kind of monitor video pedestrian recognition methods again based on ImageNet retrievals
CN101853399B (en) Method for realizing blind road and pedestrian crossing real-time detection by utilizing computer vision technology
CN109635875A (en) A kind of end-to-end network interface detection method based on deep learning
CN103211605B (en) Psychological testing system and method
CN107085696A (en) A kind of vehicle location and type identifier method based on bayonet socket image
CN110222701A (en) A kind of bridge defect automatic identifying method
CN109086708A (en) A kind of parking space detection method and system based on deep learning
CN110211173A (en) A kind of paleontological fossil positioning and recognition methods based on deep learning
CN108921814A (en) A kind of Citrus Huanglongbing pathogen on-line quick detection system and method based on deep learning
CN107241572A (en) Student's real training video frequency tracking evaluation system
CN109859146A (en) A kind of colored eye fundus image blood vessel segmentation method based on U-net convolutional neural networks
CN109871911A (en) Unmanned plane monitors the method for animal health and the unmanned plane of monitoring animal health
CN109816636A (en) A kind of crack detection method based on intelligent terminal
CN108491830A (en) A kind of job site personnel uniform dress knowledge method for distinguishing based on deep learning
CN109977768A (en) A kind of closed-loop feed-back type animal behavior analysis system, method and device
CN202050784U (en) Multi-path shuttling and darkness-avoiding video analysis system
CN104199446B (en) Robot bird flying controllability evaluation system and evaluation method
CN110457989A (en) The weeds in paddy field recognition methods of various dimensions extension feature is extracted based on convolutional neural networks
CN103345624A (en) Weighing characteristic face recognition method for multichannel pulse coupling neural network
CN109816657A (en) A kind of brain tumor medical image cutting method based on deep learning
CN109657722A (en) Tongue fur image-recognizing method and system based on deep learning algorithm
CN116994244A (en) Method for evaluating fruit yield of citrus tree based on Yolov8
CN117036337A (en) Beef cattle body condition scoring method based on key points
CN206948499U (en) The monitoring of student's real training video frequency tracking, evaluation system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant