CN217039972U - Outdoor independent work's rubbish cleans machine people - Google Patents

Outdoor independent work's rubbish cleans machine people Download PDF

Info

Publication number
CN217039972U
CN217039972U CN202121345500.0U CN202121345500U CN217039972U CN 217039972 U CN217039972 U CN 217039972U CN 202121345500 U CN202121345500 U CN 202121345500U CN 217039972 U CN217039972 U CN 217039972U
Authority
CN
China
Prior art keywords
steering
intelligent trolley
power
gears
trolley
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN202121345500.0U
Other languages
Chinese (zh)
Inventor
谢子晖
蒋玉莲
潘爽
林光涛
杨宪傲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southwest Minzu University
Original Assignee
Southwest Minzu University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southwest Minzu University filed Critical Southwest Minzu University
Priority to CN202121345500.0U priority Critical patent/CN217039972U/en
Application granted granted Critical
Publication of CN217039972U publication Critical patent/CN217039972U/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Manipulator (AREA)

Abstract

The utility model discloses a rubbish of outdoor autonomous working cleans machine people belongs to intelligent robot technical field. The utility model provides a rubbish of outdoor autonomous working cleans machine people which characterized in that: the intelligent trolley comprises an intelligent trolley, wherein a graphic processing device is installed in the middle of the top of the intelligent trolley, a radar module is installed at the front end of the top of the intelligent trolley, a manipulator is installed at the rear end of the top of the intelligent trolley, and a dust suction device is arranged at the bottom of the intelligent trolley; the utility model discloses it is not enough and uses the limited problem of scene effectively to have solved traditional rubbish cleaning device intellectuality.

Description

Outdoor independent work's rubbish cleans machine people
Technical Field
The utility model relates to an intelligent robot technical field has especially related to a rubbish of outdoor autonomous working cleans machine people.
Background
The continuous progress of the technology level promotes a plurality of intelligent robot products. With the gradual reduction of population dividends in China and the improvement of the automation level of computers and industries, the automation equipment is urgently required to replace the traditional manual work for work [1 ]. The robot related to intelligent garbage cleaning is a great hot point in application. The garbage cleaning machine in the current market has certain effect on certain occasions, but has many disadvantages. Firstly, the use scene is limited, and most rubbish clearance machine is effective to its specific occasion use, but to places such as district, garden, square, school, because domestic rubbish clearance robot volume is less and road cleaning vehicle volume is great, both these will not be suitable for. Secondly, the intelligent degree is not enough, the garbage cleaning technology in the market can not realize automatic control well, the function is not complete enough, and the garbage cleaning device is difficult to be suitable for various garbage cleaning scenes.
Based on this, this paper has designed an intelligent movement rubbish and has cleaned robot, can carry out route planning and traverse the region of cleaning, discerns rubbish and snatchs rubbish. The garbage cleaning machine is suitable for garbage cleaning in closed places such as rooms, scenic spots, campuses, communities and the like, has the characteristics of high automation degree, low cost, high efficiency and the like, can greatly save garbage cleaning cost, and has good popularization and application prospects.
SUMMERY OF THE UTILITY MODEL
1. The to-be-solved technical problem of the utility model
An object of the utility model is to provide a rubbish of outdoor autonomous working cleans machine people to solve the problem that proposes in the above-mentioned background art:
the traditional garbage cleaning device has the problems of insufficient intelligence and limited use scene.
2. Technical scheme
In order to achieve the above object, the utility model provides a following technical scheme:
the utility model provides a rubbish of outdoor autonomous working cleans machine people which characterized in that: the intelligent trolley is characterized by comprising an intelligent trolley, wherein a graphic processing device is installed in the middle of the top of the intelligent trolley, a radar module is installed at the front end of the top of the intelligent trolley, a mechanical arm is installed at the rear end of the top of the intelligent trolley, and a dust suction device is arranged at the bottom of the intelligent trolley.
Preferably, the intelligent trolley comprises two supporting plates which are connected through bolts, steering blocks are movably arranged on two sides of the front end of the inner portion of each supporting plate, steering wheels are fixedly connected to the outer sides of the steering blocks, steering gears are fixedly connected to the inner sides of the steering blocks, driving gears are meshed with the inner sides of the steering gears, the tops of the driving gears are fixedly connected with the output ends of steering motors, the tops of the steering motors are connected with the bottom of the top supporting plate in a threaded mode, two power wheels are connected to two sides of the rear end of each supporting plate, power gears are coaxially connected to the middle portions of the power wheels, power motors are connected to the inner portions of the supporting plates, driving gears are fixedly connected with the output ends of the power motors, and the driving gears are meshed with the power gears.
Preferably, the figure processing apparatus includes the bracing piece, the top of bracing piece is connected with the connection pad, the front end of connection pad is provided with the screw thread post, the outside cover of screw thread post is equipped with fixed knob, the centre gripping has the connecting rod between fixed knob and the connection pad, the one end of connecting rod is provided with the cloud platform, the bottom threaded connection of cloud platform has the industrial camera.
3. Advantageous effects
(1) The robot introduces wireless communication, and transmission efficiency is high, and the reaction is timely. The single chip microcomputer is connected with a WIFI (Wireless Fidelity) image transmission module through a serial port, and can be connected through a mobile phone APP (application) to realize video display and control of robot movement.
(1) The robot hand cooperates the operation with the dust catcher, has both promoted work efficiency, the energy saving again: the robot identifies garbage through the image identification module, and two different cleaning modes are adopted according to different types of garbage, so that the garbage cleaning success rate is greatly improved.
(3) The image vision is matched with the grid map, and the problems of automatically picking up garbage near the robot and avoiding obstacles are solved. The image vision positioning high-quality construction environment 2D image is measured by laser, so that garbage is accurately positioned and obstacles are avoided in time.
Drawings
Fig. 1 is a schematic view of the overall structure of the present invention;
FIG. 2 is an internal view of the intelligent trolley of the utility model;
FIG. 3 is a schematic diagram of a graphics processing apparatus according to the present invention;
fig. 4 is a flowchart of the image recognition of the present invention;
fig. 5 is an ultrasonic signal diagram of the radar module of the present invention;
fig. 6 is a structure diagram of the Stm32 master control of the present invention;
fig. 7 is a schematic diagram of the raspberry pi expansion board of the present invention.
The reference numbers in the figures illustrate:
1. an intelligent trolley; 101. a support plate; 102. a steering wheel; 103. a turning block; 104. a steering gear; 105. a driving gear; 106. a steering motor; 107. a power wheel; 108. a power gear; 109. a power motor; 2. a graphics processing device; 201. a support bar; 202. a connecting disc; 203. a threaded post; 204. fixing the knob; 205. a connecting rod; 206. a holder; 207. an industrial camera; 3. a manipulator; 4. and a radar module.
Detailed Description
The technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only some embodiments of the present invention, not all embodiments. Based on the embodiments in the present invention, all other embodiments obtained by a person skilled in the art without creative efforts all belong to the protection scope of the present invention.
Example 1: exercise device design
As shown in fig. 1-2, the intelligent trolley 1 includes two support plates 101, the two support plates 101 are connected by bolts, two sides of the front end inside the support plates 101 are movably provided with a steering block 103, the outer side of the steering block 103 is fixedly connected with a steering wheel 102, the inner side of the steering block 103 is fixedly connected with a steering gear 104, the inner side of the steering gear 104 is engaged with a driving gear 105, the top of the driving gear 105 is fixedly connected with the output end of a steering motor 106, the top of the steering motor 106 is in threaded connection with the bottom of the top support plate 101, the two sides of the rear end of the support plate 101 are connected with two power wheels 107 of the power wheel 107, the middle of the support plate 101 is coaxially connected with a power gear 108, the inside of the support plate 101 is connected with a power motor 109, the output end of the power motor 109 is fixedly connected with a driving gear 105, and the driving gear 105 is engaged with the power gear 108.
The base of the garbage cleaning robot is an intelligent trolley. The trolley is driven by a double-circuit direct current motor, the front wheels are provided with steering gears of a steering mechanism, the rear wheels provide power for the double motors, and steering is achieved through the matching of the differential speed of the rear wheels and the steering gears. The wheels are rubber wheels to adapt to various outdoor complex terrain environments.
The carrying algorithm is as follows:
in actual work, due to environmental influence, different road surfaces have different obstacles to the running of the trolley, and a PID double-closed-loop automatic control system is designed for enabling the trolley to work within a set speed range. The actual running speed of the trolley is detected through the encoder, the actual running speed is compared with the set speed to obtain an error, the regulation speed is obtained through PID operation, and finally the regulation speed is converted into a double-path PWM signal so as to regulate the rotating speed of the motor. The algorithm is given below:
Speed_Con=Spe_P*error+Spe_D*(lasterror-error)+Spe_I *errorsum
the Spe _ P is a proportional parameter, the Spe _ D is a differential parameter, the Spe _ I is an integral parameter, error is an error, Lasterror is an error of the last time, and errorsum is the sum of the errors.
Example 2: pattern recognition design
As shown in fig. 3, the graphic processing device 2 includes a support rod 201, a connection disc 202 is connected to the top of the support rod 201, a threaded column 203 is arranged at the front end of the connection disc 202, a fixing knob 204 is sleeved outside the threaded column 203, a connection rod 205 is clamped between the fixing knob 204 and the connection disc 202, a pan-tilt 206 is arranged at one end of the connection rod 205, and an industrial camera 207 is connected to the bottom of the pan-tilt 206 through threads.
In the robot design, the image recognition function is utilized to recognize the type of an object in a visible area, extract object characteristic parameters and acquire position information. The image recognition module plays a core role in the design and consists of a high-definition camera, a software part and a database.
And the information of the visible area shot by the camera can be transmitted to the terminal in real time. When a certain suspected garbage appears in the visible area, the image recognition module extracts image information of the suspected garbage and compares the image information with the database, when the same characteristic value garbage exists in the database, the object is considered to be garbage, and an instruction can be given to grab the object by using the mechanical arm and place the object at an appointed position. In the present design, the raspberry pi is a form of digital image that processes the acquired image. Because the design needs to store a large number of sample parameters and the digital image processing mode is easier to store for a long time, the digital image mode is adopted.
The stable operation of the image recognition function module is based on the mass training of the mechanical arm. By extracting and storing the characteristic parameters of the training samples, the image recognition system has the capability of recognizing such characteristic objects, and the extracted parameters can be displayed on a computer through a raspberry group. A flowchart of a specific implementation process is shown in fig. 4.
The image processing method comprises the following steps:
and S1, filtering and smoothing the image, wherein Gaussian filtering and CVsmooth functions are mainly selected. The aim is to eliminate noise interference of the image. Each image comprises a plurality of wave bands, and the image with the highest noise ratio is selected as a processing object according to the requirement of the situation, so that the experimental result is more ideal. The gaussian filtering is used for a prototype based on gaussian functions:
one-dimensional gaussian function:
Figure BDA0003116172490000061
two-dimensional gaussian function:
Figure BDA0003116172490000062
the advantage of using gaussian filtering is that it processes the edges of the image very smoothly, without the overweight or underweight of other filters; replacing the pixel value of the point by a weighted mean of a neighborhood of pixels similar to the formula 3-8; the characteristic of invariant image rotation is provided; the sigma parameter can be adjusted for the image smoothing degree to be selected, so that the method is convenient for application to various scenes.
S2, image binarization, which is also called image monochrome processing. In the experiment, a method of fixing a threshold is adopted to process the image, namely, the gray value is set by self. When the gray value of the pixel point of the image is smaller than the set value, the image is marked as black (or white), and vice versa.
During image processing, because the scenes around the photographed character picture are various, namely, interference signals are more, and pixels are scattered, the photographing effect is obvious and not obvious. The method comprises the steps of firstly detecting RGB values of each pixel point in a figure image by binarization, and properly selecting a closed value range to carry out binarization.
And S3, extracting the feature vector. Each object image is distinguished from the characteristic vectors specific to other objects, and the different images are classified by using the characteristic vectors. In this experiment, the image after image processing is divided into a plurality of small regions, and when the number of white pixels in each small region is greater than the number of black pixels (i.e., the number of white pixels is half or more of the number of pixels in the entire small region), the number of the matrix row and column corresponding to the small region is set to 1, otherwise, the number is set to O.
Through the processing of feature extraction, the features of the extracted image become in a matrix form. The matrix eigenvalues will be the input vectors for pattern recognition.
After the above image processing, we will get a matrix for each image. Because of the difference between computer vision and human eyes, any object recognized by a computer is read and matched in a matrix form, namely converted into a computer language. In order to stably perform operations such as feature extraction on the image and obtain a good recognition effect, some transformations, i.e. pre-processing, need to be performed on the object image in the experiment process. And further obtain clearer images which are easy to be identified by a computer.
Example 3: gripping device design
For slightly larger site garbage, a grabbing device is designed for garbage cleaning. The grabbing device of the cleaning robot is a mechanical arm formed by five degrees of freedom and a clamp holder, and can ensure that the arm can grab, lift, place and the like, so that the whole picking action can be completed cooperatively. The manipulator comprises a clamp holder, the manipulator is composed of 6 steering engines and joint connecting rods, each steering engine presents different degrees of freedom, and the whole manipulator can do different actions under the driving of the joint connecting rods. The manipulator controls the action of the arm by designing computer software. In the design, the steering engine parameters are manually modified or the parameters of the current action are displayed through the interaction function of the raspberry pie and the computer. In the training stage, parameters when the current type of articles are picked up are obtained by forcibly calibrating the actions of the mechanical arm, and then the parameters are stored in a database, and then corresponding actions can be called according to the image recognition result. In order to describe analysis related to each joint connecting rod, a steering engine and environmental motion in the mechanical arm, the joint connecting rod of the mechanical arm is taken as a rigid body for research. A coordinate system can be automatically fixed on a certain steering engine, and the coordinate system is expressed in space, namely the pose of the steering engine relative to a reference coordinate system.
The coordinates of a point P in space with respect to the reference coordinate system can be transformed into a spatial coordinate system, P is expressed in a vector form, and a scale factor w is added, which can be expressed as follows.
Figure BDA0003116172490000071
Wherein P is x =x·w,y x =y·w,P z =z·w
Where w can be substituted by any number, each component of the vector P is invariant when w is 1 and infinite when w is 0. The manipulator of this design has six steering engines, and every steering engine all changes the degree of freedom according to this principle, and these six scale factors determine the action of manipulator promptly.
Example 4: dust suction device design
Considering the application of the design, only a mechanical arm is not enough as a cleaning device, so for cleaning some fine garbage, a dust suction device is designed. The dust suction device can suck some garbage which can not be identified in a visible area and can not be grabbed due to undersize, so that a better cleaning effect is achieved.
In the design, the dust suction device is positioned at the chassis part of the trolley and adopts a disc type dust collector. The dust collector is placed in a sunken position of a chassis of the trolley, so that the running of the trolley is not influenced. The cyclone filter is adopted in the filter part of the dust collector and is mainly characterized in that air carrying dust into the dust collector forms cyclone inside the dust collector, and the dust particles have certain mass and collide with the wall of the container under the action of centrifugal force and fall into the dust collecting box, so that the dust and the air of the dust collector are better separated.
Example 5: radar module
The radar transmitter generates radio frequency signals, the radio frequency signals are radiated to a space through the radar transmitting antenna, when electromagnetic waves meet a target and are reflected, echo signals reach the receiver through the radar receiving antenna, the echo signals are processed by the receiver and then sent to the signal processor to be processed, and target parameters are obtained: distance, orientation, velocity, shape, etc.
Radar equation:
Figure BDA0003116172490000081
signals transmitted by radar can be generally divided into continuous wave signals and pulse signals. We use a pulsed signal, the signal shown in figure 5.
The laser radar has the advantages of good directivity, wide detection range, high precision, strong coherence and the like, and has remarkable advantages in ranging performance. In the design, the radar module adopts a Silan A1 radar, the radius of the measuring range reaches 12m, and the measuring frequency is 8000 times/second. When the ranging Slam is used for drawing construction, 360-degree scanning can be performed on the surrounding environment, so that the working adaptability of the robot to the environment is enhanced. The method adopts infrared laser ranging to disperse infrared laser to the periphery, records the time of the barrier reflected back, calculates the distance between the robot and the barrier according to the time, and then scans to obtain a point cloud map.
Obstacle avoidance navigation design:
the grid map is used for representing the probability value of the existence of obstacles in the two-dimensional plane coordinate point, and has the advantages of high construction speed, small storage space, easiness in expansion and the like. The design adopts a forward obstacle avoidance algorithm based on a grid map, the grid map divides a route to be traveled by the robot into different squares under a unit distance, and each square is positioned on a certain pixel point of an image by a camera, so that the distance of a region block is continuously calculated, and the closer the distance from the region block is, the smaller the value of the pixel point is until the next grid is entered. In the grid map, the pixel value of the area with the obstacle can be specified, and for the pixel points of all nearby non-obstacle areas, the distance from the pixel points to the obstacle area can be identified by the pixel points, so that the obstacle avoidance effect is achieved. The map construction method comprises the steps that a Gmapping algorithm is used for constructing a grid map, namely, a map is constructed from an initial coordinate set at random as a starting point, after the map construction of the grid map is completed, a trolley is moved, a newly-built map at the last moment is matched with acquired laser data, the position of the trolley is further determined, and the map construction is further completed according to the position of a laser sensor relative to a robot. In addition, the AMCL method-based path planning is adopted in the design, the pose of the robot is tracked in a known map by adopting particle filtering, the core idea is that a group of sampling particles is used for estimating the probability distribution condition of the pose of the robot, and the weight condition of the particles reflects the reliability of the pose estimation of the robot.
Example 6: control system hardware design
The power supply of the robot is divided into two parts: bottom motor car and raspberry group mainboard. To increase the working time, we used a 22.2V, 5000mAh, 15A continuous discharge battery, powering both parts simultaneously. Because the two parts have different voltages, the voltage reduction module is designed. 22.2V supplies power for the motor vehicle, and the voltage is reduced to 7V through the TPSDDAR chip, so that a mainboard is supplied for the raspberry.
The main control module is STM32, STM32 is as the main control chip of robot, and the raspberry group is the secondary control chip. STM32 is a low-power consumption, high performance, high integrated level, abundant interface's processor, and it adopts ARM Cortex serial processor v7-M framework, and the fastest speed of operation can reach 35% than ARM7DMI [4 ]. The following fig. 2 is a diagram of the main control structure of Stm 32. The robot takes STM32 as a main control chip, controls the working mode of a manipulator steering engine through STM32, and performs angle conversion, rotating speed conversion and signal input to raspberry groups. The control end feeds back signals of the ultrasonic sensor and the photoelectric sensor to the main controller to control the motor to operate so as to form a closed-loop negative feedback system. In addition, the single chip microcomputer is connected with a WIFI (wireless fidelity) image transmission module through a serial port, and can be connected through a mobile phone APP (application) to realize video display and control of robot movement.
And at the raspberry sending control end, receiving an instruction transmitted from the PC end through the Bluetooth module, analyzing the character string, recovering the desired information, calculating the target direction through comparing the current coordinate with the terminal coordinate, and controlling the motor driving module so as to control the advancing direction and speed of the robot. For the direction acquisition, a method for jointly acquiring a geomagnetic sensor and an acceleration sensor is adopted. Operations need to be performed on several classes as follows. And (4) Sonser: a sensor; SonserManager: a sensor management class; SensorEvent: a sensor event; sensoreventlisterener: a sensor event listening interface. When data of a geomagnetic sensor and data of an acceleration sensor are acquired, a rotation matrix is calculated by a getrotomatrix method in a SensorManager, and the rotation matrix is stored as a first parameter in the method. And reading the direction by using getOrientation () method in sensorManager and storing the direction into a value1 floating-point array, wherein the first numerical value of the array is the heading angle.
In a raspberry dispatching system, a video processing module at a server end is realized through software programming, the system adopts Python language, video data are processed based on Open CV, a particle filter algorithm based on color gray scale and edge characteristic multi-feature fusion is adopted for target tracking, and the target tracking is transmitted to an STM32 main control chip through raspberry dispatching for data processing.
The above description is only the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can substitute or change the technical solution and the improvement concept of the present invention within the technical scope disclosed in the present invention.

Claims (3)

1. The utility model provides a rubbish of outdoor autonomous working cleans machine people which characterized in that: the intelligent trolley is characterized by comprising an intelligent trolley (1), wherein a graphic processing device (2) is installed in the middle of the top of the intelligent trolley (1), a radar module (4) is installed at the front end of the top of the intelligent trolley (1), a mechanical arm (3) is installed at the rear end of the top of the intelligent trolley (1), and a dust collection device is arranged at the bottom of the intelligent trolley (1).
2. An outdoor self-contained working garbage cleaning robot according to claim 1, characterized in that: the intelligent trolley (1) comprises two supporting plates (101), the two supporting plates (101) are connected through bolts, steering blocks (103) are movably arranged on two sides of the front end of the interior of each supporting plate (101), steering wheels (102) are fixedly connected to the outer sides of the steering blocks (103), steering gears (104) are fixedly connected to the inner sides of the steering blocks (103), driving gears (105) are meshed and connected to the inner sides of the steering gears (104), the tops of the driving gears (105) are fixedly connected with the output ends of steering motors (106), the tops of the steering motors (106) are in threaded connection with the bottoms of the supporting plates (101) at the tops, power wheels (107) are connected to two sides of the rear end of each supporting plate (101), power gears (108) are coaxially connected to the middles of the two power wheels (107), and power motors (109) are connected to the interior of each supporting plate (101), the output end of the power motor (109) is fixedly connected with a driving gear (105), and the driving gear (105) is meshed and matched with the power gear (108).
3. An outdoor self-contained working garbage cleaning robot according to claim 1, characterized in that: the graphic processing device (2) comprises a supporting rod (201), the top of the supporting rod (201) is connected with a connecting disc (202), the front end of the connecting disc (202) is provided with a threaded column (203), the outer side of the threaded column (203) is sleeved with a fixed knob (204), a connecting rod (205) is clamped between the fixed knob (204) and the connecting disc (202), a tripod head (206) is arranged at one end of the connecting rod (205), and the bottom of the tripod head (206) is in threaded connection with an industrial camera (207).
CN202121345500.0U 2021-06-16 2021-06-16 Outdoor independent work's rubbish cleans machine people Expired - Fee Related CN217039972U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202121345500.0U CN217039972U (en) 2021-06-16 2021-06-16 Outdoor independent work's rubbish cleans machine people

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202121345500.0U CN217039972U (en) 2021-06-16 2021-06-16 Outdoor independent work's rubbish cleans machine people

Publications (1)

Publication Number Publication Date
CN217039972U true CN217039972U (en) 2022-07-26

Family

ID=82463912

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202121345500.0U Expired - Fee Related CN217039972U (en) 2021-06-16 2021-06-16 Outdoor independent work's rubbish cleans machine people

Country Status (1)

Country Link
CN (1) CN217039972U (en)

Similar Documents

Publication Publication Date Title
CN111243017B (en) Intelligent robot grabbing method based on 3D vision
CN110355765B (en) Automatic following obstacle avoidance method based on visual identification and robot
CN114474061B (en) Cloud service-based multi-sensor fusion positioning navigation system and method for robot
CN106863259B (en) Wheeled many manipulators intelligence ball robot
CN110928301A (en) Method, device and medium for detecting tiny obstacles
CN110344621A (en) A kind of wheel points cloud detection method of optic towards intelligent garage
CN111746728B (en) Novel overwater cleaning robot based on reinforcement learning and control method
CN110040394A (en) A kind of interactive intelligent rubbish robot and its implementation
CN114005021B (en) Laser vision fusion based unmanned inspection system and method for aquaculture workshop
CN112720408B (en) Visual navigation control method for all-terrain robot
CN112330746A (en) Mobile chassis obstacle detection method based on TX2
US20240051146A1 (en) Autonomous solar installation using artificial intelligence
CN108871409A (en) A kind of fault detection method and system
Chang et al. Design of mobile garbage collection robot based on visual recognition
CN118385157A (en) Visual classified garbage automatic sorting system based on deep learning and self-adaptive grabbing
CN217039972U (en) Outdoor independent work's rubbish cleans machine people
CN110421563A (en) A kind of industrial robot builds figure positioning system and robot
CN115690343A (en) Robot laser radar scanning and mapping method based on visual following
CN112731918B (en) Ground unmanned platform autonomous following system based on deep learning detection tracking
CN114578817A (en) Control method of intelligent carrier based on multi-sensor detection and multi-data fusion
Wang et al. An embedded vision system for robotic fish navigation
CN214348036U (en) Intelligent garbage sorting robot
Pan et al. Intelligent Robot for Cleaning Garbage Based on OpenCV
Guo et al. Research on Mobile Robot Vision Navigation Algorithm
Elleuch et al. New ground plane segmentation method for electronic cane

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20220726