CN205219101U - Service robot of family - Google Patents

Service robot of family Download PDF

Info

Publication number
CN205219101U
CN205219101U CN201520837794.7U CN201520837794U CN205219101U CN 205219101 U CN205219101 U CN 205219101U CN 201520837794 U CN201520837794 U CN 201520837794U CN 205219101 U CN205219101 U CN 205219101U
Authority
CN
China
Prior art keywords
robot
camera
home
mechanical arm
joint
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201520837794.7U
Other languages
Chinese (zh)
Inventor
宫兆涛
杨小伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robot4u Technology (beijing) Co Ltd
Original Assignee
Robot4u Technology (beijing) Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robot4u Technology (beijing) Co Ltd filed Critical Robot4u Technology (beijing) Co Ltd
Priority to CN201520837794.7U priority Critical patent/CN205219101U/en
Application granted granted Critical
Publication of CN205219101U publication Critical patent/CN205219101U/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The utility model relates to a robot field, concretely relates to service robot of family, include: the visual system for obtain indoor object image information, including installing at the first camera of robot head and installing the second camera at the robot hand, the distance that first camera can obtain the object is greater than the distance that the second camera can obtain the object, object image information that the visual system obtained and the target object image information who prestores are mated in the target identification system, indoor location navigation for carry out the map to family's environment of setting for and found, confirm the position at the object place that the visual system acquireed, and navigate for the walking of robot, target object grasping system, according to each joint corner volume of position calculation arm at object place, each joint steering wheel of control machinery arm rotates corresponding angle and realizes that the object picks. The map can be foundd for the ground of complicacy in real time to this equipment, carries out accurate location and surveys reliably and avoid the obstacle the object.

Description

A kind of home-services robot
Technical field
The utility model relates to robot field, is specifically related to a kind of home-services robot.
Background technology
In recent years, along with the fast development of Robotics, all start applied robot widely in occasions such as catering industry, market, exhibition, the Department of Science and Technology, conference centre, opening celebrations, instead of the manual labor that the mankind carry out to a certain extent.Particularly along with the raising of people's living standard, the home-services robot average family that enters into gradually is helped people and is completed daily housework.
Existing domestic robot generally has location navigation function and object identification function.Object recognition technique refers to that robot gathers extraneous sound, optical, electrical and image information by vision system, extracts the characteristic information of corresponding object, identifies the function of object type.It is exactly eyes and the ear of robot, is unique input channel that robot makes action output.Robot industry association of the U.S. is defined as machine vision: " machine vision is automatically received by Optical devices and non-contacting sensor and processes the image of a real-world object, to obtain information needed or the device for control motion." theory of vision computing of D.Marr and evolving development gets up on this basis primitive graph model, threedimensional model be widely used in service robot object recognition technique.So-called threedimensional model is exactly centered by object, is used for processing and the 3D shape presentation of recognition object.In home environment, service robot recognition object is exactly the identification to object three-dimensional form presentation, being different from the mankind can by brain process, the stereopsis of direct generation things, computer only can obtain the 2D shade that camera generates, and this also just causes deeper difficulty to image recognition.Domestic and international researcher has carried out large quantity research to the identification of three-dimensional body, and proposes many relevant Theories and methods.YasushiSumi etc. propose the object of the method identification arbitrary shape using contour of object three-dimensional pose coupling; MichaelBoshra etc. mate to come by the multistage object that identification division has been blocked; Geometric invariance is applied in the identification of three-dimensional body by the researchers such as Chen Tuo; The researchers such as cave turbulent waves utilize visual tag (QR-Code) marking object, realize the identification to article by recognition visible sensation label; The researchers such as Zhang Lin utilize less radio-frequency (RFID) recognition technology, the identification be converted into the various article identification in indoor infrared tags.Under home environment, robot to capture to as if diversified, for each object sets up its model library, to carry out recognition object be the process of impossible grasping body and the motion planning of robot mechanical arm.Robot, before control manipulator carries out grasping body, needs first to carry out precise positioning to object.But these sensors used by prior art respectively have the pluses and minuses of oneself, there is the scope of application of oneself, be used alone the real-time map creation task that any sensor has all been difficult to ground complexity, be difficult to carry out precise positioning to object and reliably detect and avoidant disorder.
Based on above description, need a kind of new home-services robot badly, to solve the real-time map creation task being difficult to ground complexity existed in prior art, be difficult to carry out precise positioning to object and reliably detection and the problem of avoidant disorder.
Utility model content
For the defect that prior art exists, the utility model provides a kind of home-services robot, effectively can solve the real-time map creation task that existing robot has been difficult to ground complexity, be difficult to carry out precise positioning to object and reliably detection and the problem of avoidant disorder.
The technical solution adopted in the utility model is as follows:
A kind of home-services robot, comprises;
Vision system, for obtaining indoor object image information, comprise the first camera being arranged on robot head and the second camera being arranged on robot hand, the distance that described first camera can obtain object is greater than the distance that described second camera can obtain object;
Target identification system, the object image information that described vision system obtains by this system is mated with the target object image information prestored;
Indoor positioning navigation system, this system is used for carrying out map building to the home environment of setting, determine the position at the object place that described vision system obtains, and navigate for the walking of robot, described indoor positioning navigation system comprises starry sky locating module, described starry sky locating module adopts infrared camera and is furnished with ultrared transmitter, roof posts passive infrared label, read the relative position of passive infrared label in whole infrared camera image by infrared camera, and then analysis obtains the current position of robot; And
Target object grasping system, this system is according to each joint rotation angle amount of position calculation mechanical arm at described indoor positioning navigation system determined object place, and controller mechanical arm each joint steering wheel rotates respective angles and realizes grasping body.
As preferably, described home-services robot also comprises TTS module, and described TTS module is sound rendering module, for text message being converted to waveform signal and flowing to audio amplifier.
As preferably, described home-services robot also comprises energy supply control module, and described energy supply control module comprises overcurrent submodule, overvoltage protection submodule, Charge Management submodule and power supply management submodule.
As preferably, described home-services robot also comprises motor control module, and described motor control module controls 2 direct current generators, carries out the synchronous driving of two-wheel.
As preferably, described target object grasping system comprises mechanical arm, described mechanical arm is in series by four connecting rods and an end effector, two adjacent connecting rods and being connected by joint between adjacent connecting rod and end effector, and each joint is by servo driving.
As preferably, described mechanical arm is four-degree-of-freedom, is controlled the controlled quentity controlled variable of the steering wheel in each joint, make end effector pick up target object and move to assigned address by adjustment.
As preferably, the length of described four connecting rods and an end effector is respectively L1=49mm, L2=150mm, L3=190mm, L4=85mm, L5=20mm.
As preferably, the length of described mechanical arm is 52cm, and the scope of activities in joint described in each is 0-180 °, and control accuracy is 1 °, and end effector opening size H is less than 7cm.
The home-services robot that the utility model provides has the following advantages:
The home-services robot provided due to this programme comprises vision system, target identification system, indoor positioning navigation system and target object grasping system.Vision system is for obtaining indoor object image information, comprise the first camera being arranged on robot head and the second camera being arranged on robot hand, the distance that the first camera can obtain object is greater than the distance that second camera can obtain object; The object image information that vision system obtains by target identification system is mated with the target object image information prestored; Indoor positioning navigation system is used for carrying out map building to the home environment of setting, determines the position at the object place that vision system obtains, and navigates for the walking of robot; Target object grasping system is according to each joint rotation angle amount of position calculation mechanical arm at indoor positioning navigation system determined object place, and controller mechanical arm each joint steering wheel rotates respective angles and realizes grasping body.So effectively can solve by above home-services robot the real-time map creation task that existing robot has been difficult to ground complexity, be difficult to carry out precise positioning to object and the problem of reliably detection and avoidant disorder.
Accompanying drawing explanation
The structured flowchart of the home-services robot software systems that Fig. 1 provides for the utility model specific embodiment;
The structured flowchart of the home-services robot bottom hardware system that Fig. 2 provides for the utility model specific embodiment;
The structural representation of the mechanical arm that Fig. 3 provides for the utility model specific embodiment.
Wherein,
01-vision system; 02-target identification system; 03-indoor positioning navigation system; 04-target object grasping system;
1-power management module; 2-motor control module; 3-servos control module; 4-numeral I/O sensor acquisition module; 5-monocular vision module; 6-expresses one's feelings control module; 7-remote controller control module; 8-TTS module; 9-sonar sensor control module; 10-laser sensor control module; 11-ultrasonic wave module; 12-starry sky locating module;
13-connecting rod; 14-joint; 15-end effector; 16-steering wheel.
Detailed description of the invention
Below in conjunction with accompanying drawing, the utility model is described in detail:
As shown in Figure 1, a kind of home-services robot that the application provides, comprising:
Vision system 01, for obtaining indoor object image information, comprise the first camera being arranged on robot head and the second camera being arranged on robot hand, the distance that described first camera can obtain object is greater than the distance that described second camera can obtain object;
Target identification system 02, the object image information that described vision system 01 obtains by this system is mated with the target object image information prestored;
Indoor positioning navigation system 03, this system is used for carrying out map building to the home environment of setting, determines the position at the object place that described vision system obtains, and navigates for the walking of robot; And
Target object grasping system 04, this system, according to each joint rotation angle amount of position calculation machinery arm at described indoor positioning navigation system 03 determined object place, controls mechanical arm each joint steering wheel rotation respective angles and realizes grasping body.
In prior art, vision system only have employed the camera that is arranged on robot head, and the camera of this position can be clearly seen that object at a distance, but cannot realize gripper of manipulator by its visual information and keep away barrier and capture.The application, except installing except camera at robot head, has installed again a camera at the hand of robot, has allowed it obtain object location information more accurately.Such two cameras combine, and effectively can improve the efficiency of object identification and crawl.
In the application, for target object identification, SURF (SpeededUpRobustFeatures has the high velocity characteristic gathering algorithm of the robustness) algorithm in research object recognition algorithm.When recognition object, camera is got subject image and mates with the target object image prestored by robot, when remote, feature extraction difficulty increases, present applicant proposes the object identification scheme of color combining algorithm, remote colour recognition algorithm searching object, closely confirms object by SURF feature recognition algorithms.
Robot will identify object and capture, then need first to realize its indoor autonomous navigation.For realizing robot in indoor autonomous and real-time navigation function, the application uses the home environment of StarGazer starry sky Position Fixing Navigation System to setting to carry out map building, thus solves the location navigation problem of service robot in searching object process.
Realize grasping body, then need to carry out precise positioning to target object.The application proposes based on the national forest park in Xiaokeng of monocular vision the scheme that different visual angle characteristic matching obtains end effector desired locations, by setting up each joint rotation angle amount of service robot Mechanical transmission test equation solution, final system controller mechanical arm each joint steering wheel rotates respective angles and realizes grasping body function.
By above home-services robot, effectively can solve the real-time map creation task that existing robot has been difficult to ground complexity, be difficult to carry out precise positioning to object and reliably detection and the problem of avoidant disorder.
Concrete, from hardware, the service robot that the application provides can be divided into 3 parts: bottom hardware system, middle control machine and host computer.Bottom hardware system completes the control to related sensor module, power module, motor module and mechanical arm etc.Middle control machine is mainly used in all modules of packaging bottom layer hardware system, and provides servicing communications interface to host computer.Middle control machine adopts Samsung 2440ARM920TCPU, dominant frequency 400M, and operating system is flush type LINUX, does not need display, use as service system completely when middle control machine runs.Host computer is mainly used in all service modules of package platforms, and provides second development interface.What host computer adopted is grind to raise 9458T mainboard internal memory 1G, hard disk 4GCF card, and display screen is 8.4 cun of liquid crystal displays, and support touch manner, operating system is embedded WINDOWSXP.
As shown in Figure 2, the service robot bottom hardware system that the application provides mainly comprises with lower part: power management module 1, motor control module 2, servos control module 3, digital I/O sensor acquisition module 4, monocular vision module 5, expression control module 6, remote controller control module 7, TTS module 8, sonar sensor control module 9, laser sensor control module 10, ultrasonic wave module 11, starry sky locating module 12.
Wherein, described power management module 1 core usage powers, comprises 3.3V, 5V, 7V, 12V to exactly all electronic devices and components; 24V, meanwhile, power management module 1 also comprises overcurrent submodule, overvoltage protection submodule, Charge Management submodule and power supply management submodule.
Described motor control module 2 mainly controls 2 direct current generators, realizes the synchronous driving of two-wheel.Code-disc reads, the high accuracy walking etc. of robot.
Described digital I/O sensor acquisition module 4, for gathering the data of crash sensor, touch sensor, hunting sensor board, realizes high-frequency quick response collection.
Described TTS module 8 is sound rendering modules, and it can convert waveform signal to text message, directly can export to audio amplifier.
Described remote controller control module 7 adopts 433 wireless radio frequency modules, realizes the control to robot by remote control.
Described servos control module 3 major control mechanical arm and camera The Cloud Terrace.
Described sensor control block is used for the related sensor that control carries.Concrete, the sonar sensor that described sonar sensor control module 9 is carried for control; The laser sensor that laser sensor control module 10 is carried for control.
In the present embodiment, preferably, described indoor positioning navigation system comprises starry sky locating module 12, and described starry sky locating module 12 adopts infrared camera and is furnished with ultrared transmitter, and it normally can be worked under not by light restriction for 24 hours.This technology utilizes the specific passive infrared label be attached on roof, reads the relative position of specific passive infrared label in whole camera image, just can analyze and obtain the current position of robot by camera.
In the present embodiment, described vision system 01 is the eyes of robot, and 316EW service robot is equipped with the CCD camera of 130 pixels, is furnished with camera The Cloud Terrace, expands the visual field of robot.Described camera The Cloud Terrace is controlled by servos control module 3, has 2 frees degree, can pitching and rotation.
In the present embodiment, described target object grasping system comprises mechanical arm, the mechanical arm of 316EW service robot is as Fig. 2 .7, mechanical arm is a complex dynamical systems be made up of joint 14, connecting rod 13 and end effector 15, and it makes end effector 15 arrive expected pose by the rotation controlling each joint and completes crawl task.The process that mechanical arm target object captures can be analyzed to:
(1) end effector 15 arrives target place;
(2) target is captured.The control mode of the mechanical arm that the application provides is a position formula, does not namely plan the path that mechanical arm is passed by, only requires the position that accurately can control end effector 15.
In the present embodiment, preferably, the mechanical arm of described robot has five joints, but realizes folding gripping object due to end effector 15 need, is not set as one degree of freedom, so the mechanical arm of the application is four-degree-of-freedom mechanical arm.By the controlled quentity controlled variable of adjustment mechanical arm each joint steering wheel, the end effector 15 of mechanical arm is made accurately to pick up target object and move to assigned address.
In the present embodiment, preferably, the physical arrangement of described mechanical arm as shown in Figure 3, described mechanical arm is in series by four connecting rods 13 and an end effector 15, two adjacent connecting rods 13 and being connected by joint 14 between adjacent connecting rod 13 and end effector 15, each joint 14 is driven by steering wheel 16.
In the present embodiment, preferably, described mechanical arm is four-degree-of-freedom, is controlled the controlled quentity controlled variable of the steering wheel 16 in each joint, make end effector 15 pick up target object and move to assigned address by adjustment.
The movement locus of structure to end effector 15 of mechanical arm itself plays vital effect, in the present embodiment, preferably, the concrete numerical value of length structure parameter of four described connecting rods 13 and an end effector 15 is as follows: L1=49mm, L2=150mm, L3=190mm, L4=85mm, L5=20mm.Wherein, L1, L2, L3, L4, L5 represent the length of four connecting rods 13 and an end effector 15 respectively.In the present embodiment, the span of L1, L2, L3, L4, L5 is not limited thereto, and can also be other value.
In the present embodiment, preferably, this mechanical arm uses 5 steering wheels 16 to control, and the length of described mechanical arm is 52cm, and described in each, the scope of activities in joint 14 is 0-180 °, and control accuracy is 1 °, and end effector 15 opening size H is less than 7cm.
The above is only preferred embodiment of the present utility model; it should be pointed out that for those skilled in the art, under the prerequisite not departing from the utility model principle; can also make some improvements and modifications, these improvements and modifications also should look protection domain of the present utility model.

Claims (8)

1. a home-services robot, is characterized in that, comprises;
Vision system, for obtaining indoor object image information, comprise the first camera being arranged on robot head and the second camera being arranged on robot hand, the distance that described first camera can obtain object is greater than the distance that described second camera can obtain object;
Target identification system, the object image information that described vision system obtains by this system is mated with the target object image information prestored;
Indoor positioning navigation system, this system is used for carrying out map building to the home environment of setting, determine the position at the object place that described vision system obtains, and navigate for the walking of robot, described indoor positioning navigation system comprises starry sky locating module, described starry sky locating module adopts infrared camera and is furnished with ultrared transmitter, roof posts passive infrared label, read the relative position of passive infrared label in whole infrared camera image by infrared camera, and then analysis obtains the current position of robot; And
Target object grasping system, this system is according to each joint rotation angle amount of position calculation mechanical arm at described indoor positioning navigation system determined object place, and controller mechanical arm each joint steering wheel rotates respective angles and realizes grasping body.
2. home-services robot according to claim 1, is characterized in that, also comprises TTS module, and described TTS module is sound rendering module, for text message being converted to waveform signal and flowing to audio amplifier.
3. home-services robot according to claim 1, is characterized in that, also comprises energy supply control module, and described energy supply control module comprises overcurrent submodule, overvoltage protection submodule, Charge Management submodule and power supply management submodule.
4. home-services robot according to claim 1, is characterized in that, described home-services robot also comprises motor control module, and described motor control module controls 2 direct current generators, carries out the synchronous driving of two-wheel.
5. home-services robot according to claim 1, it is characterized in that, described target object grasping system comprises mechanical arm, described mechanical arm is in series by four connecting rods and an end effector, two adjacent connecting rods and being connected by joint between adjacent connecting rod and end effector, each joint is by servo driving.
6. home-services robot according to claim 5, is characterized in that, described mechanical arm is four-degree-of-freedom, is controlled the controlled quentity controlled variable of the steering wheel in each joint, make end effector pick up target object and move to assigned address by adjustment.
7. home-services robot according to claim 5, is characterized in that, the length of described four connecting rods and an end effector is respectively L1=49mm, L2=150mm, L3=190mm, L4=85mm, L5=20mm.
8. home-services robot according to claim 5, is characterized in that, the length of described mechanical arm is 52cm, and the scope of activities in joint described in each is 0-180 °, and control accuracy is 1 °, and end effector opening size H is less than 7cm.
CN201520837794.7U 2015-10-27 2015-10-27 Service robot of family Expired - Fee Related CN205219101U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201520837794.7U CN205219101U (en) 2015-10-27 2015-10-27 Service robot of family

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201520837794.7U CN205219101U (en) 2015-10-27 2015-10-27 Service robot of family

Publications (1)

Publication Number Publication Date
CN205219101U true CN205219101U (en) 2016-05-11

Family

ID=55894193

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201520837794.7U Expired - Fee Related CN205219101U (en) 2015-10-27 2015-10-27 Service robot of family

Country Status (1)

Country Link
CN (1) CN205219101U (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106020201A (en) * 2016-07-13 2016-10-12 广东奥讯智能设备技术有限公司 Mobile robot 3D navigation and positioning system and navigation and positioning method
CN106125724A (en) * 2016-06-13 2016-11-16 华讯方舟科技有限公司 A kind of method and system of robot autonomous charging
CN106156799A (en) * 2016-07-25 2016-11-23 北京光年无限科技有限公司 The object identification method of intelligent robot and device
CN106441238A (en) * 2016-06-01 2017-02-22 昆山塔米机器人有限公司 Positioning device and positioning navigation algorithm of robot based on infrared visual technology
CN106679661A (en) * 2017-03-24 2017-05-17 山东大学 Simultaneous localization and mapping system and method assisted by search and rescue robot arms
CN107234619A (en) * 2017-06-02 2017-10-10 南京金快快无人机有限公司 A kind of service robot grasp system positioned based on active vision
CN107297748A (en) * 2017-07-27 2017-10-27 南京理工大学北方研究院 A kind of dining room service robot system and application
CN107423412A (en) * 2017-07-28 2017-12-01 中南大学 A kind of method of the carrying robot Intelligent Recognition floor based on meteorological sensing time series pattern
JP2018019471A (en) * 2016-07-26 2018-02-01 セイコーエプソン株式会社 Robot and motor
CN108161889A (en) * 2018-02-08 2018-06-15 北京华航唯实机器人科技股份有限公司 A kind of industrial robot based on AGV
CN109029423A (en) * 2018-08-10 2018-12-18 国网上海市电力公司 Substation's indoor mobile robot navigation positioning system and its navigation locating method
CN109324532A (en) * 2017-07-31 2019-02-12 广州维绅科技有限公司 Merchandising machine people and its control case assembly
CN109961074A (en) * 2017-12-22 2019-07-02 深圳市优必选科技有限公司 A kind of method, robot and computer readable storage medium for searching article
CN109986580A (en) * 2019-04-29 2019-07-09 宁波大学 A kind of bionical domestic movable assistant robot
CN110033584A (en) * 2018-01-11 2019-07-19 丰田自动车株式会社 Server, control method and computer-readable recording medium
CN111376251A (en) * 2018-12-28 2020-07-07 西安光启未来技术研究院 Logistics robot
TWI723715B (en) * 2019-12-30 2021-04-01 群邁通訊股份有限公司 Computer device and method for controlling mechanical arm to gripping and placing objects
CN113119099A (en) * 2019-12-30 2021-07-16 深圳富泰宏精密工业有限公司 Computer device and method for controlling mechanical arm to clamp and place object
CN114654482A (en) * 2022-04-26 2022-06-24 北京市商汤科技开发有限公司 Control method for mobile robot, device, equipment and storage medium

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106441238A (en) * 2016-06-01 2017-02-22 昆山塔米机器人有限公司 Positioning device and positioning navigation algorithm of robot based on infrared visual technology
CN106125724A (en) * 2016-06-13 2016-11-16 华讯方舟科技有限公司 A kind of method and system of robot autonomous charging
CN106020201A (en) * 2016-07-13 2016-10-12 广东奥讯智能设备技术有限公司 Mobile robot 3D navigation and positioning system and navigation and positioning method
CN106156799A (en) * 2016-07-25 2016-11-23 北京光年无限科技有限公司 The object identification method of intelligent robot and device
JP2018019471A (en) * 2016-07-26 2018-02-01 セイコーエプソン株式会社 Robot and motor
CN106679661A (en) * 2017-03-24 2017-05-17 山东大学 Simultaneous localization and mapping system and method assisted by search and rescue robot arms
CN106679661B (en) * 2017-03-24 2023-08-22 山东大学 System and method for assisting in simultaneous positioning and environment map construction of search and rescue robot arms
CN107234619A (en) * 2017-06-02 2017-10-10 南京金快快无人机有限公司 A kind of service robot grasp system positioned based on active vision
CN107297748A (en) * 2017-07-27 2017-10-27 南京理工大学北方研究院 A kind of dining room service robot system and application
CN107297748B (en) * 2017-07-27 2024-03-26 南京理工大学北方研究院 Restaurant service robot system and application
CN107423412A (en) * 2017-07-28 2017-12-01 中南大学 A kind of method of the carrying robot Intelligent Recognition floor based on meteorological sensing time series pattern
CN109324532A (en) * 2017-07-31 2019-02-12 广州维绅科技有限公司 Merchandising machine people and its control case assembly
CN109961074A (en) * 2017-12-22 2019-07-02 深圳市优必选科技有限公司 A kind of method, robot and computer readable storage medium for searching article
CN110033584A (en) * 2018-01-11 2019-07-19 丰田自动车株式会社 Server, control method and computer-readable recording medium
CN108161889A (en) * 2018-02-08 2018-06-15 北京华航唯实机器人科技股份有限公司 A kind of industrial robot based on AGV
CN108161889B (en) * 2018-02-08 2023-11-24 北京华航唯实机器人科技股份有限公司 Industrial robot based on AGV
CN109029423A (en) * 2018-08-10 2018-12-18 国网上海市电力公司 Substation's indoor mobile robot navigation positioning system and its navigation locating method
CN111376251A (en) * 2018-12-28 2020-07-07 西安光启未来技术研究院 Logistics robot
CN109986580A (en) * 2019-04-29 2019-07-09 宁波大学 A kind of bionical domestic movable assistant robot
CN113119099A (en) * 2019-12-30 2021-07-16 深圳富泰宏精密工业有限公司 Computer device and method for controlling mechanical arm to clamp and place object
TWI723715B (en) * 2019-12-30 2021-04-01 群邁通訊股份有限公司 Computer device and method for controlling mechanical arm to gripping and placing objects
CN114654482A (en) * 2022-04-26 2022-06-24 北京市商汤科技开发有限公司 Control method for mobile robot, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN205219101U (en) Service robot of family
CN110480634B (en) Arm guide motion control method for mechanical arm motion control
CN100487724C (en) Quick target identification and positioning system and method
CN102999152B (en) A kind of gesture motion recognition methods and system
CN110216674B (en) Visual servo obstacle avoidance system of redundant degree of freedom mechanical arm
CN105912980A (en) Unmanned plane and unmanned plane system
CN107214700A (en) A kind of robot autonomous patrol method
CN101441769A (en) Real time vision positioning method of monocular camera
WO2019001237A1 (en) Mobile electronic device, and method in mobile electronic device
Momeni-k et al. Height estimation from a single camera view
Pradeep et al. A wearable system for the visually impaired
CN104952104A (en) Three-dimensional human body gesture estimating method and device thereof
CN107543531A (en) A kind of Robot visual location system
CN104898675A (en) Robot intelligent navigation control method
CN113848931A (en) Agricultural machinery automatic driving obstacle recognition method, system, equipment and storage medium
CN112123338A (en) Transformer substation intelligent inspection robot system supporting deep learning acceleration
CN113570715B (en) Sensor fusion-based rotary laser real-time positioning modeling system and method
Gao et al. Kinect-based motion recognition tracking robotic arm platform
CN208751479U (en) A kind of synchronous spacescan system
US20230297120A1 (en) Method, apparatus, and device for creating map for self-moving device with improved map generation efficiency
Jo et al. Tracking and interaction based on hybrid sensing for virtual environments
Shukor et al. 3d modeling of indoor surfaces with occlusion and clutter
CN107253215B (en) Robot intelligent sensing module integrated with 2D camera, 3D camera and laser
Lee et al. Design and development of a monitoring system based on smart device for service robot applications
CN209086750U (en) The control device and service robot of service robot

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20160511

Termination date: 20191027