CN108994832A - A kind of robot eye system and its self-calibrating method based on RGB-D camera - Google Patents

A kind of robot eye system and its self-calibrating method based on RGB-D camera Download PDF

Info

Publication number
CN108994832A
CN108994832A CN201810804650.XA CN201810804650A CN108994832A CN 108994832 A CN108994832 A CN 108994832A CN 201810804650 A CN201810804650 A CN 201810804650A CN 108994832 A CN108994832 A CN 108994832A
Authority
CN
China
Prior art keywords
robot
camera
rgb
pose
method based
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810804650.XA
Other languages
Chinese (zh)
Other versions
CN108994832B (en
Inventor
李明洋
王家鹏
任明俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jieka Robot Co ltd
Original Assignee
Shanghai Joint Card Machine Qi Ren Science And Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Joint Card Machine Qi Ren Science And Technology Ltd filed Critical Shanghai Joint Card Machine Qi Ren Science And Technology Ltd
Priority to CN201810804650.XA priority Critical patent/CN108994832B/en
Publication of CN108994832A publication Critical patent/CN108994832A/en
Application granted granted Critical
Publication of CN108994832B publication Critical patent/CN108994832B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1653Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0009Constructional details, e.g. manipulator supports, bases
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of robot eye systems and its self-calibrating method based on RGB-D camera, it is related to robotic technology field, the following steps are included: (1) robot is fixed on table top by pedestal, RGB-D camera motion is driven by connector, a series of depth map of environment is obtained, and records robot pose simultaneously;(2) three-dimensional reconstruction is carried out to captured scene using continuous depth map, the pose of camera when obtaining each depth map of shooting;(3) the camera pose of synchronization and robot pose are bonded to the constraint equation about robot end and camera relative pose;(4) by the combination of all moment poses building one equation group about robot end and camera relative pose;(5) equation group is solved by Tsai-Lenz method, obtains the relative pose of robot end and camera.The present invention is not needed by special marker, and algorithm is simple, can be used for on-line proving, substantially increases the efficiency of Robotic Hand-Eye Calibration.

Description

A kind of robot eye system and its self-calibrating method based on RGB-D camera
Technical field
The present invention relates to robotic technology field more particularly to a kind of robot eye system based on RGB-D camera and Its self-calibrating method
Background technique
In the case where " made in China 2025 " plans the background proposed, need of the Chinese industrial to flexible production line and intelligent robot Ask more more and more urgent.A kind of important means of robot automtion is allowed to be exactly the ability giving robot using machine vision and seeing. Machine vision refers to the image information that environment is obtained by visual sensor, so that machine has the function of visual perception.It is logical Machine vision is crossed, robot can be allowed to recognize object, and determine the position of object.
It is combined between the kinetic coordinate system of robot and the coordinate system of camera by " hand and eye calibrating ", can be incited somebody to action here The end effector of robot regards the hand of people as, and visual sensor regards the eyes of people as.Robot eye system is generally divided into Two kinds, eye outside hand (eye-to-hand) in (eye-in-hand) on hand and eye, visual sensor is fixed on machine by the former It on people's end effector, can be moved with robot end, the latter is fixed in the environment by visual sensor, not random device people End movement.The former often has higher flexibility, and the movement of robot is made to have higher precision.
Traditional robot eye system uses high-precision marker, such as cube or gridiron pattern device, is scheming Angle point is extracted as in, and camera is calculated to the relative position of marker by projective geometry mode, while recording robot The position of end, by repeatedly shooting to obtain multi-group data calculating robot end to the relative pose between camera.But this Kind method generally requires to do off-line calibration, greatly limits the use of this scaling method due to needing high-precision marker Range.
Therefore, those skilled in the art be dedicated to developing a kind of robot eye system based on RGB-D camera and its Self-calibrating method, this method do not need special marker, it is only necessary to and robot has certain space complexity, The offline or on-line proving for carrying out hand-eye system can be calculated by algorithm.
Summary of the invention
In view of the above drawbacks of the prior art, the technical problem to be solved by the present invention is to overcome to need in the prior art High-precision marker, the shortcomings that limiting the use scope of traditional scaling method, solve traditional scaling method and need using high The marker of precision leads to the problem of can only carrying out off-line calibration.
To achieve the above object, the present invention provides a kind of Robot Hand-eye system self-calibration sides based on RGB-D camera Method, comprising the following steps:
Step S1, the RGB-D camera motion for being mounted on robot end is driven by robot, obtains a series of surrounding rings Border depth map, and record the posture information of synchronization robot;
Step S2, three-dimensional reconstruction is carried out to captured scene using a series of continuous depth maps, to obtain every depth Camera pose when degree figure shooting;
Step S3, the camera pose of synchronization and robot pose are combined into a data pair, and by any two groups The data of different moments are to the constraint equation being combined with composition about robot end and camera relative pose;
Step S4, all moment pose combination of two are constructed one big about robot end and camera relative pose Equation group;
Step S5, equation group is solved using Tsai-Lenz algorithm, obtains the opposite position of robot end and camera Appearance.
Simultaneously the present invention provides a kind of robot eye system based on RGB-D camera, including robot, workbench, Robot base, rigid connector, RGB-D camera and end effector of robot;The robot base is arranged in the work Make on platform;The robot is arranged in the robot base;The rigid connector is arranged in the robot end;Institute RGB-D camera is stated to be arranged on the rigid connector;The end effector of robot is arranged on the rigid connector.
Further, the depth map of the robot pose for obtaining synchronization and RGB-D camera, is touched by hardware The hair mode that perhaps software triggers triggers the RGB-D camera and obtains a frame image while by socket or robot API Obtain the robot pose.
Further, described that three-dimensional reconstruction is carried out to captured scene according to a series of continuous depth maps, it is to pass through TSDF Volume model merges multiframe depth map, carries out three-dimensional reconstruction to the scene and estimates each depth Scheme the corresponding RGB-D camera pose.
Further, the data by two groups of different moments to be combined with constitute about robot end and camera The constraint equation of relative pose is combined by arbitrarily choosing the data at two moment in continuous more time datas, The constraint equation of foundation is most basic position auto―control transformation, without other a priori assumptions.
Further, described that equation group is solved to obtain robot end and camera by Tsai-Lenz method Relative pose, be by by the rotation and translation part of relative transform matrix between the robot end and the camera point Solution is opened, the variation of pose is indicated using Douglas Rodríguez parameter, first solves rotating vector, then solve spin matrix.
Further, RGB-D camera position is arranged to the origin of reference frame in the step S1, and direction matrix is Unit matrix.
Further, the robot meets following formula between any two pose in moving process:
Wherein subscript g indicates that robot end's coordinate system, c represent camera coordinates system, and i, j indicate the pose serial number of record, Such as: c_i indicates camera coordinates system in the pose of i-th group of record, Hcjci expression camera when from i pose is in by space midpoint Coordinate in coordinate system is converted to coordinate when in j pose in camera coordinates system.
Further, the robot base and the workbench are rigidly connected, the RGB-D camera and the robot It is rigidly connected between end effector, the robot can be started shipment with the RGB-D camera and the end effector one It is dynamic.
Further, the workbench is all environmental informations around the visual sensor including work top.
Compared with prior art, the present invention is not needed by special marker, and algorithm is simple, can be used for marking online It is fixed, substantially increase the efficiency of Robotic Hand-Eye Calibration.
It is described further below with reference to technical effect of the attached drawing to design of the invention, specific structure and generation, with It is fully understood from the purpose of the present invention, feature and effect.
Detailed description of the invention
Fig. 1 is the hand and eye calibrating basic model schematic diagram of a preferred embodiment of the invention;
Fig. 2 is the hand and eye calibrating system structure diagram of a preferred embodiment of the invention.
Specific embodiment
Multiple preferred embodiments of the invention are introduced below with reference to Figure of description, keep its technology contents more clear and just In understanding.The present invention can be emerged from by many various forms of embodiments, and protection scope of the present invention not only limits The embodiment that Yu Wenzhong is mentioned.
In the accompanying drawings, the identical component of structure is indicated with same numbers label, everywhere the similar component of structure or function with Like numeral label indicates.The size and thickness of each component shown in the drawings are to be arbitrarily shown, and there is no limit by the present invention The size and thickness of each component.Apparent in order to make to illustrate, some places suitably exaggerate the thickness of component in attached drawing.
The invention adopts the following technical scheme:
RGB-D camera is fixed on end effector of robot, RGB-D camera can be moved with robot end It is dynamic, and the connection of the two is kept to be rigid, relative movement is not had.Operation robot carries out moving through depth sensing The depth map data of device acquisition robot.The posture of camera is as reference coordinate when collecting first depth map System rebuilds the environment around robot into real-time three-dimensional, at the same when recording sampling depth diagram data each time robot position Appearance.
Camera when collecting depth map each time is estimated that while rebuilding to robot The pose of opposite reference frame.
For the robot eye system of this form, basic model is as shown in Figure 1.
In robot moving process, meet following formula between any two pose:
Wherein subscript g indicates that robot end's coordinate system, c represent camera coordinates system, and i, j indicate the pose serial number of record, Such as: ciIndicate camera coordinates system in the pose of i-th group of record, HcjciExpression camera coordinates when from i pose is in by space midpoint Coordinate in system is converted to coordinate when in j pose in camera coordinates system.Due to opposite between robot end and camera Position is fixed, therefore HgcSubscript i, j is omitted.Enable A=Hgjgi, B=Hgjgi, X=Hgc, then formula (1) can be abbreviated are as follows:
AX=XB (2)
Above formula is split, is write as the form of matrix in block form are as follows:
Wherein R and t is the rotation and translation component in transformation matrix respectively, compares the available equation group in equation two sides:
Solution for equation group (5), the present invention use two step solving method of Tsai, and this method is commonly known as Tsai- Lenz method is one of most widely used Robotic Hand-Eye Calibration method.
As shown in Fig. 2, Robot Hand-eye self-calibrating method of the invention includes: robot base 1, robot 2, robot The conducting wire rigid-connecting device 3 of end and visual sensor, end effector of robot 4, the end effector are that robot executes tool The executing agency of body task, RGB-D visual sensor 5, workbench environment 6, signified workbench environment is to include in this patent All environmental informations around visual sensor including work top.
(1) operation robot 2 is returned to initial position, so that the major part of work top environment 6 is in RGB-D Camera 5 is within sweep of the eye.Set camera position at this time to the origin of reference frame, direction matrix is unit battle array.Initially Change the working space of optical rehabilitation, the reconstruction model that the present invention uses is TSDF (Trunked Signed Distance Function) model.
(2) operation robot 2 is moved along the track (such as zigzag route) of setting with RGB-D camera 5 together, Constantly scene 6 is shot during movement, obtained depth map Ik, while recording the pose of robot end 4 Tbgk
(3) to the depth map I obtained each timek, according to the method for ray cast (Ray casting) by TSDFVolume In data according to Ik-1The pose at moment is projected, and I ' is obtainedk-1, I ' is found out by ICP algorithmk-1And IkBetween it is opposite Pose Tck,ck-1.Therefore the camera pose at k moment is
(4) according to the camera pose T at k momentckBy depth map IkThree-dimensional space is projected to, is determined later according to TSDF model Right way of conduct formula is fused in TSDF Volume.
(5) a series of camera pose T can be obtained by carrying out circulation according to above-mentioned 2 to 4 stepckWith corresponding robot end Hold pose TbgkData.It is grouped two-by-two to therein, such as to one group of data that i moment and j time data are constituted, Ying You
(6) assume to amount to have obtained the data of N number of pose, then available N* (N-1)/2 group data, wherein each group of number According to form equation as shown in formula (7) can be listed.This N* (N-1)/2 equation constitutes a big equation group.
(7) equation group is solved according to Tsai-Lenz method after having obtained multi-group data, finally obtains camera Relative pose T between robot endgc
The preferred embodiment of the present invention has been described in detail above.It should be appreciated that the ordinary skill of this field is without wound The property made labour, which according to the present invention can conceive, makes many modifications and variations.Therefore, all technician in the art Pass through the available technology of logical analysis, reasoning, or a limited experiment on the basis of existing technology under this invention's idea Scheme, all should be within the scope of protection determined by the claims.

Claims (10)

1. a kind of Robot Hand-eye system self-calibration's method based on RGB-D camera, which comprises the following steps:
Step S1, the RGB-D camera motion for being mounted on robot end is driven by robot, and it is deep to obtain a series of ambient enviroments Degree figure, and record the posture information of synchronization robot;
Step S2, three-dimensional reconstruction is carried out to captured scene using a series of continuous depth maps, to obtain every depth map Camera pose when shooting;
Step S3, the camera pose of synchronization and robot pose are combined into a data pair, and different by any two groups The data at moment are to the constraint equation being combined with composition about robot end and camera relative pose;
Step S4, all moment pose combination of two are constructed into the big side about robot end and camera relative pose Journey group;
Step S5, equation group is solved using Tsai-Lenz algorithm, obtains the relative pose of robot end and camera.
2. a kind of robot eye system based on RGB-D camera, which is characterized in that including robot, workbench, robot base Seat, rigid connector, RGB-D camera and end effector of robot;The robot base setting is on the workbench;Institute Robot is stated to be arranged in the robot base;The rigid connector is arranged in the robot end;The RGB-D phase Machine is arranged on the rigid connector;The end effector of robot is arranged on the rigid connector.
3. Robot Hand-eye system self-calibration's method based on RGB-D camera as described in claim 1, which is characterized in that institute The depth map for stating the robot pose and RGB-D camera that obtain synchronization is the side triggered by hardware trigger or software Formula triggers the RGB-D camera and obtains a frame image, while obtaining the robot position by socket or robot API Appearance.
4. Robot Hand-eye system self-calibration's method based on RGB-D camera as described in claim 1, which is characterized in that institute It states and three-dimensional reconstruction is carried out to captured scene according to a series of continuous depth maps, being will be more by TSDF Volume model Frame depth map is merged, and is carried out three-dimensional reconstruction to the scene and is estimated the corresponding RGB-D phase of each depth map Seat in the plane appearance.
5. Robot Hand-eye system self-calibration's method based on RGB-D camera as described in claim 1, which is characterized in that institute It states the data of two groups of different moments to being combined to constitute constraint equation about robot end and camera relative pose, It is to be combined by arbitrarily choosing the data at two moment in continuous more time datas, the constraint equation of foundation is most base This position auto―control transformation, without other a priori assumptions.
6. Robot Hand-eye system self-calibration's method based on RGB-D camera as described in claim 1, which is characterized in that institute State and equation group solved by Tsai-Lenz method to obtain the relative pose of robot end and camera, be pass through by The rotation and translation part of relative transform matrix separately solves between the robot end and the camera, using Rodrigo This parameter indicates the variation of pose, first solves rotating vector, then solve spin matrix.
7. Robot Hand-eye system self-calibration's method based on RGB-D camera as described in claim 1, which is characterized in that institute The origin that RGB-D camera position in step S1 is arranged to reference frame is stated, direction matrix is unit battle array.
8. Robot Hand-eye system self-calibration's method based on RGB-D camera as described in claim 1, which is characterized in that institute Robot is stated in moving process, meets following formula between any two pose:
Wherein subscript g indicates that robot end's coordinate system, c represent camera coordinates system, and i, j indicate the pose serial number of record, such as: ci Indicate camera coordinates system in the pose of i-th group of record, HcjciExpression is by space midpoint when from i pose is in camera coordinates system Coordinate be converted to coordinate when in j pose in camera coordinates system.
9. as claimed in claim 2 based on the robot eye system of RGB-D camera, which is characterized in that the robot base Seat is rigidly connected with the workbench, is rigidly connected between the RGB-D camera and the end effector of robot, the machine Device people can move together with the RGB-D camera and the end effector.
10. as claimed in claim 2 based on the robot eye system of RGB-D camera, which is characterized in that the workbench is All environmental informations around visual sensor including work top.
CN201810804650.XA 2018-07-20 2018-07-20 Robot eye system based on RGB-D camera and self-calibration method thereof Active CN108994832B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810804650.XA CN108994832B (en) 2018-07-20 2018-07-20 Robot eye system based on RGB-D camera and self-calibration method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810804650.XA CN108994832B (en) 2018-07-20 2018-07-20 Robot eye system based on RGB-D camera and self-calibration method thereof

Publications (2)

Publication Number Publication Date
CN108994832A true CN108994832A (en) 2018-12-14
CN108994832B CN108994832B (en) 2021-03-02

Family

ID=64596742

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810804650.XA Active CN108994832B (en) 2018-07-20 2018-07-20 Robot eye system based on RGB-D camera and self-calibration method thereof

Country Status (1)

Country Link
CN (1) CN108994832B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109531577A (en) * 2018-12-30 2019-03-29 北京猎户星空科技有限公司 Mechanical arm calibration method, device, system, medium, controller and mechanical arm
CN110238831A (en) * 2019-07-23 2019-09-17 青岛理工大学 Robot teaching system and method based on RGB-D image and teaching device
CN110276803A (en) * 2019-06-28 2019-09-24 首都师范大学 Pose of camera estimated form method, apparatus, electronic equipment and storage medium
CN110281231A (en) * 2019-03-01 2019-09-27 浙江大学 The mobile robot 3D vision grasping means of unmanned FDM increasing material manufacturing
CN110480658A (en) * 2019-08-15 2019-11-22 同济大学 A kind of six-joint robot control system merging vision self-calibration
CN110580725A (en) * 2019-09-12 2019-12-17 浙江大学滨海产业技术研究院 Box sorting method and system based on RGB-D camera
CN111452048A (en) * 2020-04-09 2020-07-28 亚新科国际铸造(山西)有限公司 Calibration method and device for relative spatial position relationship of multiple robots
CN111474932A (en) * 2020-04-23 2020-07-31 大连理工大学 Mobile robot mapping and navigation method integrating scene experience
CN111890355A (en) * 2020-06-29 2020-11-06 北京大学 Robot calibration method, device and system
CN113479442A (en) * 2021-07-16 2021-10-08 上海交通大学烟台信息技术研究院 Device and method for realizing intelligent labeling of unstructured objects on production line

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040102911A1 (en) * 2002-11-21 2004-05-27 Samsung Electronics Co., Ltd. Hand/eye calibration method using projective invariant shape descriptor of 2-dimensional image
CN102566577A (en) * 2010-12-29 2012-07-11 沈阳新松机器人自动化股份有限公司 Method for simply and easily calibrating industrial robot
CN106384353A (en) * 2016-09-12 2017-02-08 佛山市南海区广工大数控装备协同创新研究院 Target positioning method based on RGBD
CN107253190A (en) * 2017-01-23 2017-10-17 梅卡曼德(北京)机器人科技有限公司 The device and its application method of a kind of high precision machines people trick automatic camera calibration
CN107818554A (en) * 2016-09-12 2018-03-20 索尼公司 Message processing device and information processing method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040102911A1 (en) * 2002-11-21 2004-05-27 Samsung Electronics Co., Ltd. Hand/eye calibration method using projective invariant shape descriptor of 2-dimensional image
CN102566577A (en) * 2010-12-29 2012-07-11 沈阳新松机器人自动化股份有限公司 Method for simply and easily calibrating industrial robot
CN106384353A (en) * 2016-09-12 2017-02-08 佛山市南海区广工大数控装备协同创新研究院 Target positioning method based on RGBD
CN107818554A (en) * 2016-09-12 2018-03-20 索尼公司 Message processing device and information processing method
CN107253190A (en) * 2017-01-23 2017-10-17 梅卡曼德(北京)机器人科技有限公司 The device and its application method of a kind of high precision machines people trick automatic camera calibration

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109531577B (en) * 2018-12-30 2022-04-19 北京猎户星空科技有限公司 Mechanical arm calibration method, device, system, medium, controller and mechanical arm
CN109531577A (en) * 2018-12-30 2019-03-29 北京猎户星空科技有限公司 Mechanical arm calibration method, device, system, medium, controller and mechanical arm
CN110281231A (en) * 2019-03-01 2019-09-27 浙江大学 The mobile robot 3D vision grasping means of unmanned FDM increasing material manufacturing
CN110276803B (en) * 2019-06-28 2021-07-20 首都师范大学 Formalization method and device for camera pose estimation, electronic equipment and storage medium
CN110276803A (en) * 2019-06-28 2019-09-24 首都师范大学 Pose of camera estimated form method, apparatus, electronic equipment and storage medium
CN110238831A (en) * 2019-07-23 2019-09-17 青岛理工大学 Robot teaching system and method based on RGB-D image and teaching device
CN110480658A (en) * 2019-08-15 2019-11-22 同济大学 A kind of six-joint robot control system merging vision self-calibration
CN110480658B (en) * 2019-08-15 2022-10-25 同济大学 Six-axis robot control system integrating vision self-calibration
CN110580725A (en) * 2019-09-12 2019-12-17 浙江大学滨海产业技术研究院 Box sorting method and system based on RGB-D camera
CN111452048A (en) * 2020-04-09 2020-07-28 亚新科国际铸造(山西)有限公司 Calibration method and device for relative spatial position relationship of multiple robots
CN111452048B (en) * 2020-04-09 2023-06-02 亚新科国际铸造(山西)有限公司 Calibration method and device for relative spatial position relation of multiple robots
CN111474932B (en) * 2020-04-23 2021-05-11 大连理工大学 Mobile robot mapping and navigation method integrating scene experience
CN111474932A (en) * 2020-04-23 2020-07-31 大连理工大学 Mobile robot mapping and navigation method integrating scene experience
CN111890355A (en) * 2020-06-29 2020-11-06 北京大学 Robot calibration method, device and system
CN111890355B (en) * 2020-06-29 2022-01-11 北京大学 Robot calibration method, device and system
CN113479442A (en) * 2021-07-16 2021-10-08 上海交通大学烟台信息技术研究院 Device and method for realizing intelligent labeling of unstructured objects on production line

Also Published As

Publication number Publication date
CN108994832B (en) 2021-03-02

Similar Documents

Publication Publication Date Title
CN108994832A (en) A kind of robot eye system and its self-calibrating method based on RGB-D camera
CN106826833B (en) Autonomous navigation robot system based on 3D (three-dimensional) stereoscopic perception technology
Ueda et al. A hand-pose estimation for vision-based human interfaces
CN104376594B (en) Three-dimensional face modeling method and device
CN104077804B (en) A kind of method based on multi-frame video picture construction three-dimensional face model
CN105353873A (en) Gesture manipulation method and system based on three-dimensional display
CN109272537A (en) A kind of panorama point cloud registration method based on structure light
CN104570731A (en) Uncalibrated human-computer interaction control system and method based on Kinect
JP6483832B2 (en) Method and system for scanning an object using an RGB-D sensor
Gratal et al. Visual servoing on unknown objects
CN110189257A (en) Method, apparatus, system and the storage medium that point cloud obtains
Kennedy et al. A novel approach to robotic cardiac surgery using haptics and vision
CN109676602A (en) Self-adapting calibration method, system, equipment and the storage medium of walking robot
CN109318227B (en) Dice-throwing method based on humanoid robot and humanoid robot
Yekutieli et al. Analyzing octopus movements using three-dimensional reconstruction
CN210361314U (en) Robot teaching device based on augmented reality technology
Wang et al. A virtual end-effector pointing system in point-and-direct robotics for inspection of surface flaws using a neural network based skeleton transform
Ni et al. 3D-point-cloud registration and real-world dynamic modelling-based virtual environment building method for teleoperation
Gao et al. Kinect-based motion recognition tracking robotic arm platform
CN108961393A (en) A kind of human body modeling method and device based on point cloud data stream
Valentini Natural interface in augmented reality interactive simulations: This paper demonstrates that the use of a depth sensing camera that helps generate a three-dimensional scene and track user's motion could enhance the realism of the interactions between virtual and physical objects
CN116844189A (en) Detection method and application of anchor frame and acupoint site of human body part
CN110722547B (en) Vision stabilization of mobile robot under model unknown dynamic scene
Broun et al. Bootstrapping a robot’s kinematic model
Salfelder et al. Markerless 3D spatio-temporal reconstruction of microscopic swimmers from video

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 200240 building 6, 646 Jianchuan Road, Minhang District, Shanghai

Applicant after: SHANGHAI JAKA ROBOTICS Ltd.

Address before: 200120 floor 1, building 1, No. 251, Yaohua Road, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai

Applicant before: SHANGHAI JAKA ROBOTICS Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: Building 6, 646 Jianchuan Road, Minhang District, Shanghai 201100

Patentee after: Jieka Robot Co.,Ltd.

Address before: 200240 building 6, 646 Jianchuan Road, Minhang District, Shanghai

Patentee before: SHANGHAI JAKA ROBOTICS Ltd.

CP03 Change of name, title or address