CN110328669A - The end orbit acquisition of robot for real training and tracking and device - Google Patents
The end orbit acquisition of robot for real training and tracking and device Download PDFInfo
- Publication number
- CN110328669A CN110328669A CN201910726487.4A CN201910726487A CN110328669A CN 110328669 A CN110328669 A CN 110328669A CN 201910726487 A CN201910726487 A CN 201910726487A CN 110328669 A CN110328669 A CN 110328669A
- Authority
- CN
- China
- Prior art keywords
- robot
- track
- camera
- image
- coordinate system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
The present invention provides the acquisition of the end orbit of the robot for real training and trackings and device, it can be widely applied in robot real training, allow real training person that can understand the motion process for intuitively seeing robot, camera the following steps are included: is mounted on the side of track pointer by method by end flange, offset of the tip position relative to robot end's flange coordinate system is calculated in the needle point that track pointer can be taken in the camera lens of camera;Camera acquires image data, handles image data, obtains the track for needing to track, while obtaining position of each pixel of track under camera coordinates system;The track that the tip position obtained by step 1 is obtained relative to the offset and step 2 of robot end's flange coordinate system, obtain position of the track under robot end's flange coordinate system, robot carries out track following by needle point according to position of the track under robot end's flange coordinate system is obtained.
Description
Technical field
The present invention relates to multi-axis robot technical field, more particularly, to the end orbit acquisition of the robot of real training and
Tracking and device.
Background technique
Robot will complete certain movement, require the edge robot end's executive item (End effector) in many cases
Given track, require movement according to certain speed and acceleration.
For example, user need to only provide the object pose of handgrip, it is thus necessary to determine that reach the path point of the target, the duration,
The trajectory parameters such as movement velocity, and the track required in computer-internal description.Finally, to the track of internal description, in real time
The displacement of calculating robot's movement, velocity and acceleration, generate motion profile.Robot end track is described, it can be with
The motion process of intuitive reflection robot.
Summary of the invention
In view of the above-mentioned problems, the present invention provides the acquisition of the end orbit of the robot for real training and trackings and dress
It sets, can be widely applied in robot real training, allow real training person that can understand the motion process for intuitively seeing robot.
Its technical solution is such that the end orbit acquisition of the robot of real training and tracking, feature exist
In, comprising the following steps:
Step 1: camera being mounted on to the side of track pointer by end flange, rail can be taken in the camera lens of camera
Offset of the tip position relative to robot end's flange coordinate system is calculated in the needle point of mark pointer;
Step 2: camera acquires image data, handles image data, obtains the track for needing to track, obtains simultaneously
Position of each pixel of track under camera coordinates system;
Step 3: offset and step of the tip position obtained by step 1 relative to robot end's flange coordinate system
2 obtained tracks, obtain position of the track under robot end's flange coordinate system, and robot is according to obtaining track in machine
Position under people's end flange coordinate system carries out track following by needle point.
Further, in step 1, it by hand and eye calibrating method, obtains camera coordinates system and robot end's flange is sat
The transformation relation matrix T1 of mark system obtains tip position and robot end's method according to the Design of Mechanical Structure size of robot
The transformation relation matrix T2 of blue coordinate system obtains the transformation relation between tip position and camera coordinates system by camera calibration
The transformation relation matrix T of matrix T3, needle point and robot end's flange coordinate system are obtained by following formula: T=T1*T2*T3,
To obtain offset of the tip position relative to robot end's flange coordinate system.
Further, in step 2, for the image data of camera acquisition, image denoising, image filtering, figure are first passed through
It as dilation erosion pre-processes image data, is then handled using Image Edge-Detection, obtains the track for needing to track,
Position of each pixel of track under camera coordinates system is obtained simultaneously.
Further, image denoising, image filtering use median filtering algorithm, and median filtering algorithm is non-thread mild-natured by kind
Filter slide replaces the intermediate value of each point in the value of any in the digital picture or Serial No. vertex neighborhood.
Further, image expansion corrosion includes image expansion and image enlargement, and image expansion is used for the height in image
Bright part is expanded, and certain fields expansion is highlighted, so that effect picture possesses the highlight regions bigger than original image;Figure corrosion is used
Fall in by the highlighted partial corrosion in image, certain fields reduction is highlighted, so that effect picture possesses highlight bar more smaller than original image
Domain.
Further, in step 3, track is passed through at Robotic inverse kinematics in the position under robot coordinate system
Reason, is converted to the motion information of each joint angles of robot, angular acceleration, and robot controller controls machine according to motion information
Needle point on device people reaches corresponding position, completes track following.
The end orbit of robot for real training acquires and tracking device, including robot, it is characterised in that: the machine
Camera and track pointer are installed, camera is mounted on the side of track pointer, institute by end flange on the end flange of device people
The needle point for stating camera lens towards the track pointer of camera is arranged, and the robot is configured to be driven by robot controller, also wrap
It includes: processor, memory and program;
Described program stores in the memory, and the processor calls the program of memory storage, above-mentioned to execute
End orbit acquisition and tracking.
Further, the track pointer is fixed on end flange by pointer mounting base, in the pointer mounting base
It is provided with pointer mounting hole.
The end orbit acquisition of robot for real training of the invention and tracking and device, devise comprising pointer
With the tracking device of camera, camera passes through hand for more easily tracking intended trajectory for acquiring the track to be tracked, needle point
Eye scaling method, obtains the transformation relation matrix T1 of camera coordinates system Yu robot end's flange coordinate system, according to robot
Design of Mechanical Structure size obtains the transformation relation matrix T2 of tip position Yu robot end's flange coordinate system, passes through camera
Calibration obtains the transformation relation matrix T3 between tip position and camera coordinates system, needle point and robot end's flange coordinate system
Transformation relation matrix T obtained by following formula: T=T1*T2*T3, to obtain tip position relative to robot end
The offset of flange coordinate system;Camera acquires image data, handles image data, obtains the track for needing to track, together
When obtain position of each pixel of track under camera coordinates system, robot is according to obtaining track in robot end's flange
Position under coordinate system carries out track following by needle point, and the present invention utilizes camera, extracts track, then robot is according to pass
The motion informations such as angle are saved, robot is allowed to reach designated position and then carry out track following, can understand the machine that intuitively reflects
The motion conditions of device people's end orbit, can be widely applied in robot real training, and real training person can understand and intuitively sees
The motion process of robot.
Detailed description of the invention
Fig. 1 is the flow chart of the end orbit acquisition and tracking of the robot for real training of the invention;
Fig. 2 is end orbit acquisition and the schematic diagram of tracking device of the robot for real training of the invention;
Fig. 3 is the track sample figure of the tracking in the present invention.
Specific embodiment
See Fig. 1, the end orbit acquisition of the robot for real training of the invention and tracking, comprising the following steps:
Step 1: camera being mounted on to the side of track pointer by end flange, rail can be taken in the camera lens of camera
Offset of the tip position relative to robot end's flange coordinate system is calculated, specifically, passing through hand in the needle point of mark pointer
Eye scaling method, obtains the transformation relation matrix T1 of camera coordinates system Yu robot end's flange coordinate system, according to robot
Design of Mechanical Structure size obtains the transformation relation matrix T2 of tip position Yu robot end's flange coordinate system, passes through camera
Calibration obtains the transformation relation matrix T3 between tip position and camera coordinates system, needle point and robot end's flange coordinate system
Transformation relation matrix T obtained by following formula: T=T1*T2*T3, to obtain tip position relative to robot end
The offset of flange coordinate system;
Step 2: camera acquires image data, handles image data, obtains the track for needing to track, obtains simultaneously
Position of each pixel of track under camera coordinates system,
Specifically, first passing through image denoising, image filtering, image expansion corrosion pair for the image data of camera acquisition
Image data is pre-processed, and is then handled using Image Edge-Detection, obtains the track for needing to track, while obtaining track
Position of each pixel under camera coordinates system;
Specifically, image denoising, image filtering use median filtering algorithm, median filtering algorithm passes through kind of a nonlinear smoothing
Filter replaces the intermediate value of each point in the value of any in the digital picture or Serial No. vertex neighborhood.
Image denoising, image filtering use median filtering algorithm, and median filtering algorithm passes through kind of a Nonlinear Smoothing Filter,
By the intermediate value replacement of each point in the value of any in the digital picture or Serial No. vertex neighborhood, if f (x, y) indicates digital picture
The gray value of pixel (x, y), filter window are that the median filter of A is defined as:
There is n Pixel of Digital Image point, when n is odd number, n several x1, the intermediate value of x2 ... xn is exactly suitable by numerical values recited
Sequence is in intermediate number;When n is even number, the average value of two mediants is intermediate value.
Specifically, image expansion corrosion includes image expansion and image enlargement, image expansion is used for will be highlighted in image
Part is expanded, and certain fields expansion is highlighted, so that effect picture possesses the highlight regions bigger than original image;Figure corrosion is used for
Highlighted partial corrosion in image is fallen, certain fields reduction is highlighted, so that effect picture possesses highlight regions more smaller than original image.
Image expansion: local maximum is sought;Step 1: a convolution kernel B is defined, core can be any shapes and sizes,
And possesses one and individually define the reference point-anchor point (anchorPoint) come;Here core be the square with reference point or
Core is known as template or exposure mask by person's disk;Step 2: core B and image A being subjected to convolution, calculates the pixel of the overlay area core B
Maximum value;Step 3: this maximum value is assigned to the specified pixel of reference point.
Image erosion: local minimum is sought;Step 1: a convolution kernel B is defined, core can be any shapes and sizes,
And possesses one and individually define the reference point-anchor point (anchorPoint) come;Here core be the square with reference point or
Core can be known as template or exposure mask by person's disk;Step 2: core B and image A being subjected to convolution, calculates the pixel of the overlay area core B
Point minimum value;Step 3: this minimum value is assigned to the specified pixel of reference point.
Step 3: offset and step of the tip position obtained by step 1 relative to robot end's flange coordinate system
2 obtained tracks obtain position of the track under robot end's flange coordinate system, by track under robot coordinate system
Position is handled by Robotic inverse kinematics, is converted to the motion information of each joint angles of robot, angular acceleration, robot control
For device processed according to motion information, the needle point controlled in robot reaches corresponding position, completes track following.
See Fig. 2, Fig. 3, the end orbit acquisition of the robot for real training and tracking device, including robot, robot
End flange on camera 1 and track pointer 2 be installed, camera 1 is mounted on the side of track pointer 2, phase by end flange 3
The needle point of the camera lens of machine 1 towards track pointer 2 is arranged, and specifically in the present embodiment, track pointer 2 is solid by pointer mounting base 4
It is scheduled on end flange 3, multiple pointer mounting holes 5 is provided in pointer mounting base 4, robot is configured to be controlled by robot
Device driving, further includes: processor, memory and program;
Program stores in memory, and processor calls the program of memory storage, is adopted with the end orbit for executing above-mentioned
Collection and tracking.
The end orbit of robot for real training of the invention acquires and tracking device, is installed by the way that multiple pointers are arranged
Hole can guarantee that camera can have the visual field good enough with the positional relationship of rational deployment camera and needle point;And guarantee in camera
Within sweep of the eye, the problems such as needle point is in pursuit path, and needle point will not be interfered, be collided.
In the realization of the above-mentioned dot laser measuring device based on real training robot, between memory and processor directly or
It is electrically connected indirectly, to realize the transmission or interaction of data.For example, these elements between each other can be by one or more
Communication bus or signal wire, which are realized, to be electrically connected, and can such as be connected by bus.Realization data access is stored in memory
The computer executed instructions of control method, the software that can be stored in memory in the form of software or firmware including at least one
Functional module, the software program and module that processor is stored in memory by operation, thereby executing various function application
And data processing.
Memory may be, but not limited to, random access memory (Random Access Memory, referred to as: RAM),
Read-only memory (Read Only Memory, referred to as: ROM), programmable read only memory (Programmable Read-Only
Memory, referred to as: PROM), erasable read-only memory (Erasable Programmable Read-Only Memory, letter
Claim: EPROM), electricallyerasable ROM (EEROM) (Electric Erasable Programmable Read-Only Memory,
Referred to as: EEPROM) etc..Wherein, memory is for storing program, and processor executes program after receiving and executing instruction.
Processor can be a kind of IC chip, the processing capacity with signal.Above-mentioned processor can be logical
With processor, including central processing unit (Central Processing Unit, referred to as: CPU), network processing unit (Network
Processor, referred to as: NP) etc..It may be implemented or execute disclosed each method, step and the logic in the embodiment of the present application
Block diagram.General processor can be microprocessor or the processor is also possible to any conventional processor etc..
The end orbit of robot for real training of the invention acquires and tracking is obtained by hand and eye calibrating method
To the transformation relation matrix T1 of camera coordinates system and robot end's flange coordinate system, according to the Design of Mechanical Structure ruler of robot
Very little, the transformation relation matrix T2 for obtaining tip position and robot end's flange coordinate system obtains needle point position by camera calibration
Set the transformation relation matrix T3 between camera coordinates system, the transformation relation matrix T of needle point and robot end's flange coordinate system
It is obtained by following formula: T=T1*T2*T3, to obtain offset of the tip position relative to robot end's flange coordinate system
Amount;Each of camera acquires image data, handles image data, obtain the track for needing to track, while obtaining track
Position of the pixel under camera coordinates system, robot according to obtaining position of the track under robot end's flange coordinate system,
Track following is carried out by needle point, the present invention utilizes camera, extracts track, and then robot believes according to movements such as joint angles
Breath allows robot to reach designated position and then carries out track following, can understand and intuitively reflect robot end track
Motion conditions can be widely applied in robot real training, and real training person can understand and intuitively sees being moved through for robot
Journey.
It is obvious to a person skilled in the art that invention is not limited to the details of the above exemplary embodiments, Er Qie
In the case where without departing substantially from spirit or essential attributes of the invention, the present invention can be realized in other specific forms.Therefore, no matter
From the point of view of which point, the present embodiments are to be considered as illustrative and not restrictive, and the scope of the present invention is by appended power
Benefit requires rather than above description limits, it is intended that all by what is fallen within the meaning and scope of the equivalent elements of the claims
Variation is included within the present invention.
In addition, it should be understood that although this specification is described in terms of embodiments, but not each embodiment is only wrapped
Containing an independent technical solution, this description of the specification is merely for the sake of clarity, and those skilled in the art should
It considers the specification as a whole, the technical solutions in the various embodiments may also be suitably combined, forms those skilled in the art
The other embodiments being understood that.
Claims (8)
1. the end orbit acquisition of the robot for real training and tracking, which comprises the following steps:
Step 1: camera being mounted on to the side of track pointer by end flange, track can be taken in the camera lens of camera and referred to
Offset of the tip position relative to robot end's flange coordinate system is calculated in the needle point of needle;
Step 2: camera acquires image data, handles image data, obtains the track for needing to track, while obtaining track
Position of each pixel under camera coordinates system;
Step 3: the tip position obtained by step 1 is obtained relative to the offset and step 2 of robot end's flange coordinate system
The track arrived, obtains position of the track under robot end's flange coordinate system, and robot is last in robot according to track is obtained
Position under end flanges coordinate system carries out track following by needle point.
2. the end orbit of the robot according to claim 1 for real training acquires and tracking, it is characterised in that:
In step 1, by hand and eye calibrating method, the transformation relation matrix of camera coordinates system Yu robot end's flange coordinate system is obtained
T1 obtains the transformation relation of tip position Yu robot end's flange coordinate system according to the Design of Mechanical Structure size of robot
Matrix T2 obtains the transformation relation matrix T3 between tip position and camera coordinates system, needle point and robot by camera calibration
The transformation relation matrix T of end flange coordinate system is obtained by following formula: T=T1*T2*T3, to obtain tip position phase
For the offset of robot end's flange coordinate system.
3. the end orbit of the robot according to claim 2 for real training acquires and tracking, it is characterised in that:
In step 2, for the image data of camera acquisition, image denoising, image filtering, image expansion corrosion are first passed through to picture number
Each of according to being pre-processed, then handled using Image Edge-Detection, obtain the track for needing to track, while obtaining track
Position of the pixel under camera coordinates system.
4. the end orbit of the robot according to claim 3 for real training acquires and tracking, it is characterised in that:
Image denoising, image filtering use median filtering algorithm, and median filtering algorithm is by kind of a Nonlinear Smoothing Filter, by digitized map
The intermediate value of each point replaces in the value of some vertex neighborhood in picture or Serial No..
5. the end orbit of the robot according to claim 3 for real training acquires and tracking, it is characterised in that:
Image expansion corrosion includes image expansion and image enlargement, and image expansion is high for expanding the high bright part in image
Bright part field expansion, so that effect picture possesses the highlight regions bigger than original image;Figure corrosion in image for that will highlight
Partial corrosion falls, and certain fields reduction is highlighted, so that effect picture possesses highlight regions more smaller than original image.
6. the end orbit of the robot according to claim 3 for real training acquires and tracking, it is characterised in that:
In step 3, the position by track under robot coordinate system is handled by Robotic inverse kinematics, is converted to robot and is respectively closed
Angle, the motion information of angular acceleration are saved, according to motion information, the needle point controlled in robot reaches accordingly robot controller
Position, complete track following.
7. the end orbit acquisition of the robot for real training and tracking device, including robot, it is characterised in that: the machine
Camera and track pointer are installed, camera is mounted on the side of track pointer by end flange, described on the end flange of people
The needle point of the camera lens of camera towards track pointer is arranged, and the robot is configured to be driven by robot controller, further includes:
Processor, memory and program;
Described program stores in the memory, and the processor calls the program of memory storage, requires 1 with perform claim
The end orbit acquisition and tracking.
8. the end orbit acquisition for the robot of real training and tracking device according to claim 7, it is characterised in that: institute
It states track pointer to be fixed on end flange by pointer mounting base, is provided with pointer mounting hole in the pointer mounting base.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910726487.4A CN110328669B (en) | 2019-08-07 | 2019-08-07 | Terminal track acquisition and tracking method and device for practical training robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910726487.4A CN110328669B (en) | 2019-08-07 | 2019-08-07 | Terminal track acquisition and tracking method and device for practical training robot |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110328669A true CN110328669A (en) | 2019-10-15 |
CN110328669B CN110328669B (en) | 2021-03-09 |
Family
ID=68148958
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910726487.4A Active CN110328669B (en) | 2019-08-07 | 2019-08-07 | Terminal track acquisition and tracking method and device for practical training robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110328669B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113223048A (en) * | 2021-04-20 | 2021-08-06 | 深圳瀚维智能医疗科技有限公司 | Hand-eye calibration precision determination method and device, terminal equipment and storage medium |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011011315A (en) * | 2009-07-06 | 2011-01-20 | Canon Inc | Component assembling method |
CN102554938A (en) * | 2010-12-31 | 2012-07-11 | 中国科学院计算技术研究所 | Tracking method for mechanical arm tail end trajectory of robot |
CN105590541A (en) * | 2016-03-18 | 2016-05-18 | 长沙工控帮教育科技有限公司 | Three-axle linkage training platform |
CN105710881A (en) * | 2016-03-16 | 2016-06-29 | 杭州娃哈哈精密机械有限公司 | Continuous trajectory planning transition method for robot tail end |
CN107081775A (en) * | 2017-05-31 | 2017-08-22 | 西京学院 | A kind of robot and the integrated Practical training equipment of NI Vision Builder for Automated Inspection |
CN107618030A (en) * | 2016-07-16 | 2018-01-23 | 深圳市得意自动化科技有限公司 | The Robotic Dynamic tracking grasping means of view-based access control model and system |
CN208744876U (en) * | 2018-05-28 | 2019-04-16 | 珠海格力智能装备有限公司 | Robot trajectory's mechanism for testing |
US20190143517A1 (en) * | 2017-11-14 | 2019-05-16 | Arizona Board Of Regents On Behalf Of Arizona State University | Systems and methods for collision-free trajectory planning in human-robot interaction through hand movement prediction from vision |
-
2019
- 2019-08-07 CN CN201910726487.4A patent/CN110328669B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011011315A (en) * | 2009-07-06 | 2011-01-20 | Canon Inc | Component assembling method |
CN102554938A (en) * | 2010-12-31 | 2012-07-11 | 中国科学院计算技术研究所 | Tracking method for mechanical arm tail end trajectory of robot |
CN105710881A (en) * | 2016-03-16 | 2016-06-29 | 杭州娃哈哈精密机械有限公司 | Continuous trajectory planning transition method for robot tail end |
CN105590541A (en) * | 2016-03-18 | 2016-05-18 | 长沙工控帮教育科技有限公司 | Three-axle linkage training platform |
CN107618030A (en) * | 2016-07-16 | 2018-01-23 | 深圳市得意自动化科技有限公司 | The Robotic Dynamic tracking grasping means of view-based access control model and system |
CN107081775A (en) * | 2017-05-31 | 2017-08-22 | 西京学院 | A kind of robot and the integrated Practical training equipment of NI Vision Builder for Automated Inspection |
US20190143517A1 (en) * | 2017-11-14 | 2019-05-16 | Arizona Board Of Regents On Behalf Of Arizona State University | Systems and methods for collision-free trajectory planning in human-robot interaction through hand movement prediction from vision |
CN208744876U (en) * | 2018-05-28 | 2019-04-16 | 珠海格力智能装备有限公司 | Robot trajectory's mechanism for testing |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113223048A (en) * | 2021-04-20 | 2021-08-06 | 深圳瀚维智能医疗科技有限公司 | Hand-eye calibration precision determination method and device, terminal equipment and storage medium |
CN113223048B (en) * | 2021-04-20 | 2024-02-27 | 深圳瀚维智能医疗科技有限公司 | Method and device for determining hand-eye calibration precision, terminal equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN110328669B (en) | 2021-03-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7071054B2 (en) | Information processing equipment, information processing methods and programs | |
JP5647155B2 (en) | Body feature detection and human pose estimation using inner distance shape relation | |
EP3753685A1 (en) | Control system and control method | |
CN106256512B (en) | Robot device including machine vision | |
US20030219147A1 (en) | Object tracking apparatus and method | |
JP6399832B2 (en) | Pattern matching method and pattern matching apparatus | |
CN112446917B (en) | Gesture determination method and device | |
JP6973444B2 (en) | Control system, information processing device and control method | |
US20150104068A1 (en) | System and method for locating fiducials with known shape | |
WO2020141468A1 (en) | Method and system for detecting position of a target area in a target subject | |
US10623629B2 (en) | Imaging apparatus and imaging condition setting method and program | |
US20080019568A1 (en) | Object tracking apparatus and method | |
CN110328669A (en) | The end orbit acquisition of robot for real training and tracking and device | |
CN114505864A (en) | Hand-eye calibration method, device, equipment and storage medium | |
JP2022152845A (en) | Calibration device for controlling robot | |
KR20130075712A (en) | A laser-vision sensor and calibration method thereof | |
JP7439410B2 (en) | Image processing device, image processing method and program | |
KR102333768B1 (en) | Hand recognition augmented reality-intraction apparatus and method | |
CN112347837A (en) | Image processing system | |
US9508192B2 (en) | Image processing device, image processing method, and image processing program | |
CN115272410A (en) | Dynamic target tracking method, device, equipment and medium without calibration vision | |
Baek et al. | Full state visual forceps tracking under a microscope using projective contour models | |
JP7404017B2 (en) | Image processing method, image processing device, production system, article manufacturing method, program, and recording medium | |
CN110900606B (en) | Hand-eye linkage system based on small mechanical arm and control method thereof | |
CN109389645A (en) | Camera method for self-calibrating, system, camera, robot and cloud server |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |