CN101568307B - Method and machine for manipulating toxic substances - Google Patents

Method and machine for manipulating toxic substances Download PDF

Info

Publication number
CN101568307B
CN101568307B CN2007800434719A CN200780043471A CN101568307B CN 101568307 B CN101568307 B CN 101568307B CN 2007800434719 A CN2007800434719 A CN 2007800434719A CN 200780043471 A CN200780043471 A CN 200780043471A CN 101568307 B CN101568307 B CN 101568307B
Authority
CN
China
Prior art keywords
image
pin
pixel
coordinate
container
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN2007800434719A
Other languages
Chinese (zh)
Other versions
CN101568307A (en
Inventor
达尼埃莱·巴尔达萨里
阿尔费里诺·加巴里尼
保罗·吉里博纳
维尔纳·赖纳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Omani Co.,Ltd.
Original Assignee
Health Robotics SRL
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from ITBO20060840 external-priority patent/ITBO20060840A1/en
Application filed by Health Robotics SRL filed Critical Health Robotics SRL
Priority claimed from PCT/IB2007/003577 external-priority patent/WO2008062285A2/en
Publication of CN101568307A publication Critical patent/CN101568307A/en
Application granted granted Critical
Publication of CN101568307B publication Critical patent/CN101568307B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M5/00Devices for bringing media into the body in a subcutaneous, intra-vascular or intramuscular way; Accessories therefor, e.g. filling or cleaning devices, arm-rests
    • A61M5/178Syringes
    • A61M5/20Automatic syringes, e.g. with automatically actuated piston rod, with automatic needle injection, filling automatically
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M5/00Devices for bringing media into the body in a subcutaneous, intra-vascular or intramuscular way; Accessories therefor, e.g. filling or cleaning devices, arm-rests
    • A61M5/178Syringes
    • A61M5/31Details
    • A61M5/32Needles; Details of needles pertaining to their connection with syringe or hub; Accessories for bringing the needle into, or holding the needle on, the body; Devices for protection of needles
    • A61M5/3287Accessories for bringing the needle into the body; Automatic needle insertion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Animal Behavior & Ethology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Quality & Reliability (AREA)
  • Radiology & Medical Imaging (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Infusion, Injection, And Reservoir Apparatuses (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Manipulator (AREA)

Abstract

In a machine for manipulating toxic substances, an anthropomorphic robot (13) transfers a container (3), which has a mouth (4) closed by a cap (5) made of perforable material, from a magazine (2) for containers to a dosage station (6), set in which is a syringe (9) provided with a needle (11), axially aligns the mouth (4) with the needle (11), correcting the position of the mouth (4) according to the position of the tip of the needle (11) obtained from processing a number of images of the needle (11) acquired from at least two distinct observation points (OT, OS), and finally approaches the container (3) to the syringe (9) in such a way that the needle (11) perforates the cap (5) and penetrates into the container (3) to be able to inject or extract a substance into or out of the container (3).

Description

The method of manipulating toxic substances and machine
Technical field
The present invention relates to be used for the method and the machine of manipulating toxic substances.
Particularly, the present invention has found that the machine of automatic manipulating toxic substances is used for preparing automatically the favourable but not exclusive application of cytostatic medicament, and following description will clearly be introduced this application, but this application does not influence the versatility of this machine.
Background technology
There is the machine that is used for automatic manipulating toxic substances on the market, it comprises: storage device (magazine), be used for container, but each container is provided with the mouth of the respective cap sealing of being made by perforated material and comprises the necessary material of preparation medicine such as capsule, arrow-necked bottle etc.; Wherein there is at least one seat in dispensing table, and described seat is designed to keep the commercial style syringe of correspondence; And anthropomorphic robot, be provided with clamping head, this clamping head is used for picking up container from storage device, this container is being sent to dispensing table corresponding to the syringe place, and then make the pin of container by this way near syringe, that is, pin is by penetrating in the container medicated cap, thereby material can be injected into container or extract material from container.In addition, be provided with the optical pickocff of light cell type in these machines, with axially aligning of the pin that helps mouth and syringe.
The major defect of this machine is to carry out having sizable difficulty on time at the container that will have undersized mouth, and this is because the precision level of pick off when detecting the terminal position of pin is very low.In fact, frequently, because production defective or because storage and/or the damage that causes of transportation, the end portion of pin may be crooked, and undersized mouth often causes and can not suitably aim at, and therefore makes pin can not suitably insert container.
Summary of the invention
Main purpose of the present invention provides a kind of method and the machine of automatic manipulating toxic substances to prepare cytostatic medicament that be used for, and provide a kind of computer program of implementing described method, described method and described machine have been eliminated above-mentioned shortcoming, meanwhile, also convenient enforcement and cheap.
According to the present invention, method and machine that is used for manipulating toxic substances and the computer program of implementing described method are provided according to the qualification of claims.
Description of drawings
Describe the present invention with reference to the accompanying drawings, accompanying drawing illustrates the limiting examples of the embodiment of the invention, in the accompanying drawing:
Fig. 1 is the sketch map of the machine that is used for manipulating toxic substances constructed in accordance;
Fig. 2 shows the part of the machine of Fig. 1;
Fig. 3 shows at the Descartes's reference system and the syringe of expressing in another non-Cartesian reference system;
Fig. 4 is according to the flow chart of of the present invention and the method that is used for manipulating toxic substances implemented in Fig. 1 machine;
Fig. 5 to 7 is sketch maps of some steps of the flow chart of Fig. 4;
Fig. 8 is the flow chart according to the method that is used for manipulating toxic substances of further embodiment of this invention; And
Fig. 9 to 11 is sketch maps of some steps of the flow chart of Fig. 8.
The specific embodiment
In Fig. 1, the machine that is used for manipulating toxic substances is represented with 1 on the whole, this machine comprises: storage device 2 (being schematically shown by a parallelepiped), be used for container 3 such as capsule, arrow-necked bottle etc., these containers are used for the necessary poisonous and nontoxic material of medication preparation, in the described container each is provided with mouth 4, but respective cap 5 sealings that this mouth has axis 4a and made by perforated material; Dispensing table 6, wherein be provided with drug dispensing component 7, this drug dispensing component has three seats 8 that are designed to keep the syringe 9 of same quantity, these syringes are various forms of commercial type syringes, promptly, the syringe of such type, it comprises columniform body 10, is assemblied in the pin 11 at body 10 1 end places, plunger 12 in axial sliding in body 10; And anthropomorphic robot 13, it is provided with clamping head 14, this clamping head comprises holder 15 that is designed to holding vessels 3 and the articulated jib with six-freedom degree (articulated arm) 16, this articulated jib is designed to supporting member 14 so that container 3 can be sent to dispensing table 6 and then make the pin 11 of container 3 self near described syringe 9 by this way at the point corresponding to syringe 9 from storage device 2, that is, thus pin 11 is by penetrating in the container 3 medicated cap 5.
In addition, machine 1 is provided with control unit 17, and this control unit is used to the startup controlling robot 13, drug dispensing component 7 and be used for the member (not shown) of mobile storage device 2.Drug dispensing component 7 and robot 13 are contained in the aseptic cabin (not shown) that communicates with storage device 2, pollute medicine to prevent antibacterial or microorganism during the preparation of medicine.
Drug dispensing component 7 comprises the platform 18 that rotates around horizontal axis 18a, and three seats 8 that are used for syringe are installed on described platform 18.Each seat 8 includes: clamping element 19, and it is vertical substantially that this clamping element makes the longitudinal axis 9a of syringe 9 simultaneously with respect to platform 18 fixing bodies 10 with maintenance syringe 9; And actuation element 20, thereby be used for withdrawing plunger 12 with a certain amount of material suction syringe 9 or be used for plunger depressed 12 so that a certain amount of material is expelled in the container 3.Actuation element 20 is connected to each other rigidly by bar 21, and described bar is designed to vertically move along the linear guide 22 that forms on platform 18.The rotation of platform 18 can make that (in this position, the end of pin 11 upwards) (in this position, pin 11 downwards terminal changes direction between as shown in fig. 1) to syringe 9 with material being expelled to position in the container in the position of suction material.
This machine also comprises: the black and white that focuses simulated television video camera 23, and it is provided with the synthetic output device (not shown) of video, and is installed on 14 in mode regularly; And computer 24 (for example PC framework type computer), thereby its be connected to hospital computer system 25 be used for television camera 23 exchange about the data of preparation medicine obtain detected image and with control unit 17 exchanges about the data of preparation medicine above-mentioned member with control machine 1.
With reference to figure 2,14 holder 15 comprises two jaws 26, and these two jaws move along slip axis 26a, and by mutually near and the motion that retreats distinguish clamping and release container 3.Television camera 23 is installed on 14, and the optical axis 23a of television camera self is perpendicular to the residing plane of slip axis 26a.
Computer 24 comprises: image pick-up card 27, and its " frame grabber " type that is known, and have the synthetic input equipment (not shown) of video of the synthetic output device of video that is connected to television camera 23; Interface unit 28, touch screen for example is to realize the operation mutual with machine 1; Processing unit 29, it is designed to handle about the data of medicine and the image of acquisition, so that determine the order to control unit 17 to be sent; And communication unit 30, it is connected to control unit 17 and is connected to hospital computer system 25.
Being loaded in the processing unit 29 is control sequence, and this control sequence is designed to when implementing according to the method that is used for manipulating toxic substances of the present invention when moving on one's body at processing unit 29.
In order to simplify description, hereinafter will be with reference to the material of aequum is injected into processing example the container 3 from syringe 9, in wherein said syringe 9 has been located in the corresponding seat 8 and described material has been packed into the body 10.
In described example, this method is to arm 16 control of robot 13, so that pick up container 3 and be sent to the dispensing table 6 from storage device 2, and basically meanwhile, platform 18 rotates so that syringe 9 enters injection position, that is, pin 11 is faced down.At this moment, arm 16 is positioned at container 3 below of pin 11 and makes mouth 4 towards last.Arm makes the mouth 4 of container 3 axially aim at pin 11, that is, its axis 4a that makes mouth 4 basically with the dead in line of pin 11.Subsequently, arm vertically raises container 3 towards pin 11 by this way, that is, pin 11 is after passing medicated cap 5, and its sizable part is thrust in the container 3.At last, actuation element 20 is moved down, so that plunger 12 is pressed in the body 10 of syringe 9 and makes plunger just be enough to the material of requirement is expelled in the container 3.
According to the present invention, the mouth 4 of container 3 is designed to pin 11 axially aligned steps: two images that obtain pin 11 from two different separately points of observation; Handle described image with the position of the end of definite pin 11 and the gradient (being the gradient of the axis of pin 11) of pin 11 with respect to the axis 9a of syringe 9; And the people's 13 that starts the machine arm 16, so that come the position of Calibration Head 14 according to the gradient of the position of the end of pin 11 and pin 11, and and then the position of proofreading and correct container 3.Particularly, two treatment of picture are substantially: determine to calculate as the terminal position of the pin 11 of the function of described coordinate and the gradient of pin 11 along the space coordinates of two points of pin 11 and on the basis of known optical triangulation method principle at each image.
With reference to figure 3, identify the point of a section of having expressed pin 11 with P, this is expressed with respect to the cartesian coordinate system with three axle X, Y, Z, and its axis Z is perpendicular to the drawing plane, and the axis 9a of syringe 9 is parallel to a Z.In Fig. 3, represent two different finding a view (framing) direction perpendicular to axle Z with FT and FS.Being limited on view direction FT and the FS is two corresponding point of observation OT and OS, and these two points of observation are arranged on the substantially the same distance of range points P.In order to obtain image from each point of observation OT and OS, the arm 16 of robot 13 moves and makes 14 to rotate around axle Z, make optical axis 23a be parallel to view direction FT among the point of observation OT simultaneously thereby at first television camera 23 is positioned at, television camera is positioned at makes optical axis be parallel to view direction FS among the point of observation OS simultaneously subsequently.
It should be noted that if view direction FT is vertical mutually with FS then optical triangulation method will produce optimum result.Yet,, form acute angle between view direction FT and the FS for the mode with the external overall dimensions that reduces aseptic cabin limits the moving range of the arm 16 of robot 13.Particularly, refer again to Fig. 3, view direction FT and FS and an axle Y have formed two corresponding angle [alpha] and β, and view direction FT and FS be perpendicular to two planes of finding a view accordingly, and each plane limits by axle Z and perpendicular to corresponding axis t, the s of axle Z respectively.The value of angle [alpha] and β be make them and (alpha+beta) less than 90 °, and more specifically smaller or equal to 40 °.Point P with respect to the coordinate of cartesian axis X and Y by X PAnd Y PExpression, and some P with respect to the coordinate of axle t and s by t pAnd s pExpression.Under any circumstance, the coordinate with respect to axle t and s can come with reference to Descartes's referential by following triangular transformation:
Y = ( t cos ( α ) - s cos ( β ) ) · 1 tan ( α ) + tan ( β ) ; - - - ( 1 )
X = Y · tan ( β ) + s cos ( β ) . - - - ( 2 )
In order to determine the gradient of pin 11, must determine at least two points of pin 11 processes, these two points will be represented by P1 and P2 hereinafter.In case known cartesian coordinate X2, Y2, the Z2 of cartesian coordinate X1, Y1, Z1 and the some P2 of some P1, just can draw gradient by following triangle relation formula:
RX = arctan ( Y 1 - Y 2 Z 1 - Z 2 ) , - - - ( 3 )
RY = arctan ( X 1 - X 2 Z 1 - Z 2 ) , - - - ( 4 )
Wherein RX and RY are the angles that forms with axle X and Y respectively in the projection of the part between a P1 and the P2 on plane X-Z and Y-Z of pin 11.
Fig. 4 shows to have described and handles flow chart with the gradient of the terminal position of determining pin 11 and pin 11 to obtain two images from corresponding point of observation OT and OS.Fig. 5 subsequently, 6 and 7 is sketch maps of some steps of described processing.Flame Image Process is carried out in the part of described control sequence, and described control sequence is with known programming language
Figure G2007800434719D00073
Write and
Figure G2007800434719D00074
Figure G2007800434719D00075
Be assembled into data chained library (DLL) in the environment.
With reference to figure 4, the image of pin 11 at first obtains (module 100) by image pick-up card 27 with data mode, after this limit region-of-interest 31 (Fig. 5 of image,), that is, the rectangle part of image will be searched pin 11 (module 101) in this region-of-interest 31, and then the picture contrast threshold value is adjusted to minima, surpass this threshold value and just can detect pin (module 102).Region-of-interest 31 is to operate by the screening of the image of pin 11 being carried out known type to limit.
From the bottom (z=zi, Fig. 5) beginning delegation connects delegation ground and checks region-of-interest 31 (module 103), and searches the edge in described each enterprising line scanning of row so that use is called as the function of " IMAQEdge Tool ", and above-mentioned function setup is useful on
Figure G2007800434719D00076
The storehouse that is called as IMAQ Vision of language (module 104).Under the situation of finding edge (from module 105 output YES), then verify to assess terminal compatible that described edge whether can be with pin 11, promptly, whether described edge falls in compatibility area (compatibility area) 32 (Fig. 5), this compatibility area is defined as the given rectangle part in the region-of-interest 31, and the end of expectation pin 11 is positioned (module 106) in this given rectangle part.If do not find the edge, then check next line (z=z+dz, Fig. 5) (module 107).Do not find under the situation of any edge (from module 108 output YES) at the row (zmax) that arrives the number maximum, then produce the error message (module 109) of " pin can not be discerned " type.
If the edge of being found is compatible (from module 106 output YES), then from image, extract the coordinate at described edge, described edge is corresponding to first P1 that searches and the end (Fig. 5) (module 110) that limits pin 11; In addition, contrast threshold (module 111) is increased specified rate so that make the quality that the search of pin 11 is adapted to image, and since the first capable inspection (module 103) that repeats region-of-interest.But in compatibility area, do not find under the situation of any edge (from module 112 output YES) error message (module 109) that then produces " pin is in non-desired locations " type at the maximum that has reached contrast threshold.
After the coordinate of the some P1 that has determined to represent pin 11 ends, determine second P2 of pin 11 processes.
Particularly, the row that forwards capable scheduled volume (OFFSET) row of point of distance P1 to is checked (Fig. 6) (module 113), contrast threshold is adjusted to maximum (module 114), and in described row, scans so that search edge (module 115) with above-mentioned " IMAQ Edge Tool " function.
As long as found edge (from module 116 output YES), whether then verify to be evaluated at the first edge EF and the distance between the final edge EL found in the described row is compatible with the lateral dimension of pin 11, promptly, in given tolerance, whether described distance equals the diameter (module 117) of pin 11.Under the situation of not finding the distance incompatible with the size of pin 11 (from module 117 output NO) between edge (from module 116 output NO) or edge EF and the EL, then contrast threshold is reduced specified rate (module 118) and repeat the described line scanning of advancing (module 115).But do not find under the situation of the distance incompatible with the size of pin 11 (from module 119 output YES) between any edge or edge EF and the EL error message (module 119) that then produces " pin can not be discerned " type in the minima that reaches contrast threshold.
If the distance between edge EF that finds in described row and the EL and the size of pin 11 are compatible (from module 117 output YES), then put P2 and be calculated as intermediate point (Fig. 7) (module 120) between the point of determining by edge EF and EL.
It should be noted that the some P1 that provided by module 110 and 120 and the coordinate of P2 represent with pixel.Therefore, after determining some P2, undertaken the coordinate of a P1 and P2 is become the operation of millimeter (module 121) from pixel transitions, will be described below this operation by using the conversion coefficient that from calibration steps, obtains.
Each image in two images that obtain all repeated to carry out the algorithm (module 122) that limits by module 100 to 121.By this way, from the image that obtains from point of observation OT and what obtain is a P1 and the P2 coordinate with respect to axle t and z, and from the image that obtains from point of observation OS and what obtain is a P1 and the P2 coordinate with respect to axle s and z.Generally, coordinate t2, s2, the z2 of coordinate t1, s1, z1 and the some P2 of some P1 have been obtained.
At this moment, use expression formula (1) and (2) obtaining, and application expression formula (3) and (4) is calculated the gradient (module 124) as the pin 11 of the function of the coordinate of two some P1 and P2 as the cartesian coordinate X1 of the function of coordinate s1 and t1 and Y1 and as the cartesian coordinate X2 and the Y2 (module 123) of the function of coordinate s2 and t2.
About aforementioned calibration steps, it was carried out before any preparation of medicine, and this calibration steps is designed to obtain and is fixed in the normal place other three images of pin 11 of the syringe 9 of (for example, be contained in the corresponding seat 8 and make the terminal surface of pin 11 downward).Television camera 23 is positioned at (for example in an OT) in one of them point of observation and makes after optical axis 23a is parallel to corresponding view direction FT at arm 16, write down first image in the described other image by robot 13.After passing through arm 16 mobile TV video cameras 23 once more, obtain second image and the 3rd image, television camera for the first time only for the second time only moves through the corresponding known quantity of representing with millimeter along axle Z along axle t separately, and the axis 9a that keeps at a distance apart from substantially constant.By utilizing aforesaid algorithm, calculate of the displacement of the pixel of pin 11 in second image and the 3rd image with respect to the position that is recorded in the pin 11 in first image, and then, ratio by between the displacement of calculating the displacement represented with millimeter and representing with pixel has just obtained two conversion coefficients from the pixel to the millimeter.
According to still another embodiment of the invention, as shown in Figure 8, it shows a new flow chart, yet wherein, with corresponding some modules of the module of having described with Fig. 4 in the identical label of used label identify, and once more shown in Fig. 9 to 11, wherein with the element corresponding elements of describing respectively with Fig. 5 to 7 in the identical label of used label identify, the region-of-interest 31 of each data image by handling pin 11 is in a different manner determined the some P1 and the P2 that are searched.
Each data image of pin 11 constitutes by the picture element matrix that obtains according to known data acquisition technology.Therefore, Dui Ying region-of-interest 31 is made of the submatrix of described picture element matrix.
Fig. 9 shows a part of image 33 of region-of-interest 31 in a simplified manner, wherein can notice the end of pin 11.This part shown image 33 is divided into a plurality of pixels.Those pixels corresponding to the end of pin 11 are represented with oblique line.The position of relative pixel PX in the compatibility area 32 determined by a pair of coordinate of representing with pixel, and this represents with respect to axle Z in the coordinate first and identified by r, and this to second in the coordinate with respect to axle t, s represents and is identified by c.In fact, coordinate r has determined one-row pixels, and coordinate c has determined the string pixel.
In Figure 10 and 11, show that part of image 33 identical in some treatment step processes of the flow chart of Fig. 8 with Fig. 9, for the purpose of clearer, only some pixels with the end of pin 11 illustrate with oblique line.
With reference to Fig. 8, at the image that obtains pin 11 and determined corresponding region-of-interest 31 (module 100 and 101) afterwards, (pixel intensity threshold) sets predetermined value (module 200) for the pixel intensity threshold value.Selectively, the value of luminance threshold can be regulated by the operator.
Then, begin to check region-of-interest 31 (module 201) from minimum pixel column (r=0) by pixel column.In each pixel column (determining) by relative coordinate (generic coordinate) r, with respect to search the first pixel PXF and last pixel PXL (Figure 10) along the predetermined search direction 34 (Figure 10) of this pixel column, the brightness of these two pixels is higher than luminance threshold (module 202).Coordinate c, the r of pixel PXF are stored in the correspondence position of the first pixel vectors VF, and the coordinate of pixel PXL is stored in the correspondence position of the second pixel vectors VL (module 203).Subsequently, check next pixel column (r=r+1) (module 204), thereby each pixel column is repeated to be stored among the corresponding vectorial VF and VL, up to the row (r=rmax) (module 205) of the maximum number that arrives region-of-interest 31 for the search of pixel PXF and PXL and with them.These two pixel vectors VF and VL implement in the internal storage (not shown) of processing unit 29 (Fig. 2).
At this moment, carry out the linearisation circulation to determine two straight lines according to two vectorial VF and VL, these two straight lines can accurately be determined the gradient of the terminal position and the pin 11 of pin 11.
Particularly, with reference to Fig. 8 and 11, after cyclic variable I being initialized to zero (module 206), this cyclic design becomes at first to be carried out calculating first straight line (being identified by LF) on the basis of linear interpolation in Figure 11 to being stored in coordinate among the vectorial VF, and is carrying out calculating second straight line (being identified by LL) (module 207) on the basis of linear interpolation in Figure 11 to being stored in coordinate among another vectorial VL.Straight line LF is represented by following equation:
c=MF·r+QF,(5)
And straight line LL is represented by following equation:
c=ML·r+QL,(6)
Wherein, MF and ML are coefficient of angularities, and QF and QL are the side-play amounts of respective straight LF and LL.
Subsequently, check that first condition is about coefficient of angularity MF and ML, promptly so that satisfied the calculating whether correct (module 208) of verifying straight line LF and LL on the basis of following two conditions whether, whether the absolute value of their difference is less than preset limit difference DM
|MF-ML|<DM, (7)
And second condition be about side-play amount (offset) QF and QL, that is, whether their difference is included between predetermined lowest difference DQmin and the predetermined maximum DQmax of difference,
DQmin<(QF-QL)<DQmax. (8)
Limit difference DM preferably equals 0.2.Lowest difference preferably equals 10, and maximum difference DQmax preferably equals 200 pixels.
If at least one in two conditions (7) and (8) do not satisfy (from module 208 output NO), then produce error message (module 109) and termination.Otherwise promptly, if (from module 208 output YES) are all satisfied in condition (7) and (8), then program is carried out according to described below.
This paired pixel PXF, PXL from vectorial VF and VL, have been rejected, promptly, this paired pixel drops on outside the shape of pin 11 ends that limited by a part of plane that is included between straight line LF and the LL, and this paired pixel separates scheduled volume (module 209) along corresponding pixel column with straight line LF and LL self.Particularly, this disallowable paired pixel PXF, PXL must satisfy at least one in following two conditions.First of described condition is to have increased the coordinate c (being represented by c_PXF) of pixel PXF of target offset S less than the corresponding coordinate c_LF of the coordinate r with pixel PXF self that is provided by equation (5), that is,
c_PXF_+S<c_LF,(9)
Another condition is to have reduced the coordinate c (hereinafter being represented by c_PXL) of pixel PXL of deviation (deviation) S greater than the corresponding coordinate c_LL of the coordinate r with pixel PXL self that is provided by equation (6), that is,
c_PXL-S<c_LL.(10)
In Figure 11, what represented by PXFe, PXLe is wherein a pair of in the paired pixel of wanting disallowable.
At this moment, make cyclic variable I increase (I=I+1) (module 210), and the calculating of straight line LF and LL repeats as the function of vectorial VF after upgrading and VL, and up to reaching maximum number of repetitions Imax, this maximum number of repetitions preferably equals 3 or 4 (modules 211).
First P1 that is searched is defined as the function of a pair of pixel PXF1, PXL1, from all paired pixel PXF, PXL of being stored among vectorial VF and the VL, and this has the coordinate r (being represented by r1 hereinafter) of minima and remains essentially in (module 212) in the shape that is limited by straight line LF and FF pixel to pixel for this.Particularly, selected this must satisfy two further conditions to pixel PXF1, PXL1: corresponding coordinate c_PXF and the distance between the c_PXL are less than preset distance DPX, just
|c_PXF-c_PXL|<DPX;(11)
If there is not tolerance TPX, the average of coordinate c_PXF and c_PXL (mean) (hereinafter being represented by c1) is included between the straight line LF and LL corresponding to coordinate r1, just
c_LF<|c1-TPX?|<c_LL.(12)
Pixel c1 of Que Dinging and the coordinate among the r1 have determined to be in the some P1 on the plane of the image that obtains thus.
For second P2 that determine to search, the 3rd straight line LN calculates (Figure 11) (module 213) as the vector sum of the direction of being determined by two straight line LF and LL.Represented the gradient of pin 11 by the direction that straight line LN determines with respect to the axis 9a in the plane of image section 33.Described vector sum is to calculate as the average coefficient of angularity Mm of the meansigma methods of two coefficient of angularity MF and ML and as the mean shift amount Qm of the meansigma methods between side-play amount QF and the QL.Therefore, straight line LN limits by average coefficient of angularity Mm and by mean shift amount Qm.
Determine that on the plane of obtaining image coordinate second P2 being searched, that represent with pixel provides (module 214) by the intersection point of straight line LN and a pixel column (distance that this pixel column is set in distance row r1 equals the position that the line number amount is OFFSET), so be defined as coordinate:
r2=r1+OFFSET.(13)
Realize this step by simple triangulation calculation method.Pixel column with coordinate r2 does not have shown in Figure 11, just puts P2, will put P2 and be specified to point of distance P1 as far as possible, and particularly, the limit superior (r=rmax) that is in region-of-interest 31 is located, and therefore is in the outside of the image section 33 of example illustrated.
With with the similar mode of the flow chart of Fig. 4, the coordinate transformation of representing with pixel of a P1 and P2 is become the coordinate of representing with millimeter (module 121), and all repeat to carry out the algorithm (module 122) that limits by module 100,121,200 to 214 and 121 in two images that obtained each, thereby the gradient of pin 11 in cartesian space calculated (module 123 and 124) as the function of the cartesian coordinate of P1 and P2.
From above-mentioned description, can know clearly, according to the present invention, the mouth 4 of container 3 can be used in any position of syringe 9 with the axially aligning equally of pin 11 of syringe 9, for example, use in the position of syringe suction material, syringe 9 is oriented such that the terminal surface of pin 11 makes progress in this position.Under this situation, in fact, the arm 16 of control robot 13 is so that 14 move and rotate, thereby container 3 put upside down and mouth 4 is faced down, and container is set in the terminal top of pin 11, and a television camera 23 that meanwhile also will be installed on 14 is put upside down.Therefore, the image that obtains by television camera 23 gives expression to pin 11 in the mode of pin terminal surface to the image bottom once more.
Above-mentionedly be used for the method for manipulating toxic substances and the major advantage of machine 1 is: between the pin 11 of the mouth 4 of container 3 and syringe 9, realized axially aligning accurately, thereby made pin 11 also can insert by the minimum mouth 4 of size.In fact, can realize alignment error less than 0.5mm.In addition, even under pin is crooked situation, suitable axially aligning also can allow pin 11 to extract material effectively.

Claims (17)

1. method that is used for manipulating toxic substances said method comprising the steps of:
Has longitudinal axis (9a) and have the dispenser (9) of pin (11) by using anthropomorphic manipulation device (13) from storage device (2) removal container (3) that is used for container and this container is sent to dispensing table (6), being provided with in the described dispensing table; Described container (3) but have mouth (4) by punctured element (5) sealing;
By described manipulation device (13) described mouth (4) and described pin (11) are axially aligned;
Make described container (3) by this way near described dispenser (9) by described manipulation device (13), promptly, described pin (11) but pass described punctured element (5) and then thrust in the described container (3) so that material can be expelled in the described container (3) or extraction material from described container (3);
It is characterized in that, described mouth (4) and the axially aligned step of described pin (11) are comprised following substep:
From at least two different points of observation (OT, OS) and from least two view direction (FT, FS) obtain a plurality of images of described pin (11), described at least two view directions are positioned on the plane perpendicular to described longitudinal axis (9a), and each view direction (FT, FS) all so that view direction (FT, the mode that forms acute angle between FS) is passed corresponding in the described point of observation (OT, OS);
Handle described image with the position of the end of determining described pin (11) and described pin (11) gradient with respect to described longitudinal axis (9a); And
Start described manipulation device (13) to proofread and correct the position of described mouth (4) according to the gradient of the position of the end of described pin (11) and described pin (11).
2. method according to claim 1, wherein, the described step of obtaining a plurality of images of described pin (11) is designed to obtain from each described point of observation (OT, OS) image of (100,122) digital form; The step of described Flame Image Process comprises such step,, determines (101-112 from each image that is; 101,200-212) first point (P1) of position of end that defines described pin (11) is with respect to first coordinate (s1, t1, z1) of reference system, and wherein said reference system limits on the plane perpendicular to described view direction (FT, FS).
3. method according to claim 2 wherein, describedly determines that from each image the step of (101-112) first coordinate (s1, t1, z1) comprises following substep: the region-of-interest (31) that limits (101) image;
Each row to described region-of-interest (31) is carried out (102-108) scanning, with possible position compatible first edge of discovery with the end of described pin (11); And
Determine the coordinate at first edge that (110) are found, described first coordinate (s1, t1, z1) is defined as the function of the coordinate at described first edge.
4. method according to claim 3, wherein, describedly determine that from each image the step of (101-112) first coordinate (s1, t1, z1) comprises following substep:
Finding under the situation at the first non-conforming edge, the contrast threshold of image is increased (111) scheduled volume and repeating line by line image scanning; And
Do not find under the situation at the first compatible edge at the maximum that reaches (112) described contrast threshold, produce (109) error message.
5. according to each described method in the claim 2 to 4, wherein, the step of described Flame Image Process comprises following substep: determine (113-120 from each image; 213,214) second point (P2) is with respect to second coordinate (s2, t2, z2) of described reference system, and described pin (11) is positioned at apart from described first point (P1) preset distance (OFFSET) along described longitudinal axis (9a) and locates through described second and described second; Described definite described pin (11) is designed to determine (123,124) described gradient as the function of described first coordinate (s1, t1, z1) and described second coordinate (s2, t2, z2) with respect to the step of the gradient of described longitudinal axis (9a).
6. method according to claim 5, wherein, determine that from each image (113-120) second coordinate (s2, t2, z2) may further comprise the steps:
To with described first point (P1) distance that delegation of the row of predetermined number (OFFSET) carry out scanning (114-117), with at least two second edges (EF, EL) of the compatible distance of the lateral dimension of finding spaced apart and described pin (11); And the coordinate that second coordinate (s2, t2, z2) is calculated as the intermediate point between described two edges (EF, EL).
7. method according to claim 6, wherein, determine that from each image (113-120) second coordinate (s2, t2, z2) may further comprise the steps:
Under the inconsistent situation of lateral dimension of not finding distance and described pin (11) between second edge (EF, EL) or described second edge (EF, the EL), reduce the contrast threshold and the described row of multiple scanning of (118) image;
Do not find in the minima that reaches (119) described contrast threshold to produce (109) error message under the inconsistent situation of lateral dimension of distance and described pin (11) between any second edge (EF, EL) or described second edge (EF, the EL).
8. method according to claim 2, wherein, the image of described digital form is made of picture element matrix; Determine that from each image (101,200-212) first coordinate (s1, t1, z1) may further comprise the steps:
Determine the region-of-interest (31) of (101) image;
Search (200-205) a pair of pixel along predetermined search direction (34) in each pixel column of described region-of-interest (31), described a pair of pixel is made of first pixel (PXF) and the last pixel (PXL) that brightness is higher than the predetermined luminance threshold value;
The shape of on basis, coming the end of definite (206-211) described pin (11) to the linear interpolation of the position (c, r) of paired pixel (PXF, PXL) described region-of-interest (31) in found;
Select (212) first pairs of pixels (PXF1, PXL1) from the paired pixel of finding (PXF, PXL), it is interior and be positioned on the pixel column compatible with the possible position of the end of described pin (11) that this first pair of pixel remains essentially in described shape; Described first coordinate (s1, t1, z1) is defined as the function of first pair of pixel (PXF, PXL).
9. method according to claim 8, wherein, the terminal surface of described pin (11) is downward; With respect to the every other described paired pixel (PXF, PXL) in the described shape, described first pair of pixel (PXF1, PXL1) is positioned on the nethermost pixel column.
10. according to Claim 8 or 9 described methods, wherein, determine that the shape of the end of (206-211) described pin (11) comprising:
Carry out linear interpolation respectively by first pixel (PXF) and the last position (c, r) of pixel (PXL) in described region-of-interest (31) to all paired pixels (PXF, PXL) of finding, thus definite (207) first straight lines (LF) and second straight line (LL); (LF, LL) limits described shape by described straight line.
11. method according to claim 10 wherein, determines that the shape of the end of (206-211) described pin (11) comprising:
From the paired pixel of finding (PXF, PXL), reject (209) and drop on described shape outside and along those paired pixels (PXF, PXL) of corresponding described pixel column and described straight line (LF, LL) preset distance amount (S); And
Repeat (210,211) determine (207) to described straight line (LF, LL).
12. method according to claim 5, wherein, the image of described digital form is made of picture element matrix; The step of described Flame Image Process comprises following substep:
Determine the region-of-interest (31) of (101) image at each image;
Search (200-205) a pair of pixel along predetermined search direction (34) in each pixel column of described region-of-interest (31), described a pair of pixel is made of first pixel (PXF) and the last pixel (PXL) that brightness is higher than the predetermined luminance threshold value; And
Described first pixel (PXF) and the position of described last pixel (PXL) in described region-of-interest (31) to all paired pixels (PXF, PXL) of finding are carried out linear interpolation respectively, thus definite (207) first straight lines (LF) and second straight line (LL);
Described step from definite (213,214) second coordinate (s2, t2, z2) of each image comprises following substep:
It is vector sum by the definite direction of described first and second straight lines (LF, LL) that the 3rd straight line (LN) is calculated (213); And
Described second coordinate (s2, t2, z2) is determined that (214) be the function of intersection point of that pixel column of the same row (OFFSET) with described first point (P1) the preset distance quantity of described the 3rd straight line (LN).
13. a machine that is used for manipulating toxic substances comprises: dispensing table (6) wherein is provided with the dispenser (9) with pin (11); Storage device (2) is used for storage container (3), but each container all has the mouth (4) by punctured element (5) sealing separately; Anthropomorphic operating means (13) is designed to pick up at least one container (3) and described container (3) is sent in the described dispensing table (6) to arrive the point corresponding to described dispenser (9) from described storage device (2); And control device (17), be designed to control moving of described manipulation device (13); It is characterized in that described machine comprises: image acquiring device (23) is used to obtain the image of described pin (11); And blood processor (24), described blood processor be connected to described control device (17) with order is sent to described machine and and then be sent to described image acquiring device (23), thereby receive the image of described pin (11), and described blood processor is configured to be used for carry out according to each described method in the claim 1 to 12.
14. machine according to claim 13, wherein, described image acquiring device (23) is installed on the described manipulation device (13).
15. machine according to claim 13, wherein, described manipulation device comprises anthropomorphic robot (13), and described anthropomorphic robot comprises the clamping head (14) that is designed to the described container of clamping (3) and has six-freedom degree and be designed to support the articulated jib (16) of described head (14); Described image acquiring device comprises with fixed form and is installed in television camera (23) on the described head (14).
16. machine according to claim 15, wherein, described head (14) comprises along two jaws (26) of slip axis (26a) motion, and described two jaws by mutually near and the motion that retreats and clamping and discharge described container (3) respectively; Described television camera (23) comprises optical axis (23a), and is installed on the described head (14), and makes described optical axis (23a) perpendicular to the residing plane of described slip axis (26a).
17. according to claim 15 or 16 described machines, wherein, described television camera (23) is to have the black and white type that focuses, and comprises the synthetic output device of video of the corresponding input equipment that is connected to described blood processor (24).
CN2007800434719A 2006-11-22 2007-11-21 Method and machine for manipulating toxic substances Active CN101568307B (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
PCT/IT2006/000816 WO2008062485A1 (en) 2006-11-22 2006-11-22 Device for detecting elongated bodies
ITPCT/IT2006/000816 2006-11-22
ITBO2006A000840 2006-12-06
ITBO20060840 ITBO20060840A1 (en) 2006-12-06 2006-12-06 METHOD AND MACHINE TO MANIPULATE TOXIC SUBSTANCES
PCT/IB2007/003577 WO2008062285A2 (en) 2006-11-22 2007-11-21 Method and machine for manipulating toxic substances

Publications (2)

Publication Number Publication Date
CN101568307A CN101568307A (en) 2009-10-28
CN101568307B true CN101568307B (en) 2011-06-15

Family

ID=37807852

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2007800434719A Active CN101568307B (en) 2006-11-22 2007-11-21 Method and machine for manipulating toxic substances

Country Status (2)

Country Link
CN (1) CN101568307B (en)
WO (1) WO2008062485A1 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4517004B2 (en) * 2008-06-16 2010-08-04 ノリー株式会社 Injection needle guidance device
US9930297B2 (en) 2010-04-30 2018-03-27 Becton, Dickinson And Company System and method for acquiring images of medication preparations
CN103417375B (en) * 2012-05-18 2016-08-03 辽宁九洲龙跃医用科技股份有限公司 A kind of machine automatization venous medicine dispensing device
EP2913042B1 (en) * 2012-10-25 2019-10-02 Yuyama Mfg. Co., Ltd. Co-infusion device
JP6128019B2 (en) * 2014-03-05 2017-05-17 株式会社安川電機 Liquid transfer system, liquid transfer control method, liquid transfer control device, and drug manufacturing method
CN114111983A (en) 2014-09-08 2022-03-01 贝克顿·迪金森公司 System and method for preparing pharmaceutical compounds
DE102016007625A1 (en) * 2016-06-23 2018-01-18 Kiefel Gmbh APPARATUS FOR MANUFACTURING A MEDICAL BAG AND METHOD FOR OPERATING SUCH AN APPROPRIATE APPARATUS
CN106491358B (en) * 2016-10-31 2019-10-11 成都杰仕德科技有限公司 A kind of positioning device and method for automated dispensing system
CN106580696A (en) * 2016-11-09 2017-04-26 无锡安之卓医疗机器人有限公司 Syringe needle video positioning system and operating method of the same
CN110108248A (en) * 2018-02-01 2019-08-09 深圳市卫邦科技有限公司 A kind of steel needle is inserted in place testing agency, make up a prescription robot and detection method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1314452A1 (en) * 2001-11-23 2003-05-28 Nucletron B.V. Self controlled image guided device for inserting a needle in an animal body for effecting radiation therapy in said body
DE10249786A1 (en) * 2002-10-24 2004-05-13 Medical Intelligence Medizintechnik Gmbh Referencing method for relating robot to workpiece, in medical applications, by relating reference point using x and y position data obtained from at least two images picked up using camera mounted on robot arm

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6405072B1 (en) * 1991-01-28 2002-06-11 Sherwood Services Ag Apparatus and method for determining a location of an anatomical target with reference to a medical apparatus
US5279309A (en) * 1991-06-13 1994-01-18 International Business Machines Corporation Signaling device and method for monitoring positions in a surgical operation
US5517990A (en) * 1992-11-30 1996-05-21 The Cleveland Clinic Foundation Stereotaxy wand and tool guide
US6973202B2 (en) * 1998-10-23 2005-12-06 Varian Medical Systems Technologies, Inc. Single-camera tracking of an object
US6314311B1 (en) * 1999-07-28 2001-11-06 Picker International, Inc. Movable mirror laser registration system
US6695779B2 (en) * 2001-08-16 2004-02-24 Siemens Corporate Research, Inc. Method and apparatus for spatiotemporal freezing of ultrasound images in augmented reality visualization
JP4865547B2 (en) * 2003-06-18 2012-02-01 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Remote control needle for CT fluoroscopy

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1314452A1 (en) * 2001-11-23 2003-05-28 Nucletron B.V. Self controlled image guided device for inserting a needle in an animal body for effecting radiation therapy in said body
DE10249786A1 (en) * 2002-10-24 2004-05-13 Medical Intelligence Medizintechnik Gmbh Referencing method for relating robot to workpiece, in medical applications, by relating reference point using x and y position data obtained from at least two images picked up using camera mounted on robot arm

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JP特开平11-14310A 1999.01.22

Also Published As

Publication number Publication date
CN101568307A (en) 2009-10-28
WO2008062485A1 (en) 2008-05-29

Similar Documents

Publication Publication Date Title
CN101568307B (en) Method and machine for manipulating toxic substances
CA2669926C (en) Method and machine for manipulating toxic substances
US8267129B2 (en) Control of fluid transfer operations
US11766700B2 (en) Robotic system for performing pattern recognition-based inspection of pharmaceutical containers
EP3725688B1 (en) Preparing a double chamber container
US7150138B2 (en) Method for filling a container having at least one flexible component
EP3115301A1 (en) Machine and method for the automatic preparation of substances for intravenous application
EP1361440A1 (en) Method and apparatus for transporting a plurality of test tubes in a measuring system
US11900540B2 (en) 3D particle imaging in pharmaceutical containers
US20180257051A1 (en) Method and device for making up a pharmaceutical preparation
WO2017158398A1 (en) Automatic compounding system
CN115158830B (en) Reagent bottle capable of displaying pre-drug information identification code
JP7442672B2 (en) Method for inspecting the side walls of objects
JP7302667B2 (en) Analysis equipment
WO2022197181A2 (en) Device for the automatic filling of syringes with injection liquid
ITBO20060840A1 (en) METHOD AND MACHINE TO MANIPULATE TOXIC SUBSTANCES
KR101835093B1 (en) Manufacturing system using tray and material feeding method for the same
CN114074906A (en) Integrated link module and link system including the same

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CI01 Publication of corrected invention patent application

Correction item: International Day of publication

Correct: 20080529

False: 20080814

Number: 24

Volume: 27

CI02 Correction of invention patent application

Correction item: International Day of publication

Correct: 20080529

False: 20080814

Number: 24

Page: The title page

Volume: 27

ERR Gazette correction

Free format text: CORRECT: INTERNATIONAL PROCLAMATION DATE; FROM: 2008.08.14 TO: 2008.05.29

CP03 Change of name, title or address

Address after: Italy, Trieste

Patentee after: Omani Co.,Ltd.

Address before: Italy Bolzano

Patentee before: HEALTH ROBOTICS S.R.L.

CP03 Change of name, title or address