WO2013115093A1 - 情報処理システム、情報処理方法、情報処理装置およびその制御方法と制御プログラム、通信端末およびその制御方法と制御プログラム - Google Patents
情報処理システム、情報処理方法、情報処理装置およびその制御方法と制御プログラム、通信端末およびその制御方法と制御プログラム Download PDFInfo
- Publication number
- WO2013115093A1 WO2013115093A1 PCT/JP2013/051573 JP2013051573W WO2013115093A1 WO 2013115093 A1 WO2013115093 A1 WO 2013115093A1 JP 2013051573 W JP2013051573 W JP 2013051573W WO 2013115093 A1 WO2013115093 A1 WO 2013115093A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- local feature
- feature
- local
- information processing
- medical article
- Prior art date
Links
- 238000004891 communication Methods 0.000 title claims description 187
- 238000000034 method Methods 0.000 title claims description 139
- 230000010365 information processing Effects 0.000 title claims description 109
- 238000003672 processing method Methods 0.000 title claims description 6
- 239000013598 vector Substances 0.000 claims abstract description 167
- 239000003814 drug Substances 0.000 claims abstract description 114
- 229940079593 drug Drugs 0.000 claims abstract description 42
- 238000003384 imaging method Methods 0.000 claims description 35
- 230000005540 biological transmission Effects 0.000 claims description 32
- 230000007547 defect Effects 0.000 claims description 10
- 230000004044 response Effects 0.000 claims description 4
- 230000006872 improvement Effects 0.000 claims description 2
- 230000008569 process Effects 0.000 description 86
- 238000010586 diagram Methods 0.000 description 72
- 238000012545 processing Methods 0.000 description 72
- 230000006870 function Effects 0.000 description 14
- 238000004519 manufacturing process Methods 0.000 description 9
- 239000011159 matrix material Substances 0.000 description 9
- 238000001514 detection method Methods 0.000 description 7
- 230000008859 change Effects 0.000 description 4
- 229940127554 medical product Drugs 0.000 description 3
- 239000000825 pharmaceutical preparation Substances 0.000 description 3
- 229940127557 pharmaceutical product Drugs 0.000 description 3
- 238000010187 selection method Methods 0.000 description 3
- 238000012795 verification Methods 0.000 description 3
- 230000007423 decrease Effects 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 238000001802 infusion Methods 0.000 description 2
- 238000013139 quantization Methods 0.000 description 2
- 230000001174 ascending effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/42—Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/46—Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
- G06V10/462—Salient features, e.g. scale invariant feature transforms [SIFT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/758—Involving statistics of pixels or of feature values, e.g. histogram matching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/66—Trinkets, e.g. shirt buttons or jewellery items
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/20—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/03—Recognition of patterns in medical or anatomical images
- G06V2201/034—Recognition of patterns in medical or anatomical images of medical instruments
Definitions
- the present invention relates to a technique for identifying a medical article such as a medical device, a medical instrument, or a medicine in an image captured using a local feature amount.
- Patent Document 1 discloses a technique for identifying a medical device based on a comparison between a singular point of an input image and the number of edges of an equal distance, and a singular point of a template generated in advance and the number of edges of an equal distance. Is described.
- Japanese Patent Application Laid-Open No. 2004-228561 describes a technique for improving the recognition speed by clustering feature amounts when a query image is recognized using a model dictionary generated in advance from a model image.
- An object of the present invention is to provide a technique for solving the above-described problems.
- a system provides: M first items each consisting of a 1-dimensional to i-dimensional feature vector generated for each of the m local regions including each of the medical article and the m feature points of the image of the medical article.
- First local feature quantity storage means for storing the local feature quantity in association with each other; N feature points are extracted from the image captured by the imaging means, and n local feature regions each including the n feature points are each composed of feature vectors of 1 to j dimensions.
- Second local feature quantity generating means for generating the second local feature quantity of A smaller dimension number is selected from among the dimension number i of the feature vector of the first local feature quantity and the dimension number j of the feature vector of the second local feature quantity, and the feature vector includes up to the selected dimension number.
- the method according to the present invention comprises: M first items each consisting of a 1-dimensional to i-dimensional feature vector generated for each of the m local regions including each of the medical article and the m feature points of the image of the medical article.
- An information processing method in an information processing system including first local feature storage means for storing a local feature in association with each other, Imaging step; N feature points are extracted from the image in the image captured in the imaging step, and n local regions including the n feature points are each composed of feature vectors of 1 to j dimensions.
- a second local feature quantity generating step for generating a plurality of second local feature quantities;
- a smaller dimension number is selected from among the dimension number i of the feature vector of the first local feature quantity and the dimension number j of the feature vector of the second local feature quantity, and the feature vector includes up to the selected dimension number.
- an apparatus provides: N feature points are extracted from the image captured by the imaging means, and n local feature regions each including the n feature points are each composed of feature vectors of 1 to j dimensions.
- Second local feature quantity generating means for generating the second local feature quantity of First transmitting means for transmitting the m second local feature amounts to an information processing apparatus that recognizes a medical article included in the image captured based on the comparison of local feature amounts;
- First receiving means for receiving, from the information processing apparatus, information indicating a medical article included in the captured image; It is characterized by providing.
- the method according to the present invention comprises: N feature points are extracted from the image captured by the imaging means, and n local feature regions each including the n feature points are each composed of feature vectors of 1 to j dimensions.
- a second local feature generation step of generating the second local feature of A first transmission step of transmitting the m second local feature quantities to an information processing apparatus that recognizes a medical article included in the image captured based on the comparison of local feature quantities;
- a program provides: N feature points are extracted from the image captured by the imaging means, and n local feature regions each including the n feature points are each composed of feature vectors of 1 to j dimensions.
- a second local feature generation step of generating the second local feature of A first transmission step of transmitting the m second local feature quantities to an information processing apparatus that recognizes a medical article included in the image captured based on the comparison of local feature quantities;
- an apparatus provides: M first items each consisting of a 1-dimensional to i-dimensional feature vector generated for each of the m local regions including each of the medical article and the m feature points of the image of the medical article.
- First local feature quantity storage means for storing the local feature quantity in association with each other; N feature points are extracted from an image in the video captured by the communication terminal, and n feature regions of 1 to j dimensions are respectively obtained for n local regions including the n feature points.
- Second receiving means for receiving the second local feature amount from the communication terminal; A smaller dimension number is selected from among the dimension number i of the feature vector of the first local feature quantity and the dimension number j of the feature vector of the second local feature quantity, and the feature vector includes up to the selected dimension number.
- Second transmission means for transmitting information indicating the recognized medical article to the communication terminal; It is characterized by providing.
- the method according to the present invention comprises: M first items each consisting of a 1-dimensional to i-dimensional feature vector generated for each of the m local regions including each of the medical article and the m feature points of the image of the medical article.
- a control method for an information processing apparatus including first local feature storage means for storing a local feature in association with each other, N feature points are extracted from an image in the video captured by the communication terminal, and n feature regions of 1 to j dimensions are respectively obtained for n local regions including the n feature points.
- a second receiving step of receiving the second local feature amount of the communication terminal from the communication terminal A smaller dimension number is selected from among the dimension number i of the feature vector of the first local feature quantity and the dimension number j of the feature vector of the second local feature quantity, and the feature vector includes up to the selected dimension number.
- the image in the video Recognizing that the medical article is present in A second transmission step of transmitting information indicating the recognized medical article to the communication terminal; It is characterized by including.
- a program according to the present invention provides: M first items each consisting of a 1-dimensional to i-dimensional feature vector generated for each of the m local regions including each of the medical article and the m feature points of the image of the medical article.
- a control program for an information processing apparatus including a first local feature storage unit that stores a local feature in association with each other, N feature points are extracted from an image in the video captured by the communication terminal, and n feature regions of 1 to j dimensions are respectively obtained for n local regions including the n feature points.
- a smaller dimension number is selected from among the dimension number i of the feature vector of the first local feature quantity and the dimension number j of the feature vector of the second local feature quantity, and the feature vector includes up to the selected dimension number.
- the information processing system 100 is a system that recognizes medical articles in real time.
- medical article in this specification includes medical devices, medical instruments, and pharmaceuticals.
- the information processing system 100 includes a first local feature quantity storage unit 110, an imaging unit 120, a second local feature quantity generation unit 130, and a recognition unit 140.
- the first local feature amount storage unit 110 is generated for each of the medical article 111 and the m local regions including the m feature points of the image of the medical article, each from 1 dimension to i dimension. Are stored in association with the m first local feature quantities 112 made up of the feature vectors.
- the second local feature quantity generation unit 130 extracts n feature points 131 from the image 101 in the video captured by the imaging unit 120.
- the second local feature value generation unit 130 for n local regions 132 including each of the n feature points, n second local feature values 133 each consisting of a feature vector from 1 dimension to j dimension. Is generated.
- the recognition unit 140 selects a smaller number of dimensions from the number of dimensions i of the feature vector of the first local feature 112 and the number of dimensions j of the feature vector of the second local feature 133. Then, the recognizing unit 140 adds m number of first local feature values 112 including feature vectors up to the selected number of dimensions to n second local feature values 133 including feature vectors up to the selected number of dimensions.
- FIG. 2 is a block diagram illustrating a configuration of the information processing system 200 according to the present embodiment.
- the information processing system 200 includes a hospital 201 and a pharmacy 202.
- a hospital computer 201 a installed in the hospital 201 and a pharmacy computer 202 a installed in the pharmacy 202 are connected via a network 270.
- the hospital computer 201a and the pharmacy computer 202a communicate prescription data with each other.
- the hospital computer 201a may comprehensively control the pharmacy.
- the communication terminal 211 images the examination room or the desk and generates a local feature amount from the image.
- the generated local feature amount is sent to the hospital computer 201a.
- a medical device or a medical instrument on the examination room or desk is identified from the local feature amount.
- the arrangement of the medical device and the medical device, or the status of whether or not normal is determined.
- the determination result may be notified to the communication terminal 211 if the communication terminal 211 is a mobile terminal.
- a doctor and a nurse monitor through a center PC not shown.
- a medical chart on the desk may be recognized.
- the communication terminal 221 images the hospital room and generates a local feature amount from the video.
- the generated local feature amount is sent to the hospital computer 201a.
- a medical device or a medical instrument in a hospital room is identified from the local feature amount.
- the arrangement of the medical device and the medical device, or the status of whether or not normal is determined.
- the determination result may be notified to the communication terminal 221 if the communication terminal 221 is a mobile terminal.
- a doctor and a nurse monitor through a center PC (not shown).
- a thermometer and an infusion facility are included.
- the communication terminal 231 images the operating room, the surgical instrument tray 232, or the patient and medical equipment.
- a local feature amount is generated from the captured image.
- the generated local feature amount is sent to the hospital computer 201a.
- a medical device or a medical instrument in the operating room is identified from the local feature amount.
- the arrangement of the medical device and the medical device, or the status of whether or not normal is determined.
- the arrangement of the surgical instrument in the surgical instrument tray 232 or the status of whether or not the surgical instrument is normal is determined.
- the determination result may be notified to the communication terminal 231 if the communication terminal 231 is a mobile terminal.
- the communication terminal 241 carried by the staff or installed at the window images the medicine bag 242 and the medicine basket.
- a local feature amount is generated from the captured image.
- the generated local feature amount is sent to the pharmacy computer 202a.
- the medicine bag and the medicine at the window are identified from the local feature amount. Then, it is determined whether the type and number of medicines correspond to the prescription read by the prescription reader 243, or whether the medicine itself is normal. The determination result may be notified to the communication terminal 241 if the communication terminal 241 is a mobile terminal. Also, the operator monitors through the operator PC 244.
- the medicine tray 252 is photographed. A local feature amount is generated from the captured image. The generated local feature amount is sent to the pharmacy computer 202a. In the pharmacy computer 202a, the medicine in the medicine tray 252 is identified from the local feature amount. Then, it is determined whether the type and number of medicines correspond to the prescription read by the prescription reader 243, or whether the medicine itself is normal. The determination result is notified by the communication terminal 241. In the case of recognition of individual medicine bags and recognition of a plurality of medicines in the car, control may be performed so as to generate local feature amounts with different accuracy.
- a desired shelf is imaged by the communication terminal 261 carried by the employee.
- a local feature amount is generated from the captured image.
- the generated local feature amount is sent to the pharmacy computer 202a.
- the inventory process 260 it is necessary not only to recognize the shelf but also to recognize each of the medicines displayed on the shelf. Therefore, the inventory process 260 is more accurate than the counter process 240 and the drug tray process 250.
- the local feature quantity is generated based on the number of feature points and the dimension number of the feature vector (see FIGS. 11A to 11F).
- the examination room process 210, the hospital room process 220, the operating room process 230, the window process 240, the medicine tray process 250, and the medicine inventory process 260 are performed by the communication terminals 211 to 261, respectively. It can be realized in real time only by imaging.
- FIG. 3 is a diagram showing a display screen example in the communication terminals 221, 231 and 251 according to the present embodiment.
- FIG. 3 shows the communication terminal 221 in the hospital room, the communication terminal 231 in the operating room, and the communication terminal 251 in the medicine tray, but is similar in other cases.
- a local feature amount is generated from the video screen 311 in the left diagram and collated with a local feature amount generated from each medical device generated in advance. Then, the status of each medical device as a recognition result is determined, and a screen 312 in which the status 313 is superimposed on the video screen is displayed as shown in the right figure.
- the screen 312 may be displayed on the center PC.
- FIG. 3 is a display screen in which the communication terminal 231 in the operating room images the surgical instrument tray.
- a local feature amount is generated from the video screen 321 in the left diagram, and collated with a local feature amount generated from each medical device generated in advance. And the number, arrangement
- the screen 322 may be displayed on the center PC.
- FIG. 3 is a display screen in which the communication terminal 251 in the pharmacy images the medicine tray.
- a local feature amount is generated from the video screen 331 in the left diagram, and collated with a local feature amount generated from each medical device generated in advance. Then, the number and status of each drug as a recognition result are determined and compared with the prescription, and a screen 332 in which the status 333 is superimposed on the video screen is displayed as shown in the right figure.
- the screen 332 may be displayed on the operator PC.
- FIG. 4 is a sequence diagram showing an operation procedure in the hospital room of the information processing system 200 according to the present embodiment.
- step S400 an application and / or data is downloaded from the hospital computer 201a to the communication terminal 221 or the center PC.
- step S401 the application is activated and initialized to perform the processing of this embodiment.
- step S403 the communication terminal images a hospital room.
- step S405 a local feature amount is generated from the image of the hospital room.
- step S407 the local feature amount is encoded together with the feature point coordinates.
- step S409 the encoded local feature is transmitted from the communication terminal to the hospital computer 201a.
- step S411 the hospital computer 201a refers to the local feature DB 410 generated and stored for each medical device that is a medical article to recognize the medical device.
- step S413 the status of the medical device is determined with reference to the medical device DB 420 that stores the normal status of the medical device.
- step S415 the status determination result is transmitted from the hospital computer 201a to the communication terminal or the center PC.
- the communication terminal notifies the determination result received in step S417, and the center PC notifies the determination result received in step S419.
- FIG. 5 is a sequence diagram showing an operation procedure in the operating room of the information processing system 200 according to the present embodiment.
- step S500 an application and / or data is downloaded from the hospital computer 201a to the communication terminal 231 or the center PC.
- step S501 the application is started and initialized to perform the processing of this embodiment.
- step S503 the communication terminal images the operating room.
- step S505 a local feature amount is generated from the operating room image.
- step S507 the local feature amount is encoded together with the feature point coordinates.
- the encoded local feature is transmitted from the communication terminal to the hospital computer 201a in step S509.
- step S511 the hospital computer 201a refers to the local feature DB 410 generated and stored for each medical device that is a medical article to recognize the medical device.
- step S513 the status of the medical device is determined with reference to the medical device DB 420 that stores the normal status of the medical device.
- step S515 the status determination result is transmitted from the hospital computer 201a to the communication terminal or the center PC.
- the communication terminal notifies the determination result received in step S517, and the center PC notifies the determination result received in step S519.
- step S521 the communication terminal photographs the surgical instrument tray.
- step S523 local feature amounts are generated from the surgical instrument tray image.
- step S525 the local feature amount is encoded together with the feature point coordinates.
- the encoded local feature amount is transmitted from the communication terminal to the hospital computer 201a in step S527.
- the hospital computer 201a refers to the local feature DB 410 generated and stored for each surgical instrument that is a medical article to recognize the surgical instrument.
- the surgical instrument DB 530 that stores the normal status of the surgical instrument is referenced to determine the status of the surgical instrument such as an error or a result.
- the status determination result is transmitted from the hospital computer 201a to the communication terminal or the center PC.
- the communication terminal notifies the determination result received in step S535, and the center PC notifies the determination result received in step S537.
- FIG. 6 is a sequence diagram illustrating an operation procedure in the pharmacy of the information processing system 200 according to the present embodiment.
- step S600 an application and / or data is downloaded from the pharmacy computer 202a to the communication terminal 251 or the operator PC.
- step S601 the application is activated and initialized to perform the processing of this embodiment.
- step S603 the communication terminal photographs the medicine tray.
- step S605 a local feature amount is generated from the image of the medicine tray.
- step S607 the local feature amount is encoded together with the feature point coordinates. The encoded local feature amount is transmitted from the communication terminal to the pharmacy computer 202a in step S609.
- step S611 the pharmacy computer 202a recognizes the pharmaceutical with reference to the local feature DB 610 generated and stored for each pharmaceutical that is a medical article.
- step S613 the status of the medicine is determined with reference to the prescription DB 620 storing the medicine and the number of medicines.
- step S615 the status determination result is transmitted from the pharmacy computer 202a to the communication terminal or the operator PC.
- the communication terminal notifies the determination result received in step S617, and the operator PC notifies the determination result received in step S619.
- step S621 the communication terminal images the medicine shelf.
- step S623 a local feature is generated from the medicine shelf image.
- step S625 the local feature amount is encoded together with the feature point coordinates.
- the encoded local feature amount is transmitted from the communication terminal to the pharmacy computer 202a in step S627.
- step S629 the pharmacy computer 202a refers to the local feature DB 610 generated and stored for each medicine that is a medical article, and recognizes the medicine shelf and the medicine.
- step S631 the placement and number of medicines in the medicine shelf are determined with reference to the inventory management DB 630 that stores the stock of medicines.
- step S633 the determination result is transmitted from the pharmacy computer 202a to the communication terminal or the operator PC.
- the communication terminal notifies the determination result received in step S635, and the operator PC notifies the determination result received in step S637.
- FIG. 7 is a block diagram illustrating a functional configuration of the communication terminals 211, 221, 231, 241, and 251 according to the present embodiment.
- the imaging unit 701 inputs a query image.
- the local feature value generation unit 702 generates a local feature value from the video from the imaging unit 701.
- the generated local feature amount is encoded by the encoding unit 703a in the local feature amount transmission unit 703 together with the feature point coordinates, and the medical article is recognized and status is determined by the local feature amount via the communication control unit 704. Send to hospital computer or pharmacy computer to do.
- the medical article result receiving unit 705 receives the medical article determination result via the communication control unit 704. Then, the determination result notification unit 706 notifies the user of the received medical article determination result.
- the determination result notification unit 706 includes a display in which the video from the imaging unit 701 and the medical article determination result are superimposed.
- FIG. 8A is a block diagram illustrating a functional configuration of the hospital computer 201a according to the present embodiment.
- the local feature receiving unit 812 decodes the local feature received from the communication terminal via the communication control unit 811 by the decoding unit 812a.
- the medical article recognition unit 813 recognizes the medical article by comparing the received local feature quantity with the local feature quantity in the local feature quantity DB 410 that stores the local feature quantity corresponding to the medical article.
- the determination article selection unit 814 selects different determinations depending on whether the recognized medical article is a medical device or a surgical instrument.
- the medical device status determination unit 815 refers to the medical device DB 420 to determine the status of the medical device.
- the medical device determination result generation unit 816 generates determination result data.
- the surgical instrument status determination unit 817 refers to the surgical instrument DB 530 to determine the arrangement and number of surgical instruments.
- the surgical instrument determination result generation unit 818 generates determination result data.
- the determination result transmission unit 819 transmits the determination result data to the communication terminal and the center PC via the communication control unit 811.
- FIG. 8B is a block diagram showing a functional configuration of the pharmacy computer 202a according to the present embodiment.
- the local feature receiving unit 822 uses the decoding unit 822a to decode the local feature received from the communication terminal via the communication control unit 821.
- the pharmaceutical product recognition unit 823 recognizes the pharmaceutical product (medicine shelf) by comparing the received local feature value with the local feature value of the local feature value DB 610 that stores the local feature value corresponding to the pharmaceutical product (medicine shelf).
- the judgment article selection unit 814 selects different judgments depending on whether the recognized medical article is a medicine or a medicine shelf.
- the prescription status determination unit 825 refers to the prescription DB 620 to determine the status of the medicine.
- the prescription determination result generation unit 826 generates determination result data.
- the medicine shelf status determination unit 827 refers to the inventory management DB 630 to determine the status of the arrangement and number of medicines in the medicine shelf.
- the inventory management result generation unit 828 generates determination result data.
- the determination result transmission unit 829 transmits the determination result data to the communication terminal and the operator PC via the communication control unit 821.
- FIG. 9A is a diagram showing a configuration of the local feature DB 410 of the hospital according to the present embodiment. Note that the present invention is not limited to such a configuration.
- the local feature amount DB 410 is associated with the medical article ID (medical device ID, medical device ID) 911 and the name / type 912, the first local feature amount 913, the second local feature amount 914,.
- the number local feature value 915 is stored.
- Each local feature quantity stores a feature vector composed of 1-dimensional to 150-dimensional elements hierarchized by 25 dimensions corresponding to 5 ⁇ 5 subregions (see FIG. 11F).
- m is a positive integer and may be a different number corresponding to the medical article ID.
- the feature point coordinates used for the matching process are stored together with the respective local feature amounts.
- FIG. 9B is a diagram showing a configuration of the medical device DB 420 according to the present embodiment. Note that the present invention is not limited to such a configuration.
- the medical device DB 420 stores a maker / type 923, a switch state 924, a meter needle position (display waveform position) 925, a hospital room layout 926, and an operating room layout 927 in association with the medical device ID 921 and the name / type 922. To do.
- FIG. 9C is a diagram showing a configuration of the surgical instrument DB 530 according to the present embodiment.
- the surgical instrument DB 530 includes a DB 930 that stores information on each surgical instrument, and a DB 940 that stores the arrangement and number of surgical instruments in the tray corresponding to the surgery. Note that the present invention is not limited to such a configuration.
- the DB 930 that stores information on each surgical instrument stores a manufacturer / model 933, a size 934, a shape 935, and a surface state 936 in association with the surgical instrument ID 931 and the name / type 932.
- the DB 940 that stores the arrangement and number in the tray is associated with the surgical type 941 and the tray arrangement and number 942 of the first surgical instrument ID, the tray position and number 943 of the second surgical instrument ID, and the kth surgical instrument ID.
- the tray arrangement and the number 944 are stored.
- FIG. 10A is a diagram showing a configuration of a local feature DB 610 of a pharmacy according to the present embodiment. Note that the present invention is not limited to such a configuration.
- the local feature DB 610 is associated with the medical article ID (medicine ID, medicine shelf ID) 1011 and the name / type 1012, the first local feature 1013, the second local feature 1014,.
- the local feature amount 1015 is stored.
- Each local feature quantity stores a feature vector composed of 1-dimensional to 150-dimensional elements hierarchized by 25 dimensions corresponding to 5 ⁇ 5 subregions (see FIG. 11F).
- m is a positive integer and may be a different number corresponding to the medical article ID.
- the feature point coordinates used for the matching process are stored together with the respective local feature amounts.
- FIG. 10B is a diagram showing a configuration of the prescription DB 620 according to the present embodiment. Note that the present invention is not limited to such a configuration.
- the prescription DB 620 stores a prescription 1024 in association with the patient ID 1021, the patient name 1022, and the date 1023.
- Each column of the prescription 1024 stores a medicine ID or a generic ID.
- FIG. 10C is a diagram showing a configuration of the inventory management DB 630 according to the present embodiment. Note that the present invention is not limited to such a configuration.
- FIG. 11A is a block diagram illustrating a configuration of a local feature value generation unit 702 according to the present embodiment.
- the local feature quantity generation unit 702 includes a feature point detection unit 1111, a local region acquisition unit 1112, a sub region division unit 1113, a sub region feature vector generation unit 1114, and a dimension selection unit 1115.
- the feature point detection unit 1111 detects a large number of characteristic points (feature points) from the image data, and outputs the coordinate position, scale (size), and angle of each feature point.
- the local region acquisition unit 1112 acquires a local region where feature amount extraction is performed from the coordinate value, scale, and angle of each detected feature point.
- the sub area dividing unit 1113 divides the local area into sub areas.
- the sub-region dividing unit 1113 can divide the local region into 16 blocks (4 ⁇ 4 blocks) or divide the local region into 25 blocks (5 ⁇ 5 blocks).
- the number of divisions is not limited. In the present embodiment, the case where the local area is divided into 25 blocks (5 ⁇ 5 blocks) will be described below as a representative.
- the sub-region feature vector generation unit 1114 generates a feature vector for each sub-region of the local region.
- a gradient direction histogram can be used as the feature vector of the sub-region.
- the dimension selection unit 1115 selects a dimension to be output as a local feature amount (for example, thinning out) so that the correlation between feature vectors of adjacent sub-regions becomes low based on the positional relationship of the sub-regions.
- the dimension selection unit 1115 can not only select a dimension but also determine a selection priority. That is, the dimension selection unit 1115 can select dimensions with priorities so that, for example, dimensions in the same gradient direction are not selected between adjacent sub-regions. Then, the dimension selection unit 1115 outputs a feature vector composed of the selected dimensions as a local feature amount.
- the dimension selection part 1115 can output a local feature-value in the state which rearranged the dimension based on the priority.
- 11B to 11F are diagrams showing processing of the local feature quantity generation unit 702 according to the present embodiment.
- FIG. 11B is a diagram showing a series of processing of feature point detection / local region acquisition / sub-region division / feature vector generation in the local feature amount generation unit 702.
- Such a series of processes is described in US Pat. No. 6,711,293, David G. Lowe, “Distinctive image features from scale-invariant key points” (USA), International Journal of Computer Vision, 60 (2), 2004. Year, p. 91-110.
- An image 1121 in FIG. 11B is a diagram illustrating a state in which feature points are detected from an image in the video in the feature point detection unit 1111 in FIG. 11A.
- the starting point of the arrow of the feature point data 1121a indicates the coordinate position of the feature point
- the length of the arrow indicates the scale (size)
- the direction of the arrow indicates the angle.
- the scale (size) and direction brightness, saturation, hue, and the like can be selected according to the target image.
- FIG. 11B the case of six directions at intervals of 60 degrees will be described, but the present invention is not limited to this.
- the local region acquisition unit 1112 in FIG. 11A generates a Gaussian window 1122a around the starting point of the feature point data 1121a, and generates a local region 1122 that substantially includes the Gaussian window 1122a.
- the local region acquisition unit 1112 generates a square local region 1122, but the local region may be circular or have another shape. This local region is acquired for each feature point. If the local area is circular, there is an effect that the robustness is improved with respect to the imaging direction.
- the sub-region dividing unit 1113 shows a state in which the scale and angle of each pixel included in the local region 1122 of the feature point data 1121a are divided into sub-regions 1123.
- the gradient direction is not limited to 6 directions, but may be quantized to an arbitrary quantization number such as 4 directions, 8 directions, and 10 directions.
- the sub-region feature vector generation unit 1114 may add up the magnitudes of the gradients instead of adding up the simple frequencies.
- the sub-region feature vector generation unit 1114 when the sub-region feature vector generation unit 1114 aggregates the gradient histogram, the sub-region feature vector generation unit 1114 assigns weight values not only to the sub-region to which the pixel belongs, but also to sub-regions (such as adjacent blocks) that are close to each other according to the distance between the sub-regions. You may make it add. Further, the sub-region feature vector generation unit 1114 may add weight values to gradient directions before and after the quantized gradient direction. Note that the feature vector of the sub-region is not limited to the gradient direction histogram, and may be any one having a plurality of dimensions (elements) such as color information. In the present embodiment, it is assumed that a gradient direction histogram is used as the feature vector of the sub-region.
- the dimension selection unit 1115 selects (decimates) a dimension (element) to be output as a local feature amount based on the positional relationship between the sub-regions so that the correlation between feature vectors of adjacent sub-regions becomes low. More specifically, the dimension selection unit 1115 selects dimensions such that at least one gradient direction differs between adjacent sub-regions, for example.
- the dimension selection unit 1115 mainly uses adjacent subregions as adjacent subregions. However, the adjacent subregions are not limited to adjacent subregions. A sub-region within a predetermined distance may be a nearby sub-region.
- FIG. 11C shows an example in which a dimension is selected from a feature vector 1131 of a 150-dimensional gradient histogram generated by dividing a local region into 5 ⁇ 5 block sub-regions and quantizing gradient directions into six directions 1131a.
- FIG. 11C is a diagram showing a state of feature vector dimension number selection processing in the local feature quantity generation unit 702.
- the dimension selection unit 1115 selects a feature vector 1132 of a half 75-dimensional gradient histogram from a feature vector 1131 of a 150-dimensional gradient histogram.
- dimensions can be selected so that dimensions in the same gradient direction are not selected in adjacent left and right and upper and lower sub-region blocks.
- the dimension selection unit 1115 selects the feature vector 1133 of the 50-dimensional gradient histogram from the feature vector 1132 of the 75-dimensional gradient histogram.
- the dimension can be selected so that only one direction is the same (the remaining one direction is different) between the sub-region blocks positioned at an angle of 45 degrees.
- the dimension selection unit 1115 selects the feature vector 1134 of the 25-dimensional gradient histogram from the feature vector 1133 of the 50-dimensional gradient histogram, the gradient direction selected between the sub-region blocks located at an angle of 45 degrees. Dimension can be selected so that does not match.
- the dimension selection unit 1115 selects one gradient direction from each sub-region from the first dimension to the 25th dimension, selects two gradient directions from the 26th dimension to the 50th dimension, and starts from the 51st dimension. Three gradient directions are selected up to 75 dimensions.
- the gradient directions should not be overlapped between adjacent sub-area blocks and that all gradient directions should be selected uniformly.
- the dimensions be selected uniformly from the entire local region. Note that the dimension selection method illustrated in FIG. 11C is an example, and is not limited to this selection method.
- FIG. 11D is a diagram illustrating an example of the selection order of feature vectors from sub-regions in the local feature value generation unit 702.
- the dimension selection unit 1115 can determine the priority of selection so as to select not only the dimensions but also the dimensions that contribute to the features of the feature points in order. That is, for example, the dimension selection unit 1115 can select dimensions with priorities so that dimensions in the same gradient direction are not selected between adjacent sub-area blocks. Then, the dimension selection unit 1115 outputs a feature vector composed of the selected dimensions as a local feature amount. In addition, the dimension selection part 1115 can output a local feature-value in the state which rearranged the dimension based on the priority.
- the dimension selection unit 1115 adds dimensions in the order of the sub-region blocks as shown in the matrix 1141 in FIG. 11D, for example, between 1 to 25 dimensions, 26 dimensions to 50 dimensions, and 51 dimensions to 75 dimensions. It may be selected.
- the dimension selection unit 1115 can select the gradient direction by increasing the priority order of the sub-region blocks close to the center.
- 11E is a diagram illustrating an example of element numbers of 150-dimensional feature vectors in accordance with the selection order of FIG. 11D.
- the element number of the feature vector is 6 ⁇ p + q.
- the matrix 1161 in FIG. 11F is a diagram showing that the 150-dimensional order according to the selection order in FIG. 11E is hierarchized in units of 25 dimensions.
- the matrix 1161 in FIG. 11F is a diagram illustrating a configuration example of local feature amounts obtained by selecting the elements illustrated in FIG. 11E according to the priority order illustrated in the matrix 1141 in FIG. 4D.
- the dimension selection unit 1115 can output dimension elements in the order shown in FIG. 11F. Specifically, for example, when outputting a 150-dimensional local feature amount, the dimension selection unit 1115 can output all 150-dimensional elements in the order shown in FIG. 11F.
- the dimension selection unit 1115 When the dimension selection unit 1115 outputs, for example, a 25-dimensional local feature, the element 1171 in the first row (76th, 45th, 83rd,..., 120th) shown in FIG. 11F is shown in FIG. 11F. Can be output in order (from left to right). For example, when outputting a 50-dimensional local feature value, the dimension selection unit 1115 adds the elements 1172 in the second row shown in FIG. 11F in the order shown in FIG. To the right).
- the local feature amount has a hierarchical structure. That is, for example, in the 25-dimensional local feature quantity and the 150-dimensional local feature quantity, the arrangement of the elements 1171 to 1176 in the first 25-dimensional local feature quantity is the same.
- the dimension selection unit 1115 selects a dimension hierarchically (progressively), thereby depending on the application, communication capacity, terminal specification, etc. Feature quantities can be extracted and output.
- the dimension selection unit 1115 can select images hierarchically, sort the dimensions based on the priority order, and output them, thereby collating images using local feature amounts of different dimensions. . For example, when images are collated using a 75-dimensional local feature value and a 50-dimensional local feature value, the distance between the local feature values can be calculated by using only the first 50 dimensions.
- the priorities shown in the matrix 1141 in FIG. 11D to FIG. 11F are merely examples, and the order of selecting dimensions is not limited to this.
- the order of blocks may be the order shown in the matrix 1142 in FIG. 11D or the matrix 1143 in FIG. 11D in addition to the example of the matrix 1141 in FIG. 11D.
- the priority order may be determined so that dimensions are selected from all the sub-regions.
- the vicinity of the center of the local region may be important, and the priority order may be determined so that the selection frequency of the sub-region near the center is increased.
- the information indicating the dimension selection order may be defined in the program, for example, or may be stored in a table or the like (selection order storage unit) referred to when the program is executed.
- the dimension selection unit 1115 may select a dimension by selecting one sub-region block. That is, 6 dimensions are selected in a certain sub-region, and 0 dimensions are selected in other sub-regions close to the sub-region. Even in such a case, it can be said that the dimension is selected for each sub-region so that the correlation between adjacent sub-regions becomes low.
- the shape of the local region and sub-region is not limited to a square, and can be any shape.
- the local region acquisition unit 1112 may acquire a circular local region.
- the sub-region dividing unit 1113 can divide the circular local region into, for example, nine or seventeen sub-regions into concentric circles having a plurality of local regions.
- the dimension selection unit 1115 can select a dimension in each sub-region.
- the dimension of the feature vector generated while maintaining the information amount of the local feature value is hierarchically selected.
- the This process enables real-time medical article recognition and recognition result display while maintaining recognition accuracy.
- the configuration and processing of the local feature value generation unit 702 are not limited to this example. Of course, other processes that enable real-time medical article recognition and recognition result display while maintaining recognition accuracy can be applied.
- FIG. 11G is a block diagram showing the encoding unit 703a according to this embodiment. Note that the encoding unit is not limited to this example, and other encoding processes can be applied.
- the encoding unit 703a has a coordinate value scanning unit 1181 that inputs the coordinates of feature points from the feature point detection unit 1111 of the local feature quantity generation unit 702 and scans the coordinate values.
- the coordinate value scanning unit 1181 scans the image according to a specific scanning method, and converts the two-dimensional coordinate values (X coordinate value and Y coordinate value) of the feature points into one-dimensional index values.
- This index value is a scanning distance from the origin according to scanning. There is no restriction on the scanning direction.
- the sorting unit 1182 has a sorting unit 1182 that sorts the index values of feature points and outputs permutation information after sorting.
- the sorting unit 1182 sorts, for example, in ascending order. You may also sort in descending order.
- a difference calculation unit 1183 that calculates a difference value between two adjacent index values in the sorted index value and outputs a series of difference values is provided.
- the differential encoding unit 1184 that encodes a sequence of difference values in sequence order.
- the sequence of the difference value may be encoded with a fixed bit length, for example.
- the bit length may be specified in advance, but this requires the number of bits necessary to express the maximum possible difference value, so the encoding size is small. Don't be. Therefore, when encoding with a fixed bit length, the differential encoding unit 1184 can determine the bit length based on the input sequence of difference values.
- the difference encoding unit 1184 obtains the maximum value of the difference value from the input series of difference values, obtains the number of bits (expression number of bits) necessary to express the maximum value, A series of difference values can be encoded with the obtained number of expression bits.
- the local feature encoding unit 1185 that encodes the local feature of the corresponding feature point in the same permutation as the index value of the sorted feature point.
- the local feature amount encoding unit 1185 encodes a local feature amount dimension-selected from 150-dimensional local feature amounts for one feature point, for example, one dimension with one byte, and the number of dimensions of bytes. Can be encoded.
- FIG. 11H, FIG. 11J, and FIG. 11K are diagrams showing processing of the medical article recognition unit 813 and the pharmaceutical recognition unit 823 according to the present embodiment.
- FIG. 11H is a diagram showing processing of the medical article recognition unit 813 in the hospital room of FIG.
- Local feature amounts 1191 to 1193 generated according to the present embodiment in advance from a medical device or an infusion bag shown in FIG. 11H are stored in the local feature amount DB 410.
- a local feature amount is generated according to the present embodiment from the video screen 311 captured by the communication terminal 221 in the left diagram of FIG. 11H. Then, it is verified whether or not the local feature amounts 1191 to 1193 stored in the local feature amount DB 410 are among the local feature amounts generated from the video screen 311.
- the medical article recognition unit 813 associates each feature point that matches the local feature quantity stored in the local feature quantity DB 410 with a thin feature line. Note that the medical article recognition unit 813 determines that the feature points match when a predetermined ratio or more of the local feature amounts match. The medical article recognition unit 813 recognizes that the medical article is a target medical article if the positional relationship between the associated sets of feature points is a linear relationship. If such recognition is performed, it is possible to recognize by size difference, orientation difference (difference in viewpoint), or inversion. In addition, since recognition accuracy can be obtained if there are a predetermined number or more of associated feature points, it is possible to recognize a medical article even if a part is hidden from view.
- FIG. 11J is a diagram showing processing of the medical article recognition unit 813 for the medical article (surgical instrument) in the operating room of FIG.
- Local feature amounts 1194 to 1196 generated according to the present embodiment in advance from a medical instrument such as a scalpel, a cane, and a sushi as shown in FIG. 11J are stored in the local feature amount DB 410.
- local feature amounts are generated from the video screen 321 captured by the communication terminal 231 in the left diagram of FIG. 11J according to the present embodiment. Then, it is checked whether or not the local feature amounts 1194 to 1196 stored in the local feature amount DB 410 are included in the local feature amounts generated from the video screen 321.
- the medical article recognition unit 813 associates each feature point where the local feature quantity stored in the local feature quantity DB 410 matches the local feature quantity as a thin line. Note that the medical article recognition unit 813 determines that the feature points match when a predetermined ratio or more of the local feature amounts match. The medical article recognition unit 813 recognizes that the medical article is a target medical article if the positional relationship between the associated sets of feature points is a linear relationship. If such recognition is performed, it is possible to recognize by size difference, orientation difference (difference in viewpoint), or inversion. In addition, since recognition accuracy can be obtained if there are a predetermined number or more of associated feature points, it is possible to recognize a medical article even if a part is hidden from view.
- surgical instruments having different orientations in the surgical instrument tray that match the local feature quantities 1194 to 1196 of the three medical articles in the local feature quantity DB 410 have a precision corresponding to the accuracy of the local feature quantity. Be recognized.
- only one in the surgical instrument tray is associated with each surgical instrument to avoid complication, but the process for recognizing other identical surgical instruments is the same.
- FIG. 11K is a diagram showing processing of the drug recognition unit 823 for the drug in the drug tray of the pharmacy in FIG.
- Local feature amounts 1197 to 1199 generated according to the present embodiment in advance from each medicine shown in FIG. 11K are stored in the local feature amount DB 610.
- a local feature amount is generated according to the present embodiment from the video screen 331 captured by the communication terminal 251 in the left diagram of FIG. 11K. Then, it is verified whether or not the local feature values 1197 to 1199 stored in the local feature value DB 610 are in the local feature values generated from the video screen 331.
- the drug recognition unit 823 associates each feature point where the local feature quantity stored in the local feature quantity DB 610 matches the local feature quantity as a thin line.
- the pharmaceutical recognizing unit 823 determines that the feature points match when a predetermined ratio or more of the local feature amounts match. Then, if the positional relationship between the set of associated feature points is a linear relationship, the pharmaceutical recognizing unit 823 recognizes the target medical article. If such recognition is performed, it is possible to recognize by size difference, orientation difference (difference in viewpoint), or inversion. In addition, since recognition accuracy can be obtained if there are a predetermined number or more of associated feature points, it is possible to recognize a medical article even if a part is hidden from view.
- collation is performed based on the feature point coordinates and the local feature amount, but the local feature generated from the matching medical article.
- the recognition is possible only by the linear relationship of the arrangement order between the quantity and the local feature quantity generated from the image in the video.
- this embodiment has been described with a two-dimensional image, the same processing can be performed even if three-dimensional feature point coordinates are used.
- FIG. 12A is a block diagram showing a hardware configuration of the communication terminals 211 to 261 according to the present embodiment.
- a CPU 1210 is a processor for arithmetic control, and implements each functional component of the communication terminals 211 to 261 by executing a program.
- the ROM 1220 stores fixed data and programs such as initial data and programs.
- the communication control unit 704 is a communication control unit, and in this embodiment, communicates with the hospital computer 201a and the pharmacy computer 202a via a network. Note that the number of CPUs 1210 is not limited to one, and may be a plurality of CPUs or may include a GPU (GraphicsGraphProcessing Unit) for image processing.
- the RAM 1240 is a random access memory that the CPU 1210 uses as a work area for temporary storage.
- the RAM 1240 has an area for storing data necessary for realizing the present embodiment.
- An input video 1241 indicates an input video imaged and input by the imaging unit 701.
- the feature point data 1242 indicates feature point data including the feature point coordinates, scale, and angle detected from the input video 1241.
- the local feature value generation table 1243 indicates a local feature value generation table that holds data until a local feature value is generated (see 12B).
- the local feature quantity 1244 is generated using the local feature quantity generation table 1243, and indicates a local feature quantity to be sent to a transmission destination that recognizes and determines a medical article via the communication control unit 704.
- the medical article determination result 1245 indicates the medical article determination result returned from the transmission destination via the communication control unit 704.
- the display screen data 1246 indicates display screen data for notifying the user of information including the medical article determination result 1245. In the case of outputting audio, audio data may be included.
- Input / output / transmission / reception data 1247 indicates input / output data input / output via the input / output interface 1260 and transmission / reception data transmitted / received via the communication control unit 704.
- the storage 1250 stores a database, various parameters, or the following data or programs necessary for realizing the present embodiment.
- the storage 1250 stores the following programs.
- the communication terminal control program 1251 is a communication terminal control program that controls the entire communication terminals 211 to 261.
- the communication terminal control program 1251 includes the following modules.
- the local feature amount generation module 1252 is a module that generates a local feature amount from an input video according to FIGS. 11B to 11F in the communication terminal control program 1251.
- the encoding module 1258 is a module that encodes the local feature generated by the local feature generating module 1252 for transmission.
- the medical article determination result notification module 1259 is a module for receiving a medical article determination result and notifying the user by display or voice.
- the input / output interface 1260 interfaces input / output data with input / output devices.
- the input / output interface 1260 is connected to a display unit 1261, a touch panel or keyboard as the operation unit 1262, a speaker 1263, a microphone 1264, and an imaging unit 701.
- the input / output device is not limited to the above example.
- a GPS (Global Positioning System) position generation unit 1265 is mounted to acquire a current position based on a signal from a GPS satellite.
- FIG. 12A only data and programs essential to the present embodiment are shown, and data and programs not related to the present embodiment are not shown.
- FIG. 12B is a diagram showing a local feature value generation table 1243 in the communication terminals 211 to 261 according to the present embodiment.
- a plurality of detected feature points 1202, feature point coordinates 1203, and local region information 1204 corresponding to the feature points are stored in association with the input image ID 1201.
- the local feature 1209 generated for each detected feature point 1202 is generated from the above data.
- FIG. 13 is a flowchart showing a processing procedure of the communication terminals 211 to 261 according to the present embodiment. This flowchart is executed by the CPU 1210 of FIG. 12A using the RAM 1240, and implements each functional component of FIG.
- step S1311 it is determined whether or not there is an image input for recognizing a medical article.
- step S1321 data reception is determined. Otherwise, other processing is performed in step S1331. Note that description of normal transmission processing is omitted.
- step S1313 If there is video input, the process proceeds to step S1313, and local feature generation processing is executed from the input video (see FIG. 14A).
- step S1315 local feature quantities and feature point coordinates are encoded (see FIGS. 14B and 14C).
- step S1317 the encoded data is transmitted to the hospital computer 201a or the pharmacy computer 202a.
- step S1323 it is determined whether or not a medical article determination result is received from the hospital computer 201a or the pharmacy computer 202a. If it is reception of a medical article determination result, it will progress to step S1325 and will alert
- FIG. 14A is a flowchart illustrating a processing procedure of local feature generation processing S1313 according to the present embodiment.
- step S1411 the position coordinates, scale, and angle of the feature points are detected from the input video.
- step S1413 a local region is acquired for one of the feature points detected in step S1411.
- step S1415 the local area is divided into sub-areas.
- step S1417 a feature vector for each sub-region is generated to generate a feature vector for the local region. The processing of steps S1411 to S1417 is illustrated in FIG. 11B.
- step S1419 dimension selection is performed on the feature vector of the local region generated in step S1417.
- the dimension selection is illustrated in FIGS. 11D to 11F.
- step S1421 it is determined whether the generation of local features and dimension selection have been completed for all feature points detected in step S1411. If not completed, the process returns to step S1413 to repeat the process for the next one feature point.
- FIG. 14B is a flowchart illustrating a processing procedure of the encoding processing S1315 according to the present embodiment.
- step S1431 the coordinate values of feature points are scanned in a desired order.
- step S1433 the scanned coordinate values are sorted.
- step S1435 a difference value of coordinate values is calculated in the sorted order.
- step S1437 the difference value is encoded (see FIG. 14C).
- step S1439 local feature amounts are encoded in the coordinate value sorting order. The difference value encoding and the local feature amount encoding may be performed in parallel.
- FIG. 14C is a flowchart illustrating a processing procedure of difference value encoding processing S1437 according to the present embodiment.
- step S1441 it is determined whether or not the difference value is within a range that can be encoded. If it is within the range which can be encoded, it will progress to step S1447 and will encode a difference value. Then, control goes to a step S1449. If it is not within the range that can be encoded (outside the range), the process proceeds to step S1443 to encode the escape code.
- step S1445 the difference value is encoded by an encoding method different from the encoding in step S1447. Then, control goes to a step S1449.
- step S1449 it is determined whether the processed difference value is the last element in the series of difference values. If it is the last, the process ends. When it is not the last, it returns to step S1441 again and the process with respect to the next difference value of the series of a difference value is performed.
- FIG. 15 is a block diagram illustrating a hardware configuration of the hospital computer 201a according to the present embodiment.
- a CPU 1510 is a processor for arithmetic control, and implements each functional component of the hospital computer 201a by executing a program.
- the ROM 1520 stores fixed data and programs such as initial data and programs.
- the communication control unit 811 is a communication control unit, and in this embodiment, communicates with a communication terminal or the pharmacy computer 202a via a network. Note that the number of CPUs 1510 is not limited to one, and may be a plurality of CPUs or may include a GPU for image processing.
- the RAM 1540 is a random access memory that the CPU 1510 uses as a work area for temporary storage.
- the RAM 1540 has an area for storing data necessary for realizing the present embodiment.
- the received local feature value 1541 indicates a local feature value including the feature point coordinates received from the communication terminal.
- the read local feature value 1542 indicates the local feature value when including the feature point coordinates read from the local feature value DB 410.
- the medical part recognition result 1543 indicates the medical article recognition result recognized from the collation between the received local feature quantity and the local feature quantity stored in the local feature quantity DB 410.
- the medical article arrangement determination result 1544 indicates a medical article arrangement determination result that is a determination arrangement of a medical device or a surgical instrument.
- the recognized number of medical articles 1545 indicates the number of medical articles 1545 that is the number of recognized surgical instruments.
- Transmission / reception data 1547 indicates transmission / reception data transmitted / received via the communication control unit 811.
- the storage 1550 stores a database, various parameters, or the following data or programs necessary for realizing the present embodiment.
- the local feature DB 410 is a local feature DB similar to that shown in FIG. 9A.
- the medical device DB 420 is the same medical device DB as shown in FIG. 9B.
- the surgical instrument DB 530 shows the same surgical instrument DB as shown in FIG. 9C.
- the storage 1550 stores the following programs.
- the hospital computer control program 1551 indicates a hospital computer control program that controls the entire hospital computer.
- the local feature DB creation module 1552 is a module that generates a local feature from an image of a medical article and stores it in the local feature DB in the hospital computer control program 1551.
- the medical article recognition module 1553 is a module that recognizes a medical article by comparing the received local feature quantity with the local feature quantity stored in the local feature quantity DB 410 in the hospital computer control program 1551.
- the medical device arrangement / status determination module 1554 is a module that determines the arrangement and status based on the medical device recognized from the local feature amount in the hospital computer control program 1551.
- the surgical instrument arrangement / status determination module 1555 is a module that determines the arrangement and status based on the surgical instrument recognized from the local feature amount in the hospital computer control program 1551.
- the determination result transmission module 1556 is a module that transmits the determination result to the communication terminal or the center PC in the hospital computer control program 1551.
- FIG. 15 shows only data and programs essential to the present embodiment, and does not illustrate data and programs not related to the present embodiment.
- FIG. 16 is a flowchart showing a processing procedure of the hospital computer 201a according to this embodiment. This flowchart is executed by the CPU 1510 of FIG. 15 using the RAM 1540, and implements each functional component of FIG. 8A.
- step S1611 it is determined whether or not a local feature DB is generated.
- step S1621 it is determined whether a local feature amount is received from the communication terminal. Otherwise, other processing is performed in step S1641.
- step S1613 If the local feature DB is generated, the process advances to step S1613 to execute a local feature DB generation process (see FIG. 17). If a local feature is received, the process proceeds to step S1623 to perform medical article recognition processing (see FIGS. 18A and 18B).
- step S1625 it is determined whether the recognized medical article is a medical device or a surgical instrument. If it is a medical device, the process proceeds to step S1627, and the arrangement and status of the medical device are determined with reference to the medical device DB 420 (FIG. 9B). In step S1629, the determination result is transmitted. On the other hand, if it is a surgical instrument, it will progress to step S1631 and will determine the arrangement / number / correctness of the surgical instrument with reference to the surgical instrument DB 530 (FIG. 9C). In step S1633, the determination result is transmitted.
- FIG. 17 is a flowchart showing a processing procedure of local feature DB generation processing S1613 according to the present embodiment.
- step S1701 an image of a medical article is acquired.
- step S1703 the position coordinates, scale, and angle of the feature points are detected.
- step S1705 a local region is acquired for one of the feature points detected in step S1703.
- step S1707 the local area is divided into sub-areas.
- step S1709 a feature vector for each sub-region is generated to generate a local region feature vector. The processing from step S1705 to S1709 is illustrated in FIG. 11B.
- step S1711 dimension selection is performed on the feature vector of the local region generated in step S1709.
- the dimension selection is illustrated in FIGS. 11D to 11F.
- hierarchization in dimension selection is executed, but it is desirable to store all generated feature vectors.
- step S1713 it is determined whether generation of local feature values and dimension selection have been completed for all feature points detected in step S1703. If not completed, the process returns to step S1705 to repeat the process for the next one feature point. If all feature points have been completed, the process advances to step S1715 to register local feature values and feature point coordinates in the local feature value DB 410 in association with the medical article.
- step S1717 it is determined whether there is an image of another medical article. If there is an image of another medical article, the process returns to step S1701, an image of the other medical article is acquired, and the process is repeated.
- FIG. 18A is a flowchart showing a processing procedure of medical article recognition processing S1623 according to this embodiment.
- step S1811 the local feature amount of one medical article is acquired from the local feature amount DB 410.
- step S1813 the local feature amount of the medical article is compared with the local feature amount received from the communication terminal (see FIG. 18B).
- step S1815 it is determined whether or not they match. If they match, the process proceeds to step S1821, and the matched medical article is stored as being in the video.
- step S1817 it is determined whether or not all medical articles registered in the local feature DB 410 have been collated, and if there is any remaining, the process returns to step S1811 to repeat collation of the next medical article.
- the field may be limited in advance in order to improve the processing speed and reduce the load on the hospital computer.
- FIG. 18B is a flowchart showing a processing procedure of collation processing S1813 according to the present embodiment.
- step S1833 a smaller number of dimensions is selected from the dimension number i of the local feature quantity in the local feature quantity DB 410 and the dimension number j of the received local feature quantity.
- step S1835 data of the selected number of dimensions of the p-th local feature amount of the medical article stored in the local feature amount DB 410 is acquired. That is, the number of dimensions selected from the first one dimension is acquired.
- step S1837 the p-th local feature value acquired in step S1835 and the local feature values of all feature points generated from the input video are sequentially checked to determine whether or not they are similar.
- step S1839 it is determined whether or not the similarity exceeds the threshold value ⁇ from the result of collation between the local feature quantities.
- the local feature quantity matches the input image and the medical article in step S1841.
- a combination with the positional relationship of feature points is stored.
- q which is a parameter for the number of matched feature points, is incremented by one.
- the feature point of the medical article is advanced to the next feature point (p ⁇ p + 1). If all feature points of the medical article have not been collated (p ⁇ m), the process returns to step S1835. Repeat matching of matching local features.
- the threshold value ⁇ can be changed according to the recognition accuracy required by the medical article. Here, if the medical article has a low correlation with other medical articles, accurate recognition is possible even if the recognition accuracy is lowered.
- step S1845 it is determined whether or not the ratio of the number of feature points q that matches the local feature amount of the feature points of the input image out of the number of feature points p of the medical article exceeds a threshold value ⁇ . If exceeded, the process proceeds to step S1849, and it is further determined as a medical article candidate whether the positional relationship between the feature point of the input image and the feature point of the medical article has a relationship that allows linear transformation. .
- step S1841 the positional relationship between the feature point of the input image and the feature point of the medical article stored as the local feature amount matches in step S1841 is possible even by a change such as rotation, inversion, or change of the viewpoint position. Or whether the positional relationship cannot be changed. Since such a determination method is geometrically known, detailed description thereof is omitted. If it is determined in step S1851 that the linear conversion is possible, the process proceeds to step S953 to determine that the collated medical article exists in the input video. Note that the threshold value ⁇ can be changed according to the recognition accuracy required by the medical article. Here, if the medical article has a low correlation with other medical articles, or the characteristics of which can be judged even from a part, accurate recognition is possible even if there are few matching feature points. That is, even if a part is hidden and cannot be seen, or if a characteristic part is visible, the medical article can be recognized.
- the process of collating all medical articles by storing all medical articles in the local feature DB 410 is very heavy. Therefore, for example, it is conceivable that the user selects the range of the medical article from the menu before the medical article is recognized from the input video, and searches and collates the range from the local feature amount DB 410. Also, the load can be reduced by storing only the local feature amount in the range used by the user in the local feature amount DB 410.
- FIG. 19 is a block diagram showing a hardware configuration of the pharmacy computer 202a according to the present embodiment.
- a CPU 1910 is a processor for arithmetic control, and implements each functional component of the pharmacy computer 202a by executing a program.
- the ROM 1920 stores fixed data and programs such as initial data and programs.
- the communication control unit 821 is a communication control unit, and in this embodiment, communicates with a communication terminal or the hospital computer 201a via a network. Note that the CPU 1910 is not limited to one, and may be a plurality of CPUs or may include a GPU for image processing.
- the RAM 1940 is a random access memory that the CPU 1910 uses as a work area for temporary storage.
- the RAM 1940 has an area for storing data necessary for realizing the present embodiment.
- the received local feature value 1941 indicates a local feature value including the feature point coordinates received from the communication terminal.
- the read local feature value 1942 indicates a local feature value including the feature point coordinates read from the local feature value DB 610.
- the medicine recognition result 1943 indicates the medicine recognition result recognized from the collation between the received local feature quantity and the local feature quantity stored in the local feature quantity DB 610.
- the medicine placement determination result 1944 indicates the medicine placement determination result that is the judgment placement of the medicine.
- the number of recognized medicines 1945 indicates the number of medicines.
- the drug number determination result 1946 indicates a determination result obtained by determining whether or not the drug number 1945 is the number described in the prescription.
- Transmission / reception data 1947 indicates transmission / reception data transmitted / received via the communication control unit 821.
- the storage 1950 stores a database, various parameters, or the following data or programs necessary for realizing the present embodiment.
- the local feature DB 610 is a local feature DB similar to that shown in FIG. 10A.
- the prescription DB 620 shows the same prescription DB as shown in FIG. 10B.
- the inventory management DB 630 is an inventory management DB similar to that shown in FIG. 10C.
- the storage 1950 stores the following programs.
- the pharmacy computer control program 1951 indicates a pharmacy computer control program that controls the entire pharmacy computer.
- the local feature DB creation module 1952 is a module that generates a local feature from a drug image and stores it in the local feature DB 610 in the pharmacy computer control program 1951.
- the drug recognition module 1953 is a module for recognizing a drug by comparing the received local feature quantity with the local feature quantity stored in the local feature quantity DB 610 in the pharmacy computer control program 1951.
- the drug correctness / number determination module 1954 is a module for determining whether the drug is correct or not based on the drug recognized from the local feature amount in the pharmacy computer control program 1951.
- the medicine placement / inventory determination module 1955 is a module that performs placement determination and inventory management on the medicine shelf based on the medicine recognized from the local feature amount in the pharmacy computer control program 1951.
- the determination result transmission module 1956 is a module that transmits the determination result to the communication terminal or the operator PC in the pharmacy computer control program 1951.
- FIG. 19 shows only data and programs essential to the present embodiment, and data and programs not related to the present embodiment are not shown.
- FIG. 20 is a flowchart showing a processing procedure of the pharmacy computer 202a according to the present embodiment. This flowchart is executed by the CPU 1010 of FIG. 19 using the RAM 1940, and implements each functional component of FIG. 8B.
- step S2011 it is determined whether or not a local feature DB is generated.
- step S2021 it is determined whether a local feature amount is received from the communication terminal. Otherwise, other processing is performed in step S2041.
- step S2013 If the local feature DB is generated, the process proceeds to step S2013 to execute a local feature DB generation process. If a local feature amount is received, the process advances to step S2023 to perform drug recognition processing.
- step S2025 it is determined whether the recognized medicine is a process based on a prescription or an inventory process. If it is a prescription, it will progress to step S2027 and will judge the right and wrong of a pharmaceutical with reference to prescription DB620 (FIG. 10B). In step S2029, the determination result is transmitted. On the other hand, if it is an inventory, the process proceeds to step S2031, and the placement / number of medicines in the medicine shelf is determined with reference to the inventory management DB 630 (FIG. 10C). In step S2033, the determination result is transmitted.
- the information processing system according to the present embodiment differs from the second embodiment in that recognition processing and determination processing are performed with different accuracy by adjusting the accuracy of the local feature amount. Since other configurations and operations are the same as those of the second embodiment, the same configurations and operations are denoted by the same reference numerals, and detailed description thereof is omitted.
- FIG. 21 is a sequence diagram illustrating an operation procedure of the information processing system according to the present embodiment.
- the recognition and determination processing of the surgical instrument in the surgical instrument tray in the operating room of the hospital will be described as a representative, but the same procedure can be realized in other processes such as a medicine tray.
- step S2100 an application and / or data is downloaded from the hospital computer 201a to the communication terminal or the center PC.
- the application is started and initialized for performing the processing of this embodiment.
- step S2103 the communication terminal images the surgical instrument tray.
- step S2105 the initial accuracy of local feature generation is set.
- step S2107 a local feature amount with initial accuracy is generated from the surgical instrument tray image.
- step S2109 the local feature amount is encoded together with the feature point coordinates. The encoded local feature amount is transmitted from the communication terminal to the hospital computer 201a in step S2111.
- the hospital computer 201a refers to the local feature DB 410 generated and stored for each surgical instrument that is a medical article to recognize the surgical instrument.
- step S2117 the position (area in the image) and the precision to be adjusted are set in correspondence with the surgical instrument that requires detailed checks for defects.
- step S2119 the set surgical instrument position and accuracy parameter are transmitted to the communication terminal.
- step S2121 the communication terminal adjusts the received accuracy (setting accuracy parameters).
- step S2123 the local feature amount of the surgical instrument at the specified position (region) is generated with high accuracy.
- step S2125 the local feature amount is encoded together with the feature point coordinates. The encoded local feature is transmitted from the communication terminal to the hospital computer 201a in step S2127.
- step S2129 the hospital computer 201a refers to the local feature DB 410 and the surgical instrument DB 530 for the specific surgical instrument, and performs detailed correct / incorrect determination of the surgical instrument.
- step S2131 the determination result of the placement / number of surgical instruments in step S2115 and the defect check result in units of surgical instruments in step S2129 are transmitted to the communication terminal and the center PC.
- the communication terminal notifies the determination result received in step S2133, and the center PC notifies the determination result received in step S2135.
- FIG. 22 is a block diagram showing a functional configuration of a communication terminal according to the third embodiment of the present invention.
- the same reference number is attached
- the accuracy / video region receiving unit 2207 receives the region (position) in the video that generates the accuracy parameter to be adjusted and the local feature amount transmitted from the hospital computer 201a via the communication control unit 704.
- the accuracy adjustment unit 2208 holds the accuracy parameter 2208a for accuracy adjustment, and adjusts the accuracy of the local feature amount generated by the local feature amount generation unit 702 based on the accuracy parameter 2208a.
- the image area selection unit 2209 selects an area where a surgical instrument as a target in an image for generating a local feature amount is arranged.
- FIG. 5A is a block diagram showing a first configuration 2208-1 of the accuracy adjustment unit 2208 according to the present embodiment.
- the number of dimensions can be determined by the number of dimensions determination unit 2311.
- Dimension number determination unit 2311 can determine the number of dimensions selected in dimension selection unit 1115. For example, the dimension number determination unit 2311 can determine the number of dimensions by receiving information indicating the number of dimensions from the user. Note that the information indicating the number of dimensions does not need to indicate the number of dimensions per se, and may be information indicating, for example, verification accuracy or verification speed. Specifically, for example, when receiving an input requesting that the local feature generation accuracy, communication accuracy, and collation accuracy be increased, the dimension number determination unit 2311 increases the number of dimensions so that the number of dimensions increases. decide. For example, the dimension number determination unit 2311 determines the number of dimensions so that the number of dimensions decreases when an input requesting to increase the local feature generation speed, the communication speed, and the collation speed is received.
- the dimension number determination unit 2311 may determine the same dimension number for all feature points detected from the image, or may determine a different dimension number for each feature point. For example, when the importance of feature points is given by external information, the dimension number determination unit 2311 increases the number of dimensions for feature points with high importance and decreases the number of dimensions for feature points with low importance. Also good. In this way, the number of dimensions can be determined in consideration of the matching accuracy, the local feature generation speed, the communication speed, and the matching speed.
- FIG. 23B is a block diagram showing a second configuration 2208-2 of the accuracy adjustment unit 2208 according to the present embodiment.
- the number of dimensions can be changed by the feature vector expansion unit 2312 collecting multiple dimensional values.
- the feature vector extending unit 2312 can extend the feature vector by generating a dimension in a larger scale (extended divided region) using the feature vector output from the sub-region feature vector generating unit 1114. Note that the feature vector extending unit 2312 can extend the feature vector using only the feature vector information output from the sub-region feature vector generating unit 1114. Therefore, since it is not necessary to return to the original image and perform feature extraction in order to extend the feature vector, the processing time for extending the feature vector is very small compared to the processing time for generating the feature vector from the original image. It is. For example, the feature vector extending unit 2312 may generate a new gradient direction histogram by combining gradient direction histograms of adjacent sub-regions.
- FIG. 23C is a diagram for explaining processing by the second configuration 2208-2 of the accuracy adjustment unit 2208 according to the present embodiment.
- the feature vector expansion unit 2312 expands a gradient direction histogram 2331 of 5 ⁇ 5 ⁇ 6 dimensions (150 dimensions), for example, thereby increasing a gradient direction of 4 ⁇ 4 ⁇ 6 dimensions (96 dimensions).
- a histogram 2341 can be generated. That is, four blocks 2331a surrounded by a thick solid line are combined into one block 2341a. In addition, four blocks 2331b indicated by thick broken lines are combined into one block 2241b.
- the feature vector extension unit 2312 obtains 3 ⁇ 3 ⁇ 6 dimensions by taking the sum of gradient direction histograms of adjacent 3 ⁇ 3 blocks from a 5 ⁇ 5 ⁇ 6 dimensions (150 dimensions) gradient direction histogram 2341. It is also possible to generate a (54 dimensional) gradient direction histogram 2351. That is, four blocks 2341c indicated by a thick solid line are combined into one block 2351c. Further, four blocks 2341d indicated by thick broken lines are grouped into one block 2351d.
- the dimension selection unit 1115 selects a 5 ⁇ 5 ⁇ 6 dimensional (150 dimensions) gradient direction histogram 2331 as a 5 ⁇ 5 ⁇ 3 dimensional (75 dimensions) gradient direction histogram 2332, 4 ⁇ 4
- the gradient direction histogram 2341 of ⁇ 6 dimensions (96 dimensions) becomes a 4 ⁇ 4 ⁇ 6 dimensions (96 dimensions) of gradient direction histogram 2342.
- the 3 ⁇ 3 ⁇ 6 dimension (54 dimensions) gradient direction histogram 2351 becomes a 3 ⁇ 3 ⁇ 3 dimension (27 dimensions) gradient direction histogram 2352.
- FIG. 24 is a block diagram showing a third configuration 2208-3 of the accuracy adjustment unit 2208 according to this embodiment.
- the feature point selection unit 2411 can change the number of feature points in the feature point selection, thereby changing the data amount of the local feature amount while maintaining the accuracy. It is.
- the feature point selection unit 2411 can hold in advance specified number information indicating the “specified number” of feature points to be selected, for example.
- the designated number information may be information indicating the designated number itself, or information indicating the total size (for example, the number of bytes) of the local feature amount in the image.
- the feature point selection unit 2411 divides the total size by the size of the local feature amount at one feature point, for example. Can be calculated. Also, importance can be given to all feature points at random, and feature points can be selected in descending order of importance. Then, when a specified number of feature points are selected, information about the selected feature points can be output as a selection result.
- only feature points included in a specific scale region can be selected from the scales of all feature points.
- the feature points can be reduced to the designated number based on the importance, and information on the selected feature points can be output as a selection result.
- FIG. 25 is a block diagram showing a fourth configuration 2208-4 of the accuracy adjustment unit 2208 according to this embodiment.
- the dimension number determination unit 2311 and the feature point selection unit 2411 cooperate to change the data amount of the local feature amount while maintaining accuracy.
- the feature point selection unit 2411 can select a feature point based on the number of feature points determined by the dimension number determination unit 2311. Further, the dimension number determining unit 2311 determines the selected dimension number so that the feature amount size becomes the specified feature amount size based on the specified feature amount size selected by the feature point selecting unit 2411 and the determined feature point number. Can do. In addition, the feature point selection unit 2411 selects feature points based on the feature point information output from the feature point detection unit 1111.
- the feature point selection unit 2411 outputs importance level information indicating the importance level of each selected feature point to the dimension number determination unit 2311, and the dimension number determination unit 2311 based on the importance level information.
- the number of dimensions selected in (1) can be determined for each feature point.
- FIG. 26 is a diagram showing the configuration of the accuracy parameter 2208a according to this embodiment.
- the accuracy parameter 2208a stores, as the feature point parameter 2601, a feature point selection threshold for selecting the number of feature points, feature points, or the like. Further, as the local region parameter 2602, an area (size) corresponding to a Gaussian window, a shape indicating a rectangle, a circle, or the like is stored. In addition, as the sub region parameter 2603, the number of divisions and the shape of the local region are stored. In addition, the number of directions (for example, 8 directions and 6 directions), the number of dimensions, a dimension selection method, and the like are stored as feature vector parameters 2604.
- FIG. 27 is a block diagram showing a functional configuration of the hospital computer 2701a according to the present embodiment.
- the same reference number is attached
- the accuracy adjustment determination unit 2720 receives the determination of the medical device status determination unit 815 and the surgical instrument status determination unit 817, refers to the accuracy adjustment DB 2740 (see FIG. 28), adjusts the accuracy, and generates a local feature amount again. Determine accuracy.
- the accuracy / sorting region transmission unit 2721 transmits the region information of the target medical device or surgical instrument and the determined accuracy parameter to the communication terminal via the communication control unit 811.
- FIG. 28 is a diagram showing a configuration of the accuracy adjustment DB 2740 according to the present embodiment.
- the configuration of the accuracy adjustment DB 2740 is not limited to FIG.
- the first adjustment value 2803 and the second adjustment value 2804 for generating the accuracy parameter 2208a of FIG. 26 are stored in association with the medical article ID 2801 and the name / type 2802. Any adjustment value may be used depending on the type of parameter. Since these parameters are related to each other, it is desirable to select appropriate parameters for the medical article to be recognized and determined. Therefore, parameters can be generated and stored in advance according to the target medical article, or learned and held.
- the information processing system according to this embodiment has a communication terminal having a local feature DB for communication terminals, and performs a medical article recognition process with a communication terminal and a hospital computer. It is different in that it is shared.
- Other configurations and operations are the same as those in the second embodiment and the third embodiment. Therefore, the same configurations and operations are denoted by the same reference numerals, and detailed description thereof is omitted.
- the medical article recognition process by the communication terminal when the medical article recognition process by the communication terminal is sufficient, it is not necessary to send the local feature amount from the communication terminal to the hospital computer. Therefore, the traffic between the communication terminal and the hospital computer can be reduced and the load on the hospital computer can be reduced.
- FIG. 29 is a sequence diagram showing an operation procedure of the information processing system according to the present embodiment.
- the medical article recognition and determination processing in the hospital will be described as a representative, but the same procedure can be realized in the processing in the pharmacy. Further, although the center PC is not shown in FIG. 29, the determination result is received and notified in the same manner as in the above sequence diagram.
- step S2900 the hospital computer 201a downloads the application and the local feature amount for the communication terminal to the communication terminal.
- the received local feature amount is associated with the medical article and stored in the communication terminal local feature amount DB 2910.
- step S2903 the application is started and initialized for performing the processing of this embodiment.
- step S2905 the communication terminal captures an image and acquires a video.
- step S2907 the initial accuracy of local feature generation is set.
- step S2909 a local feature amount with initial accuracy is generated from the acquired video.
- step S2911 the medical feature in the video is recognized with reference to the communication terminal local feature DB 2910.
- step S2913 it is determined whether the medical article recognition in step S2911 is sufficiently reliable (accuracy adjustment is necessary). That is, if there is not sufficient reliability, the accuracy is adjusted and a local feature amount is generated, and medical article recognition with high accuracy is performed by the hospital computer. Therefore, if the medical article recognition in step S2911 is not sufficiently reliable, the accuracy is adjusted in step S2915. In step S2917, a highly accurate local feature is generated and transmitted to the hospital computer in step S2919.
- step S2921 the hospital computer recognizes the medical article in the video with reference to the local feature DB 410 storing the highly accurate local feature.
- step S2911 determines whether the medical article recognition in step S2911 has sufficient reliability. If the medical article recognition in step S2911 has sufficient reliability, the process proceeds to step S2923, and the determination result is transmitted to the hospital computer.
- step S2925 the hospital computer refers to the medical device DB 420 and the surgical instrument DB 530 in accordance with the recognition result of the medical product, and determines the medical product.
- the recognition at the communication terminal is sufficient, it is possible to determine the arrangement and the number only by transmitting the determination result (medical article ID and position) from the communication terminal to the hospital computer.
- the information processing system according to the present embodiment differs from the second to fourth embodiments in that the communication terminal independently recognizes and determines a medical article. Since other configurations and operations are the same as those of the second to fourth embodiments, the same configurations and operations are denoted by the same reference numerals, and detailed description thereof is omitted.
- FIG. 30 is a block diagram showing a functional configuration of the communication terminals 3011 to 3061 according to the present embodiment.
- the same reference number is attached
- the local feature DB 3001 stores the local feature downloaded from the hospital computer or pharmacy computer via the communication control unit 704 by the local feature receiving unit 3002.
- the local feature DB 3001 may accumulate the local feature generated by the local feature generating unit 702 of the communication terminal in association with the recognition result as learning.
- the medical article recognition unit 3003 recognizes the medical article by collating the local feature amount generated by the local feature amount generation unit 702 with the local feature amount of the local feature amount DB 3001.
- medical article information downloaded from the hospital computer or pharmacy computer by the medical article information receiving unit 3006 via the communication control unit 704 is stored.
- medical article information may include a medical device DB 420, a surgical instrument DB 530, a prescription DB 620, and the like.
- the medical article determination unit 3007 determines a medical article from the recognition result recognized by the medical article recognition unit 3003 with reference to the medical article DB 2005. Such determination includes determination of the arrangement and number of medical articles, or errors and defects.
- the determination result is notified from the determination result notification unit 706 and transmitted to the hospital computer and the pharmacy computer by the medical article determination result transmission unit 3008 via the communication control unit 704.
- an accuracy adjustment parameter storage unit 3009 may be provided to adjust the accuracy of the local feature amount generation unit 702 according to the recognition result of the medical article recognition unit 3003.
- the present invention may be applied to a system composed of a plurality of devices, or may be applied to a single device. Furthermore, the present invention can also be applied to a case where a control program that realizes the functions of the embodiments is supplied directly or remotely to a system or apparatus. Therefore, in order to realize the functions of the present invention on a computer, a control program installed in the computer, a medium storing the control program, and a WWW (World Wide Web) server that downloads the control program are also included in the scope of the present invention. include.
- M first items each consisting of a 1-dimensional to i-dimensional feature vector generated for each of the m local regions including each of the medical article and the m feature points of the image of the medical article.
- First local feature quantity storage means for storing the local feature quantity in association with each other; N feature points are extracted from the image captured by the imaging means, and n local feature regions each including the n feature points are each composed of feature vectors of 1 to j dimensions.
- Second local feature quantity generating means for generating the second local feature quantity of A smaller dimension number is selected from among the dimension number i of the feature vector of the first local feature quantity and the dimension number j of the feature vector of the second local feature quantity, and the feature vector includes up to the selected dimension number.
- the image in the video Recognizing means for recognizing that the medical article exists An information processing system comprising: (Appendix 2) The information processing system according to appendix 1, further comprising notification means for notifying a recognition result of the recognition means.
- the information processing system includes a communication terminal carried by a user and an information processing apparatus that communicates with the communication terminal,
- the communication terminal includes the imaging means, the second local feature quantity generation means, and the notification means, and transmits the m second local feature quantities from the communication terminal to the information processing apparatus;
- the information processing apparatus includes the first local feature quantity storage unit and the recognition unit, and transmits a recognition result of the recognition unit from the information processing apparatus to the communication terminal.
- the first local feature quantity storage means stores the m first local feature quantities generated from images of the medical articles in association with a plurality of medical articles,
- the recognizing unit recognizes a plurality of medical articles included in the image captured by the imaging unit, and based on a sequence of the n second local feature amounts, the recognizing unit in the image captured by the imaging unit.
- the information processing system according to any one of appendices 1 to 3, further comprising an arrangement determination unit that determines the arrangement of a plurality of medical articles.
- the medical article is a medical device, and the image captured by the imaging unit is an examination room, a hospital room, or an operating room,
- the arrangement determining means recognizes the arrangement of the medical device in the examination room, a hospital room or an operating room
- the second local feature quantity generating means has accuracy adjusting means for adjusting the precision of the second local feature quantity
- the recognition unit further recognizes an error, a defect, or a state of the medical device based on the second local feature amount generated by the second local feature amount generation unit with higher accuracy.
- the medical article is a medical instrument, and the image captured by the imaging unit is a tray on which the medical instrument is disposed,
- the arrangement determining means recognizes the arrangement of the medical device in the tray;
- the second local feature quantity generating means has accuracy adjusting means for adjusting the precision of the second local feature quantity,
- the recognition unit further recognizes an error, a defect, or a state of the medical device based on the second local feature amount generated by the second local feature amount generation unit with higher accuracy.
- the medical article is a medicine
- the image captured by the imaging unit is a medicine shelf or a medicine tray
- the arrangement determining means recognizes the arrangement of the medicines in the medicine shelf or medicine tray
- the second local feature quantity generating means has accuracy adjusting means for adjusting the precision of the second local feature quantity
- the recognizing unit further recognizes an error, a defect, or a state of the medicine based on the second local feature generated by the second local feature generating unit with higher accuracy.
- the information processing system according to 4. (Appendix 8) The information processing system according to appendix 7, further comprising management means for performing inventory based on the arrangement of the plurality of medicines recognized by the arrangement determining means.
- the first local feature amount and the second local feature amount are a plurality of dimensions formed by dividing a local region including a feature point extracted from an image into a plurality of sub-regions, and comprising histograms of gradient directions in the plurality of sub-regions.
- the information processing system according to any one of appendices 1 to 8, wherein the information processing system is generated by generating a feature vector.
- the first local feature quantity and the second local feature quantity are generated by selecting a dimension having a smaller correlation between adjacent sub-regions from the generated feature vectors of a plurality of dimensions.
- the plurality of dimensions of the feature vector is a predetermined dimension so that it can be selected in order from the dimension that contributes to the feature of the feature point and from the first dimension in accordance with the improvement in accuracy required for the local feature amount.
- the second local feature quantity generation means In response to the correlation of the medical article, the second local feature quantity generation means generates the second local feature quantity having a higher number of dimensions for the medical article having a higher correlation with other medical articles.
- the first local feature quantity storage means stores the first local feature quantity having a higher number of dimensions for a medical article having a higher correlation with another medical article in response to the correlation of the medical article. 13.
- M first items each consisting of a 1-dimensional to i-dimensional feature vector generated for each of the m local regions including each of the medical article and the m feature points of the image of the medical article.
- An information processing method in an information processing system including first local feature storage means for storing a local feature in association with each other, N feature points are extracted from an image in the captured video, and n second regions each consisting of a feature vector from one dimension to j dimension for each of the n local regions including each of the n feature points.
- a second local feature generation step for generating a local feature; A smaller dimension number is selected from among the dimension number i of the feature vector of the first local feature quantity and the dimension number j of the feature vector of the second local feature quantity, and the feature vector includes up to the selected dimension number.
- the image in the video Recognizing that the medical article is present in An information processing method comprising: (Appendix 15) N feature points are extracted from the image captured by the imaging means, and n local feature regions each including the n feature points are each composed of feature vectors of 1 to j dimensions.
- Second local feature quantity generating means for generating the second local feature quantity of First transmitting means for transmitting the m second local feature amounts to an information processing apparatus that recognizes a medical article included in the image captured based on the comparison of local feature amounts;
- First receiving means for receiving, from the information processing apparatus, information indicating a medical article included in the captured image;
- a communication terminal comprising: (Appendix 16) N feature points are extracted from the image captured by the imaging means, and n local feature regions each including the n feature points are each composed of feature vectors of 1 to j dimensions.
- a second local feature generation step of generating the second local feature of A first transmission step of transmitting the m second local feature quantities to an information processing apparatus that recognizes a medical article included in the image captured based on the comparison of local feature quantities;
- a control method for a communication terminal comprising: (Appendix 17) N feature points are extracted from the image captured by the imaging means, and n local feature regions each including the n feature points are each composed of feature vectors of 1 to j dimensions.
- a second local feature generation step of generating the second local feature of A first transmission step of transmitting the m second local feature quantities to an information processing apparatus that recognizes a medical article included in the image captured based on the comparison of local feature quantities;
- a control program for causing a computer to execute. Appendix 18
- M first items each consisting of a 1-dimensional to i-dimensional feature vector generated for each of the m local regions including each of the medical article and the m feature points of the image of the medical article.
- First local feature quantity storage means for storing the local feature quantity in association with each other; N feature points are extracted from an image in the video captured by the communication terminal, and n feature regions of 1 to j dimensions are respectively obtained for n local regions including the n feature points.
- Second receiving means for receiving the second local feature amount from the communication terminal; A smaller dimension number is selected from among the dimension number i of the feature vector of the first local feature quantity and the dimension number j of the feature vector of the second local feature quantity, and the feature vector includes up to the selected dimension number.
- Second transmission means for transmitting information indicating the recognized medical article to the communication terminal;
- An information processing apparatus comprising: (Appendix 19) M first items each consisting of a 1-dimensional to i-dimensional feature vector generated for each of the m local regions including each of the medical article and the m feature points of the image of the medical article.
- a control method for an information processing apparatus including first local feature storage means for storing a local feature in association with each other, N feature points are extracted from an image in the video captured by the communication terminal, and n feature regions of 1 to j dimensions are respectively obtained for n local regions including the n feature points.
- a method for controlling an information processing apparatus comprising: (Appendix 20) M first items each consisting of a 1-dimensional to i-dimensional feature vector generated for each of the m local regions including each of the medical article and the m feature points of the image of the medical article.
- a control program for an information processing apparatus including a first local feature storage unit that stores a local feature in association with each other, N feature points are extracted from an image in the video captured by the communication terminal, and n feature regions of 1 to j dimensions are respectively obtained for n local regions including the n feature points.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Medical Informatics (AREA)
- General Business, Economics & Management (AREA)
- Epidemiology (AREA)
- Biomedical Technology (AREA)
- Primary Health Care (AREA)
- Public Health (AREA)
- Business, Economics & Management (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Software Systems (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Medical Treatment And Welfare Office Work (AREA)
- Medical Preparation Storing Or Oral Administration Devices (AREA)
- Image Analysis (AREA)
Abstract
Description
医療用物品と、前記医療用物品の画像のm個の特徴点のそれぞれを含むm個の局所領域のそれぞれについて生成された、それぞれ1次元からi次元までの特徴ベクトルからなるm個の第1局所特徴量とを、対応付けて記憶する第1局所特徴量記憶手段と、
撮像手段が撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を生成する第2局所特徴量生成手段と、
前記第1局所特徴量の特徴ベクトルの次元数iおよび前記第2局所特徴量の特徴ベクトルの次元数jのうち、より少ない次元数を選択し、選択された前記次元数までの特徴ベクトルからなる前記n個の第2局所特徴量に、選択された前記次元数までの特徴ベクトルからなる前記m個の第1局所特徴量の所定割合以上が対応すると判定した場合に、前記映像中の前記画像に前記医療用物品が存在すると認識する認識手段と、
を備えたことを特徴とする。
医療用物品と、前記医療用物品の画像のm個の特徴点のそれぞれを含むm個の局所領域のそれぞれについて生成された、それぞれ1次元からi次元までの特徴ベクトルからなるm個の第1局所特徴量とを、対応付けて記憶する第1局所特徴量記憶手段を備えた情報処理システムにおける情報処理方法であって、
撮像ステップと、
前記撮像ステップにおいて撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を生成する第2局所特徴量生成ステップと、
前記第1局所特徴量の特徴ベクトルの次元数iおよび前記第2局所特徴量の特徴ベクトルの次元数jのうち、より少ない次元数を選択し、選択された前記次元数までの特徴ベクトルからなる前記n個の第2局所特徴量に、選択された前記次元数までの特徴ベクトルからなる前記m個の第1局所特徴量の所定割合以上が対応すると判定した場合に、前記映像中の前記画像に前記医療用物品が存在すると認識する認識ステップと、
を備えることを特徴とする。
撮像手段が撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を生成する第2局所特徴量生成手段と、
前記m個の第2局所特徴量を、局所特徴量の照合に基づいて撮像した前記画像に含まれる医療用物品を認識する情報処理装置に送信する第1送信手段と、
前記情報処理装置から、撮像した前記画像に含まれる医療用物品を示す情報を受信する第1受信手段と、
を備えることを特徴とする。
撮像手段が撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を生成する第2局所特徴量生成ステップと、
前記m個の第2局所特徴量を、局所特徴量の照合に基づいて撮像した前記画像に含まれる医療用物品を認識する情報処理装置に送信する第1送信ステップと、
前記情報処理装置から、撮像した前記画像に含まれる医療用物品を示す情報を受信する第1受信ステップと、
を含むことを特徴とする。
撮像手段が撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を生成する第2局所特徴量生成ステップと、
前記m個の第2局所特徴量を、局所特徴量の照合に基づいて撮像した前記画像に含まれる医療用物品を認識する情報処理装置に送信する第1送信ステップと、
前記情報処理装置から、撮像した前記画像に含まれる医療用物品を示す情報を受信する第1受信ステップと、
をコンピュータに実行させることを特徴とする。
医療用物品と、前記医療用物品の画像のm個の特徴点のそれぞれを含むm個の局所領域のそれぞれについて生成された、それぞれ1次元からi次元までの特徴ベクトルからなるm個の第1局所特徴量とを、対応付けて記憶する第1局所特徴量記憶手段と、
通信端末が撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を、前記通信端末から受信する第2受信手段と、
前記第1局所特徴量の特徴ベクトルの次元数iおよび前記第2局所特徴量の特徴ベクトルの次元数jのうち、より少ない次元数を選択し、選択された前記次元数までの特徴ベクトルからなる前記n個の第2局所特徴量に、選択された前記次元数までの特徴ベクトルからなる前記m個の第1局所特徴量の所定割合以上が対応すると判定した場合に、前記映像中の前記画像に前記医療用物品が存在すると認識する認識手段と、
認識した前記医療用物品を示す情報を前記通信端末に送信する第2送信手段と、
を備えることを特徴とする。
医療用物品と、前記医療用物品の画像のm個の特徴点のそれぞれを含むm個の局所領域のそれぞれについて生成された、それぞれ1次元からi次元までの特徴ベクトルからなるm個の第1局所特徴量とを、対応付けて記憶する第1局所特徴量記憶手段を備えた情報処理装置の制御方法であって、
通信端末が撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を、前記通信端末から受信する第2受信ステップと、
前記第1局所特徴量の特徴ベクトルの次元数iおよび前記第2局所特徴量の特徴ベクトルの次元数jのうち、より少ない次元数を選択し、選択された前記次元数までの特徴ベクトルからなる前記n個の第2局所特徴量に、選択された前記次元数までの特徴ベクトルからなる前記m個の第1局所特徴量の所定割合以上が対応すると判定した場合に、前記映像中の前記画像に前記医療用物品が存在すると認識する認識ステップと、
認識した前記医療用物品を示す情報を前記通信端末に送信する第2送信ステップと、
を含むことを特徴とする。
医療用物品と、前記医療用物品の画像のm個の特徴点のそれぞれを含むm個の局所領域のそれぞれについて生成された、それぞれ1次元からi次元までの特徴ベクトルからなるm個の第1局所特徴量とを、対応付けて記憶する第1局所特徴量記憶手段を備えた情報処理装置の制御プログラムであって、
通信端末が撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を、前記通信端末から受信する第2受信ステップと、
前記第1局所特徴量の特徴ベクトルの次元数iおよび前記第2局所特徴量の特徴ベクトルの次元数jのうち、より少ない次元数を選択し、選択された前記次元数までの特徴ベクトルからなる前記n個の第2局所特徴量に、選択された前記次元数までの特徴ベクトルからなる前記m個の第1局所特徴量の所定割合以上が対応すると判定した場合に、前記映像中の前記画像に前記医療用物品が存在すると認識する認識ステップと、
認識した前記医療用物品を示す情報を前記通信端末に送信する第2送信ステップと、
をコンピュータに実行させることを特徴とする。
本発明の第1実施形態としての情報処理システム100について、図1を用いて説明する。情報処理システム100は、医療用物品をリアルタイムに認識するシステムである。なお、本明細書中の医療用物品との文言は、医療機器、医療器具、医薬品を含む。
次に、本発明の第2実施形態に係る情報処理システムについて説明する。本実施形態においては、病院あるいは薬局において、各部署で医療用部品を認識して管理する構成を総合的に示す。
図2は、本実施形態に係る情報処理システム200の構成を示すブロック図である。
図3は、本実施形態に係る通信端末221、231、251における表示画面例を示す図である。なお、図3には、病室の通信端末221と、手術室の通信端末231と、薬トレーの通信端末251とを示すが、他の場合も類似である。
図4乃至図6を参照して、本実施形態に係る情報処理システム200における各部署に適用した場合の動作手順を示す。
図4は、本実施形態に係る情報処理システム200の病室における動作手順を示すシーケンス図である。
図5は、本実施形態に係る情報処理システム200の手術室における動作手順を示すシーケンス図である。
図6は、本実施形態に係る情報処理システム200の薬局における動作手順を示すシーケンス図である。
図7は、本実施形態に係る通信端末211、221、231、241、251の機能構成を示すブロック図である。
図8Aは、本実施形態に係る病院コンピュータ201aの機能構成を示すブロック図である。
図8Bは、本実施形態に係る薬局コンピュータ202aの機能構成を示すブロック図である。
図9Aは、本実施形態に係る病院の局所特徴量DB410の構成を示す図である。なお、かかる構成に限定されない。
図9Bは、本実施形態に係る医療機器DB420の構成を示す図である。なお、かかる構成に限定されない。
図9Cは、本実施形態に係る手術器具DB530の構成を示す図である。手術器具DB530は、各手術器具の情報を記憶するDB930と、手術に対応して手術器具のトレー内の配置および数を記憶するDB940とを含む。なお、かかる構成に限定されない。
図10Aは、本実施形態に係る薬局の局所特徴量DB610の構成を示す図である。なお、かかる構成に限定されない。
図10Bは、本実施形態に係る処方箋DB620の構成を示す図である。なお、かかる構成に限定されない。
図10Cは、本実施形態に係る在庫管理DB630の構成を示す図である。なお、かかる構成に限定されない。
図11Aは、本実施形態に係る局所特徴量生成部702の構成を示すブロック図である。
図11B~図11Fは、本実施形態に係る局所特徴量生成部702の処理を示す図である。
図11Bの画像1121は、図11Aの特徴点検出部1111において、映像中の画像から特徴点を検出した状態を示す図である。以下、1つの特徴点データ1121aを代表させて局所特徴量の生成を説明する。特徴点データ1121aの矢印の起点が特徴点の座標位置を示し、矢印の長さがスケール(大きさ)を示し、矢印の方向が角度を示す。ここで、スケール(大きさ)や方向は、対象映像にしたがって輝度や彩度、色相などを選択できる。また、図11Bの例では、60度間隔で6方向の場合を説明するが、これに限定されない。
図11Aの局所領域取得部1112は、例えば、特徴点データ1121aの起点を中心にガウス窓1122aを生成し、このガウス窓1122aをほぼ含む局所領域1122を生成する。図11Bの例では、局所領域取得部1112は正方形の局所領域1122を生成したが、局所領域は円形であっても他の形状であってもよい。この局所領域を各特徴点について取得する。局所領域が円形であれば、撮影方向に対してロバスト性が向上するという効果がある。
次に、サブ領域分割部1113において、上記特徴点データ1121aの局所領域1122に含まれる各画素のスケールおよび角度をサブ領域1123に分割した状態が示されている。なお、図11Bでは4×4=16画素をサブ領域とする5×5=25のサブ領域に分割した例を示す。しかし、サブ領域は、4×4=16や他の形状、分割数であってもよい。
サブ領域特徴ベクトル生成部1114は、サブ領域内の各画素のスケールを6方向の角度単位にヒストグラムを生成して量子化し、サブ領域の特徴ベクトル1124とする。すなわち、特徴点検出部1111が出力する角度に対して正規化された方向である。そして、サブ領域特徴ベクトル生成部1114は、サブ領域ごとに量子化された6方向の頻度を集計し、ヒストグラムを生成する。この場合、サブ領域特徴ベクトル生成部1114は、各特徴点に対して生成される25サブ領域ブロック×6方向=150次元のヒストグラムにより構成される特徴ベクトルを出力する。また、勾配方向を6方向に量子化するだけに限らず、4方向、8方向、10方向など任意の量子化数に量子化してよい。勾配方向をD方向に量子化する場合、量子化前の勾配方向をG(0~2πラジアン)とすると、勾配方向の量子化値Qq(q=0,…,D-1)は、例えば式(1)や式(2)などで求めることができるが、これに限られない。
Qq=round(G×D/2π)modD …(2)
ここで、floor()は小数点以下を切り捨てる関数、round()は四捨五入を行う関数、modは剰余を求める演算である。また、サブ領域特徴ベクトル生成部1114は勾配ヒストグラムを生成するときに、単純な頻度を集計するのではなく、勾配の大きさを加算して集計してもよい。また、サブ領域特徴ベクトル生成部1114は勾配ヒストグラムを集計するときに、画素が属するサブ領域だけではなく、サブ領域間の距離に応じて近接するサブ領域(隣接するブロックなど)にも重み値を加算するようにしてもよい。また、サブ領域特徴ベクトル生成部1114は量子化された勾配方向の前後の勾配方向にも重み値を加算するようにしてもよい。なお、サブ領域の特徴ベクトルは勾配方向ヒストグラムに限られず、色情報など、複数の次元(要素)を有するものであればよい。本実施形態においては、サブ領域の特徴ベクトルとして、勾配方向ヒストグラムを用いることとして説明する。
次に、図11C~図11Fにしたがって、局所特徴量生成部702における、次元選定部1115に処理を説明する。
図11Cは、局所特徴量生成部702における、特徴ベクトルの次元数の選定処理の様子を示す図である。
図11Dは、局所特徴量生成部702における、サブ領域からの特徴ベクトルの選定順位の一例を示す図である。
図11Gは、本実施形態に係る符号化部703aを示すブロック図である。なお、符号化部は本例に限定されず、他の符号化処理も適用可能である。
図11H、図11Jおよび図11Kは、本実施形態に係る医療用物品認識部813、医薬品認識部823の処理を示す図である。
図12Aは、本実施形態に係る通信端末211~261のハードウェア構成を示すブロック図である。
図12Bは、本実施形態に係る通信端末211~261における局所特徴量生成テーブル1243を示す図である。
図13は、本実施形態に係る通信端末211~261の処理手順を示すフローチャートである。このフローチャートは、図12AのCPU1210によってRAM1240を用いて実行され、図7の各機能構成部を実現する。
図14Aは、本実施形態に係る局所特徴量生成処理S1313の処理手順を示すフローチャートである。
図14Bは、本実施形態に係る符号化処理S1315の処理手順を示すフローチャートである。
図14Cは、本実施形態に係る差分値の符号化処理S1437の処理手順を示すフローチャートである。
図15は、本実施形態に係る病院コンピュータ201aのハードウェア構成を示すブロック図である。
図16は、本実施形態に係る病院コンピュータ201aの処理手順を示すフローチャートである。このフローチャートは、図15のCPU1510によりRAM1540を使用して実行され、図8Aの各機能構成部を実現する。
図17は、本実施形態に係る局所特徴量DB生成処理S1613の処理手順を示すフローチャートである。
図18Aは、本実施形態に係る医療用物品認識処理S1623の処理手順を示すフローチャートである。
図18Bは、本実施形態に係る照合処理S1813の処理手順を示すフローチャートである。
図19は、本実施形態に係る薬局コンピュータ202aのハードウェア構成を示すブロック図である。
図20は、本実施形態に係る薬局コンピュータ202aの処理手順を示すフローチャートである。このフローチャートは、図19のCPU1010によりRAM1940を使用して実行され、図8Bの各機能構成部を実現する。
図20の局所特徴量DB生成処理(S2013)や医薬品認識処理(S2023)の詳細については、図17、図18Aおよび図18Bと、医療用物品を医薬品に変更すれば類似であるので、説明はそれらで代替する。
次に、本発明の第3実施形態に係る情報処理システムについて説明する。本実施形態に係る情報処理システムは、上記第2実施形態と比べると、局所特徴量の精度を調整して認識処理と判定処理とを異なる精度で行なう点で異なる。その他の構成および動作は、第2実施形態と同様であるため、同じ構成および動作については同じ符号を付してその詳しい説明を省略する。
図21は、本実施形態に係る情報処理システムの動作手順を示すシーケンス図である。なお、図21においては、病院の手術室における手術器具トレー内の手術器具の認識と判定処理を代表して説明するが、薬トレーなどの他の処理においても同様の手順で実現できる。
図22は、本発明の第3実施形態に係る通信端末の機能構成を示すブロック図である。なお、第2実施形態の図7と同様の機能構成には、同じ参照番号を付して説明を省略する。
以下、図23A~図23C、図24および図25を参照して、精度調整部2208の数例の構成を説明する。
図5Aは、本実施形態に係る精度調整部2208の第1の構成2208-1を示すブロック図である。精度調整部2208の第1の構成2208-1においては、次元数決定部2311で次元数を決定可能である。
(第2の構成)
図23Bは、本実施形態に係る精度調整部2208の第2の構成2208-2を示すブロック図である。精度調整部2208の第2の構成2208-2においては、特徴ベクトル拡張部2312が複数次元の値をまとめることで、次元数を変更することが可能である。
図24は、本実施形態に係る精度調整部2208の第3の構成2208-3を示すブロック図である。精度調整部2208の第3の構成2208-3においては、特徴点選定部2411が特徴点選定で特徴点数を変更することで、精度を維持しながら局所特徴量のデータ量を変更することが可能である。
図25は、本実施形態に係る精度調整部2208の第4の構成2208-4を示すブロック図である。精度調整部2208の第4の構成2208-4においては、次元数決定部2311と特徴点選定部2411とが協働しながら、精度を維持しながら局所特徴量のデータ量を変更する。
図26は、本実施形態に係る精度パラメータ2208aの構成を示す図である。
図27は、本実施形態に係る病院コンピュータ2701aの機能構成を示すブロック図である。なお、第2実施形態の図8と同様の機能構成には、同じ参照番号を付して説明を省略する。
図28は、本実施形態に係る精度調整DB2740の構成を示す図である。精度調整DB2740の構成は、図28に限定されない。
次に、本発明の第4実施形態に係る情報処理システムについて説明する。本実施形態に係る情報処理システムは、上記第2実施形態および第3実施形態と比べると、通信端末が通信端末用局所特徴量DBを有して、医療用物品認識処理を通信端末と病院コンピュータとで分担する点で異なる。その他の構成および動作は、第2実施形態および第3実施形態と同様であるため、同じ構成および動作については同じ符号を付してその詳しい説明を省略する。
図29は、本実施形態に係る情報処理システムの動作手順を示すシーケンス図である。なお、図29においては、病院における医療用物品の認識と判定処理を代表して説明するが、薬局における処理においても同様の手順で実現できる。また、図29にはセンターPCを図示していないが、前出のシーケンス図と同様に判定結果を受信して報知する。
次に、本発明の第5実施形態に係る情報処理システムについて説明する。本実施形態に係る情報処理システムは、上記第2実施形態乃至第4実施形態と比べると、通信端末が独立して医療用物品の認識および判定を行なう点で異なる。その他の構成および動作は、第2実施形態乃至第4実施形態と同様であるため、同じ構成および動作については同じ符号を付してその詳しい説明を省略する。
図30は、本実施形態に係る通信端末3011~3061の機能構成を示すブロック図である。なお、第2実施形態の図7と同様の機能構成部には同じ参照番号を付して、説明は省略する。
以上、実施形態を参照して本発明を説明したが、本発明は上記実施形態に限定されものではない。本発明の構成や詳細には、本発明のスコープ内で当業者が理解し得る様々な変更をすることができる。また、それぞれの実施形態に含まれる別々の特徴を如何様に組み合わせたシステムまたは装置も、本発明の範疇に含まれる。
(付記1)
医療用物品と、前記医療用物品の画像のm個の特徴点のそれぞれを含むm個の局所領域のそれぞれについて生成された、それぞれ1次元からi次元までの特徴ベクトルからなるm個の第1局所特徴量とを、対応付けて記憶する第1局所特徴量記憶手段と、
撮像手段が撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を生成する第2局所特徴量生成手段と、
前記第1局所特徴量の特徴ベクトルの次元数iおよび前記第2局所特徴量の特徴ベクトルの次元数jのうち、より少ない次元数を選択し、選択された前記次元数までの特徴ベクトルからなる前記n個の第2局所特徴量に、選択された前記次元数までの特徴ベクトルからなる前記m個の第1局所特徴量の所定割合以上が対応すると判定した場合に、前記映像中の前記画像に前記医療用物品が存在すると認識する認識手段と、
を備えたことを特徴とする情報処理システム。
(付記2)
前記認識手段の認識結果を報知する報知手段をさらに備えることを特徴とする付記1に記載の情報処理システム。
(付記3)
前記情報処理システムは、ユーザが携帯する通信端末と、前記通信端末と通信する情報処理装置とを有し、
前記通信端末が、前記撮像手段と前記第2局所特徴量生成手段と前記報知手段とを含んで、前記m個の第2局所特徴量を前記通信端末から前記情報処理装置へ送信し、
前記情報処理装置が、前記第1局所特徴量記憶手段と前記認識手段とを含んで、前記認識手段の認識結果を前記情報処理装置から前記通信端末へ送信することを特徴とする付記2に記載の情報処理システム。
(付記4)
前記第1局所特徴量記憶手段は、複数の医療用物品にそれぞれ対応付けて各医療用物品の画像から生成した前記m個の第1局所特徴量を記憶し、
前記認識手段は、前記撮像手段が撮像した前記画像に含まれる複数の医療用物品を認識すると共に、前記n個の第2局所特徴量のならびに基づいて、前記撮像手段が撮像した前記画像における前記複数の医療用物品の配置を判定する配置判定手段を有することを特徴とする付記1乃至3のいずれか1つに記載の情報処理システム。
(付記5)
前記医療用物品は医療機器であり、前記撮像手段が撮像する画像は診察室、病室あるいは手術室であって、
前記配置判定手段は、前記診察室、病室あるいは手術室における前記医療機器の配置を認識し、
前記第2局所特徴量生成手段は、前記第2局所特徴量の精度を調整する精度調整手段を有し、
前記認識手段は、精度をより高く調整して前記第2局所特徴量生成手段が生成した第2局所特徴量に基づいて、前記医療機器の間違い、欠陥あるいは状態をさらに認識することを特徴とする付記4に記載の情報処理システム。
(付記6)
前記医療用物品は医療器具であり、前記撮像手段が撮像する画像は医療器具を配置するトレーであって、
前記配置判定手段は、前記トレーにおける前記医療器具の配置を認識し、
前記第2局所特徴量生成手段は、前記第2局所特徴量の精度を調整する精度調整手段を有し、
前記認識手段は、精度をより高く調整して前記第2局所特徴量生成手段が生成した第2局所特徴量に基づいて、前記医療器具の間違い、欠陥あるいは状態をさらに認識することを特徴とする付記4に記載の情報処理システム。
(付記7)
前記医療用物品は医薬品であり、前記撮像手段が撮像する画像は薬棚あるいは薬トレーであって、
前記配置判定手段は、前記薬棚あるいは薬トレーにおける前記医薬品の配置を認識し、
前記第2局所特徴量生成手段は、前記第2局所特徴量の精度を調整する精度調整手段を有し、
前記認識手段は、精度をより高く調整して前記第2局所特徴量生成手段が生成した第2局所特徴量に基づいて、前記医薬品の間違い、欠陥あるいは状態をさらに認識することを特徴とする付記4に記載の情報処理システム。
(付記8)
前記配置判定手段が認識した前記複数の医薬品の配置に基づいて、棚卸しを行なう管理手段を備えることを特徴とする付記7に記載の情報処理システム。
(付記9)
前記第1局所特徴量および前記第2局所特徴量は、画像から抽出した特徴点を含む局所領域を複数のサブ領域に分割し、前記複数のサブ領域内の勾配方向のヒストグラムからなる複数の次元の特徴ベクトルを生成することにより生成されることを特徴とする付記1乃至8のいずれか1つに記載の情報処理システム。
(付記10)
前記第1局所特徴量および前記第2局所特徴量は、前記生成した複数の次元の特徴ベクトルから、隣接するサブ領域間の相関がより小さな次元を選定することにより生成されることを特徴とする付記9に記載の情報処理システム。
(付記11)
前記特徴ベクトルの複数の次元は、前記特徴点の特徴に寄与する次元から順に、かつ、前記局所特徴量に対して求められる精度の向上に応じて第1次元から順に選択できるよう、所定の次元数ごとに前記局所領域をひと回りするよう配列することを特徴とする付記9または10に記載の情報処理システム。
(付記12)
前記第2局所特徴量生成手段は、前記医療用物品の相関に対応して、他の医療用物品とより高い前記相関を有する医療用物品については次元数のより多い前記第2局所特徴量を生成することを特徴とする付記11に記載の情報処理システム。
(付記13)
前記第1局所特徴量記憶手段は、前記医療用物品の相関に対応して、他の医療用物品とより高い前記相関を有する医療用物品については次元数のより多い前記第1局所特徴量を記憶することを特徴とする付記11または12に記載の情報処理システム。
(付記14)
医療用物品と、前記医療用物品の画像のm個の特徴点のそれぞれを含むm個の局所領域のそれぞれについて生成された、それぞれ1次元からi次元までの特徴ベクトルからなるm個の第1局所特徴量とを、対応付けて記憶する第1局所特徴量記憶手段を備えた情報処理システムにおける情報処理方法であって、
撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を生成する第2局所特徴量生成ステップと、
前記第1局所特徴量の特徴ベクトルの次元数iおよび前記第2局所特徴量の特徴ベクトルの次元数jのうち、より少ない次元数を選択し、選択された前記次元数までの特徴ベクトルからなる前記n個の第2局所特徴量に、選択された前記次元数までの特徴ベクトルからなる前記m個の第1局所特徴量の所定割合以上が対応すると判定した場合に、前記映像中の前記画像に前記医療用物品が存在すると認識する認識ステップと、
を備えることを特徴とする情報処理方法。
(付記15)
撮像手段が撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を生成する第2局所特徴量生成手段と、
前記m個の第2局所特徴量を、局所特徴量の照合に基づいて撮像した前記画像に含まれる医療用物品を認識する情報処理装置に送信する第1送信手段と、
前記情報処理装置から、撮像した前記画像に含まれる医療用物品を示す情報を受信する第1受信手段と、
を備えることを特徴とする通信端末。
(付記16)
撮像手段が撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を生成する第2局所特徴量生成ステップと、
前記m個の第2局所特徴量を、局所特徴量の照合に基づいて撮像した前記画像に含まれる医療用物品を認識する情報処理装置に送信する第1送信ステップと、
前記情報処理装置から、撮像した前記画像に含まれる医療用物品を示す情報を受信する第1受信ステップと、
を含むことを特徴とする通信端末の制御方法。
(付記17)
撮像手段が撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を生成する第2局所特徴量生成ステップと、
前記m個の第2局所特徴量を、局所特徴量の照合に基づいて撮像した前記画像に含まれる医療用物品を認識する情報処理装置に送信する第1送信ステップと、
前記情報処理装置から、撮像した前記画像に含まれる医療用物品を示す情報を受信する第1受信ステップと、
をコンピュータに実行させることを特徴とする制御プログラム。
(付記18)
医療用物品と、前記医療用物品の画像のm個の特徴点のそれぞれを含むm個の局所領域のそれぞれについて生成された、それぞれ1次元からi次元までの特徴ベクトルからなるm個の第1局所特徴量とを、対応付けて記憶する第1局所特徴量記憶手段と、
通信端末が撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を、前記通信端末から受信する第2受信手段と、
前記第1局所特徴量の特徴ベクトルの次元数iおよび前記第2局所特徴量の特徴ベクトルの次元数jのうち、より少ない次元数を選択し、選択された前記次元数までの特徴ベクトルからなる前記n個の第2局所特徴量に、選択された前記次元数までの特徴ベクトルからなる前記m個の第1局所特徴量の所定割合以上が対応すると判定した場合に、前記映像中の前記画像に前記医療用物品が存在すると認識する認識手段と、
認識した前記医療用物品を示す情報を前記通信端末に送信する第2送信手段と、
を備えることを特徴とする情報処理装置。
(付記19)
医療用物品と、前記医療用物品の画像のm個の特徴点のそれぞれを含むm個の局所領域のそれぞれについて生成された、それぞれ1次元からi次元までの特徴ベクトルからなるm個の第1局所特徴量とを、対応付けて記憶する第1局所特徴量記憶手段を備えた情報処理装置の制御方法であって、
通信端末が撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を、前記通信端末から受信する第2受信ステップと、
前記第1局所特徴量の特徴ベクトルの次元数iおよび前記第2局所特徴量の特徴ベクトルの次元数jのうち、より少ない次元数を選択し、選択された前記次元数までの特徴ベクトルからなる前記n個の第2局所特徴量に、選択された前記次元数までの特徴ベクトルからなる前記m個の第1局所特徴量の所定割合以上が対応すると判定した場合に、前記映像中の前記画像に前記医療用物品が存在すると認識する認識ステップと、
認識した前記医療用物品を示す情報を前記通信端末に送信する第2送信ステップと、
を含むことを特徴とする情報処理装置の制御方法。
(付記20)
医療用物品と、前記医療用物品の画像のm個の特徴点のそれぞれを含むm個の局所領域のそれぞれについて生成された、それぞれ1次元からi次元までの特徴ベクトルからなるm個の第1局所特徴量とを、対応付けて記憶する第1局所特徴量記憶手段を備えた情報処理装置の制御プログラムであって、
通信端末が撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を、前記通信端末から受信する第2受信ステップと、
前記第1局所特徴量の特徴ベクトルの次元数iおよび前記第2局所特徴量の特徴ベクトルの次元数jのうち、より少ない次元数を選択し、選択された前記次元数までの特徴ベクトルからなる前記n個の第2局所特徴量に、選択された前記次元数までの特徴ベクトルからなる前記m個の第1局所特徴量の所定割合以上が対応すると判定した場合に、前記映像中の前記画像に前記医療用物品が存在すると認識する認識ステップと、
認識した前記医療用物品を示す情報を前記通信端末に送信する第2送信ステップと、
をコンピュータに実行させることを特徴とする制御プログラム。
Claims (20)
- 医療用物品と、前記医療用物品の画像のm個の特徴点のそれぞれを含むm個の局所領域のそれぞれについて生成された、それぞれ1次元からi次元までの特徴ベクトルからなるm個の第1局所特徴量とを、対応付けて記憶する第1局所特徴量記憶手段と、
撮像手段が撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を生成する第2局所特徴量生成手段と、
前記第1局所特徴量の特徴ベクトルの次元数iおよび前記第2局所特徴量の特徴ベクトルの次元数jのうち、より少ない次元数を選択し、選択された前記次元数までの特徴ベクトルからなる前記n個の第2局所特徴量に、選択された前記次元数までの特徴ベクトルからなる前記m個の第1局所特徴量の所定割合以上が対応すると判定した場合に、前記映像中の前記画像に前記医療用物品が存在すると認識する認識手段と、
を備えたことを特徴とする情報処理システム。 - 前記認識手段の認識結果を報知する報知手段をさらに備えることを特徴とする請求項1に記載の情報処理システム。
- 前記情報処理システムは、ユーザが携帯する通信端末と、前記通信端末と通信する情報処理装置とを有し、
前記通信端末が、前記撮像手段と前記第2局所特徴量生成手段と前記報知手段とを含んで、前記m個の第2局所特徴量を前記通信端末から前記情報処理装置へ送信し、
前記情報処理装置が、前記第1局所特徴量記憶手段と前記認識手段とを含んで、前記認識手段の認識結果を前記情報処理装置から前記通信端末へ送信することを特徴とする請求項2に記載の情報処理システム。 - 前記第1局所特徴量記憶手段は、複数の医療用物品にそれぞれ対応付けて各医療用物品の画像から生成した前記m個の第1局所特徴量を記憶し、
前記認識手段は、前記撮像手段が撮像した前記画像に含まれる複数の医療用物品を認識すると共に、前記n個の第2局所特徴量のならびに基づいて、前記撮像手段が撮像した前記画像における前記複数の医療用物品の配置を判定する配置判定手段を有することを特徴とする請求項1乃至3のいずれか1項に記載の情報処理システム。 - 前記医療用物品は医療機器であり、前記撮像手段が撮像する画像は診察室、病室あるいは手術室であって、
前記配置判定手段は、前記診察室、病室あるいは手術室における前記医療機器の配置を認識し、
前記第2局所特徴量生成手段は、前記第2局所特徴量の精度を調整する精度調整手段を有し、
前記認識手段は、精度をより高く調整して前記第2局所特徴量生成手段が生成した第2局所特徴量に基づいて、前記医療機器の間違い、欠陥あるいは状態をさらに認識することを特徴とする請求項4に記載の情報処理システム。 - 前記医療用物品は医療器具であり、前記撮像手段が撮像する画像は医療器具を配置するトレーであって、
前記配置判定手段は、前記トレーにおける前記医療器具の配置を認識し、
前記第2局所特徴量生成手段は、前記第2局所特徴量の精度を調整する精度調整手段を有し、
前記認識手段は、精度をより高く調整して前記第2局所特徴量生成手段が生成した第2局所特徴量に基づいて、前記医療器具の間違い、欠陥あるいは状態をさらに認識することを特徴とする請求項4に記載の情報処理システム。 - 前記医療用物品は医薬品であり、前記撮像手段が撮像する画像は薬棚あるいは薬トレーであって、
前記配置判定手段は、前記薬棚あるいは薬トレーにおける前記医薬品の配置を認識し、
前記第2局所特徴量生成手段は、前記第2局所特徴量の精度を調整する精度調整手段を有し、
前記認識手段は、精度をより高く調整して前記第2局所特徴量生成手段が生成した第2局所特徴量に基づいて、前記医薬品の間違い、欠陥あるいは状態をさらに認識することを特徴とする請求項4に記載の情報処理システム。 - 前記配置判定手段が認識した前記複数の医薬品の配置に基づいて、棚卸しを行なう管理手段を備えることを特徴とする請求項7に記載の情報処理システム。
- 前記第1局所特徴量および前記第2局所特徴量は、画像から抽出した特徴点を含む局所領域を複数のサブ領域に分割し、前記複数のサブ領域内の勾配方向のヒストグラムからなる複数の次元の特徴ベクトルを生成することにより生成されることを特徴とする請求項1乃至8のいずれか1項に記載の情報処理システム。
- 前記第1局所特徴量および前記第2局所特徴量は、前記生成した複数の次元の特徴ベクトルから、隣接するサブ領域間の相関がより小さな次元を選定することにより生成されることを特徴とする請求項9に記載の情報処理システム。
- 前記特徴ベクトルの複数の次元は、前記特徴点の特徴に寄与する次元から順に、かつ、前記局所特徴量に対して求められる精度の向上に応じて第1次元から順に選択できるよう、所定の次元数ごとに前記局所領域をひと回りするよう配列することを特徴とする請求項9または10に記載の情報処理システム。
- 前記第2局所特徴量生成手段は、前記医療用物品の相関に対応して、他の医療用物品とより高い前記相関を有する医療用物品については次元数のより多い前記第2局所特徴量を生成することを特徴とする請求項11に記載の情報処理システム。
- 前記第1局所特徴量記憶手段は、前記医療用物品の相関に対応して、他の医療用物品とより高い前記相関を有する医療用物品については次元数のより多い前記第1局所特徴量を記憶することを特徴とする請求項11または12に記載の情報処理システム。
- 医療用物品と、前記医療用物品の画像のm個の特徴点のそれぞれを含むm個の局所領域のそれぞれについて生成された、それぞれ1次元からi次元までの特徴ベクトルからなるm個の第1局所特徴量とを、対応付けて記憶する第1局所特徴量記憶手段を備えた情報処理システムにおける情報処理方法であって、
撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を生成する第2局所特徴量生成ステップと、
前記第1局所特徴量の特徴ベクトルの次元数iおよび前記第2局所特徴量の特徴ベクトルの次元数jのうち、より少ない次元数を選択し、選択された前記次元数までの特徴ベクトルからなる前記n個の第2局所特徴量に、選択された前記次元数までの特徴ベクトルからなる前記m個の第1局所特徴量の所定割合以上が対応すると判定した場合に、前記映像中の前記画像に前記医療用物品が存在すると認識する認識ステップと、
を備えることを特徴とする情報処理方法。 - 撮像手段が撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を生成する第2局所特徴量生成手段と、
前記m個の第2局所特徴量を、局所特徴量の照合に基づいて撮像した前記画像に含まれる医療用物品を認識する情報処理装置に送信する第1送信手段と、
前記情報処理装置から、撮像した前記画像に含まれる医療用物品を示す情報を受信する第1受信手段と、
を備えることを特徴とする通信端末。 - 撮像手段が撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を生成する第2局所特徴量生成ステップと、
前記m個の第2局所特徴量を、局所特徴量の照合に基づいて撮像した前記画像に含まれる医療用物品を認識する情報処理装置に送信する第1送信ステップと、
前記情報処理装置から、撮像した前記画像に含まれる医療用物品を示す情報を受信する第1受信ステップと、
を含むことを特徴とする通信端末の制御方法。 - 撮像手段が撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を生成する第2局所特徴量生成ステップと、
前記m個の第2局所特徴量を、局所特徴量の照合に基づいて撮像した前記画像に含まれる医療用物品を認識する情報処理装置に送信する第1送信ステップと、
前記情報処理装置から、撮像した前記画像に含まれる医療用物品を示す情報を受信する第1受信ステップと、
をコンピュータに実行させることを特徴とする制御プログラム。 - 医療用物品と、前記医療用物品の画像のm個の特徴点のそれぞれを含むm個の局所領域のそれぞれについて生成された、それぞれ1次元からi次元までの特徴ベクトルからなるm個の第1局所特徴量とを、対応付けて記憶する第1局所特徴量記憶手段と、
通信端末が撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を、前記通信端末から受信する第2受信手段と、
前記第1局所特徴量の特徴ベクトルの次元数iおよび前記第2局所特徴量の特徴ベクトルの次元数jのうち、より少ない次元数を選択し、選択された前記次元数までの特徴ベクトルからなる前記n個の第2局所特徴量に、選択された前記次元数までの特徴ベクトルからなる前記m個の第1局所特徴量の所定割合以上が対応すると判定した場合に、前記映像中の前記画像に前記医療用物品が存在すると認識する認識手段と、
認識した前記医療用物品を示す情報を前記通信端末に送信する第2送信手段と、
を備えることを特徴とする情報処理装置。 - 医療用物品と、前記医療用物品の画像のm個の特徴点のそれぞれを含むm個の局所領域のそれぞれについて生成された、それぞれ1次元からi次元までの特徴ベクトルからなるm個の第1局所特徴量とを、対応付けて記憶する第1局所特徴量記憶手段を備えた情報処理装置の制御方法であって、
通信端末が撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を、前記通信端末から受信する第2受信ステップと、
前記第1局所特徴量の特徴ベクトルの次元数iおよび前記第2局所特徴量の特徴ベクトルの次元数jのうち、より少ない次元数を選択し、選択された前記次元数までの特徴ベクトルからなる前記n個の第2局所特徴量に、選択された前記次元数までの特徴ベクトルからなる前記m個の第1局所特徴量の所定割合以上が対応すると判定した場合に、前記映像中の前記画像に前記医療用物品が存在すると認識する認識ステップと、
認識した前記医療用物品を示す情報を前記通信端末に送信する第2送信ステップと、
を含むことを特徴とする情報処理装置の制御方法。 - 医療用物品と、前記医療用物品の画像のm個の特徴点のそれぞれを含むm個の局所領域のそれぞれについて生成された、それぞれ1次元からi次元までの特徴ベクトルからなるm個の第1局所特徴量とを、対応付けて記憶する第1局所特徴量記憶手段を備えた情報処理装置の制御プログラムであって、
通信端末が撮像した映像中の画像からn個の特徴点を抽出し、前記n個の特徴点のそれぞれを含むn個の局所領域について、それぞれ1次元からj次元までの特徴ベクトルからなるn個の第2局所特徴量を、前記通信端末から受信する第2受信ステップと、
前記第1局所特徴量の特徴ベクトルの次元数iおよび前記第2局所特徴量の特徴ベクトルの次元数jのうち、より少ない次元数を選択し、選択された前記次元数までの特徴ベクトルからなる前記n個の第2局所特徴量に、選択された前記次元数までの特徴ベクトルからなる前記m個の第1局所特徴量の所定割合以上が対応すると判定した場合に、前記映像中の前記画像に前記医療用物品が存在すると認識する認識ステップと、
認識した前記医療用物品を示す情報を前記通信端末に送信する第2送信ステップと、
をコンピュータに実行させることを特徴とする制御プログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/375,452 US9418314B2 (en) | 2012-01-30 | 2013-01-25 | Information processing apparatus and control method and control program thereof, and communication terminal and control method and control program thereof |
JP2013556374A JP6226187B2 (ja) | 2012-01-30 | 2013-01-25 | 情報処理システム、情報処理方法、情報処理装置およびその制御方法と制御プログラム、通信端末およびその制御方法と制御プログラム |
EP13743138.3A EP2811459B1 (en) | 2012-01-30 | 2013-01-25 | Information processing system, information processing method, information processing device, and control method and control program therefor, and communication terminal, and control method and control program therefor |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012017383 | 2012-01-30 | ||
JP2012-017383 | 2012-01-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013115093A1 true WO2013115093A1 (ja) | 2013-08-08 |
Family
ID=48905133
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/051573 WO2013115093A1 (ja) | 2012-01-30 | 2013-01-25 | 情報処理システム、情報処理方法、情報処理装置およびその制御方法と制御プログラム、通信端末およびその制御方法と制御プログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US9418314B2 (ja) |
EP (1) | EP2811459B1 (ja) |
JP (1) | JP6226187B2 (ja) |
WO (1) | WO2013115093A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018132889A (ja) * | 2017-02-14 | 2018-08-23 | 浩平 田仲 | 情報提供システム、情報提供装置、情報提供方法及び情報提供プログラム |
WO2018221599A1 (ja) * | 2017-05-31 | 2018-12-06 | カリーナシステム株式会社 | 手術器具検出システムおよびコンピュータプログラム |
JP2019504678A (ja) * | 2016-01-29 | 2019-02-21 | ベクトン・ディッキンソン・ロワ・ジャーマニー・ゲーエムベーハー | 薬剤成分を移送するための輸送トレイを充填する方法およびそのような輸送トレイのための充填ステーション |
JP2022012643A (ja) * | 2020-07-02 | 2022-01-17 | オオクマ電子株式会社 | 医薬品の管理システム、および医薬品の管理方法 |
WO2023189097A1 (ja) * | 2022-03-29 | 2023-10-05 | テルモ株式会社 | プログラム、情報処理装置、情報処理システム及び情報処理方法 |
US12030683B2 (en) | 2022-11-01 | 2024-07-09 | Carefusion Germany 326 Gmbh | Filling station and method for filling a transport tray |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9792528B2 (en) * | 2012-01-30 | 2017-10-17 | Nec Corporation | Information processing system, information processing method, information processing apparatus and control method and control program thereof, and communication terminal and control method and control program thereof |
US11139048B2 (en) * | 2017-07-18 | 2021-10-05 | Analytics For Life Inc. | Discovering novel features to use in machine learning techniques, such as machine learning techniques for diagnosing medical conditions |
US11062792B2 (en) | 2017-07-18 | 2021-07-13 | Analytics For Life Inc. | Discovering genomes to use in machine learning techniques |
US11625834B2 (en) | 2019-11-08 | 2023-04-11 | Sony Group Corporation | Surgical scene assessment based on computer vision |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6711293B1 (en) | 1999-03-08 | 2004-03-23 | The University Of British Columbia | Method and apparatus for identifying scale invariant features in an image and use of same for locating an object in an image |
JP2010218149A (ja) | 2009-03-16 | 2010-09-30 | Institute Of National Colleges Of Technology Japan | 識別装置および識別方法 |
JP2011008507A (ja) * | 2009-06-25 | 2011-01-13 | Kddi Corp | 画像検索方法およびシステム |
JP2011198130A (ja) * | 2010-03-19 | 2011-10-06 | Fujitsu Ltd | 画像処理装置及び画像処理プログラム |
JP2011221688A (ja) | 2010-04-07 | 2011-11-04 | Sony Corp | 認識装置、認識方法、およびプログラム |
JP2011248757A (ja) * | 2010-05-28 | 2011-12-08 | Institute Of National Colleges Of Technology Japan | ピッキングシステムおよびピッキング方法 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4988408B2 (ja) * | 2007-04-09 | 2012-08-01 | 株式会社デンソー | 画像認識装置 |
US9041508B2 (en) * | 2008-08-08 | 2015-05-26 | Snap-On Incorporated | Image-based inventory control system and method |
JP4547639B2 (ja) * | 2008-08-26 | 2010-09-22 | ソニー株式会社 | 画像処理装置および方法、並びにプログラム |
US8270724B2 (en) * | 2009-03-13 | 2012-09-18 | Nec Corporation | Image signature matching device |
US20110238137A1 (en) * | 2010-03-25 | 2011-09-29 | Fujifilm Corporation | Medical apparatus for photodynamic therapy and method for controlling therapeutic light |
JP5685390B2 (ja) * | 2010-05-14 | 2015-03-18 | 株式会社Nttドコモ | 物体認識装置、物体認識システムおよび物体認識方法 |
US20130113929A1 (en) * | 2011-11-08 | 2013-05-09 | Mary Maitland DeLAND | Systems and methods for surgical procedure safety |
-
2013
- 2013-01-25 JP JP2013556374A patent/JP6226187B2/ja active Active
- 2013-01-25 WO PCT/JP2013/051573 patent/WO2013115093A1/ja active Application Filing
- 2013-01-25 EP EP13743138.3A patent/EP2811459B1/en active Active
- 2013-01-25 US US14/375,452 patent/US9418314B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6711293B1 (en) | 1999-03-08 | 2004-03-23 | The University Of British Columbia | Method and apparatus for identifying scale invariant features in an image and use of same for locating an object in an image |
JP2010218149A (ja) | 2009-03-16 | 2010-09-30 | Institute Of National Colleges Of Technology Japan | 識別装置および識別方法 |
JP2011008507A (ja) * | 2009-06-25 | 2011-01-13 | Kddi Corp | 画像検索方法およびシステム |
JP2011198130A (ja) * | 2010-03-19 | 2011-10-06 | Fujitsu Ltd | 画像処理装置及び画像処理プログラム |
JP2011221688A (ja) | 2010-04-07 | 2011-11-04 | Sony Corp | 認識装置、認識方法、およびプログラム |
JP2011248757A (ja) * | 2010-05-28 | 2011-12-08 | Institute Of National Colleges Of Technology Japan | ピッキングシステムおよびピッキング方法 |
Non-Patent Citations (3)
Title |
---|
DAVID G. LOWE: "Distinctive image features from scale-invariant keypoints", USA, INTERNATIONAL JOURNAL OF COMPUTER VISION, vol. 60, no. 2, 2004, pages 91 - 110, XP002756976, DOI: doi:10.1023/B:VISI.0000029664.99615.94 |
HIRONOBU FUJIYOSHI: "Gradient-Based Feature Extraction : SIFT and HOG", IEICE TECHNICAL REPORT, vol. 107, no. 206, 27 August 2007 (2007-08-27), pages 211 - 224, XP055150076 * |
See also references of EP2811459A4 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019504678A (ja) * | 2016-01-29 | 2019-02-21 | ベクトン・ディッキンソン・ロワ・ジャーマニー・ゲーエムベーハー | 薬剤成分を移送するための輸送トレイを充填する方法およびそのような輸送トレイのための充填ステーション |
JP2018132889A (ja) * | 2017-02-14 | 2018-08-23 | 浩平 田仲 | 情報提供システム、情報提供装置、情報提供方法及び情報提供プログラム |
WO2018221599A1 (ja) * | 2017-05-31 | 2018-12-06 | カリーナシステム株式会社 | 手術器具検出システムおよびコンピュータプログラム |
JPWO2018221599A1 (ja) * | 2017-05-31 | 2020-03-26 | Eizo株式会社 | 手術器具検出システムおよびコンピュータプログラム |
US11256963B2 (en) | 2017-05-31 | 2022-02-22 | Eizo Corporation | Surgical instrument detection system and computer program |
JP2022012643A (ja) * | 2020-07-02 | 2022-01-17 | オオクマ電子株式会社 | 医薬品の管理システム、および医薬品の管理方法 |
WO2023189097A1 (ja) * | 2022-03-29 | 2023-10-05 | テルモ株式会社 | プログラム、情報処理装置、情報処理システム及び情報処理方法 |
US12030683B2 (en) | 2022-11-01 | 2024-07-09 | Carefusion Germany 326 Gmbh | Filling station and method for filling a transport tray |
Also Published As
Publication number | Publication date |
---|---|
EP2811459B1 (en) | 2020-02-19 |
EP2811459A4 (en) | 2016-10-26 |
JPWO2013115093A1 (ja) | 2015-05-11 |
EP2811459A1 (en) | 2014-12-10 |
US9418314B2 (en) | 2016-08-16 |
US20150003704A1 (en) | 2015-01-01 |
JP6226187B2 (ja) | 2017-11-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6226187B2 (ja) | 情報処理システム、情報処理方法、情報処理装置およびその制御方法と制御プログラム、通信端末およびその制御方法と制御プログラム | |
Adjabi et al. | Multi-block color-binarized statistical images for single-sample face recognition | |
JP6153087B2 (ja) | 情報処理システム、情報処理方法、情報処理装置およびその制御方法と制御プログラム、通信端末およびその制御方法と制御プログラム | |
CN105793867A (zh) | 图像搜索方法及设备 | |
CN107944344A (zh) | 供电企业施工移动安全监督平台 | |
Chen et al. | Image retrieval based on image-to-class similarity | |
Yu et al. | A new feature descriptor for multimodal image registration using phase congruency | |
Karthik et al. | A hybrid feature modeling approach for content-based medical image retrieval | |
Martins et al. | Multispectral facial recognition in the wild | |
JP7409499B2 (ja) | 画像処理装置、画像処理方法、及びプログラム | |
WO2013089004A1 (ja) | 映像処理システム、映像処理方法、携帯端末用またはサーバ用の映像処理装置およびその制御方法と制御プログラム | |
Contreras Alejo et al. | Recognition of a single dynamic gesture with the segmentation technique hs-ab and principle components analysis (pca) | |
Huang et al. | RWBD: learning robust weighted binary descriptor for image matching | |
EP1828959A1 (en) | Face recognition using features along iso-radius contours | |
Yang et al. | A pca-based kernel for kernel pca on multivariate time series | |
CN110929583A (zh) | 一种高检测精度人脸识别方法 | |
Cai et al. | Robust facial expression recognition using RGB-D images and multichannel features | |
Chen et al. | A new accurate pill recognition system using imprint information | |
Nurzynska et al. | Evaluation of Keypoint Descriptors for Flight Simulator Cockpit Elements: WrightBroS Database | |
Alraqibah et al. | X-ray image retrieval system based on visual feature discrimination | |
Pietkiewicz | Application of fusion of two classifiers based on principal component analysis method and time series comparison to recognize maritime objects upon FLIR images | |
Ma et al. | A Method of Protein Model Classification and Retrieval Using Bag‐of‐Visual‐Features | |
Navarro et al. | Gender classification of full-body biological motion of aperiodic actions using machine learning | |
Dodson et al. | Some information geometric aspects of cyber security by face recognition | |
Said et al. | 3D fast wavelet network model-assisted 3D face recognition |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13743138 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013556374 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013743138 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14375452 Country of ref document: US |