WO2024043190A1 - Inspection device, inspection system, and inspection method - Google Patents

Inspection device, inspection system, and inspection method Download PDF

Info

Publication number
WO2024043190A1
WO2024043190A1 PCT/JP2023/029874 JP2023029874W WO2024043190A1 WO 2024043190 A1 WO2024043190 A1 WO 2024043190A1 JP 2023029874 W JP2023029874 W JP 2023029874W WO 2024043190 A1 WO2024043190 A1 WO 2024043190A1
Authority
WO
WIPO (PCT)
Prior art keywords
inspection
specific
photographed image
object candidate
candidate
Prior art date
Application number
PCT/JP2023/029874
Other languages
French (fr)
Japanese (ja)
Inventor
一平 高石
琢磨 赤木
泰弘 大川
Original Assignee
株式会社 東芝
東芝インフラシステムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社 東芝, 東芝インフラシステムズ株式会社 filed Critical 株式会社 東芝
Publication of WO2024043190A1 publication Critical patent/WO2024043190A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N23/00Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00
    • G01N23/02Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material
    • G01N23/04Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material and forming images of the material
    • G01N23/046Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material and forming images of the material using tomography, e.g. computed tomography [CT]
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N23/00Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00
    • G01N23/02Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material
    • G01N23/06Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material and measuring the absorption
    • G01N23/10Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material and measuring the absorption the material being confined in a container, e.g. in a luggage X-ray scanners
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/083Shipping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • Embodiments of the present invention relate to an inspection device, an inspection system, and an inspection method.
  • an inspection system that detects a specific object in a baggage by analyzing an image (X-ray image) taken of the baggage to be inspected using electromagnetic waves such as X-rays.
  • Such inspection systems for example, use X-rays to take images of baggage and determine whether the effective atomic number, which is the average atomic number, of the substances contained in each pixel that makes up the image is within a threshold. Detects what is presumed to be a specific object made of a specific substance.
  • the problem to be solved by the present invention is to provide an inspection device, an inspection system, and an inspection method that can accurately detect a specific object in an inspection target.
  • the inspection device includes an image acquisition unit and a processor.
  • the image acquisition unit acquires photographed image data including a photographed image photographed by irradiating the inspection object with electromagnetic waves and physical property information indicating physical properties of each part of the photographed image.
  • the processor detects an object candidate that is a candidate for a specific object in the inspection target based on the physical property information included in the captured image data, detects a specific part that is a specific part or object in the inspection target, and detects a specific part that is a specific part or object in the inspection target.
  • the relative relationship between the object candidate and the specific part is calculated, and the object candidate whose calculated relative relationship satisfies a predetermined condition is notified.
  • FIG. 1 is a diagram schematically showing the overall configuration of an inspection system including an inspection apparatus according to an embodiment.
  • FIG. 2 is a block diagram showing a configuration example of an information management system including the inspection device according to the embodiment.
  • FIG. 3 is a block diagram illustrating a configuration example of a control system in the imaging device and the inspection device of the inspection system according to the embodiment.
  • FIG. 4 is a block diagram illustrating a configuration example of a higher-level management device in an information management system including the inspection device according to the embodiment.
  • FIG. 5 is a flowchart for explaining the overall flow of inspection processing by the inspection apparatus according to the embodiment.
  • FIG. 6A is a diagram illustrating an example of a photographed image photographed by a photographing device supplied to the inspection device according to the embodiment.
  • FIG. 6(b) is a diagram illustrating an example of an image of an object candidate detected from a captured image by the inspection device according to the embodiment.
  • FIG. 6C is a diagram showing an example of an image of a specific portion detected from a captured image by the inspection device according to the embodiment.
  • FIG. 6D is a diagram illustrating an example of a target object candidate detected as a specific target object based on the relative relationship between the target object candidate detected by the inspection apparatus according to the embodiment and the identification unit.
  • FIG. 7 is a flowchart for explaining an operation example of object candidate detection processing in the inspection apparatus according to the embodiment.
  • FIG. 8 is a diagram illustrating an example of a two-dimensional image generated by slicing along each axis a three-dimensional image captured by an imaging device supplied to the inspection device according to the embodiment.
  • FIG. 9 is a flowchart for explaining an operation example of relative relationship calculation processing in the inspection apparatus according to the embodiment.
  • FIG. 10 is a diagram illustrating a display example of the inspection results of a specific object displayed on a display device by the inspection device according to the embodiment.
  • FIG. 11 is a diagram illustrating a display example in which the inspection device according to the embodiment displays a detection result of a specific target object in a captured image.
  • FIG. 12 is a diagram illustrating a display example in which the inspection device according to the embodiment displays the detection results of the specific portion in the captured image.
  • FIG. 10 is a diagram illustrating a display example of the inspection results of a specific object displayed on a display device by the inspection device according to the embodiment.
  • FIG. 11 is a diagram illustrating a display example in which the
  • FIG. 13 is a diagram illustrating an example of a display in which the inspection apparatus according to the embodiment displays the relative distance between the specific part and the target object candidate in a captured image.
  • FIG. 14 is a diagram illustrating an example in which the inspection device according to the embodiment displays a detection result of a specific target object and also displays buttons for instructing display of target object candidates and specific parts.
  • FIG. 15 is a diagram schematically showing an example in which the inspection apparatus according to the embodiment displays a two-dimensional image obtained by slicing a three-dimensional captured image at a designated location.
  • FIG. 1 is a diagram for schematically explaining a configuration example of an inspection system 1 including an inspection apparatus 13 according to an embodiment.
  • the inspection system 1 according to the embodiment is a system for inspecting whether or not a specific detection object exists in a baggage to be inspected.
  • Specific detection objects include, for example, dangerous goods, dangerous drugs, prohibited drugs, substances prohibited from being brought into or out of designated areas such as within Japan, etc. is assumed.
  • the specific object to be detected does not have to be a solid having a specific shape, and includes substances such as liquids and powders.
  • the inspection system 1 includes a conveyor 11, an imaging device 12, an inspection device 13, a display device 14, an operating device 15, a speaker 16, and the like.
  • the inspection device 13 is communicatively connected to the imaging device 12, the display device 14, the operating device 15, the speaker 16, and the like.
  • the conveyor 11 is a device that conveys the cargo M to be inspected.
  • the conveyor 11 transports the baggage M to be inspected to the image capturing position (reading position) by the image capturing device 12 .
  • the conveyor 11 conveys a load M supplied by a worker.
  • the conveyor 11 may be configured to transport the cargo M supplied by a robot arm or the like.
  • the photographing device 12 acquires photographed image data including a photographed image of the inspection object and physical property information indicating the physical properties of each part of the photographed image by irradiating electromagnetic waves to the baggage M to be inspected.
  • the photographing device 12 supplies photographed image data of the baggage M to the inspection device 13.
  • the imaging device 12 detects candidates (hereinafter also referred to as object candidates) and specific parts or objects (hereinafter also referred to as specific parts) that are presumed to be specific detection objects described later in the baggage M by the inspection device 13. Any method that can obtain photographed image data that can be used may be used.
  • the photographing device 12 may be one that acquires two-dimensional image data as a photographed image, or may be one that acquires three-dimensional image data.
  • the imaging device 12 is, for example, an X-ray CT imaging device.
  • An X-ray CT imaging device as an example of the imaging device 12 acquires three-dimensional X-ray image data as a photographed image by irradiating X-rays from around the baggage M conveyed by the conveyor 11.
  • the X-ray CT imaging device as the imaging device 12 captures a three-dimensional X-ray image of the baggage M and the physical property information indicating the physical properties of each constituent unit (pixel or voxel) constituting the X-ray image. Get image data.
  • An X-ray CT imaging device serving as the imaging device 12 supplies photographed image data acquired from the baggage M to the inspection device 13.
  • the imaging device 12 is not limited to an X-ray CT imaging device, but in the embodiment described below, the description will be made assuming that the imaging device 12 is an X-ray CT imaging device. do.
  • the inspection device 13 has various functions, such as a function to process a photographed image of the baggage M taken by the photographing device 12 using electromagnetic waves.
  • the inspection device 13 has a function (receiving section) for acquiring a photographed image of the baggage M taken by the photographing device 12 using electromagnetic waves, and displays output information based on information obtained by an inspection process described later on the display device 14 or a speaker. It has a function (transmission unit) to output using an output device such as 16.
  • the inspection device 13 has a function (object candidate detection unit) that detects a candidate (object candidate) that is estimated to be a specific object existing in the luggage M from the captured image data obtained from the imaging device 12.
  • the inspection device 13 detects a candidate (object candidate) that is estimated to be a specific object in the photographed image of the baggage M obtained from the photographing device 12, based on setting values set based on the physical properties of the specific object. .
  • the inspection device 13 detects object candidates based on the physical properties (density, effective atomic number, etc.) of each pixel or voxel in a captured image (X-ray image) captured by an X-ray CT imaging device as the imaging device 12.
  • a pixel corresponds to a pixel that is the smallest unit of two-dimensional image data.
  • a voxel is the smallest unit of data that constitutes three-dimensional data, and represents a value in a regular grid unit.
  • a voxel is a value corresponding to a pixel in two-dimensional image data.
  • the inspection device 13 has a function (specific part detection unit) of detecting a specific part or a specific object (hereinafter also simply referred to as a specific part) from an image taken of the baggage M.
  • a specific part detects a specific portion that is a specific region or a specific object in the captured image captured by the imaging device 12.
  • the specific part is a part or object where a specific target object collected in advance is often hidden.
  • the specific portion may be a portion or object having a predetermined shape, or may be a portion or object placed at a predetermined position in the luggage.
  • specific parts may include shoes, cameras, computers, walls of luggage such as trunks, and cigarettes.
  • the specific part detection unit of the inspection device 13 extracts an image of a region that is likely to be a specific part or object that has been set (learned) in advance from the photographed image taken by the imaging device 12, and recognizes the shape of the extracted image. Detects the specific part. Specifically, the specific part detection unit of the inspection device 13 can detect the specific part using semantic segmentation that associates labels and categories with all pixels in the captured image.
  • the method of detecting a specific part applied to the specific part detection unit is not limited to the above-mentioned method, and may be a method such as general image recognition or HOG (Histograms of Oriented Gradients) features extracted from an image.
  • a method using object recognition such as SSD (Single Shot Detector, Single Shot MultiBox Detector) using machine learning or deep learning using feature amounts as input may also be used.
  • the inspection device 13 determines whether the object candidate is likely to be a specific object based on the relative relationship such as the distance between each object candidate and the specific part in the image taken by the imaging device 12 (if the object candidate is suspected to be a specific object). It has a function (relative relationship calculation unit) that evaluates that For example, when the target object candidate is adjacent to or included in the specific part, the inspection device 13 determines that there is a high possibility that the target object is the specific target object. In addition, the inspection device 13 calculates the relative distance between the object candidate and the specific part as a positional relationship, and if the calculated relative distance is less than or equal to a predetermined value, the possibility that the object candidate is a specific detection object is determined. may be determined to be high.
  • the display device 14 is an output device for notifying the inspector of the test results.
  • the display device 14 displays a guidance screen and the like according to the control of the inspection device 13.
  • the display device 14 displays a guide screen showing the results of the inspection process on the photographed image of the luggage M, etc., as a guide screen to be presented to the inspector.
  • the display device 14 displays an image clearly showing the object candidate and the specific portion in the image captured by the imaging device 12 and generated by the inspection device 13 .
  • the operating device 15 generates an operating signal according to the operating input of the inspector (operator), and supplies the operating signal to the inspection device 13. Further, the operating device 15 includes operating devices such as a keyboard and a pointing device. Further, the operating device 15 may be configured by a touch panel provided on the display screen of the display device 14 or the like.
  • the speaker 16 is an output device for notifying the inspector of test results and the like by voice.
  • the speaker 16 outputs audio for notifying the inspector of audio guidance according to the results of the inspection process and the like.
  • the inspection device 13 has a function of executing a notification process to notify (issue a notification) information such as object candidates to the inspector.
  • the inspection device 13 displays, on the display device 14, object candidates that are determined to be highly likely to be specific objects, along with a photographed image taken by the photographing device 12 of the baggage M being conveyed by the conveyor 11.
  • the inspection device 13 displays all object candidates in the photographed image of the baggage M on the display device 14, and further distinguishes object candidates that are determined to be likely to be specific objects from other candidates. It may be displayed in a different color or mark. Furthermore, the inspection device 13 may display on the display device 14 images that clearly indicate the object candidate and the specific portion in the photographed image of the baggage M. Furthermore, the inspection device 13 may change the content displayed on the display device 14 in response to an instruction from a worker using the operating device 15.
  • FIG. 2 is a diagram showing a configuration example of an information management system 100 including the inspection system 1 according to the embodiment.
  • the information management system 100 includes an upper management device 101 that is communicatively connected to the inspection device 13 of the inspection system 1 provided at each inspection site.
  • the upper management device 101 functions as an information management device that collects data from the inspection devices in each inspection system 1 and supplies data to each inspection device.
  • the upper management device 101 is composed of, for example, a computer such as a server device.
  • the upper management device 101 includes a storage device that stores information regarding tests performed by each test system 1. Further, the upper management device 101 may include an interface that connects to a server device that stores information regarding examinations.
  • the upper management device 101 acquires information from the inspection devices 13 in each inspection system 1.
  • the upper management device 101 stores and aggregates information acquired from the inspection devices 13 of each inspection system 1.
  • the upper management device 101 supplies information to the inspection devices 13 in each inspection system 1.
  • the upper management device 101 distributes setting values used for inspection processing to the inspection devices 13 of each inspection system 1.
  • the upper management device 101 may distribute update data for a program for each inspection device 13 to execute inspection processing.
  • FIG. 3 is a block diagram illustrating a configuration example of a control system of the inspection device 13 in the inspection system 1 according to the embodiment.
  • the imaging device 12 includes an imaging section 21, a processing section 22, and an output section 23.
  • the imaging unit 21 shoots an image by irradiating an object to be imaged, such as a baggage M to be inspected, with electromagnetic waves such as X-rays.
  • the imaging device 12 is an X-ray CT imaging device
  • the imaging unit 21 generates three-dimensional X-ray image data by emitting X-rays from around the baggage M to be inspected that is transported by the conveyor 11. get.
  • the processing unit 22 includes a processor, various types of memory, and the like, and the processor executes various processes by executing programs stored in the memory. For example, the processing unit 22 processes the image captured by the imaging unit 21 by irradiating electromagnetic waves, thereby generating a captured image that includes the captured image and physical property information indicating the physical properties of the constituent units (pixels or voxels) constituting the captured image. Generate data.
  • the output unit 23 is an interface that outputs data such as photographed image data.
  • the output unit 23 includes an interface corresponding to the image interface 39 of the inspection device 13 and outputs captured image data to the inspection device 13. Further, the output unit 23 may be an input/output interface including an interface for inputting data such as control data from the connected inspection device 13.
  • the inspection device 13 includes a processor 31, a ROM 32, a RAM 33, a storage section 34, a communication section 35, a display interface (I/F) 36, an operation interface (I/F) 37, and an audio interface ( I/F) 38 and an image interface (I/F) 39.
  • a processor 31 a ROM 32, a RAM 33, a storage section 34, a communication section 35, a display interface (I/F) 36, an operation interface (I/F) 37, and an audio interface ( I/F) 38 and an image interface (I/F) 39.
  • the processor 31 executes arithmetic processing.
  • the processor 31 is, for example, a CPU (Central Processing Unit).
  • the processor 31 functions as a processing unit that executes various processes by executing programs stored in the ROM 32 or the storage unit 34 using the RAM 33.
  • the ROM 32 is a read-only nonvolatile memory.
  • the ROM 32 stores program data, control data, and the like.
  • the RAM 33 is a volatile memory that functions as a working memory. RAM 33 temporarily stores data.
  • the storage unit 34 is a rewritable nonvolatile memory.
  • the storage unit 34 includes a hard disk drive (HDD), a solid state drive (SSD), and the like.
  • the storage unit 34 stores information such as program data, setting values as control data, and test processing results.
  • the communication unit 35 is a communication interface for communicating with the upper management device 101.
  • the processor 31 communicates with the upper management device 101 via the communication unit 35.
  • the processor 31 transmits data such as processing results to the higher-level management device 101 via the communication unit 35, and receives data from the higher-level management device 101.
  • the display interface 36 is an interface for connecting to the display device 14 as an output device.
  • the display interface 36 may be any interface that is compatible with the interface included in the display device 14.
  • the processor 31 controls the display content displayed on the display device 14 via the display interface 36.
  • the operation interface 37 is an interface for connecting to the operation device 15.
  • the operation interface 37 may be any interface as long as it corresponds to the interface included in the operation device 15.
  • the processor 31 acquires information input by the operating device 15 via the operating interface 37 .
  • the audio interface 38 is an interface for connecting to the speaker 16 as an output device.
  • the audio interface 38 may be any interface that is compatible with the interface included in the speaker 16.
  • the processor 31 outputs audio from the speaker 16 as an output device via the audio interface 38.
  • the image interface 39 is an interface for connecting to the photographing device 12.
  • the image interface 39 is an image acquisition unit (image acquisition interface) for acquiring a photographed image from the photographing device 12.
  • the image interface 39 may be any interface as long as it is compatible with an interface included in the imaging device 12 such as an X-ray CT device.
  • the processor 31 acquires, via the image interface 39, a photographed image (X-ray image) taken by an X-ray CT device serving as the photographing device 12. Furthermore, the processor 31 may control the photographing operation of the baggage M by the photographing device 12 via the image interface 39.
  • FIG. 4 is a block diagram showing a configuration example of the upper management device 101 in the information management system 100 including the inspection system 1 according to the embodiment.
  • the upper management device 101 is an information management device that manages information about the entire inspection system 1 .
  • the upper management device 101 is a computer that is communicatively connected to the inspection device 13 of the inspection system 1 provided at each inspection site.
  • the upper management device 101 is configured by, for example, a server device.
  • the upper management device 101 includes a processor 41, a ROM 42, a RAM 43, a storage section 44, and a communication section 45.
  • Processor 41 executes arithmetic processing.
  • the processor 41 is, for example, a CPU (Central Processing Unit).
  • the processor 41 functions as a processing unit that executes various processes by executing programs stored in the ROM 42 or the storage unit 44 using the RAM 43.
  • the ROM 42 is a read-only nonvolatile memory.
  • the ROM 42 stores program data, control data, and the like.
  • RAM 43 is a volatile memory that functions as working memory. RAM 43 temporarily stores data.
  • the storage unit 44 is a rewritable nonvolatile memory.
  • the storage unit 44 includes a hard disk drive (HDD), solid state drive (SSD), and the like.
  • the storage unit 34 stores information such as program data, set values as control data, and data collected from each inspection device 13.
  • the communication unit 45 is a communication interface for communicating with the inspection device 13 in each inspection system 1.
  • the processor 41 communicates with the inspection device 13 via the communication unit 45.
  • the processor 41 receives data such as processing results from the inspection device 13 via the communication unit 45, and transmits data to the inspection device 13.
  • FIG. 5 is a flowchart for schematically explaining the flow of inspection processing in the inspection system 1 according to the embodiment.
  • FIGS. 6(a) to 6(d) are diagrams schematically showing examples of images obtained by each process of the inspection process.
  • packages M to be inspected are sequentially placed on a conveyor 11.
  • the conveyor 11 transports the loaded baggage M to a position where an image is taken by the photographing device 12.
  • the imaging unit 21 of the photographing device 12 acquires a photographed image showing the contents of the baggage M by irradiating electromagnetic waves onto the baggage M conveyed by the conveyor 11.
  • the photographed image may be image data that shows the condition inside the luggage M.
  • the X-ray CT device of the photographing device 12 acquires three-dimensional data indicating the state inside the baggage M as a photographed image by irradiating the baggage M transported to the photographing position with X-rays. Furthermore, physical property information indicating physical properties (density and effective atomic number) of each pixel or voxel constituting the photographed image is acquired.
  • the processing section 22 of the photographing device 12 generates photographed image data that includes a photographed image of the luggage M taken by the photographing section 21 and physical property information indicating the physical properties of each pixel or voxel in the photographed image.
  • the processing unit 22 of the photographing device acquires the photographed image data of the baggage M
  • the processing unit 22 outputs the photographed image data to the inspection device 13 through the output unit 23.
  • the inspection device 13 acquires photographed image data from the photographing device 12 through the image interface 39 (step S101).
  • the processor 31 executes a process of acquiring captured image data as a process performed by the receiving section described above.
  • the inspection device 13 acquires photographed image data including a photographed image as shown in FIG. 6(a) and physical property information indicating the physical properties of each region (pixel or voxel) of the photographed image.
  • the photographed image may be two-dimensional data, three-dimensional data, or a plurality of two-dimensional image data obtained by slicing three-dimensional data along a specific axis. .
  • it is assumed that the photographed image is two-dimensional image data.
  • the processor 31 of the inspection device 13 executes an object candidate detection process of detecting an object candidate as a specific object candidate from the captured image data acquired from the imaging device 12 (step S102).
  • the processor 31 executes object candidate detection processing as processing by the object candidate detection section described above. For example, the processor 31 selects locations (pixels) whose physical properties are similar or consistent with the physical properties of the specific target in the photographed image as object candidates based on setting values that are set according to the physical properties of the specific target to be detected. To detect.
  • FIG. 6(b) is a diagram showing an example of object candidates detected from the captured image data as shown in FIG. 6(a).
  • the processor 31 of the inspection device 13 detects, based on the photographed image data acquired from the photographing device 12, a specific part or object in the photographed image of the baggage M that is a place where a specific object to be detected is likely to be placed.
  • a detection process for detecting a specific part (specific part) is executed (step S103).
  • the processor 31 executes a specific part detection process as the process by the specific part detection unit described above. For example, the processor 31 sets in advance the shape of a region or object to be detected as a specific part, and detects, as a specific part, an image area having a shape similar to the shape of the specific part to be detected in the photographed image.
  • FIG. 6(c) is a diagram showing an example of a specific part detected from the captured image data as shown in FIG. 6(a).
  • the processor 31 executes a relative relationship calculation process that calculates the relative relationship between the detected object candidate and the identification part (step S104).
  • the processor 31 executes a relative relationship detection process as a process by the above-mentioned relative relationship calculation unit.
  • the relative relationship between the object candidate and the identification unit is information for evaluating whether or not the object candidate is suspected to be the specific object to be detected.
  • the processor 31 calculates, for example, the relative distance between the object candidate and the specific part as a relative relationship.
  • the processor 31 calculates the relative relationship between the target object candidate and the specific part, it detects the target object candidate suspected to be the specific target based on the relative relationship with the specific part (step S105). For example, the processor 31 detects an object candidate that is suspected of being a specific object (detects it as a specific object) that satisfies a predetermined condition in which the relative relationship (for example, relative distance) with the specific part is a preset value. ) is determined to be a thing.
  • the processor 31 notifies (announces) the detection result of the object candidate that is suspected to be the specific object based on the relative relationship between the object candidate and the identification unit using the display device 14 and the speaker 16 as output devices.
  • Alert processing is executed (step S106). For example, when the processor 31 detects an object candidate that is suspected to be a specific object as a process performed by the above-mentioned transmitting unit, the processor 31 issues an alarm to notify the detection result through an output device such as the display device 14 or the speaker 16. Execute processing.
  • the processor 31 causes the display device 14 to display object candidates whose relative distance to the specific portion is less than or equal to a preset threshold as being suspected to be the specific object.
  • FIG. 6(d) shows object candidates that are determined to be suspected to be specific objects based on the relative relationship between the object candidates shown in FIG. 6(b) and the identification portion shown in FIG. 6(c).
  • FIG. 4 is a diagram showing an example of a display displaying a specific part associated with the target object candidate.
  • the baggage for which the detection result of the target object candidate is displayed on the display device 14 through the notification process is inspected by the inspector.
  • the inspector performs inspection work to inspect the contents of the baggage while referring to information such as the detection results displayed by the notification process.
  • the inspection system 1 may perform inspection work on all packages M, or may perform inspection work on packages M in which an object candidate suspected of being a specific object has been detected. It is also possible to carry out the operation.
  • FIG. 7 is a flowchart for explaining an example of object candidate detection processing performed by the object candidate detection unit of the inspection device 13 according to the embodiment.
  • the processor 31 of the inspection device 13 acts as an object candidate detection unit, based on, for example, the photographed image data of the luggage M obtained from the photographing device 12 and the effective atomic number and density data of each part in the photographed image.
  • An object candidate that is similar to or coincides with the physical properties of a specific object to be detected is detected. Specifically, whether the effective atomic number and density of each pixel or voxel of the captured image, which is the input data, are within the range of a threshold determined based on the physical property values of the specific target to be detected collected in advance. It is determined whether or not the object is a candidate object.
  • the processor 31 of the inspection device 13 acquires the physical property value Z of the specific object to be detected (step S201).
  • the physical property value Z of the specific object may be input by a manager or an inspector using the operating device 15, or may be obtained from an external device such as the upper management device 101.
  • the processor 31 Upon acquiring the physical property value Z of the specific target object, the processor 31 sets a threshold value E based on the physical property value Z of the specific target object (step S202). For example, the processor 31 sets a threshold value E that sets a predetermined tolerance range for the physical property value Z of the specific object.
  • steps S201 and S202 may be replaced with a process in which the administrator or inspector sets the threshold value E using the operating device 15. Further, in the processing of steps S201 and S202, information specifying the threshold E is obtained from the upper management device 101, and the threshold E obtained from the upper management device 101 is set as a setting value in the object candidate detection process. Also good.
  • the processor 31 of the inspection device 13 that has set the threshold value E, which is a set value in the object candidate detection process, uses the photographic image taken by the photographing device 12 of the baggage M to be inspected using electromagnetic waves such as X-rays, and the photographed image. Captured image data including physical property information indicating the physical properties of each part is acquired (step S203). Upon acquiring the photographed image data of the luggage M from the photographing device 12, the processor 31 executes a process of detecting object candidates included in the photographed image (steps S204 to S207).
  • step S205 In the processing example shown in FIG. )-Z
  • the processor 31 detects, as a target object candidate, an image area consisting of a set of pixels detected as target object candidate pixels (step S208). For example, the processor 31 selects a pixel from an image region in which pixels having a physical property value I(p) whose difference from the physical property value Z of the specific object is less than a threshold value E (or from a set of pixels whose relative distance is less than or equal to a predetermined distance) image area) is detected as a target object candidate.
  • a threshold value E or from a set of pixels whose relative distance is less than or equal to a predetermined distance
  • the inspection device detects the physical property values such as the effective atomic number and density data of each part in the photographed image obtained by irradiating electromagnetic waves such as an X-ray image, and the specific target object.
  • An object candidate is detected based on whether the difference between the object and the physical property value is within a predetermined threshold.
  • the inspection device according to the embodiment can detect candidates for the specific object present in the baggage, even if the specific object to be detected is a substance such as a liquid or powder that does not have a specific shape. can.
  • a two-dimensional image is used as the processing target, and it is determined whether or not each pixel is an object candidate.
  • the processing is performed for each voxel.
  • An object candidate in a three-dimensional photographed image may be detected by determining whether or not the object is an object candidate.
  • FIG. 8 is a diagram showing an example of a plurality of two-dimensional images obtained by slicing (dividing) three-dimensional data as a captured image along each axis (x-axis, y-axis, z-axis).
  • the processor 31 executes the object candidate detection process as described above for each two-dimensional image data. After detecting the object candidate in each two-dimensional image data, the processor 31 may detect the object candidate in the three-dimensional space by superimposing the detection results of the object candidate in each two-dimensional image.
  • the process for detecting object candidates is not limited to the method described above, but may be any method that can detect object candidates using data on specific objects collected in advance.
  • a hash table, machine learning, or the like may be used to detect object candidates.
  • general object recognition may be used to detect the target object candidate.
  • the processor 31 of the inspection device 13 calculates the relative relationship between the target object candidate detected from the captured image and the specific unit.
  • the relative relationship is calculated as an index value for evaluating whether the object candidate is a specific object to be detected. For example, as the relative relationship, the distance (relative distance) between the target object candidate and the specific part is calculated. In this case, it can be determined that the object candidate that is close to the specific part is suspected to be the specific object.
  • an inclusive relationship between the target object candidate and the specific part, or a positional relationship such as whether they are adjacent or not, may be calculated.
  • the object candidate within the specific portion or the object candidate adjacent to the specific portion can be evaluated as being suspected to be the specific object.
  • physical property information for example, average values of density, effective atomic number, etc.
  • physical property information of the target object candidate and physical property information of the specific part may be calculated.
  • an object candidate suspected of being a specific object can be detected even in a state where the substance forming the identification portion and the substance forming the object candidate coexist.
  • FIG. 9 is a flowchart for explaining the process of calculating the relative distance between the object candidate and the specific part as an example of the process of calculating the relative relationship by the relative relationship calculation unit of the inspection apparatus 13 according to the embodiment.
  • a processing example will be described in which the relative distance between the target object candidate and the specific portion is calculated based on the distance between each pixel detected as the target object candidate and each pixel forming the specific portion.
  • An example of calculating the relative distance as the relative relationship between the object candidate and the specific portion is a method of calculating the Euclidean distance between the center position of the object candidate and the center position of the specific portion.
  • the distance between the center positions is greatly influenced by the shape of the object candidate or the specific part.
  • the relative distance between the target object candidate and the specific part is calculated based on the distance between each pixel of the target object candidate and each pixel of the specific part.
  • the processor 31 of the inspection device 13 acquires information indicating the area of the object candidate T and the area of the specific part W detected in the photographed image of the baggage M to be inspected (step S301).
  • the processor 31 acquires the image area of the target object candidate T and the image area of the specific part W for which the correlation is to be calculated.
  • the processor 31 performs the correlation by executing the processes of steps S301 to S307 for all combinations of each object candidate and each specific part. Let us calculate the relationship (relative distance).
  • the processor 31 Upon acquiring the information indicating the image area of the specific portion W in the captured image, the processor 31 sets the total number of pixels forming the image of the specific portion W as the number of loops depending on the specific portion (step S302). Further, upon acquiring information indicating the image area of the target object candidate T in the photographed image, the processor 31 sets the total number of pixels forming the image of the target object candidate T as the number of loops according to the target object candidate (step S303 ).
  • the processor 31 calculates the distance between each pixel of the target object candidate and each pixel of the specific part, and uses the calculated distance between each pixel as a distance for calculating the relative distance between the specific part W and the target object candidate T.
  • the information is stored in the RAM 33 or the storage unit 34 (step S304).
  • the processor 31 repeatedly executes the process of calculating the distance of each pixel of the specific part to one pixel of the target object candidate for the number of pixels of the specific part (the number of loops depending on the specific part) (step S305).
  • the processor 31 executes a process of calculating the distance of each pixel of the specific part to the next pixel selected from the target object candidate. do.
  • the processor 31 repeatedly executes the process of calculating the distance of each pixel of the specific part to the pixels sequentially selected from the target object candidates for the total number of pixels of the target object candidates (the number of loops according to the target object candidates). Step S306). Through the processing in steps S302 to S306, the processor 31 calculates the distances between all pixels in the specific part and all pixels in the object candidate.
  • the processor 31 calculates the distance between the target object candidate T and the specific part W based on the distance between each pixel stored in the RAM 33 or the storage part 34.
  • a relative distance as a relative relationship with W is calculated (step S307). For example, the processor 31 may calculate the distance that is the minimum value from the distances calculated for the number of combinations of each pixel of the target object candidate T and each pixel of the specific portion W as the relative relationship.
  • noise may be mistakenly detected as a specific part.
  • Noise that is mistakenly detected as a specific part may also be detected at a position close to the target object candidate. In such a case, if there is even a small amount of noise in a location closer to the target object candidate than the actual specified part, the minimum distance to the target object candidate will be calculated as a value smaller than the actual distance to the specified part. be done.
  • the processor 31 calculates the distance between each pixel detected as a target object candidate and each pixel of the specific part as the relative relationship (relative distance) between the target object candidate and the specific part, and then calculates the minimum value. It is also possible to take a percentile value instead of . By calculating the correlation that indicates the distance between each pixel of the object candidate and each pixel of the specific part using percentile values, it is possible to confirm that the object candidate is likely to be a specific object even if there is noise detected as the specific part. This enables robust evaluation.
  • the processor 31 may reduce the number of combinations of pixels for which distances are calculated by calculating distances for combinations of pixels sampled from the object candidate and pixels sampled from the specific portion. By reducing the number of combinations of pixels for which distances are calculated in the processing example shown in FIG. 9, the amount of calculation in the relative relationship calculation process can be reduced.
  • the distance between pixels is calculated using a two-dimensional image as the processing target, but if the captured image is three-dimensional data, each voxel in the object candidate and the What is necessary is to calculate the distance to each voxel in the area. Thereby, even if the photographed image is three-dimensional data, it is possible to calculate the relative distance as the relative relationship between the object candidate and the specific part.
  • the processor 31 of the inspection device 13 executes a notification process of notifying the detection result of a specific target object based on the relative relationship using an output device. For example, the processor 31 of the inspection device 13 displays a display screen of the inspection results for the specific object in which an image area suspected to be the specific object (an image area detected as the specific object) in the photographed image of the luggage M is highlighted. is displayed on the display device 14.
  • FIG. 10 is a diagram illustrating a display example in which the inspection device 13 displays the inspection results of the specific object in the baggage M on the display device 14 as a notification process.
  • the display example shown in FIG. 10 displays a photographed image of baggage M taken by the photographing device 12, and objects that are determined to be suspected of being a specific object based on the relative relationship with a specific part on the photographed image.
  • This is an example of a guidance display in which an image region of an object candidate (an image region detected as a specific target object) is highlighted.
  • the image area suspected to be a specific object may be highlighted so that it can be visually distinguished from other image areas. For example, an image area suspected of being a specific object may be highlighted by displaying it in a different color from other areas. Further, an image area suspected of being a specific object may be displayed in a highlighted manner by surrounding it with a square, an ellipse, or the like.
  • results of the inspection process that the inspection device 13 displays on the display device 14 through the alarm process include not only the image area of the target object candidate that has been determined to be suspected of being the specific target object, but also the specific part, the target object Candidates or information indicating relative relationships may be displayed.
  • FIG. 11 is a table displaying information indicating object candidates detected in the object candidate detection process as well as image areas determined to be suspected to be specific objects in the photographed image (specific object detection results). It is a figure which shows an example.
  • image areas of object candidates other than object candidates determined to be suspected of being specific objects are displayed so that they are surrounded by elliptical guide lines so that the inspector can visually recognize them.
  • the area detected as a target object candidate may be displayed so as to be surrounded by a rectangle or the like, or may be displayed in a specific color set as the area of the target object candidate.
  • FIG. 12 is a diagram illustrating a display example in which a guide indicating a specific part detected by the specific part detection process is displayed together with an image area determined to be suspected of being a specific target object in a photographed image.
  • the display example shown in FIG. 12 is an example in which a rectangular guide line surrounds the image area of the specific part so that the inspector can visually recognize the image area detected as the specific part in the photographed image.
  • the image area detected as the specific portion may be displayed so as to be surrounded by an ellipse or the like, or may be displayed in a specific color set as the area of the target object candidate.
  • FIG. 13 is a diagram illustrating a display example in which information indicating the relative relationship (relative distance) of each object candidate with respect to the specific portion is displayed together with an image area determined to be suspected of being a specific object in a photographed image. .
  • information (number of pixels) indicating the relative distance as the relative relationship of each object candidate to the part detected as the specific part is displayed.
  • object candidates and specific parts may be displayed in the captured image, and information such as density and effective atomic number in the image area of the object candidates and specific parts may be displayed side by side (or overlapping).
  • FIG. 14 is a diagram illustrating a display example in which a detection result of a specific target object is displayed and buttons for instructing display of target object candidates and specific parts are provided.
  • a button for instructing display of all object candidates a button for instructing display of a specific part, and a button for selecting individual object candidates are displayed.
  • images of all object candidates detected as object candidates are displayed on the display device 14, as shown in FIG. .
  • the display device 14 displays an image of the selected object candidate as well as information regarding the object candidate (for example, information indicating physical properties, etc.). Alternatively, information indicating the relative relationship with the specific part, etc.) is displayed.
  • the processor 31 of the inspection device 13 may display the captured image on the display device 14 in 3D.
  • the processor 31 may display the area of the object candidate detected as the specific object in a different color from the other areas.
  • the processor 31 may display object candidates, specific parts, etc. on the photographed image according to instructions from the inspector.
  • the processor 31 rotates, enlarges, reduces, or moves the 3D photographed image in accordance with operations by the inspector. It's okay.
  • the processor 31 may display on the display device 14 a two-dimensional image sliced at a location (plane) designated by the inspector.
  • FIG. 15 is a diagram schematically showing an example of displaying a two-dimensional image obtained by slicing a three-dimensional captured image at a designated location.
  • the inspection device 13 displays the captured image as a two-dimensional image obtained by slicing the three-dimensional captured image at a designated location on the display device 14, and further displays an object on the captured image. Object candidates, specific parts, areas detected as specific objects, etc. are displayed.
  • the location where the three-dimensional photographed image is sliced may be a predetermined location set in advance, or a location designated by the inspector using the operating device 15.
  • the processing of the inspection device 13 described above is configured to be executed by the imaging device 12. It's okay. That is, in the above-described inspection system 1, the above-described photographing device 12 and inspection device 13 may be realized by an integrated device.
  • the information may be collected by the upper management device 101. That is, information indicating various processing results and inspection work results by the inspection device 13 may be transmitted from the inspection device 13 to the higher-level management device 101 and stored in the higher-level management device.
  • the upper management device 101 analyzes the information collected from the inspection devices 13 of the inspection system 1 located in various places, and changes the parameters for various processes executed by each inspection device 13 according to the analysis results. It's okay.
  • the upper management device collects information indicating the parts and objects where specific target objects are detected during actual inspection work from each inspection device, and sets each inspection to detect those parts and objects as specific parts. Parameters for the detection process of the specific unit may be set in the device. In addition, the upper management device calculates statistics such as the probability (accuracy) that an object candidate detected as a specific object will be confirmed as a specific object in actual inspection work from the information collected from each inspection device, and calculates the statistics. The setting value for determining a target object candidate as a specific target object may be adjusted based on the relative relationship with the specific part according to the statistical amount of .
  • the inspection apparatus acquires photographed image data including a photographed image photographed by irradiating an inspection target with electromagnetic waves and physical property information on each part of the photographed image.
  • the inspection device detects the target object candidate and the specific part in the photographed image, and notifies the target object candidate that is suspected to be the specific target based on the relative relationship between the target object candidate and the specific part.
  • the inspection device detects a specific part where a specific target object is likely to be placed, and identifies an area suspected of being a specific target object based on the relative relationship between the specific part and the candidate object. be able to.

Abstract

Provided are an inspection device, an inspection system, and an inspection method capable of accurately detecting a specific target object in an inspection target. According to an embodiment, the inspection device includes an image acquiring unit and a processor. The image acquiring unit acquires captured image data including a captured image captured by emitting electromagnetic waves onto an inspection target, and physical property information representing a physical property of each portion of the captured image. The processor: detects a target object candidate, which is a candidate of a specific target object in the inspection target, on the basis of the physical property information included in the captured image data; detects a specific portion, which is a specific part or object in the inspection target; calculates a relative relationship between the target object candidate and the specific portion; and reports a target object candidate for which the calculated relative relationship satisfies a predetermined condition.

Description

検査装置、検査システムおよび検査方法Inspection equipment, inspection system and inspection method
 本発明の実施形態は、検査装置、検査システムおよび検査方法に関する。 Embodiments of the present invention relate to an inspection device, an inspection system, and an inspection method.
 従来、検査場において、荷物などの検査対象内に予め定められた対象物(以下、特定対象物とも称する)が存在するか否かを検査する検査作業が必要となることがある。検査対象となり得る荷物が多数である場合、検査員が多数の荷物を全て開披検査することは多大な時間や手間がかかる。このため、多数の荷物を検査する検査場では、様々な方法のスクリーニングによって特定対象物が含まれる可能性が高いと推定された荷物を検査対象として開披検査を行うという運用が望まれる。 Conventionally, at an inspection site, it is sometimes necessary to perform inspection work to determine whether or not a predetermined object (hereinafter also referred to as a specific object) exists within the object to be inspected, such as baggage. When there are many packages that can be inspected, it takes a lot of time and effort for inspectors to open and inspect all the packages. Therefore, at an inspection site where a large number of packages are inspected, it is desirable to carry out open inspections on packages that are estimated to be likely to contain specific objects through various screening methods.
 近年、検査対象とする荷物をX線などの電磁波を用いて撮影した撮影画像(X線画像)を解析することによって当該荷物内における特定対象物を検出する検査システムが提案されている。このような検査システムは、例えば、X線を使用して荷物を撮影した画像を構成する各ピクセル内に含まれる物質の平均的な原子番号である実効原子番号が閾値以内であるか否かによって特定の物質からなる特定対象物と推定されるものを検出する。しかしながら、X線画像から得られる実効原子番号および密度の情報だけでは、特定対象物と当該特定対象物に類似した物性を持つものとを識別することが難しいという問題がある。 In recent years, an inspection system has been proposed that detects a specific object in a baggage by analyzing an image (X-ray image) taken of the baggage to be inspected using electromagnetic waves such as X-rays. Such inspection systems, for example, use X-rays to take images of baggage and determine whether the effective atomic number, which is the average atomic number, of the substances contained in each pixel that makes up the image is within a threshold. Detects what is presumed to be a specific object made of a specific substance. However, there is a problem in that it is difficult to distinguish between a specific object and an object having physical properties similar to the specific object using only the effective atomic number and density information obtained from an X-ray image.
日本国特開2009-92659号公報Japanese Patent Application Publication No. 2009-92659
 本発明が解決しようとする課題は、検査対象における特定対象物を精度良く検出できる検査装置、検査システム、および、検査方法を提供することである。 The problem to be solved by the present invention is to provide an inspection device, an inspection system, and an inspection method that can accurately detect a specific object in an inspection target.
 実施形態によれば、検査装置は、画像取得部とプロセッサとを有する。画像取得部は、検査対象に電磁波を照射して撮影した撮影画像と撮影画像の各部における物性を示す物性情報とを含む撮影画像データを取得する。プロセッサは、撮影画像データに含まれる物性情報に基づいて検査対象にある特定対象物の候補である対象物候補を検出し、検査対象にある特定の部位又は物体である特定部を検知し、対象物候補と特定部との相対関係を算出し、算出した相対関係が所定条件を満たす対象物候補を報知する。 According to the embodiment, the inspection device includes an image acquisition unit and a processor. The image acquisition unit acquires photographed image data including a photographed image photographed by irradiating the inspection object with electromagnetic waves and physical property information indicating physical properties of each part of the photographed image. The processor detects an object candidate that is a candidate for a specific object in the inspection target based on the physical property information included in the captured image data, detects a specific part that is a specific part or object in the inspection target, and detects a specific part that is a specific part or object in the inspection target. The relative relationship between the object candidate and the specific part is calculated, and the object candidate whose calculated relative relationship satisfies a predetermined condition is notified.
図1は、実施形態に係る検査装置を含む検査システムの全体構成を概略的に示す図である。FIG. 1 is a diagram schematically showing the overall configuration of an inspection system including an inspection apparatus according to an embodiment. 図2は、実施形態に係る検査装置を含む情報管理システムの構成例を示すブロック図である。FIG. 2 is a block diagram showing a configuration example of an information management system including the inspection device according to the embodiment. 図3は、実施形態に係る検査システムの撮影装置および検査装置における制御系の構成例を示すブロック図である。FIG. 3 is a block diagram illustrating a configuration example of a control system in the imaging device and the inspection device of the inspection system according to the embodiment. 図4は、実施形態に係る検査装置を含む情報管理システムにおける上位管理装置の構成例を示すブロック図である。FIG. 4 is a block diagram illustrating a configuration example of a higher-level management device in an information management system including the inspection device according to the embodiment. 図5は、実施形態に係る検査装置による検査処理全体の流れを説明するためのフローチャートである。FIG. 5 is a flowchart for explaining the overall flow of inspection processing by the inspection apparatus according to the embodiment. 図6(a)は、実施形態に係る検査装置に供給される撮影装置が撮影する撮影画像の例を示す図である。図6(b)は、実施形態に係る検査装置が撮影画像から検出する対象物候補の画像の例を示す図である。図6(c)は、実施形態に係る検査装置が撮影画像から検出する特定部の画像の例を示す図である。図6(d)は、実施形態に係る検査装置が検出した対象物候補と特定部との相対関係を基に特定対象物として検出された対象物候補の例を示す図である。FIG. 6A is a diagram illustrating an example of a photographed image photographed by a photographing device supplied to the inspection device according to the embodiment. FIG. 6(b) is a diagram illustrating an example of an image of an object candidate detected from a captured image by the inspection device according to the embodiment. FIG. 6C is a diagram showing an example of an image of a specific portion detected from a captured image by the inspection device according to the embodiment. FIG. 6D is a diagram illustrating an example of a target object candidate detected as a specific target object based on the relative relationship between the target object candidate detected by the inspection apparatus according to the embodiment and the identification unit. 図7は、実施形態に係る検査装置における対象物候補の検出処理の動作例を説明するためのフローチャートである。FIG. 7 is a flowchart for explaining an operation example of object candidate detection processing in the inspection apparatus according to the embodiment. 図8は、実施形態に係る検査装置に供給される撮影装置が撮影した3次元画像を各軸に沿ってスライスすることで生成する2次元画像の例を示す図である。FIG. 8 is a diagram illustrating an example of a two-dimensional image generated by slicing along each axis a three-dimensional image captured by an imaging device supplied to the inspection device according to the embodiment. 図9は、実施形態に係る検査装置における相対関係の算出処理の動作例を説明するためのフローチャートである。FIG. 9 is a flowchart for explaining an operation example of relative relationship calculation processing in the inspection apparatus according to the embodiment. 図10は、実施形態に係る検査装置が表示装置に表示する特定対象物の検査結果の表示例を示す図である。FIG. 10 is a diagram illustrating a display example of the inspection results of a specific object displayed on a display device by the inspection device according to the embodiment. 図11は、実施形態に係る検査装置が撮影画像において特定対象物の検出結果を表示した表示例を示す図である。FIG. 11 is a diagram illustrating a display example in which the inspection device according to the embodiment displays a detection result of a specific target object in a captured image. 図12は、実施形態に係る検査装置が撮影画像において特定部の検出結果を表示した表示例を示す図である。FIG. 12 is a diagram illustrating a display example in which the inspection device according to the embodiment displays the detection results of the specific portion in the captured image. 図13は、実施形態に係る検査装置が撮影画像において特定部と対象物候補との相対距離をした表示例を示す図である。FIG. 13 is a diagram illustrating an example of a display in which the inspection apparatus according to the embodiment displays the relative distance between the specific part and the target object candidate in a captured image. 図14は、実施形態に係る検査装置が特定対象物の検出結果を表示するとともに対象物候補および特定部の表示を指示するボタンを表示した例を示す図である。FIG. 14 is a diagram illustrating an example in which the inspection device according to the embodiment displays a detection result of a specific target object and also displays buttons for instructing display of target object candidates and specific parts. 図15は、実施形態に係る検査装置が3次元の撮影画像を指定された場所でスライスした2元次画像を表示する例を模式的に示す図である。FIG. 15 is a diagram schematically showing an example in which the inspection apparatus according to the embodiment displays a two-dimensional image obtained by slicing a three-dimensional captured image at a designated location.
実施形態Embodiment
 以下、実施形態について、図面を参照して説明する。
 図1は、実施形態に係る検査装置13を含む検査システム1の構成例を概略的に説明するための図である。
 実施形態に係る検査システム1は、検査対象とする荷物内に特定の検出対象物が存在するか否かを検査するためのシステムである。特定の検出対象物(特定対象物)は、例えば、危険な物品、危険な薬品、取り扱いが禁止されている薬物、国内などの所定領域内への持込又は持出が禁止されている物質などが想定される。また、特定の検出対象物は、特定の形状をなす固体でなくても良く、液体や粉体などの物質も含まれるものとする。
Embodiments will be described below with reference to the drawings.
FIG. 1 is a diagram for schematically explaining a configuration example of an inspection system 1 including an inspection apparatus 13 according to an embodiment.
The inspection system 1 according to the embodiment is a system for inspecting whether or not a specific detection object exists in a baggage to be inspected. Specific detection objects (specific objects) include, for example, dangerous goods, dangerous drugs, prohibited drugs, substances prohibited from being brought into or out of designated areas such as within Japan, etc. is assumed. Furthermore, the specific object to be detected does not have to be a solid having a specific shape, and includes substances such as liquids and powders.
 図1に示す構成例において、検査システム1は、コンベア11、撮影装置12、検査装置13、表示装置14、操作装置15、および、スピーカ16などを備える。検査装置13は、撮影装置12、表示装置14、操作装置15、および、スピーカ16などに通信接続される。 In the configuration example shown in FIG. 1, the inspection system 1 includes a conveyor 11, an imaging device 12, an inspection device 13, a display device 14, an operating device 15, a speaker 16, and the like. The inspection device 13 is communicatively connected to the imaging device 12, the display device 14, the operating device 15, the speaker 16, and the like.
 コンベア11は、検査対象とする荷物Mを搬送する装置である。コンベア11は、検査対象とする荷物Mを撮影装置12による画像の撮影位置(読取位置)に搬送させる。例えば、コンベア11は、作業員によって供給された荷物Mを搬送する。また、コンベア11には、ロボットアームなどによって供給される荷物Mを搬送するように構成されるものであっても良い。 The conveyor 11 is a device that conveys the cargo M to be inspected. The conveyor 11 transports the baggage M to be inspected to the image capturing position (reading position) by the image capturing device 12 . For example, the conveyor 11 conveys a load M supplied by a worker. Furthermore, the conveyor 11 may be configured to transport the cargo M supplied by a robot arm or the like.
 撮影装置12は、検査対象とする荷物Mに電磁波を照射することにより検査対象における撮影画像と撮影画像の各部における物性を示す物性情報とを含む撮影画像データを取得する。撮影装置12は、荷物Mの撮影画像データを検査装置13へ供給する。撮影装置12は、検査装置13が荷物M内において後述する特定の検出対象物と推定される候補(以下、対象物候補とも称する)および特定部位又は特定物体(以下、特定部とも称する)を検出できる撮影画像データを取得するものであれば良い。撮影装置12は、撮影画像として2次元の画像データを取得するものであっても良いし、3次元の画像データを取得するものであっても良い。 The photographing device 12 acquires photographed image data including a photographed image of the inspection object and physical property information indicating the physical properties of each part of the photographed image by irradiating electromagnetic waves to the baggage M to be inspected. The photographing device 12 supplies photographed image data of the baggage M to the inspection device 13. The imaging device 12 detects candidates (hereinafter also referred to as object candidates) and specific parts or objects (hereinafter also referred to as specific parts) that are presumed to be specific detection objects described later in the baggage M by the inspection device 13. Any method that can obtain photographed image data that can be used may be used. The photographing device 12 may be one that acquires two-dimensional image data as a photographed image, or may be one that acquires three-dimensional image data.
 撮影装置12は、例えば、X線CT撮影装置である。撮影装置12の一例としてのX線CT撮影装置は、コンベア11により搬送される荷物Mの周りからX線を照射することにより撮影画像として3次元のX線画像データを取得する。また、撮影装置12としてのX線CT撮影装置は、荷物Mを撮影した3次元のX線画像と当該X線画像を構成する構成単位(ピクセル又はボクセル)ごとの物性を示す物性情報を含む撮影画像データを取得する。撮影装置12としてのX線CT撮影装置は、荷物Mから取得した撮影画像データを検査装置13へ供給する。 The imaging device 12 is, for example, an X-ray CT imaging device. An X-ray CT imaging device as an example of the imaging device 12 acquires three-dimensional X-ray image data as a photographed image by irradiating X-rays from around the baggage M conveyed by the conveyor 11. In addition, the X-ray CT imaging device as the imaging device 12 captures a three-dimensional X-ray image of the baggage M and the physical property information indicating the physical properties of each constituent unit (pixel or voxel) constituting the X-ray image. Get image data. An X-ray CT imaging device serving as the imaging device 12 supplies photographed image data acquired from the baggage M to the inspection device 13.
 なお、撮影装置12は、X線CT撮影装置に限定されるものではないが、以下に説明する実施形態においては、撮影装置12がX線CT撮影装置であることを想定して説明するものとする。 Note that the imaging device 12 is not limited to an X-ray CT imaging device, but in the embodiment described below, the description will be made assuming that the imaging device 12 is an X-ray CT imaging device. do.
 検査装置13は、撮影装置12が電磁波を用いて荷物Mを撮影した撮影画像を処理する機能などの種々の機能を有する。
 例えば、検査装置13は、撮影装置12が電磁波を用いて荷物Mを撮影した撮影画像を取得する機能(受信部)および後述する検査処理によって得られた情報に基づく出力情報を表示装置14又はスピーカ16などの出力デバイスを用いて出力させる機能(送信部)を有する。
The inspection device 13 has various functions, such as a function to process a photographed image of the baggage M taken by the photographing device 12 using electromagnetic waves.
For example, the inspection device 13 has a function (receiving section) for acquiring a photographed image of the baggage M taken by the photographing device 12 using electromagnetic waves, and displays output information based on information obtained by an inspection process described later on the display device 14 or a speaker. It has a function (transmission unit) to output using an output device such as 16.
 また、検査装置13は、撮影装置12から取得する撮影画像データから荷物M内に存在する特定対象物と推定される候補(対象物候補)を検出する機能(対象物候補検出部)を有する。検査装置13は、特定対象物の物性を基準に設定される設定値に基づいて、撮影装置12から取得する荷物Mの撮影画像において特定対象物と推定される候補(対象物候補)を検出する。 In addition, the inspection device 13 has a function (object candidate detection unit) that detects a candidate (object candidate) that is estimated to be a specific object existing in the luggage M from the captured image data obtained from the imaging device 12. The inspection device 13 detects a candidate (object candidate) that is estimated to be a specific object in the photographed image of the baggage M obtained from the photographing device 12, based on setting values set based on the physical properties of the specific object. .
 例えば、検査装置13は、撮影装置12としてのX線CT撮影装置が撮影する撮影画像(X線画像)における各ピクセル又はボクセルの物性(密度および実効原子番号など)によって対象物候補を検出する。なお、ピクセルは、2次元画像データを構成する最小単位である画素に相当する。ボクセルとは、3次元データを構成する最小単位のデータであり、正規格子単位の値を表す。ボクセルは、2次元画像データにおけるピクセルに対応する値である。 For example, the inspection device 13 detects object candidates based on the physical properties (density, effective atomic number, etc.) of each pixel or voxel in a captured image (X-ray image) captured by an X-ray CT imaging device as the imaging device 12. Note that a pixel corresponds to a pixel that is the smallest unit of two-dimensional image data. A voxel is the smallest unit of data that constitutes three-dimensional data, and represents a value in a regular grid unit. A voxel is a value corresponding to a pixel in two-dimensional image data.
 また、検査装置13は、荷物Mを撮影した画像から特定部位又は特定物体(以下、単に、特定部とも称する)を検出する機能(特定部検出部)を有する。例えば、検査装置13は、撮影装置12が撮影した撮影画像において特定部位又は特定物体である特定部を検出する。特定部は、事前に収集した特定対象物が隠されることが多いとされる部位又は物体である。特定部は、所定の形状を有する部位又は物体であっても良いし、荷物における所定位置に配置される部位又は物体であっても良い。例えば、特定部としては、靴、カメラ、パソコン、トランクなどの荷物の壁、たばこなどが想定される。 In addition, the inspection device 13 has a function (specific part detection unit) of detecting a specific part or a specific object (hereinafter also simply referred to as a specific part) from an image taken of the baggage M. For example, the inspection device 13 detects a specific portion that is a specific region or a specific object in the captured image captured by the imaging device 12. The specific part is a part or object where a specific target object collected in advance is often hidden. The specific portion may be a portion or object having a predetermined shape, or may be a portion or object placed at a predetermined position in the luggage. For example, specific parts may include shoes, cameras, computers, walls of luggage such as trunks, and cigarettes.
 検査装置13の特定部検出部では、撮影装置12が撮影した撮影画像から事前に設定(学習)した特定部とする部位又は物体らしい領域の画像を抽出し、抽出した画像の形状を認識することにより特定部を検出する。具体的には、検査装置13の特定部検出部は、撮影画像内の全画素にラベルやカテゴリを関連付けるセマンティックセグメンテーションを用いて特定部を検出することができる。 The specific part detection unit of the inspection device 13 extracts an image of a region that is likely to be a specific part or object that has been set (learned) in advance from the photographed image taken by the imaging device 12, and recognizes the shape of the extracted image. Detects the specific part. Specifically, the specific part detection unit of the inspection device 13 can detect the specific part using semantic segmentation that associates labels and categories with all pixels in the captured image.
 ただし、特定部検出部に適用する特定部を検出する手法は、上述した方法に限定されるものではなく、一般的な画像認識、又は、画像から抽出するHOG(Histograms of Oriented Gradients)特徴などの特徴量を入力とした機械学習、ディープラーニングを用いたSSD(Single Shot Detector、 Single Shot MultiBox Detector)などの物体認識などを用いた方法であっても良い。 However, the method of detecting a specific part applied to the specific part detection unit is not limited to the above-mentioned method, and may be a method such as general image recognition or HOG (Histograms of Oriented Gradients) features extracted from an image. A method using object recognition such as SSD (Single Shot Detector, Single Shot MultiBox Detector) using machine learning or deep learning using feature amounts as input may also be used.
 また、検査装置13は、撮影装置12による撮影画像における各対象物候補と特定部との距離などの相対関係に基づいて対象物候補が特定対象物らしい(特定対象物であることが疑われるものである)ことを評価する機能(相対関係算出部)を有する。例えば、検査装置13は、対象物候補が特定部に隣接又は包含される場合に当該検査対象物が特定対象物である可能性が高いと判定する。また、検査装置13は、位置関係として対象物候補と特定部との相対距離を算出し、算出した相対距離が所定値以下である場合に当該対象物候補が特定の検出対象物である可能性が高いと判定するようにしても良い。 In addition, the inspection device 13 determines whether the object candidate is likely to be a specific object based on the relative relationship such as the distance between each object candidate and the specific part in the image taken by the imaging device 12 (if the object candidate is suspected to be a specific object). It has a function (relative relationship calculation unit) that evaluates that For example, when the target object candidate is adjacent to or included in the specific part, the inspection device 13 determines that there is a high possibility that the target object is the specific target object. In addition, the inspection device 13 calculates the relative distance between the object candidate and the specific part as a positional relationship, and if the calculated relative distance is less than or equal to a predetermined value, the possibility that the object candidate is a specific detection object is determined. may be determined to be high.
 表示装置14は、検査員に検査結果を報知するための出力デバイスである。表示装置14は、検査装置13の制御に応じて案内画面などを表示する。表示装置14は、検査員に提示するための案内画面として、荷物Mの撮影画像に対する検査処理の結果などを示す案内画面を表示する。例えば、表示装置14は、検査装置13が生成する撮影装置12による撮影画像において対象物候補および特定部を明示した画像を表示する。 The display device 14 is an output device for notifying the inspector of the test results. The display device 14 displays a guidance screen and the like according to the control of the inspection device 13. The display device 14 displays a guide screen showing the results of the inspection process on the photographed image of the luggage M, etc., as a guide screen to be presented to the inspector. For example, the display device 14 displays an image clearly showing the object candidate and the specific portion in the image captured by the imaging device 12 and generated by the inspection device 13 .
 操作装置15は、検査員(オペレータ)の操作入力に応じた操作信号を生成し、操作信号を検査装置13に供給する。また、操作装置15は、キーボードおよびポインティングデバイスなどの操作デバイスで構成される。また、操作装置15は、表示装置14の表示画面に設けたタッチパネルなどにより構成されるようにしても良い。 The operating device 15 generates an operating signal according to the operating input of the inspector (operator), and supplies the operating signal to the inspection device 13. Further, the operating device 15 includes operating devices such as a keyboard and a pointing device. Further, the operating device 15 may be configured by a touch panel provided on the display screen of the display device 14 or the like.
 スピーカ16は、検査員に検査結果などを音声で報知するための出力デバイスである。スピーカ16は、検査処理の結果などに応じた音声案内を検査員に報知するための音声を出力する。 The speaker 16 is an output device for notifying the inspector of test results and the like by voice. The speaker 16 outputs audio for notifying the inspector of audio guidance according to the results of the inspection process and the like.
 また、検査装置13は、対象物候補などの情報を検査員に報知(発報)する発報処理を実行する機能を有する。例えば、検査装置13は、コンベア11により搬送される荷物Mを撮影装置12によって撮影した撮影画像とともに、特定対象物である可能性が高いと判定された対象物候補を表示装置14に表示する。 In addition, the inspection device 13 has a function of executing a notification process to notify (issue a notification) information such as object candidates to the inspector. For example, the inspection device 13 displays, on the display device 14, object candidates that are determined to be highly likely to be specific objects, along with a photographed image taken by the photographing device 12 of the baggage M being conveyed by the conveyor 11.
 また、検査装置13は、荷物Mの撮影画像における全ての対象物候補を表示装置14に表示し、さらに、特定対象物である可能性が高いと判定された対象物候補を他の候補とは別の色やマークなどで表示するようにしても良い。また、検査装置13は、荷物Mの撮影画像において対象物候補と特定部とをそれぞれ明示する画像を表示装置14に表示するようにしても良い。また、検査装置13は、操作装置15による作業員からの指示に応じて表示装置14に表示する内容を切り替えるようにしても良い。 In addition, the inspection device 13 displays all object candidates in the photographed image of the baggage M on the display device 14, and further distinguishes object candidates that are determined to be likely to be specific objects from other candidates. It may be displayed in a different color or mark. Furthermore, the inspection device 13 may display on the display device 14 images that clearly indicate the object candidate and the specific portion in the photographed image of the baggage M. Furthermore, the inspection device 13 may change the content displayed on the display device 14 in response to an instruction from a worker using the operating device 15.
 次に、実施形態に係る検査システム1を含む情報管理システム100の構成について説明する。
 図2は、実施形態に係る検査システム1を含む情報管理システム100の構成例を示す図である。
 図2に示すように、情報管理システム100は、各検査場に設けられる検査システム1の検査装置13に通信接続される上位管理装置101を有する。上位管理装置101は、各検査システム1における検査装置からのデータを収集したり各検査装置へデータを供給したりする情報管理装置として機能する。
Next, the configuration of the information management system 100 including the inspection system 1 according to the embodiment will be described.
FIG. 2 is a diagram showing a configuration example of an information management system 100 including the inspection system 1 according to the embodiment.
As shown in FIG. 2, the information management system 100 includes an upper management device 101 that is communicatively connected to the inspection device 13 of the inspection system 1 provided at each inspection site. The upper management device 101 functions as an information management device that collects data from the inspection devices in each inspection system 1 and supplies data to each inspection device.
 上位管理装置101は、例えば、サーバ装置などのコンピュータで構成される。上位管理装置101は、各検査システム1が実施する検査に関する情報を保存する記憶装置を備える。また、上位管理装置101は、検査に関する情報を保存するサーバ装置に接続するインタフェースを備えるものであっても良い。 The upper management device 101 is composed of, for example, a computer such as a server device. The upper management device 101 includes a storage device that stores information regarding tests performed by each test system 1. Further, the upper management device 101 may include an interface that connects to a server device that stores information regarding examinations.
 上位管理装置101は、各検査システム1における検査装置13から情報を取得する。上位管理装置101は、各検査システム1の検査装置13から取得する情報を保存したり、集計したりする。上位管理装置101は、各検査システム1における検査装置13へ情報を供給する。例えば、上位管理装置101は、各検査システム1の検査装置13に対して検査処理に用いる設定値などを配信する。また、上位管理装置101は、各検査装置13が検査処理を実行するためのプログラムの更新データなどを配信するようにしても良い。 The upper management device 101 acquires information from the inspection devices 13 in each inspection system 1. The upper management device 101 stores and aggregates information acquired from the inspection devices 13 of each inspection system 1. The upper management device 101 supplies information to the inspection devices 13 in each inspection system 1. For example, the upper management device 101 distributes setting values used for inspection processing to the inspection devices 13 of each inspection system 1. Further, the upper management device 101 may distribute update data for a program for each inspection device 13 to execute inspection processing.
 次に、実施形態に係る検査システム1における検査装置13の制御系の構成について説明する。
 図3は、実施形態に係る検査システム1における検査装置13の制御系の構成例を示すブロック図である。
 図3に示すように、撮影装置12は、撮像部21、処理部22、および、出力部23を有する。
 撮像部21は、検査対象とする荷物Mなどの被撮影物にX線などの電磁波を照射して画像を撮影する。例えば、撮影装置12がX線CT撮影装置である場合、撮像部21は、コンベア11により搬送される検査対象とする荷物Mの周りからX線を照射することにより3次元のX線画像データを取得する。
Next, the configuration of the control system of the inspection device 13 in the inspection system 1 according to the embodiment will be explained.
FIG. 3 is a block diagram illustrating a configuration example of a control system of the inspection device 13 in the inspection system 1 according to the embodiment.
As shown in FIG. 3, the imaging device 12 includes an imaging section 21, a processing section 22, and an output section 23.
The imaging unit 21 shoots an image by irradiating an object to be imaged, such as a baggage M to be inspected, with electromagnetic waves such as X-rays. For example, when the imaging device 12 is an X-ray CT imaging device, the imaging unit 21 generates three-dimensional X-ray image data by emitting X-rays from around the baggage M to be inspected that is transported by the conveyor 11. get.
 処理部22は、プロセッサおよび各種のメモリなどを備え、プロセッサがメモリに記憶されたプログラムを実行することにより種々の処理を実行する。処理部22は、例えば、撮像部21が電磁波を照射して撮影した画像を処理することにより撮影画像と撮影画像を構成する構成単位(ピクセル又はボクセル)における物性を示す物性情報とを含む撮影画像データを生成する。 The processing unit 22 includes a processor, various types of memory, and the like, and the processor executes various processes by executing programs stored in the memory. For example, the processing unit 22 processes the image captured by the imaging unit 21 by irradiating electromagnetic waves, thereby generating a captured image that includes the captured image and physical property information indicating the physical properties of the constituent units (pixels or voxels) constituting the captured image. Generate data.
 出力部23は、撮影画像データなどデータを出力するインタフェースである。出力部23は、検査装置13の画像インタフェース39に対応するインタフェースを備え、検査装置13へ撮影画像データを出力する。また、出力部23は、接続される検査装置13からの制御データなどのデータを入力するインタフェースを含む入出力インタフェースであっても良い。 The output unit 23 is an interface that outputs data such as photographed image data. The output unit 23 includes an interface corresponding to the image interface 39 of the inspection device 13 and outputs captured image data to the inspection device 13. Further, the output unit 23 may be an input/output interface including an interface for inputting data such as control data from the connected inspection device 13.
 また、図3に示すように、検査装置13は、プロセッサ31、ROM32、RAM33、記憶部34、通信部35、表示インタフェース(I/F)36、操作インタフェース(I/F)37、音声インタフェース(I/F)38、および、画像インタフェース(I/F)39を有する。 Further, as shown in FIG. 3, the inspection device 13 includes a processor 31, a ROM 32, a RAM 33, a storage section 34, a communication section 35, a display interface (I/F) 36, an operation interface (I/F) 37, and an audio interface ( I/F) 38 and an image interface (I/F) 39.
 プロセッサ31は、演算処理を実行する。プロセッサ31は、例えば、CPU(Central Processing Unit)である。プロセッサ31は、RAM33を用いてROM32又は記憶部34に記憶したプログラムを実行することにより種々の処理を実行する処理部として機能する。 The processor 31 executes arithmetic processing. The processor 31 is, for example, a CPU (Central Processing Unit). The processor 31 functions as a processing unit that executes various processes by executing programs stored in the ROM 32 or the storage unit 34 using the RAM 33.
 ROM32は、読み出し専用の不揮発性メモリである。ROM32は、プログラムのデータおよび制御データなどを記憶する。RAM33は、ワーキングメモリとして機能する揮発性のメモリである。RAM33は、データを一時的に記憶する。 The ROM 32 is a read-only nonvolatile memory. The ROM 32 stores program data, control data, and the like. The RAM 33 is a volatile memory that functions as a working memory. RAM 33 temporarily stores data.
 記憶部34は、書き換え可能な不揮発性メモリである。記憶部34は、ハードディスクドライブ(HDD)、ソリッドステートドライブ(SSD)などにより構成される。記憶部34は、プログラムのデータ、制御データとしての設定値、検査処理の結果などの情報を記憶する。 The storage unit 34 is a rewritable nonvolatile memory. The storage unit 34 includes a hard disk drive (HDD), a solid state drive (SSD), and the like. The storage unit 34 stores information such as program data, setting values as control data, and test processing results.
 通信部35は、上位管理装置101と通信するための通信インタフェースである。プロセッサ31は、通信部35を介して上位管理装置101と通信する。プロセッサ31は、通信部35を介して処理結果などのデータを上位管理装置101へ送信したり、上位管理装置101からのデータを受信したりする。 The communication unit 35 is a communication interface for communicating with the upper management device 101. The processor 31 communicates with the upper management device 101 via the communication unit 35. The processor 31 transmits data such as processing results to the higher-level management device 101 via the communication unit 35, and receives data from the higher-level management device 101.
 表示インタフェース36は、出力デバイスとしての表示装置14と接続するためのインタフェースである。表示インタフェース36は、表示装置14が備えるインタフェースに対応するものであれば良い。プロセッサ31は、表示インタフェース36を介して、表示装置14に表示する表示内容を制御する。 The display interface 36 is an interface for connecting to the display device 14 as an output device. The display interface 36 may be any interface that is compatible with the interface included in the display device 14. The processor 31 controls the display content displayed on the display device 14 via the display interface 36.
 操作インタフェース37は、操作装置15と接続するためのインタフェースである。操作インタフェース37は、操作装置15が備えるインタフェースに対応するものであれば良い。プロセッサ31は、操作インタフェース37を介して、操作装置15により入力された情報を取得する。 The operation interface 37 is an interface for connecting to the operation device 15. The operation interface 37 may be any interface as long as it corresponds to the interface included in the operation device 15. The processor 31 acquires information input by the operating device 15 via the operating interface 37 .
 音声インタフェース38は、出力デバイスとしてのスピーカ16と接続するためのインタフェースである。音声インタフェース38は、スピーカ16が備えるインタフェースに対応するものであれば良い。プロセッサ31は、音声インタフェース38を介して、出力デバイスとしてのスピーカ16から音声を出力させる。 The audio interface 38 is an interface for connecting to the speaker 16 as an output device. The audio interface 38 may be any interface that is compatible with the interface included in the speaker 16. The processor 31 outputs audio from the speaker 16 as an output device via the audio interface 38.
 画像インタフェース39は、撮影装置12と接続するためのインタフェースである。画像インタフェース39は、撮影装置12から撮影画像を取得するための画像取得部(画像取得インタフェース)である。画像インタフェース39は、X線CT装置などの撮影装置12が備えるインタフェースに対応するものであれば良い。プロセッサ31は、画像インタフェース39を介して、撮影装置12としてのX線CT装置が撮影した撮影画像(X線画像)を取得する。また、プロセッサ31は、画像インタフェース39を介して撮影装置12による荷物Mに対する撮影動作を制御するようにしても良い。 The image interface 39 is an interface for connecting to the photographing device 12. The image interface 39 is an image acquisition unit (image acquisition interface) for acquiring a photographed image from the photographing device 12. The image interface 39 may be any interface as long as it is compatible with an interface included in the imaging device 12 such as an X-ray CT device. The processor 31 acquires, via the image interface 39, a photographed image (X-ray image) taken by an X-ray CT device serving as the photographing device 12. Furthermore, the processor 31 may control the photographing operation of the baggage M by the photographing device 12 via the image interface 39.
 次に、実施形態に係る検査システム1を含む情報管理システム100における上位管理装置101の構成について説明する。
 図4は、実施形態に係る検査システム1を含む情報管理システム100における上位管理装置101の構成例を示すブロック図である。
 上位管理装置101は、検査システム1全体の情報を管理する情報管理装置である。上位管理装置101は、各検査場に設けられる検査システム1の検査装置13に通信接続されるコンピュータである。上位管理装置101は、例えば、サーバ装置によって構成される。
Next, the configuration of the upper management device 101 in the information management system 100 including the inspection system 1 according to the embodiment will be described.
FIG. 4 is a block diagram showing a configuration example of the upper management device 101 in the information management system 100 including the inspection system 1 according to the embodiment.
The upper management device 101 is an information management device that manages information about the entire inspection system 1 . The upper management device 101 is a computer that is communicatively connected to the inspection device 13 of the inspection system 1 provided at each inspection site. The upper management device 101 is configured by, for example, a server device.
 図4に示す構成例において、上位管理装置101は、プロセッサ41、ROM42、RAM43、記憶部44、および、通信部45を有する。
 プロセッサ41は、演算処理を実行する。プロセッサ41は、例えば、CPU(Central Processing Unit)である。プロセッサ41は、RAM43を用いてROM42又は記憶部44に記憶したプログラムを実行することにより種々の処理を実行する処理部として機能する。
In the configuration example shown in FIG. 4, the upper management device 101 includes a processor 41, a ROM 42, a RAM 43, a storage section 44, and a communication section 45.
Processor 41 executes arithmetic processing. The processor 41 is, for example, a CPU (Central Processing Unit). The processor 41 functions as a processing unit that executes various processes by executing programs stored in the ROM 42 or the storage unit 44 using the RAM 43.
 ROM42は、読み出し専用の不揮発性メモリである。ROM42は、プログラムのデータおよび制御データなどを記憶する。RAM43は、ワーキングメモリとして機能する揮発性のメモリである。RAM43は、データを一時的に記憶する。 The ROM 42 is a read-only nonvolatile memory. The ROM 42 stores program data, control data, and the like. RAM 43 is a volatile memory that functions as working memory. RAM 43 temporarily stores data.
 記憶部44は、書き換え可能な不揮発性メモリである。記憶部44は、ハードディスクドライブ(HDD)、ソリッドステートドライブ(SSD)などにより構成される。記憶部34は、プログラムのデータ、制御データとしての設定値、各検査装置13から収集するデータなどの情報を記憶する。 The storage unit 44 is a rewritable nonvolatile memory. The storage unit 44 includes a hard disk drive (HDD), solid state drive (SSD), and the like. The storage unit 34 stores information such as program data, set values as control data, and data collected from each inspection device 13.
 通信部45は、各検査システム1における検査装置13と通信するための通信インタフェースである。プロセッサ41は、通信部45を介して検査装置13と通信する。プロセッサ41は、通信部45を介して検査装置13から処理結果などのデータを受信したり、検査装置13へデータを送信したりする。 The communication unit 45 is a communication interface for communicating with the inspection device 13 in each inspection system 1. The processor 41 communicates with the inspection device 13 via the communication unit 45. The processor 41 receives data such as processing results from the inspection device 13 via the communication unit 45, and transmits data to the inspection device 13.
 次に、実施形態に係る検査システム1における検査対象とする荷物Mを検査する検査処理について説明する。
 図5は、実施形態に係る検査システム1における検査処理の流れを概略的に説明するためのフローチャートである。また、図6(a)~(d)は、検査処理の各処理によって得られる画像の例を模式的に示す図である。
Next, an inspection process for inspecting the baggage M to be inspected in the inspection system 1 according to the embodiment will be described.
FIG. 5 is a flowchart for schematically explaining the flow of inspection processing in the inspection system 1 according to the embodiment. Further, FIGS. 6(a) to 6(d) are diagrams schematically showing examples of images obtained by each process of the inspection process.
 図1に示すように構成される検査システム1において、検査対象である荷物Mは、コンベア11に順次投入される。コンベア11は、投入された荷物Mを撮影装置12による画像の撮影位置に搬送する。撮影装置12の撮像部21は、コンベア11により搬送される荷物Mに対して電磁波を照射することにより荷物Mの内容を示す撮影画像を取得する。 In an inspection system 1 configured as shown in FIG. 1, packages M to be inspected are sequentially placed on a conveyor 11. The conveyor 11 transports the loaded baggage M to a position where an image is taken by the photographing device 12. The imaging unit 21 of the photographing device 12 acquires a photographed image showing the contents of the baggage M by irradiating electromagnetic waves onto the baggage M conveyed by the conveyor 11.
 撮影画像は、荷物M内の状態を示す画像データであれば良い。例えば、撮影装置12のX線CT装置は、撮影位置に搬送される荷物MにX線を照射することにより荷物M内の状態を示す3次元データを撮影画像として取得する。また、物性情報は、撮影画像を構成する各ピクセル又はボクセルにおける物性(密度および実効原子番号)を示す物性情報を取得する。 The photographed image may be image data that shows the condition inside the luggage M. For example, the X-ray CT device of the photographing device 12 acquires three-dimensional data indicating the state inside the baggage M as a photographed image by irradiating the baggage M transported to the photographing position with X-rays. Furthermore, physical property information indicating physical properties (density and effective atomic number) of each pixel or voxel constituting the photographed image is acquired.
 撮影装置12の処理部22は、撮像部21が荷物Mを撮影した撮影画像と撮影画像における各ピクセル又はボクセルにおける物性を示す物性情報とを含む撮影画像データを生成する。撮影装置の処理部22は、荷物Mの撮影画像データを取得すると、当該撮影画像データを出力部23により検査装置13へ出力する。 The processing section 22 of the photographing device 12 generates photographed image data that includes a photographed image of the luggage M taken by the photographing section 21 and physical property information indicating the physical properties of each pixel or voxel in the photographed image. When the processing unit 22 of the photographing device acquires the photographed image data of the baggage M, the processing unit 22 outputs the photographed image data to the inspection device 13 through the output unit 23.
 検査装置13は、画像インタフェース39により撮影装置12から撮影画像データを取得する(ステップS101)。プロセッサ31は、上述した受信部による処理として撮影画像データを取得する処理を実行する。例えば、検査装置13は、図6(a)に示すような撮影画像と撮影画像の各部位(ピクセル又はボクセル)における物性を示す物性情報とを含む撮影画像データを取得する。撮影画像は、2次元データであっても良いし、3次元データであっても良いし、3次元データを特定の軸に対してスライスして得られる複数の2次元画像データであっても良い。以下の説明では、説明を簡単にするために撮影画像が2次元画像データであるものとして説明するものとする。 The inspection device 13 acquires photographed image data from the photographing device 12 through the image interface 39 (step S101). The processor 31 executes a process of acquiring captured image data as a process performed by the receiving section described above. For example, the inspection device 13 acquires photographed image data including a photographed image as shown in FIG. 6(a) and physical property information indicating the physical properties of each region (pixel or voxel) of the photographed image. The photographed image may be two-dimensional data, three-dimensional data, or a plurality of two-dimensional image data obtained by slicing three-dimensional data along a specific axis. . In the following explanation, in order to simplify the explanation, it is assumed that the photographed image is two-dimensional image data.
 検査装置13のプロセッサ31は、撮影装置12から取得した撮影画像データから特定の対象物の候補として対象物候補を検出する対象物候補の検出処理を実行する(ステップS102)。プロセッサ31は、上述した対象物候補検出部による処理として対象物候補の検出処理を実行する。例えば、プロセッサ31は、検出すべき特定対象物の物性に応じて設定される設定値に基づいて、撮影画像において物性が特定対象物の物性と類似又は一致する箇所(ピクセル)を対象物候補として検出する。図6(b)は、図6(a)に示すような撮影画像データから検出した対象物候補の例を示す図である。 The processor 31 of the inspection device 13 executes an object candidate detection process of detecting an object candidate as a specific object candidate from the captured image data acquired from the imaging device 12 (step S102). The processor 31 executes object candidate detection processing as processing by the object candidate detection section described above. For example, the processor 31 selects locations (pixels) whose physical properties are similar or consistent with the physical properties of the specific target in the photographed image as object candidates based on setting values that are set according to the physical properties of the specific target to be detected. To detect. FIG. 6(b) is a diagram showing an example of object candidates detected from the captured image data as shown in FIG. 6(a).
 また、検査装置13のプロセッサ31は、撮影装置12から取得した撮影画像データに基づいて、荷物Mを撮影した撮影画像において、検出すべき特定の対象物が配置されやすい場所である特定部位又は物体(特定部)を検出する特定部の検出処理を実行する(ステップS103)。 In addition, the processor 31 of the inspection device 13 detects, based on the photographed image data acquired from the photographing device 12, a specific part or object in the photographed image of the baggage M that is a place where a specific object to be detected is likely to be placed. A detection process for detecting a specific part (specific part) is executed (step S103).
 プロセッサ31は、上述した特定部検出部による処理として特定部の検出処理を実行する。例えば、プロセッサ31は、事前に特定部として検出すべき部位又は物体の形状を設定しておき、撮影画像において検出すべき特定部の形状と類似する形状の画像領域を特定部として検出する。図6(c)は、図6(a)に示すような撮影画像データから検出した特定部の例を示す図である。 The processor 31 executes a specific part detection process as the process by the specific part detection unit described above. For example, the processor 31 sets in advance the shape of a region or object to be detected as a specific part, and detects, as a specific part, an image area having a shape similar to the shape of the specific part to be detected in the photographed image. FIG. 6(c) is a diagram showing an example of a specific part detected from the captured image data as shown in FIG. 6(a).
 プロセッサ31は、対象物候補の検出処理と特定部の検出処理とを実行すると、検出した対象物候補と特定部との相対関係を算出する相対関係の算出処理を実行する(ステップS104)。プロセッサ31は、上述した相対関係算出部による処理として、相対関係の検出処理を実行する。対象物候補と特定部との相対関係は、対象物候補が検出すべき特定対象物であることが疑われるか否かを評価するための情報である。プロセッサ31は、例えば、相対関係として対象物候補と特定部との相対距離を算出する。 After executing the object candidate detection process and the identification part detection process, the processor 31 executes a relative relationship calculation process that calculates the relative relationship between the detected object candidate and the identification part (step S104). The processor 31 executes a relative relationship detection process as a process by the above-mentioned relative relationship calculation unit. The relative relationship between the object candidate and the identification unit is information for evaluating whether or not the object candidate is suspected to be the specific object to be detected. The processor 31 calculates, for example, the relative distance between the object candidate and the specific part as a relative relationship.
 プロセッサ31は、対象物候補と特定部との相対関係を算出すると、特定部との相対関係に基づいて特定対象物であることが疑われる対象物候補を検出する(ステップS105)。例えば、プロセッサ31は、特定部との相対関係(例えば、相対距離)が予め設定した設定値である所定条件を満たす対象物候補を特定対象物であることが疑われる(特定対象物として検出する)ものと判定する。 Once the processor 31 calculates the relative relationship between the target object candidate and the specific part, it detects the target object candidate suspected to be the specific target based on the relative relationship with the specific part (step S105). For example, the processor 31 detects an object candidate that is suspected of being a specific object (detects it as a specific object) that satisfies a predetermined condition in which the relative relationship (for example, relative distance) with the specific part is a preset value. ) is determined to be a thing.
 プロセッサ31は、対象物候補と特定部との相対関係に基づいて特定対象物であることが疑われる対象物候補の検出結果を出力デバイスとしての表示装置14やスピーカ16により報知(発報)する発報処理を実行する(ステップS106)。例えば、プロセッサ31は、上述した送信部による処理として特定対象物であることが疑われる対象物候補を検出した場合、その検出結果を表示装置14又はスピーカ16などの出力デバイスによって発報する発報処理を実行する。 The processor 31 notifies (announces) the detection result of the object candidate that is suspected to be the specific object based on the relative relationship between the object candidate and the identification unit using the display device 14 and the speaker 16 as output devices. Alert processing is executed (step S106). For example, when the processor 31 detects an object candidate that is suspected to be a specific object as a process performed by the above-mentioned transmitting unit, the processor 31 issues an alarm to notify the detection result through an output device such as the display device 14 or the speaker 16. Execute processing.
 例えば、プロセッサ31は、特定部との相対距離が予め設定した閾値以下である対象物候補を特定対象物であることが疑われるものとして表示装置14に表示させる。図6(d)は、図6(b)に示す対象物候補と図6(c)に示す特定部との相対関係に基づいて、特定対象物であると疑われると判定された対象物候補と当該対象物候補に対応づけられた特定部とを表示した表示例を示す図である。 For example, the processor 31 causes the display device 14 to display object candidates whose relative distance to the specific portion is less than or equal to a preset threshold as being suspected to be the specific object. FIG. 6(d) shows object candidates that are determined to be suspected to be specific objects based on the relative relationship between the object candidates shown in FIG. 6(b) and the identification portion shown in FIG. 6(c). FIG. 4 is a diagram showing an example of a display displaying a specific part associated with the target object candidate.
 発報処理によって対象物候補の検出結果が表示装置14に表示された荷物は、検査員による検査作業が実施される。検査員は、発報処理によって表示された検出結果等の情報を参照しながら荷物の内容物などを検査する検査作業を実施する。なお、検査システム1としては、全ての荷物Mに対して検査作業を実施する運用としても良いし、特定対象物であることが疑われる対象物候補が検出された荷物Mに対して検査作業を実行する運用しても良い。 The baggage for which the detection result of the target object candidate is displayed on the display device 14 through the notification process is inspected by the inspector. The inspector performs inspection work to inspect the contents of the baggage while referring to information such as the detection results displayed by the notification process. Note that the inspection system 1 may perform inspection work on all packages M, or may perform inspection work on packages M in which an object candidate suspected of being a specific object has been detected. It is also possible to carry out the operation.
 次に、実施形態に係る検査装置13の対象物候補検出部による対象物候補の検出処理の例について説明する。
 図7は、実施形態に係る検査装置13の対象物候補検出部による対象物候補の検出処理の例を説明するためのフローチャートである。
Next, an example of object candidate detection processing performed by the object candidate detection unit of the inspection device 13 according to the embodiment will be described.
FIG. 7 is a flowchart for explaining an example of object candidate detection processing performed by the object candidate detection unit of the inspection device 13 according to the embodiment.
 検査装置13のプロセッサ31は、対象物候補検出部として、例えば、撮影装置12から得られる荷物Mの撮影画像と撮影画像における各部位の実効原子番号および密度データとを含む撮影画像データに基づいて検出すべき特定対象物の物性と類似又は一致する対象物候補を検出する。具体的には、入力データである撮影画像の各ピクセル又はボクセルにおける実効原子番号および密度が事前に収集した検出すべき特定対象物の物性値を基に決定した閾値の範囲内であるか否かにより対象物候補であるか否かを判定する。 The processor 31 of the inspection device 13 acts as an object candidate detection unit, based on, for example, the photographed image data of the luggage M obtained from the photographing device 12 and the effective atomic number and density data of each part in the photographed image. An object candidate that is similar to or coincides with the physical properties of a specific object to be detected is detected. Specifically, whether the effective atomic number and density of each pixel or voxel of the captured image, which is the input data, are within the range of a threshold determined based on the physical property values of the specific target to be detected collected in advance. It is determined whether or not the object is a candidate object.
 図7に示す処理例において、検査装置13のプロセッサ31は、検出すべき特定対象物の物性値Zを取得する(ステップS201)。ここで、特定対象物の物性値Zは、管理者又は検査員が操作装置15を用いて入力するようにしても良いし、上位管理装置101などの外部装置から取得するようにしても良い。 In the processing example shown in FIG. 7, the processor 31 of the inspection device 13 acquires the physical property value Z of the specific object to be detected (step S201). Here, the physical property value Z of the specific object may be input by a manager or an inspector using the operating device 15, or may be obtained from an external device such as the upper management device 101.
 プロセッサ31は、特定対象物の物性値Zを取得すると、特定対象物の物性値Zを基準とした閾値Eを設定する(ステップS202)。例えば、プロセッサ31は、特定対象物の物性値Zに対して所定の許容範囲を設定する閾値Eを設定する。 Upon acquiring the physical property value Z of the specific target object, the processor 31 sets a threshold value E based on the physical property value Z of the specific target object (step S202). For example, the processor 31 sets a threshold value E that sets a predetermined tolerance range for the physical property value Z of the specific object.
 なお、ステップS201およびS202の処理は、管理者又は検査員が操作装置15を用いて入力する閾値Eを設定する処理に置き換えても良い。また、ステップS201およびS202の処理は、上位管理装置101から閾値Eを指定する情報を取得し、上位管理装置101から取得する閾値Eを対象物候補の検出処理における設定値として設定するようにしても良い。 Note that the processes in steps S201 and S202 may be replaced with a process in which the administrator or inspector sets the threshold value E using the operating device 15. Further, in the processing of steps S201 and S202, information specifying the threshold E is obtained from the upper management device 101, and the threshold E obtained from the upper management device 101 is set as a setting value in the object candidate detection process. Also good.
 対象物候補の検出処理における設定値である閾値Eを設定した検査装置13のプロセッサ31は、撮影装置12から検査対象である荷物MをX線などの電磁波を用いて撮影した撮影画像と撮影画像の各部における物性を示す物性情報とを含む撮影画像データを取得する(ステップS203)。撮影装置12から荷物Mの撮影画像データを取得すると、プロセッサ31は、撮影画像に含まれる対象物候補を検出する処理を実行する(ステップS204~S207)。 The processor 31 of the inspection device 13 that has set the threshold value E, which is a set value in the object candidate detection process, uses the photographic image taken by the photographing device 12 of the baggage M to be inspected using electromagnetic waves such as X-rays, and the photographed image. Captured image data including physical property information indicating the physical properties of each part is acquired (step S203). Upon acquiring the photographed image data of the luggage M from the photographing device 12, the processor 31 executes a process of detecting object candidates included in the photographed image (steps S204 to S207).
 図7に示す処理例において、プロセッサ31は、荷物Mの撮影画像を構成する各ピクセルpの物性値I(p)と特定検出物の物性値Zとの差が閾値E未満(|I(p)-Z|<E)であるか否かを判断する(ステップS205)。プロセッサ31は、|I(p)-Z|<Eである場合(ステップS205、YES)、撮影画像におけるピクセルpを対象物候補として選出する(ステップS206)。プロセッサ31は、荷物Mの撮影画像を構成する全てのピクセルについてステップS205およびS206の処理を繰り返し実行する(ステップS204、S207)。 In the processing example shown in FIG. )-Z|<E) (step S205). If |I(p)−Z|<E (step S205, YES), the processor 31 selects pixel p in the photographed image as an object candidate (step S206). The processor 31 repeatedly executes the processes of steps S205 and S206 for all pixels forming the photographed image of the luggage M (steps S204, S207).
 撮影画像における全てのピクセルに対するステップS205およびS206の処理が完了すると、プロセッサ31は、対象物候補のピクセルとして検出したピクセルの集合からなる画像領域を対象物候補として検出する(ステップS208)。例えば、プロセッサ31は、特定対象物の物性値Zとの差が閾値E未満となる物性値I(p)のピクセルが連続する画像領域(又は、相対距離が所定距離以下となるピクセルの集合からなる画像領域)を対象物候補として検出する。 When the processing of steps S205 and S206 for all pixels in the photographed image is completed, the processor 31 detects, as a target object candidate, an image area consisting of a set of pixels detected as target object candidate pixels (step S208). For example, the processor 31 selects a pixel from an image region in which pixels having a physical property value I(p) whose difference from the physical property value Z of the specific object is less than a threshold value E (or from a set of pixels whose relative distance is less than or equal to a predetermined distance) image area) is detected as a target object candidate.
 上述した対象物候補の検出処理の例によれば、検査装置は、X線画像などの電磁波を照射して得られる撮影画像における各部位の実効原子番号や密度データなどの物性値と特定対象物の物性値との差が所定の閾値の範囲内であるか否かにより対象物候補を検出する。これにより、実施形態に係る検査装置は、検出すべき特定対象物が特定の形状を持たない液体又は紛体などの物質であっても、荷物内に存在する特定対象物の候補を検出することができる。 According to the example of the target object candidate detection processing described above, the inspection device detects the physical property values such as the effective atomic number and density data of each part in the photographed image obtained by irradiating electromagnetic waves such as an X-ray image, and the specific target object. An object candidate is detected based on whether the difference between the object and the physical property value is within a predetermined threshold. As a result, the inspection device according to the embodiment can detect candidates for the specific object present in the baggage, even if the specific object to be detected is a substance such as a liquid or powder that does not have a specific shape. can.
 なお、上述した対象物候補の検出処理では、2次元画像を処理対象としてピクセルごとに対象物候補であるか否かを判定するものとしたが、撮影画像が3次元データであれば、ボクセルごとに対象物候補であるか否かを判定することにより3次元の撮影画像における対象物候補を検出するようにすれば良い。 In addition, in the above-mentioned object candidate detection processing, a two-dimensional image is used as the processing target, and it is determined whether or not each pixel is an object candidate. However, if the photographed image is three-dimensional data, the processing is performed for each voxel. An object candidate in a three-dimensional photographed image may be detected by determining whether or not the object is an object candidate.
 また、検査装置13のプロセッサ31は、撮影画像としての3次元の画像を各軸に沿ってスライス(分割)することで得られる複数の2次元画像に対して上述した対象物候補の検出処理を実行するようにしても良い。
 図8は、撮影画像としての3次元データを各軸(x軸、y軸、z軸)に沿ってスライス(分割)することで得られる複数の2次元画像の例を示す図である。
 図8に示すような複数の2次元画像データが得られると、プロセッサ31は、各2次元画像データに対して上述したような対象物候補の検出処理を実行する。各2次元画像データにおける対象物候補を検出すると、プロセッサ31は、各2次元画像に対する対象物候補の検出結果を重ね合わせることにより3次元の空間における対象物候補を検出するようにしても良い。
In addition, the processor 31 of the inspection device 13 performs the above-mentioned object candidate detection process on a plurality of two-dimensional images obtained by slicing (dividing) a three-dimensional image as a captured image along each axis. You may also choose to execute it.
FIG. 8 is a diagram showing an example of a plurality of two-dimensional images obtained by slicing (dividing) three-dimensional data as a captured image along each axis (x-axis, y-axis, z-axis).
When a plurality of two-dimensional image data as shown in FIG. 8 are obtained, the processor 31 executes the object candidate detection process as described above for each two-dimensional image data. After detecting the object candidate in each two-dimensional image data, the processor 31 may detect the object candidate in the three-dimensional space by superimposing the detection results of the object candidate in each two-dimensional image.
 また、対象物候補の検出処理は、上述した方法に限定されるものではなく、事前に収集した特定対象物のデータを利用して対象物候補を検出できるものであれば良い。例えば、対象物候補の検出処理としては、ハッシュテーブル又は機械学習などを用いて対象物候補を検出するようにしても良い。また、特定対象物が形状としての特徴を有するものであれば、対象物候補の検出処理としては、一般的な物体認識を用いて対象物候補を検出するようにしても良い。 Furthermore, the process for detecting object candidates is not limited to the method described above, but may be any method that can detect object candidates using data on specific objects collected in advance. For example, as the object candidate detection process, a hash table, machine learning, or the like may be used to detect object candidates. Further, as long as the specific target object has a feature as a shape, general object recognition may be used to detect the target object candidate.
 次に、実施形態に係る検査装置13の相対関係算出部による相対関係の算出処理について詳細に説明する。
 検査装置13のプロセッサ31は、相対関係算出部として、撮影画像から検出された対象物候補と特定部との相対関係を算出する。相対関係は、対象物候補が検出すべき特定対象物であるかを評価するための指標値として算出される。例えば、相対関係としては、対象物候補と特定部との距離(相対距離)を算出する。この場合、特定部との距離が近い対象物候補が特定対象物であることが疑われるものと判定することができる。
Next, relative relationship calculation processing by the relative relationship calculation unit of the inspection device 13 according to the embodiment will be described in detail.
The processor 31 of the inspection device 13, as a relative relationship calculation unit, calculates the relative relationship between the target object candidate detected from the captured image and the specific unit. The relative relationship is calculated as an index value for evaluating whether the object candidate is a specific object to be detected. For example, as the relative relationship, the distance (relative distance) between the target object candidate and the specific part is calculated. In this case, it can be determined that the object candidate that is close to the specific part is suspected to be the specific object.
 また、相対関係としては、対象物候補と特定部との包含関係、又は、隣接するか否かなどの位置関係を算出しても良い。この場合、特定部内の対象物候補、又は、特定部に隣接する対象物候補が特定対象物であることが疑われるものとして評価することができる。 Furthermore, as the relative relationship, an inclusive relationship between the target object candidate and the specific part, or a positional relationship such as whether they are adjacent or not, may be calculated. In this case, the object candidate within the specific portion or the object candidate adjacent to the specific portion can be evaluated as being suspected to be the specific object.
 また、相対関係としては、対象物候補の物性情報と特定部の物性情報との組み合わせに対する物性情報(例えば、密度および実効原子番号などの平均値など)を算出するようにしても良い。この場合、特定部を構成する物質と対象物候補を構成する物質とが混在する状態であっても特定対象物であることが疑われる対象物候補を検出できる。 Further, as a relative relationship, physical property information (for example, average values of density, effective atomic number, etc.) for a combination of physical property information of the target object candidate and physical property information of the specific part may be calculated. In this case, an object candidate suspected of being a specific object can be detected even in a state where the substance forming the identification portion and the substance forming the object candidate coexist.
 図9は、実施形態に係る検査装置13の相対関係算出部による相対関係の算出処理の例として対象物候補と特定部との相対距離を算出する処理を説明するためのフローチャートである。
 図9では、対象物候補として検出された各ピクセルと特定部を形成する各ピクセルとの距離に基づいて対象物候補と特定部との相対距離を算出する処理例について説明する。対象物候補と特定部との相対関係として相対距離を算出する例としては、対象物候補の中心位置と特定部の中心位置とのユークリッド距離を計算する方法が考えられる。しかし、中心位置同士の距離では、対象物候補又は特定部の形状による影響が大きくなる。例えば、対象物候補(又は特定部)の領域の形状がL字である場合、当該領域の中心位置が対象物候補(又は特定部)上にないことが考えられる。このため、図9に示す処理例では、対象物候補の各ピクセルと特定部の各ピクセルとの距離に基づいて対象物候補と特定部との相対距離を算出するものとする。
FIG. 9 is a flowchart for explaining the process of calculating the relative distance between the object candidate and the specific part as an example of the process of calculating the relative relationship by the relative relationship calculation unit of the inspection apparatus 13 according to the embodiment.
In FIG. 9, a processing example will be described in which the relative distance between the target object candidate and the specific portion is calculated based on the distance between each pixel detected as the target object candidate and each pixel forming the specific portion. An example of calculating the relative distance as the relative relationship between the object candidate and the specific portion is a method of calculating the Euclidean distance between the center position of the object candidate and the center position of the specific portion. However, the distance between the center positions is greatly influenced by the shape of the object candidate or the specific part. For example, if the region of the object candidate (or specific portion) has an L-shape, the center position of the region may not be on the object candidate (or specific portion). Therefore, in the processing example shown in FIG. 9, the relative distance between the target object candidate and the specific part is calculated based on the distance between each pixel of the target object candidate and each pixel of the specific part.
 すなわち、検査装置13のプロセッサ31は、検査対象とする荷物Mの撮影画像において検出された対象物候補Tの領域と特定部Wの領域とを示す情報を取得する(ステップS301)。ここでは、プロセッサ31は、相関関係を算出する対象物候補Tの画像領域と特定部Wの画像領域とを取得するものとする。ただし、撮影画像に複数の対象物候補又は複数の特定部が存在する場合、プロセッサ31は、各対象物候補と各特定部との全ての組み合わせについてステップS301~S307の処理を実行することにより相関関係(相対距離)を算出するものとする。 That is, the processor 31 of the inspection device 13 acquires information indicating the area of the object candidate T and the area of the specific part W detected in the photographed image of the baggage M to be inspected (step S301). Here, it is assumed that the processor 31 acquires the image area of the target object candidate T and the image area of the specific part W for which the correlation is to be calculated. However, if a plurality of object candidates or a plurality of specific parts exist in the captured image, the processor 31 performs the correlation by executing the processes of steps S301 to S307 for all combinations of each object candidate and each specific part. Let us calculate the relationship (relative distance).
 撮影画像における特定部Wの画像領域を示す情報を取得すると、プロセッサ31は、特定部Wの画像を形成する総ピクセル数を特定部に応じたループ数として設定する(ステップS302)。また、撮影画像における対象物候補Tの画像領域を示す情報を取得すると、プロセッサ31は、対象物候補Tの画像を形成する総ピクセル数を対象物候補に応じたループ数として設定する(ステップS303)。 Upon acquiring the information indicating the image area of the specific portion W in the captured image, the processor 31 sets the total number of pixels forming the image of the specific portion W as the number of loops depending on the specific portion (step S302). Further, upon acquiring information indicating the image area of the target object candidate T in the photographed image, the processor 31 sets the total number of pixels forming the image of the target object candidate T as the number of loops according to the target object candidate (step S303 ).
 プロセッサ31は、対象物候補の各ピクセルと特定部の各ピクセルとの距離をそれぞれ算出し、算出した各ピクセル間の距離を特定部Wと対象物候補Tとの相対距離を算出するための距離情報としてRAM33又は記憶部34に保存する(ステップS304)。プロセッサ31は、対象物候補の1つのピクセルに対して特定部の各ピクセルの距離を算出する処理を特定部のピクセル数(特定部に応じたループ数)分繰り返し実行する(ステップS305)。 The processor 31 calculates the distance between each pixel of the target object candidate and each pixel of the specific part, and uses the calculated distance between each pixel as a distance for calculating the relative distance between the specific part W and the target object candidate T. The information is stored in the RAM 33 or the storage unit 34 (step S304). The processor 31 repeatedly executes the process of calculating the distance of each pixel of the specific part to one pixel of the target object candidate for the number of pixels of the specific part (the number of loops depending on the specific part) (step S305).
 対象物候補の1つのピクセルに対する特定部の各ピクセルの距離の算出が終了すると、プロセッサ31は、対象物候補から選出する次のピクセルに対して特定部の各ピクセルの距離を算出する処理を実行する。プロセッサ31は、対象物候補から順次選出するピクセルに対して特定部の各ピクセルの距離を算出する処理を、対象物候補の総ピクセル数(対象物候補に応じたループ数)分繰り返し実行する(ステップS306)。上記ステップS302~S306の処理によって、プロセッサ31は、対象物候補における全てのピクセルに対する特定部における全てのピクセルの距離が算出される。 When the calculation of the distance of each pixel of the specific part to one pixel of the target object candidate is completed, the processor 31 executes a process of calculating the distance of each pixel of the specific part to the next pixel selected from the target object candidate. do. The processor 31 repeatedly executes the process of calculating the distance of each pixel of the specific part to the pixels sequentially selected from the target object candidates for the total number of pixels of the target object candidates (the number of loops according to the target object candidates). Step S306). Through the processing in steps S302 to S306, the processor 31 calculates the distances between all pixels in the specific part and all pixels in the object candidate.
 プロセッサ31は、対象物候補Tの各ピクセルと特定部Wの各ピクセルとの距離の算出が終了すると、RAM33又は記憶部34に保存した各ピクセル間の距離に基づいて対象物候補Tと特定部Wとの相対関係としての相対距離を算出する(ステップS307)。例えば、プロセッサ31は、対象物候補Tの各ピクセルと特定部Wの各ピクセルとの組み合わせの数だけ算出された距離から最小値となる距離を相対関係として算出しても良い。 When the calculation of the distance between each pixel of the target object candidate T and each pixel of the specific part W is completed, the processor 31 calculates the distance between the target object candidate T and the specific part W based on the distance between each pixel stored in the RAM 33 or the storage part 34. A relative distance as a relative relationship with W is calculated (step S307). For example, the processor 31 may calculate the distance that is the minimum value from the distances calculated for the number of combinations of each pixel of the target object candidate T and each pixel of the specific portion W as the relative relationship.
 ただし、実際には、ノイズが特定部として誤って検出される場合がある。特定部と誤って検出されるノイズは、対象物候補の近い位置に検出されることも有り得る。このような場合、実際の特定部によりも対象物候補に近い場所にノイズが少しでもあると、当該対象物候補に対する距離の最小値は、実際の特定部までの距離もよりも小さい値として算出される。 However, in reality, noise may be mistakenly detected as a specific part. Noise that is mistakenly detected as a specific part may also be detected at a position close to the target object candidate. In such a case, if there is even a small amount of noise in a location closer to the target object candidate than the actual specified part, the minimum distance to the target object candidate will be calculated as a value smaller than the actual distance to the specified part. be done.
 このため、プロセッサ31は、対象物候補と特定部との相対関係(相対距離)として、対象物候補として検出された各ピクセルと特定部の各ピクセルとの距離を計算した後、最小値をとるのではなく、パーセンタイルの値をとるようにしても良い。対象物候補の各ピクセルと特定部の各ピクセルとの距離を示す相関関係をパーセンタイルの値で算出することにより、特定部として検出されるノイズがあっても対象物候補が特定対象物らしいことを頑健に評価することが可能になる。 For this reason, the processor 31 calculates the distance between each pixel detected as a target object candidate and each pixel of the specific part as the relative relationship (relative distance) between the target object candidate and the specific part, and then calculates the minimum value. It is also possible to take a percentile value instead of . By calculating the correlation that indicates the distance between each pixel of the object candidate and each pixel of the specific part using percentile values, it is possible to confirm that the object candidate is likely to be a specific object even if there is noise detected as the specific part. This enables robust evaluation.
 なお、図9に示す処理例では、対象物候補における全てのピクセルに対して特定部における全てのピクセルの距離を算出するようにしたが、計算量を削減するために、全ての組み合わせに対して距離を算出しなくても良い。例えば、プロセッサ31は、対象物候補からサンプリングしたピクセルと特定部からサンプリングしたピクセルとの組み合わせについて距離を算出することにより、距離を算出するピクセルの組み合わせ数を減らしても良い。図9に示す処理例において距離を算出するピクセルの組み合わせ数を減らすことにより、相対関係の算出処理における計算量を削減することができる。 Note that in the processing example shown in FIG. 9, the distances of all pixels in the specific part are calculated for all pixels in the target object candidate, but in order to reduce the amount of calculation, the distances are calculated for all combinations. There is no need to calculate the distance. For example, the processor 31 may reduce the number of combinations of pixels for which distances are calculated by calculating distances for combinations of pixels sampled from the object candidate and pixels sampled from the specific portion. By reducing the number of combinations of pixels for which distances are calculated in the processing example shown in FIG. 9, the amount of calculation in the relative relationship calculation process can be reduced.
 また、上述した相関関係の算出処理の例では、2次元画像を処理対象としてピクセル間の距離を算出するものとしたが、撮影画像が3次元データであれば、対象物候補における各ボクセルと特定部における各ボクセルとの距離を算出するようにすれば良い。これにより、撮影画像が3次元データであっても、対象物候補と特定部との相対関係としての相対距離を算出することができる。 In addition, in the example of the correlation calculation process described above, the distance between pixels is calculated using a two-dimensional image as the processing target, but if the captured image is three-dimensional data, each voxel in the object candidate and the What is necessary is to calculate the distance to each voxel in the area. Thereby, even if the photographed image is three-dimensional data, it is possible to calculate the relative distance as the relative relationship between the object candidate and the specific part.
 次に、実施形態に係る検査装置13の相対関係に基づく検出結果の発報処理について詳細に説明する。
 検査装置13のプロセッサ31は、相対関係に基づく特定対象物の検出結果を出力デバイスにより報知する発報処理を実行する。例えば、検査装置13のプロセッサ31は、荷物Mの撮影画像において特定対象物であることが疑われる画像領域(特定対象物として検出した画像領域)を強調表示した特定対象物の検査結果の表示画面を表示装置14に表示させる。
Next, a process for reporting detection results based on the relative relationship of the inspection apparatus 13 according to the embodiment will be described in detail.
The processor 31 of the inspection device 13 executes a notification process of notifying the detection result of a specific target object based on the relative relationship using an output device. For example, the processor 31 of the inspection device 13 displays a display screen of the inspection results for the specific object in which an image area suspected to be the specific object (an image area detected as the specific object) in the photographed image of the luggage M is highlighted. is displayed on the display device 14.
 図10は、検査装置13が発報処理として荷物Mにおける特定対象物の検査結果を表示装置14に表示した表示例を示す図である。
 図10に示す表示例は、撮影装置12が撮影した荷物Mの撮影画像を表示し、撮影画像上において特定部との相対関係を基に特定対象物であることが疑われると判定された対象物候補の画像領域(特定対象物として検出した画像領域)を強調表示した案内表示の例である。
FIG. 10 is a diagram illustrating a display example in which the inspection device 13 displays the inspection results of the specific object in the baggage M on the display device 14 as a notification process.
The display example shown in FIG. 10 displays a photographed image of baggage M taken by the photographing device 12, and objects that are determined to be suspected of being a specific object based on the relative relationship with a specific part on the photographed image. This is an example of a guidance display in which an image region of an object candidate (an image region detected as a specific target object) is highlighted.
 特定対象物であることが疑われる画像領域は、が他の画像領域と区別して視認できるようにした強調表示するものであれば良い。例えば、特定対象物であることが疑われる画像領域は、他の領域とは別の色で表示することにより強調表示しても良い。また、特定対象物であることが疑われる画像領域は、四角、又は、楕円などで囲むように表示することで強調表示しても良い。 The image area suspected to be a specific object may be highlighted so that it can be visually distinguished from other image areas. For example, an image area suspected of being a specific object may be highlighted by displaying it in a different color from other areas. Further, an image area suspected of being a specific object may be displayed in a highlighted manner by surrounding it with a square, an ellipse, or the like.
 また、検査装置13が発報処理により表示装置14に表示する検査処理の結果は、特定対象物であることが疑われると判定された対象物候補の画像領域だけでなく、特定部、対象物候補、又は、相対関係を示す情報などを表示しても良い。 In addition, the results of the inspection process that the inspection device 13 displays on the display device 14 through the alarm process include not only the image area of the target object candidate that has been determined to be suspected of being the specific target object, but also the specific part, the target object Candidates or information indicating relative relationships may be displayed.
 図11は、撮影画像において特定対象物であることが疑われると判定した画像領域(特定対象物の検出結果)とともに、対象物候補の検出処理で検出した対象物候補を示す案内を表示した表示例を示す図である。
 図11に示す表示例は、特定対象物であることが疑われると判定された対象物候補以外の対象物候補の画像領域を検査員が視認できるように楕円のガイド線で囲むように表示した例である。また、対象物候補として検出した領域は、矩形などで囲むように表示しても良いし、対象物候補の領域として設定した特定の色で表示するようにしても良い。
FIG. 11 is a table displaying information indicating object candidates detected in the object candidate detection process as well as image areas determined to be suspected to be specific objects in the photographed image (specific object detection results). It is a figure which shows an example.
In the display example shown in FIG. 11, image areas of object candidates other than object candidates determined to be suspected of being specific objects are displayed so that they are surrounded by elliptical guide lines so that the inspector can visually recognize them. This is an example. Further, the area detected as a target object candidate may be displayed so as to be surrounded by a rectangle or the like, or may be displayed in a specific color set as the area of the target object candidate.
 図12は、撮影画像において特定対象物であることが疑われると判定した画像領域とともに、特定部の検出処理によって検出した特定部を示す案内を表示した表示例を示す図である。
 図12に示す表示例は、撮影画像において特定部として検出した画像領域を検査員が視認できるように矩形のガイド線で特定部の画像領域を囲むように表示した例である。また、特定部として検出した画像領域は、楕円などで囲むように表示しても良いし、対象物候補の領域として設定した特定の色で表示するようにしても良い。
FIG. 12 is a diagram illustrating a display example in which a guide indicating a specific part detected by the specific part detection process is displayed together with an image area determined to be suspected of being a specific target object in a photographed image.
The display example shown in FIG. 12 is an example in which a rectangular guide line surrounds the image area of the specific part so that the inspector can visually recognize the image area detected as the specific part in the photographed image. Further, the image area detected as the specific portion may be displayed so as to be surrounded by an ellipse or the like, or may be displayed in a specific color set as the area of the target object candidate.
 図13は、撮影画像において特定対象物であることが疑われると判定した画像領域とともに、特定部に対する各対象物候補の相対関係(相対距離)を示す情報を表示した表示例を示す図である。
 図13に示す表示例では、特定部として検出した部位に対する各対象物候補の相対関係としての相対距離を示す情報(ピクセル数)を表示している。また、撮影画像において対象物候補や特定部を表示し、対象物候補や特定部の画像領域おける密度や実効原子番号などの情報を並べて(又は重ねて)表示するようにしても良い。
FIG. 13 is a diagram illustrating a display example in which information indicating the relative relationship (relative distance) of each object candidate with respect to the specific portion is displayed together with an image area determined to be suspected of being a specific object in a photographed image. .
In the display example shown in FIG. 13, information (number of pixels) indicating the relative distance as the relative relationship of each object candidate to the part detected as the specific part is displayed. Alternatively, object candidates and specific parts may be displayed in the captured image, and information such as density and effective atomic number in the image area of the object candidates and specific parts may be displayed side by side (or overlapping).
 図14は、特定対象物の検出結果を表示するとともに対象物候補および特定部の表示を指示するボタンを設けた表示例を示す図である。
 図14に示す表示例では、全対象物候補の表示を指示するボタン、および、特定部の表示を指示するボタンが表示され、さらに、個々の対象物候補を選択するボタンが表示される。対象物候補の表示を指示するボタンが操作装置15により入力されると、表示装置14には、図14に示すように、対象物候補として検出された全ての対象物候補の画像が表示される。
FIG. 14 is a diagram illustrating a display example in which a detection result of a specific target object is displayed and buttons for instructing display of target object candidates and specific parts are provided.
In the display example shown in FIG. 14, a button for instructing display of all object candidates, a button for instructing display of a specific part, and a button for selecting individual object candidates are displayed. When a button instructing the display of object candidates is input through the operating device 15, images of all object candidates detected as object candidates are displayed on the display device 14, as shown in FIG. .
 また、特定部の表示を指示するボタンが操作装置15により入力されると、表示装置14には、特定部として検出された領域の画像が表示される。また、個々の対象物候補を選択するボタンが操作装置15により入力されると、表示装置14には、選択された対象物候補の画像とともに当該対象物候補に関する情報(例えば、物性を示す情報、或いは、特定部との相対関係を示す情報など)が表示される。 Furthermore, when a button for instructing display of the specific portion is input using the operating device 15, an image of the area detected as the specific portion is displayed on the display device 14. Further, when a button for selecting an individual object candidate is input through the operating device 15, the display device 14 displays an image of the selected object candidate as well as information regarding the object candidate (for example, information indicating physical properties, etc.). Alternatively, information indicating the relative relationship with the specific part, etc.) is displayed.
 また、検査装置13のプロセッサ31は、撮影画像が3次元データであれば、表示装置14に表示する撮影画像を3Dで表示するようにしても良い。この場合も、プロセッサ31は、特定対象物として検出された対象物候補の領域を他の領域とは別の色で表示するようにすれば良い。また、プロセッサ31は、撮影画像を3Dで表示する場合であっても、対象物候補又は特定部などを検査員に指示に応じて撮影画像上に表示するようにしても良い。また、撮影画像を3D画像で表示装置14に表示する場合、プロセッサ31は、検査員による操作に応じて3Dの撮影画像を回転させたり、拡大させたり、縮小させたり、移動させりするようにしても良い。 Furthermore, if the captured image is three-dimensional data, the processor 31 of the inspection device 13 may display the captured image on the display device 14 in 3D. In this case as well, the processor 31 may display the area of the object candidate detected as the specific object in a different color from the other areas. Further, even when displaying a photographed image in 3D, the processor 31 may display object candidates, specific parts, etc. on the photographed image according to instructions from the inspector. Furthermore, when displaying a photographed image as a 3D image on the display device 14, the processor 31 rotates, enlarges, reduces, or moves the 3D photographed image in accordance with operations by the inspector. It's okay.
 また、撮影画像が3次元データである場合、プロセッサ31は、検査員が指定する場所(面)でスライスした2次元画像を表示装置14に表示するようにしても良い。
 図15は、3次元の撮影画像を指定された場所でスライスした2元次画像を表示する例を模式的に示す図である。
 図15に示す例によれば、検査装置13は、3次元の撮影画像を指定された場所でスライスした2次元画像としての撮影画像を表示装置14に表示し、さらに、その撮影画像上に対象物候補、特定部および特定対象物として検出した領域などを表示する。3次元の撮影画像をスライスする場所は、予め設定される所定場所であっても良いし、検査員が操作装置15を用いて指定する場所であっても良い。
Furthermore, when the captured image is three-dimensional data, the processor 31 may display on the display device 14 a two-dimensional image sliced at a location (plane) designated by the inspector.
FIG. 15 is a diagram schematically showing an example of displaying a two-dimensional image obtained by slicing a three-dimensional captured image at a designated location.
According to the example shown in FIG. 15, the inspection device 13 displays the captured image as a two-dimensional image obtained by slicing the three-dimensional captured image at a designated location on the display device 14, and further displays an object on the captured image. Object candidates, specific parts, areas detected as specific objects, etc. are displayed. The location where the three-dimensional photographed image is sliced may be a predetermined location set in advance, or a location designated by the inspector using the operating device 15.
 なお、以上の説明では、撮影装置12と検査装置13とが別々に構成される検査システム1の動作例について説明したが、上述した検査装置13の処理は撮影装置12が実行するように構成しても良い。すなわち、上述した検査システム1において、上述した撮影装置12と検査装置13とは一体的に形成された装置で実現するようにしても良い。 In addition, in the above description, an example of the operation of the inspection system 1 in which the imaging device 12 and the inspection device 13 are configured separately has been described, but the processing of the inspection device 13 described above is configured to be executed by the imaging device 12. It's okay. That is, in the above-described inspection system 1, the above-described photographing device 12 and inspection device 13 may be realized by an integrated device.
 また、上述した例では、検査装置13の動作例を主体に説明したが、図2に示す情報管理システム100としては、上述した検査装置13による各種の処理結果や検査作業の結果などの情報を上位管理装置101が収集するようにしても良い。すなわち、検査装置13による種々の処理結果や検査作業の結果などを示す情報は、検査装置13から上位管理装置101に送信され、上位管理装置で保存されるようにしても良い。これにより、上位管理装置101は、各地にある検査システム1の検査装置13から取集する情報を分析し、分析結果に応じて各検査装置13が実行する各種の処理におけるパラメータを変更するようにしても良い。 In addition, in the above example, the operation example of the inspection device 13 was mainly explained, but the information management system 100 shown in FIG. The information may be collected by the upper management device 101. That is, information indicating various processing results and inspection work results by the inspection device 13 may be transmitted from the inspection device 13 to the higher-level management device 101 and stored in the higher-level management device. As a result, the upper management device 101 analyzes the information collected from the inspection devices 13 of the inspection system 1 located in various places, and changes the parameters for various processes executed by each inspection device 13 according to the analysis results. It's okay.
 具体例として、上位管理装置は、各検査装置から実際の検査作業において特定対象物が検出された部位や物体を示す情報を収集し、それらの部位や物体を特定部として検出するように各検査装置に特定部の検出処理のパラメータを設定するようにしても良い。また、上位管理装置は、各検査装置から収集する情報から特定対象物として検出した対象物候補が実際の検査作業で特定対象物と確認される確率(精度)などの統計量を算出し、それらの統計量に応じて特定部との相対関係を基に対象物候補を特定対象物と判定するための設定値を調整するようにしても良い。 As a specific example, the upper management device collects information indicating the parts and objects where specific target objects are detected during actual inspection work from each inspection device, and sets each inspection to detect those parts and objects as specific parts. Parameters for the detection process of the specific unit may be set in the device. In addition, the upper management device calculates statistics such as the probability (accuracy) that an object candidate detected as a specific object will be confirmed as a specific object in actual inspection work from the information collected from each inspection device, and calculates the statistics. The setting value for determining a target object candidate as a specific target object may be adjusted based on the relative relationship with the specific part according to the statistical amount of .
 以上のように、実施形態に係る検査装置は、検査対象に電磁波を照射して撮影する撮影画像と撮影画像の各部における物性情報とを含む撮影画像データを取得する。検査装置は、撮影画像における対象物候補と特定部とを検出し、対象物候補と特定部との相対関係を基に特定対象物であることが疑われる対象物候補を報知する。 As described above, the inspection apparatus according to the embodiment acquires photographed image data including a photographed image photographed by irradiating an inspection target with electromagnetic waves and physical property information on each part of the photographed image. The inspection device detects the target object candidate and the specific part in the photographed image, and notifies the target object candidate that is suspected to be the specific target based on the relative relationship between the target object candidate and the specific part.
 これにより、実施形態に係る検査装置によれば、特定対象物が配置されやすい特定部を検出し、特定部と対象物候補との相対関係によって特定対象物であることが疑われる領域を特定することができる。この結果、検査対象に電磁波を照射することで取得する物性値だけでは特定対象物との判別が難しい対象物候補を特定部との相対関係で絞り込むことができ、高精度に特定対象物であることが疑われる領域を検出することができる。 As a result, the inspection device according to the embodiment detects a specific part where a specific target object is likely to be placed, and identifies an area suspected of being a specific target object based on the relative relationship between the specific part and the candidate object. be able to. As a result, it is possible to narrow down object candidates that are difficult to distinguish from the specific object based on the physical property values obtained by irradiating the inspection object with electromagnetic waves based on the relative relationship with the specific part, and identify the specific object with high accuracy. It is possible to detect areas where a problem is suspected.
 本発明のいくつかの実施形態を説明したが、これらの実施形態は、例として提示したものであり、発明の範囲を限定することは意図していない。これら新規な実施形態は、その他の様々な形態で実施されることが可能であり、発明の要旨を逸脱しない範囲で、種々の省略、置き換え、変更を行うことができる。これら実施形態やその変形は、発明の範囲や要旨に含まれるとともに、特許請求の範囲に記載された発明とその均等の範囲に含まれる。

 
Although several embodiments of the invention have been described, these embodiments are presented by way of example and are not intended to limit the scope of the invention. These novel embodiments can be implemented in various other forms, and various omissions, substitutions, and changes can be made without departing from the gist of the invention. These embodiments and their modifications are included within the scope and gist of the invention, as well as within the scope of the invention described in the claims and its equivalents.

Claims (12)

  1.  検査対象に電磁波を照射して撮影した撮影画像と前記撮影画像の各部における物性を示す物性情報とを含む撮影画像データを取得する画像取得部と、
     前記撮影画像データに含まれる物性情報に基づいて前記検査対象にある特定対象物の候補である対象物候補を検出し、前記検査対象にある特定の部位又は物体である特定部を検知し、前記対象物候補と前記特定部との相対関係を算出し、算出した相対関係が所定条件を満たす対象物候補を報知する、プロセッサと、
     を有する検査装置。
    an image acquisition unit that acquires photographed image data including a photographed image photographed by irradiating an inspection target with electromagnetic waves and physical property information indicating physical properties of each part of the photographed image;
    detecting an object candidate that is a candidate for a specific target in the inspection target based on physical property information included in the photographed image data; detecting a specific part that is a specific part or object in the inspection target; a processor that calculates a relative relationship between an object candidate and the specific unit and notifies an object candidate for which the calculated relative relationship satisfies a predetermined condition;
    An inspection device with
  2.  前記プロセッサは、前記対象物候補と前記特定部とが隣接又は包含関係にあるかを前記相対関係として算出し、前記特定部に隣接又は包含される対象物候補を報知する、
     請求項1に記載の検査装置。
    The processor calculates whether the object candidate and the specific part are in an adjacent or inclusive relationship as the relative relationship, and notifies the object candidate adjacent to or included in the specific part.
    The inspection device according to claim 1.
  3.  前記プロセッサは、前記対象物候補と前記特定部との相対距離を前記相対関係として算出し、前記特定部との相対距離が所定閾値以下である対象物候補を報知する、
     請求項1に記載の検査装置。
    The processor calculates a relative distance between the target object candidate and the specific unit as the relative relationship, and notifies target object candidates whose relative distance to the specific unit is equal to or less than a predetermined threshold.
    The inspection device according to claim 1.
  4.  前記プロセッサは、前記対象物候補の各構成単位と前記特定部の各構成単位との構成単位間の距離を算出し、算出した距離の最小値を前記対象物候補と前記特定部との相対距離とする、
     請求項3に記載の検査装置。
    The processor calculates a distance between each constituent unit of the target object candidate and each constituent unit of the specific part, and calculates the minimum value of the calculated distances as a relative distance between the target object candidate and the specific part. and
    The inspection device according to claim 3.
  5.  前記プロセッサは、前記対象物候補の各構成単位と前記特定部の各構成単位との構成単位間の距離を算出し、算出した距離に対する所定のパーセンタイルの値を前記対象物候補と前記特定部との相対距離とする、
     請求項4に記載の検査装置。
    The processor calculates a distance between each constituent unit of the object candidate and each constituent unit of the identification section, and calculates a predetermined percentile value for the calculated distance between the object candidate and the identification section. Let be the relative distance of
    The inspection device according to claim 4.
  6.  前記プロセッサは、前記撮影画像とともに前記特定部との相対関係が前記所定条件を満たす対象物候補が存在する位置を表示装置に表示させる、
     請求項1に記載の検査装置。
    The processor causes a display device to display, together with the photographed image, a position where an object candidate whose relative relationship with the specific portion satisfies the predetermined condition is present.
    The inspection device according to claim 1.
  7.  前記プロセッサは、さらに、前記撮影画像から検出した全ての対象物候補を示す情報を表示装置に表示させる、
     請求項6に記載の検査装置。
    The processor further causes a display device to display information indicating all object candidates detected from the photographed image.
    The inspection device according to claim 6.
  8.  前記プロセッサは、さらに、前記撮影画像から検出した前記特定部を示す情報を表示装置に表示させる、
     請求項6に記載の検査装置。
    The processor further causes a display device to display information indicating the specific portion detected from the photographed image.
    The inspection device according to claim 6.
  9.  前記撮影画像は、X線を照射して前記検査対象を撮影したX線画像である、
     請求項1乃至8の何れか1項に記載の検査装置。
    The photographed image is an X-ray image obtained by photographing the inspection object by irradiating X-rays.
    An inspection device according to any one of claims 1 to 8.
  10.  前記撮影画像は、前記検査対象をX線CT装置を用いて撮影した3次元データであり、 前記プロセッサは、前記撮影画像としての3次元データにおいて前記対象物候補と前記特定部との相対関係を算出する、
     請求項9に記載の検査装置。
    The photographed image is three-dimensional data obtained by photographing the inspection object using an X-ray CT device, and the processor determines the relative relationship between the object candidate and the specific portion in the three-dimensional data as the photographed image. calculate,
    The inspection device according to claim 9.
  11.  撮影装置と検査装置とを有する検査システムにおいて、
     前記撮影装置は、
     検査対象に電磁波を照射して撮像する撮影画像を撮影する撮像部と、
     前記撮影画像と前記撮影画像の各部における物性を示す物性情報とを含む撮影画像データを生成する処理部と、
     前記撮影画像データを出力する出力部と、を有し、
     前記検査装置は、
     前記撮影装置から出力された撮影画像データを取得する画像取得部と、
     前記撮影画像データに含まれる物性情報に基づいて前記検査対象にある特定対象物の候補である対象物候補を検出し、前記検査対象にある特定の部位又は物体である特定部を検知し、前記対象物候補と前記特定部との相対関係を算出し、算出した相対関係が所定条件を満たす対象物候補を報知する、プロセッサと、を有する、
     検査システム。
    In an inspection system having an imaging device and an inspection device,
    The photographing device is
    an imaging unit that captures a captured image by irradiating the inspection target with electromagnetic waves;
    a processing unit that generates photographed image data including the photographed image and physical property information indicating physical properties of each part of the photographed image;
    an output unit that outputs the photographed image data;
    The inspection device includes:
    an image acquisition unit that acquires photographed image data output from the photographing device;
    detecting an object candidate that is a candidate for a specific target in the inspection target based on physical property information included in the photographed image data; detecting a specific part that is a specific part or object in the inspection target; a processor that calculates a relative relationship between an object candidate and the specific unit and notifies an object candidate for which the calculated relative relationship satisfies a predetermined condition;
    Inspection system.
  12.  検査対象における特定対象物を検査する検査方法であって、
     検査対象に電磁波を照射して撮影する撮影画像と前記撮影画像の各部における物性を示す物性情報とを含む撮影画像データを取得し、
     前記撮影画像データに含まれる物性情報に基づいて前記検査対象にある特定対象物の候補である対象物候補を検出し、
     前記検査対象にある特定の部位又は物体である特定部を検知し、
     前記対象物候補と前記特定部との相対関係を算出し、
     前記算出した相対関係が所定条件を満たす対象物候補を報知する、
     検査方法。

     
    An inspection method for inspecting a specific object in an inspection target,
    Obtaining photographed image data including a photographed image taken by irradiating an inspection target with electromagnetic waves and physical property information indicating physical properties of each part of the photographed image,
    detecting an object candidate that is a candidate for a specific object in the inspection target based on physical property information included in the photographed image data;
    detecting a specific part that is a specific part or object in the inspection target;
    Calculating the relative relationship between the object candidate and the specific part,
    Notifying object candidates for which the calculated relative relationship satisfies a predetermined condition;
    Inspection method.

PCT/JP2023/029874 2022-08-22 2023-08-18 Inspection device, inspection system, and inspection method WO2024043190A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022131720A JP2024029451A (en) 2022-08-22 2022-08-22 Inspection equipment, inspection system and inspection method
JP2022-131720 2022-08-22

Publications (1)

Publication Number Publication Date
WO2024043190A1 true WO2024043190A1 (en) 2024-02-29

Family

ID=90013309

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/029874 WO2024043190A1 (en) 2022-08-22 2023-08-18 Inspection device, inspection system, and inspection method

Country Status (2)

Country Link
JP (1) JP2024029451A (en)
WO (1) WO2024043190A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6117147A (en) * 1984-07-04 1986-01-25 Nec Corp Pattern checking method of photomask
JP2006084275A (en) * 2004-09-15 2006-03-30 Hitachi Ltd Method and device for detecting explosive substance, or the like
JP2006518039A (en) * 2003-02-13 2006-08-03 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Object inspection method and apparatus
JP2006214952A (en) * 2005-02-07 2006-08-17 Hitachi Ltd Apparatus and method for detecting explosives
CN101592579A (en) * 2009-07-03 2009-12-02 公安部第一研究所 The method and the device that utilize multi-view X ray that explosive substances in luggage is surveyed automatically
US20140376686A1 (en) * 2011-02-18 2014-12-25 Smiths Heimann Gmbh System and method for multi-scanner x-ray inspection
KR102293548B1 (en) * 2021-03-11 2021-08-25 대한민국 Dangerous substance detection system and method using artificial intelligence

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6117147A (en) * 1984-07-04 1986-01-25 Nec Corp Pattern checking method of photomask
JP2006518039A (en) * 2003-02-13 2006-08-03 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Object inspection method and apparatus
JP2006084275A (en) * 2004-09-15 2006-03-30 Hitachi Ltd Method and device for detecting explosive substance, or the like
JP2006214952A (en) * 2005-02-07 2006-08-17 Hitachi Ltd Apparatus and method for detecting explosives
CN101592579A (en) * 2009-07-03 2009-12-02 公安部第一研究所 The method and the device that utilize multi-view X ray that explosive substances in luggage is surveyed automatically
US20140376686A1 (en) * 2011-02-18 2014-12-25 Smiths Heimann Gmbh System and method for multi-scanner x-ray inspection
KR102293548B1 (en) * 2021-03-11 2021-08-25 대한민국 Dangerous substance detection system and method using artificial intelligence

Also Published As

Publication number Publication date
JP2024029451A (en) 2024-03-06

Similar Documents

Publication Publication Date Title
US9552521B2 (en) Human body security inspection method and system
US11010890B2 (en) Method for the non-destructive testing of the volume of a test object and testing device configured for carrying out such a method
US11029255B2 (en) Defect inspection device, defect inspection method, and program
JP5635903B2 (en) X-ray inspection equipment
US20220244194A1 (en) Automated inspection method for a manufactured article and system for performing same
KR101778503B1 (en) Method for providing information of cargo inspection, apparatus performing the same and storage media storing the same
JP2008309603A (en) Fluorescent flaw detection method and its device
JP7360841B2 (en) X-ray image processing system, method, and program
WO2018003160A1 (en) X-ray automatic evaluation device, x-ray automatic evaluation method
US20210344833A1 (en) Inspection workflow using object recognition and other techniques
JP6754155B1 (en) Teacher data generator, inspection device and computer program
KR20160037023A (en) Apparatus and Method for supporting a Computer Aided Diagnosis
US20220215521A1 (en) Transmission image-based non-destructive inspecting method, method of providing non-destructive inspection function, and device therefor
WO2024002340A1 (en) Method and system for tracking target object in object to be inspected
KR20120128110A (en) Image analysis for disposal of explosive ordinance and safety inspections
WO2024043190A1 (en) Inspection device, inspection system, and inspection method
KR102325017B1 (en) Method for identifying cargo based on deep-learning and apparatus performing the same
CN114846513A (en) Motion analysis system and motion analysis program
JP7422023B2 (en) X-ray image processing device and X-ray image processing method
KR20220111214A (en) Method, apparatus and computer program for inspection of product based on artificial intelligence
JP4415285B1 (en) Wire inspection apparatus, wire inspection method, and wire inspection program
JP2024038865A (en) Inspection equipment, inspection system, information management system and inspection method
WO2023053728A1 (en) Display processing device, display processing method, and display processing program
WO2023136030A1 (en) Information processing device, information processing method, and information processing program
EP2562718B1 (en) Inspection system and method for determining three dimensional model of an object

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23857302

Country of ref document: EP

Kind code of ref document: A1