WO2014141534A1 - Système de comparaison, dispositif terminal, dispositif serveur, procédé de comparaison, et programme - Google Patents

Système de comparaison, dispositif terminal, dispositif serveur, procédé de comparaison, et programme Download PDF

Info

Publication number
WO2014141534A1
WO2014141534A1 PCT/JP2013/080637 JP2013080637W WO2014141534A1 WO 2014141534 A1 WO2014141534 A1 WO 2014141534A1 JP 2013080637 W JP2013080637 W JP 2013080637W WO 2014141534 A1 WO2014141534 A1 WO 2014141534A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
information
input
displayed
unit
Prior art date
Application number
PCT/JP2013/080637
Other languages
English (en)
Japanese (ja)
Inventor
陽三 平木
正 安達
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to JP2015505229A priority Critical patent/JP6123881B2/ja
Priority to CN201380074556.9A priority patent/CN105008251B/zh
Publication of WO2014141534A1 publication Critical patent/WO2014141534A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/245Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries

Definitions

  • the present invention relates to a collation system, a terminal device, a server device, a collation method, and a program.
  • slabs thin or thin plate processed
  • billets cylindrical or prismatic
  • blooms shallow-shaped
  • beam blanks A shape close to an H-shape.
  • These steel materials such as slabs are stacked and stored in a plurality of stages in each of a plurality of areas each assigned an address.
  • the manager manages the storage state of a plurality of steel materials by using storage information indicating which identification information (eg, lot number) of steel material is stored at which address and what level.
  • the storage information is updated when an event occurs in which a new steel material is added to the storage location, the stored steel material is shipped, or the storage address is moved to another address. If such storage information is used, when a steel material having certain identification information is shipped, the storage location of the steel material can be easily specified.
  • the identification information attached to the surface of the steel material actually stored in the location specified using the storage information matches the identification information of the steel material to be shipped.
  • Work verification work
  • collation work is performed using the identification information (eg: printed) attached to the surface of the steel material stored at the specified position and the storage information. Check that the stored information is correct.
  • Patent Document 1 discloses a camera for photographing certificates for acquiring image data of certificates transmitted by a customer who wants to loan to a loan examination apparatus, and a frame line for dividing each description item in the certificates Among them, there is disclosed a certificate photographing camera characterized by having photographing frame display means for displaying a photographing frame that matches at least a part of the frame line on a finder.
  • collation work at the time of shipment and at a predetermined timing as described above has been performed manually. That is, the operator visually compares storage information with identification information (for example, printed) attached to the surface of a steel material stored at a predetermined position. Such an operation is very troublesome and requires a lot of time. In addition, human error may occur.
  • This invention makes it a subject to provide the technique for performing efficiently the collation operation
  • Storage means for storing correspondence information; Input accepting means for accepting input of the address of the steel material to be verified and the step information; Corresponding information search means for referring to the correspondence information and acquiring the identification information associated with the address and the stage information received by the input receiving means; A viewfinder, displaying a pre-imaged image and / or an imaged image on the viewfinder, and overlaying the image with a specific frame indicating a partial area to be subjected to image recognition processing in the displayed image Output means for displaying on the viewfinder, Imaging means for capturing the image displayed on the viewfinder; Image recognition processing is performed using only a partial image in the specific frame in the image picked up by the image pickup means, and an identification mark written on the surface of each of the plurality of steel materials is extracted and extracted. Image recognition means for recognizing the identification information using the identification mark; Collation means for determining whether or not the identification information acquired by the correspondence information search means matches the identification information recognized by the image recognition means; A verification system is provided.
  • a terminal device including the input receiving unit, the output unit, and the imaging unit included in the verification system.
  • a viewfinder displaying a pre-imaged image and / or an imaged image on the viewfinder, and overlaying the image with a specific frame indicating a partial area to be subjected to image recognition processing in the displayed image
  • a terminal device comprising: a transmission unit that transmits only a partial image within the specific frame in the image captured by the imaging unit to an external device.
  • a viewfinder displaying a pre-imaged image and / or an imaged image on the viewfinder, and overlaying the image with a specific frame indicating a partial area to be subjected to image recognition processing in the displayed image
  • Output means for displaying on the viewfinder, Imaging means for capturing the image displayed on the viewfinder;
  • a terminal device comprising: transmission means for transmitting the image to an external device together with information for identifying a partial image in the specific frame in the image captured by the imaging means.
  • a server device comprising the storage means, the correspondence information search means, and the collation means that the collation system has.
  • a program for a terminal device provided with an imaging means for capturing an image displayed on a viewfinder, Computer A pre-imaged and / or already-captured image is displayed on the finder, and a specific frame indicating a partial area to be subjected to image recognition processing in the displayed image is superimposed on the image and displayed on the finder.
  • a program for functioning as a server is provided.
  • a program for a terminal device provided with an imaging means for capturing an image displayed on a viewfinder, Computer A pre-imaged and / or already-captured image is displayed on the finder, and a specific frame indicating a partial area to be subjected to image recognition processing in the displayed image is superimposed on the image and displayed on the finder.
  • a program for functioning as a server is provided.
  • a program for a collation system that collates a plurality of steel materials that are stacked and stored in a plurality of stages in each of a plurality of areas each assigned an address
  • Computer The identification information of each of the plurality of steel materials stored, the address of the area in which each of the steel materials is stored, and stage information indicating the position in the steel material group stacked in a plurality of stages are associated with each other.
  • Storage means for storing correspondence information; An input receiving means for receiving the address of the steel material to be verified and the input of the step information; Correspondence information search means for referring to the correspondence information and acquiring the identification information associated with the address and the stage information received by the input reception means; A pre-imaged and / or already-captured image is displayed on the finder, and a specific frame indicating a partial area to be subjected to image recognition processing in the displayed image is displayed on the finder so as to overlap the image.
  • Output means Imaging means for capturing the image displayed on the viewfinder; Image recognition processing is performed using only a partial image in the specific frame in the image picked up by the image pickup means, and an identification mark written on the surface of each of the plurality of steel materials is extracted and extracted.
  • Image recognition means for recognizing the identification information using the identification mark; Collation means for determining whether or not the identification information acquired by the correspondence information search means matches the identification information recognized by the image recognition means; A program for functioning as a server is provided.
  • a collation method for collating a plurality of steel materials that are stacked and stored in each of a plurality of areas each assigned with an address Computer
  • the identification information of each of the plurality of steel materials stored, the address of the area in which each of the steel materials is stored, and stage information indicating the position in the steel material group stacked in a plurality of stages are associated with each other.
  • a pre-imaged and / or already-captured image is displayed on the finder, and a specific frame indicating a partial area to be subjected to image recognition processing in the displayed image is displayed on the finder so as to overlap the image.
  • Image recognition processing is performed using only a partial image in the specific frame in the image captured in the imaging step, and identification marks written on the surfaces of the plurality of steel materials are extracted and extracted.
  • system and apparatus of this embodiment include a CPU (Central Processing Unit), a memory, and a program loaded in the memory (a program stored in the memory in advance from the stage of shipping the apparatus, a CD (Including programs downloaded from storage media such as (Compact Disc), servers on the Internet, etc.), storage units such as hard disks that store the programs, and any combination of hardware and software, centering on the network connection interface It is realized by. It will be understood by those skilled in the art that there are various modifications to the implementation method and apparatus.
  • each device is described as being realized by one device, but the means for realizing it is not limited to this. That is, it may be a physically separated configuration or a logically separated configuration.
  • the inventors of the present invention have studied a technique for realizing, with a computer, verification work of steel materials stacked and stored in a plurality of stages in each of a plurality of areas each assigned an address.
  • First storage information indicating which identification information (lot number or the like) of steel material is stored at which address and what level is managed by electronic data.
  • image recognition technology is used for the collation work. Specifically, first, an image of an identification mark (for example, printed) attached to the surface of a steel material to be verified is imaged. Then, an identification mark is extracted from an image captured using an image recognition technique, and identification information is recognized using the identification mark.
  • the storage information is searched using the position of the steel material to be verified (information indicating the address and the number of steps) as a key, and the identification information associated with the key is acquired. Then, it is verified whether the identification information recognized using the image recognition technology matches the identification information acquired from the storage information. According to such a technique, human error can be eliminated. However, in the case of this technique, the following problems peculiar to the collation work of steel materials may occur.
  • Steel imaging needs to be done at the steel storage location. That is, the steel material cannot be moved for imaging.
  • steel materials which may be stored outdoors or indoors.
  • the steel material may be stored in an environment that is not preferable for imaging, such as in a low illuminance environment. In such a case, if the image recognition process is performed using the captured image as it is, there is a possibility that sufficient recognition accuracy cannot be obtained.
  • a means for imaging the steel material by changing the setting on the camera side to be suitable for each environment can be considered. However, busy workers in the field want to avoid such troublesome changes in camera settings. In addition, it takes time to change the camera settings, and the work efficiency of the entire collation work deteriorates.
  • FIG. 1 shows an example of a functional block diagram of the matching system 1 of the present embodiment.
  • the collation system 1 of the present embodiment includes a storage unit 11, an input reception unit 12, a correspondence information search unit 13, an imaging unit 14, an image recognition unit 15, a collation unit 16, and an output unit. 17.
  • the collation system 1 of the present embodiment may be realized by a single device (for example, a mobile terminal device), or may be realized by two or more devices configured to be able to communicate with each other by wire and / or wirelessly. Good. That is, one apparatus may include all the units shown in FIG. Alternatively, each of the two or more devices may include at least a part of the units illustrated in FIG. 1, and the matching system 1 including all the units illustrated in FIG. 1 may be realized by combining them.
  • An embodiment in which the verification system 1 is realized by two or more devices will be described in the following embodiment.
  • the collation system 1 of this embodiment is a system for collating a plurality of steel materials stacked and stored in a plurality of stages in each of a plurality of areas each assigned an address.
  • FIG. 2 shows an example of a plurality of areas each assigned an address.
  • AA1 to AF4 are addresses.
  • the plurality of areas do not necessarily have to be regularly arranged as shown in the figure, and may have a random positional relationship.
  • each of the plurality of areas may be located away from each other.
  • the environments may be different from each other, with some areas being indoors and other areas being outdoors.
  • the shape of the area to show in figure is an example, and is not limited to a square.
  • Fig. 3 shows an example of a plurality of steel materials that are stacked and stored in a plurality of stages.
  • plate-like slabs are stacked in five stages.
  • the shape of the steel material is not particularly limited.
  • the shape of the steel material may be a plate shape as shown in the figure, or may be other shapes such as a square shape and a rod shape.
  • the number of steels stacked on top of each other is a design matter.
  • each steel material may be mounted on a predetermined mounting member (for example, mounting table), and a plurality of steel materials may be laminated together with the mounting member.
  • a predetermined mounting member for example, mounting table
  • an identification mark indicating each identification information is written on the surface of each steel material.
  • an identification mark may be printed on the surface of each steel material by a machine.
  • the label for example, the label which printed the identification mark produced with the computer
  • the identification mark may be stuck on the surface of each steel material.
  • Any form that can be recognized by a conventional image recognition technique can be adopted as the identification mark.
  • the identification mark may be identification information itself made up of alphanumeric characters or the like as shown in the figure, or may be a barcode or a two-dimensional code.
  • identification information consisting of alphanumeric characters is described in one column, but may be described in two columns and three columns.
  • Such a description form can be changed according to the surface shape of steel materials. However, it is preferable to adopt the same description form for the same type of steel materials (the same shape, size, components, etc.).
  • the storage unit 11 shown in FIG. 1 includes identification information for each of a plurality of stored steel materials, an address of an area in which each of the steel materials is stored, and stage information indicating positions in a steel material group stacked in a plurality of stages. Is stored in correspondence information.
  • the stage information may be, for example, information indicating the number of stages counted from the bottom, or information indicating the number of stages counted from the top. In the following, it is assumed that the stage information is information indicating the number of stages counted from the bottom.
  • FIG. 4 shows an example of correspondence information. According to the correspondence information shown in the figure, it can be seen that the steel material of the identification information “20130101AB001” is stored in the first row from the bottom of the AA1 address.
  • the correspondence information is updated when an event occurs in which a steel material to be stored is newly added, a steel material to be stored is shipped, or an address to be stored is moved to another address.
  • the content of the correspondence information may be updated based on a human input operation.
  • the input receiving unit 12 receives the input of the steel material address and step information to be verified. For example, when an operator ships a steel material stored at a certain number of steps at a certain address, this steel material is used as a verification target. Then, the worker inputs the address and stage information indicating the number of stages. In addition, the operator may sequentially check a plurality of steel materials in order to check whether there is an error in the correspondence information stored in the storage unit 11 at a timing such as inventory.
  • FIG. 5 shows an example of a user interface for the input receiving unit 12 to receive input of the steel material address and step information to be verified.
  • an address and stage information can be selected by a pull-down menu.
  • the example of the user interface shown in the figure is merely an example, and the present invention is not limited to this (the premise is the same for all user interfaces shown below).
  • a user interface using other GUI (graphical user interface) components may be used.
  • the means by which the input receiving unit 12 receives an input is not particularly limited, and can be realized using any input device such as a touch panel display, an input button, a microphone, a keyboard, and a mouse.
  • the correspondence information search unit 13 refers to the correspondence information (see FIG. 4) stored in the storage unit 11, and is associated with the address and stage information at which the input reception unit 12 has accepted the input. Get the identification information of steel.
  • the output unit 17 has a finder.
  • the viewfinder can be composed of a display, for example.
  • the viewfinder may be a touch panel display, for example.
  • An image captured by the imaging unit 14 described below (an image before imaging) and / or an image captured by the imaging unit 14 (an image captured) is displayed on the finder.
  • an imaging instruction is input while an image is displayed on the finder
  • the image displayed on the finder is captured and imaging data is stored.
  • a captured image is displayed on the viewfinder using the stored imaging data.
  • the output unit 17 displays on the viewfinder a specific frame indicating a partial area to be subjected to image recognition processing in the image displayed on the viewfinder.
  • the image recognition process corresponds to an image recognition process executed by the image recognition unit 15 described below.
  • the output unit 17 is configured to be capable of executing at least one of (1) a process of displaying a specific frame superimposed on an image before imaging, and (2) a process of displaying a specific frame superimposed on a captured image. Is done. Below, the output part 17 demonstrates as what performs the process which superimposes and displays a specific frame on the image before imaging (1).
  • FIG. 6 shows an example in which the output unit 17 displays a specific frame superimposed on an image on a display (finder).
  • a part of steel materials (see FIG. 3) stored in a plurality of stages is displayed as an image before imaging. More specifically, the identification mark part of the steel materials stacked and stored in a plurality of stages is displayed.
  • the specific frame F is displayed over the image.
  • an imaging instruction input e.g., a touch of a shooting button
  • an image displayed on the display 100 is captured.
  • the target of the image recognition process is not all the images displayed on the display 100 but only the image in the specific frame F.
  • the shape of the specific frame F is not limited to a square and may be other shapes.
  • at least one of the size and shape of the specific frame F and the display position in the display 100 may be changed according to the input of the operator.
  • a touch panel type display capable of recognizing a plurality of points
  • There are methods such as touching and dragging.
  • a method touch and slide
  • the imaging unit 14 captures an image displayed on the display 100.
  • the operator inputs an imaging instruction in a state where the identification mark of the steel material to be collated is within the specific frame F. It is preferable to input an imaging instruction in a state where the identification marks of the steel materials other than the verification target are not included in the specific frame F.
  • the imaging unit 14 associates information (specific frame position information) indicating the position of a specific frame at the time of imaging (eg, a position in the image data, a position in the display 100) with the image data of the captured image.
  • the image recognition unit 15 performs image recognition processing using only a partial image within a specific frame in the image captured by the imaging unit 14.
  • the image recognition unit 15 acquires the image data of the image captured by the imaging unit 14, the image recognition unit 15 specifies a partial image in the specific frame F using specific frame position information associated with the image data. Then, image recognition processing is performed using only the image data of the specified image.
  • the image recognition process includes a process of extracting the identification mark written on the surface of the steel material from the image to be processed, and a process of recognizing identification information using the extracted identification mark. Details of the image recognition process are not particularly limited, and any conventional technique can be applied.
  • the image recognizing unit 15 holds feature information (feature amount) indicating the feature of the identification mark in advance, and can perform identification mark extraction and authentication processing using the feature information.
  • the image recognition unit 15 executes various processes such as noise removal, smoothing, sharpening, two-dimensional filtering processing, binarization, thinning, normalization (enlargement / reduction, parallel movement, rotation movement, density change, etc.). can do. Note that it is not always necessary to execute all of the processes exemplified here.
  • the collation unit 16 determines whether the identification information acquired by the correspondence information search unit 13 matches the identification information recognized by the image recognition unit 15.
  • the output unit 17 can display the determination result by the verification unit 16 on the display 100.
  • FIG. 7 shows an example in which the output unit 17 displays the determination result on the display 100.
  • the address and step information of the steel material to be verified are displayed on the display 100, and the determination result (authentication result) by the verification unit 16 and the recognition result of the image recognition process by the image recognition unit 15 are displayed. Has been.
  • the input receiving unit 12 receives the input of the address and step information of the steel material to be verified (S1). For example, the input receiving unit 12 receives an input of address and stage information via a user interface as shown in FIG. 5 displayed on the display 100 by the output unit 17. Here, it is assumed that the input receiving unit 12 receives an input “AA1” and the “2” level from the bottom.
  • the correspondence information search unit 13 searches the correspondence information stored in the storage unit 11 using the address and step information received by the input reception unit 12 as keys, and acquires the steel material identification information associated with the key. (S2).
  • the correspondence information is searched for the correspondence information shown in FIG. 4 using the combination of “AA1” and “2” as a key, and the identification information “20130101AB002” is acquired.
  • the collation system 1 enters the imaging mode.
  • the order of the transition / execution to the imaging mode and the processing of S2 is not limited to the order shown in FIG. 8, and these may be executed in parallel.
  • the display on the display 100 is switched.
  • the output unit 17 displays the image to be imaged on the display 100 and displays the specific frame F so as to overlap the image (see FIG. 6).
  • the operator adjusts the position, orientation, and the like of the collation system 1 to display the identification mark attached to the surface of the steel material to be collated on the display 100 and place the identification mark in the specific frame F.
  • the worker inputs an imaging instruction (for example, touching the shooting button) while maintaining the state.
  • the imaging unit 14 captures an image displayed on the display 100.
  • the imaging unit 14 then associates information (specific frame position information) indicating the position of the specific frame F at the time of imaging (eg, position in the image data, position in the display 100) with the image data of the captured image ( S3).
  • information specific frame position information
  • the imaging unit 14 captures an image in the state illustrated in FIG.
  • the image recognition unit 15 performs image recognition processing using only a partial image within the specific frame F in the image captured by the imaging unit 14.
  • the image recognition unit 15 extracts the identification mark written on the surface of each of the plurality of steel materials by the image recognition process (S4), and recognizes the identification information using the extracted identification mark (S5).
  • the image recognition unit 15 has recognized the identification information “20130101AB002”.
  • the collation unit 16 determines (collation) whether the identification information acquired by the correspondence information search unit 13 in S2 matches the identification information recognized by the image recognition unit 15 in S5 (S6). And the output part 17 outputs the collation result of the collation part 16 in S6 (S7). For example, the output unit 17 outputs a collation result as shown in FIG.
  • the recognition of the image recognition part 15 in S5 is "20130101AB00?"
  • the identification information recognized in S5 is not correctly recognized, and as a result, does not match the identification information acquired in S2, so that the collation result of the collation unit 16 is inconsistent (NG).
  • the output unit 17 outputs a collation result as shown in FIG. According to the display, the operator can recognize that the collation result is NG because the recognition accuracy of the image recognition process is insufficient.
  • the worker visually recognizes the recognition mark attached to the steel material to be verified (steel material stored at a predetermined stage at a predetermined address) and inputs the visually recognized identification information (input receiving unit 12). You may comprise so that it can do.
  • the collation unit 16 determines (collates) whether the identification information received by the input receiving unit 12 matches the identification information acquired by the correspondence information search unit 13 in S2.
  • touching “to input screen” on the user interface shown in FIG. 9 may cause a transition to the input screen shown in FIG. In this screen, the recognition result “20130101AB00” by the image recognition unit 15 is displayed as an initial value, and the last numeric part that could not be recognized is blank.
  • the input receiving unit 12 may receive an input of identification information from such a user interface, for example.
  • the verification system 1 accepts user input to select one of the captured images.
  • the user images in advance the identification mark attached to the surface of the steel material to be verified, and stores the image data. And here, the image in which the identification mark attached
  • the collation system 1 executes a process of displaying the specific frame F on the display 100 by superimposing the selected image on the captured image instead of shifting to the imaging mode after S2.
  • the user changes at least one of the size and shape of the specific frame F and the display position in the display 100 as necessary, and puts the identification mark in the specific frame F.
  • the worker performs imaging input (for example, touching the imaging button) while maintaining the state.
  • the imaging unit 14 includes, in the image data of the image displayed on the display 100, information indicating the position of the specific frame F at the time of receiving the imaging input (eg, position in the image data, position in the display 100).
  • Data associated with (specific frame position information) is created (imaging processing) and saved (S3).
  • the processing after S4 is the same as the above example.
  • the process flow is applicable to all the following embodiments.
  • collation system of the present embodiment it is possible to realize a collation operation of steel materials stacked and stored in a plurality of stages in each of a plurality of areas each assigned with an address by processing by a computer. For this reason, it is possible to avoid occurrence of human error.
  • the collation system 1 of the present embodiment is configured to be able to solve such a problem. That is, the collation system 1 of the present embodiment does not perform image recognition processing using all the captured images, but performs image recognition processing using only the images specified by the specific frame F in the captured images. For this reason, the amount of data to be processed can be reduced. As a result, the processing time can be shortened.
  • the verification system 1 of the present embodiment it is possible to efficiently and accurately perform the verification work of the steel materials stacked and stored in a plurality of stages in each of the plurality of areas each assigned with an address. It becomes possible.
  • the verification system 1 of this embodiment is different from the verification system 1 of the first embodiment in that a plurality of steel materials stored at the same address can be targeted for verification at a time.
  • this embodiment will be described.
  • omission is demonstrated suitably.
  • FIG. 1 An example of a functional block diagram of the collation system 1 of the present embodiment is shown in FIG. 1 as in the first embodiment.
  • the input reception part 12 can receive the input of the address of the steel material of collation object, for example using a user interface as shown in FIG.
  • the correspondence information searching unit 13 searches the correspondence information stored in the storage unit 11 and is associated with the address at which the input receiving unit 12 has received the input. Information (thing with which the identification information of steel materials is matched) is acquired. Thereafter, the output unit 17 displays a list of the stage information acquired by the correspondence information search unit 13.
  • FIG. 12 shows an example in which the output unit 17 displays a list of column information on the display 100.
  • five pieces of step information (circle 1 to circle 5) are displayed.
  • the number of the stage information displayed in a list here corresponds to the number of steel materials managed to be stored at the address in the correspondence information. That is, in the case of the example shown in FIG. 12, it is managed that five steel materials are stacked and stored in the AA1 address in five stages in correspondence information.
  • the operator can find an error existing in the correspondence information by comparing the number displayed in the list with the number of steel materials actually stored at the address.
  • the number of laminated steel materials is sufficiently small (e.g., in the single digit range), it is difficult to cause inconvenience that a confirmation error by an operator occurs.
  • the input receiving unit 12 receives an input of selecting one or a plurality of pieces of step information displayed in a list, and receives input of step information of one or a plurality of steel materials to be verified.
  • one or a plurality of pieces of stage information can be selected by checking a check box displayed in association with each piece of stage information.
  • the output unit 17 displays the same number (one or more) of specific frames F as the number of stage information received by the input receiving unit 12 on the display 100.
  • each unit when the input receiving unit 12 receives input of a plurality of pieces of stage information and the output unit 17 displays a plurality of specific frames F on the display 100 will be described.
  • the structure of each part can be made the same as that of 1st Embodiment.
  • FIG. 13 shows an example in which the output unit 17 displays a plurality of specific frames F on the display 100.
  • a part of steel materials (see FIG. 3) that are stacked and stored in a plurality of stages are displayed.
  • two specific frames F1 and F2 are displayed so as to overlap the image.
  • Each of the plurality of specific frames F1 and F2 displayed on the display 100 is associated with each piece of stage information received by the input receiving unit 12.
  • the output unit 17 may display a plurality of specific frames F1 and F2 so that the associated stage information can be identified.
  • the circled numbers displayed in the upper left corner of each of the specific frames F1 and F2 indicate the stage information associated with each. That is, it can be identified that the specific frame F1 is associated with the stage information of the third stage from the bottom, and the specific frame F2 is associated with the stage information of the fourth stage from the bottom. Using such information, the operator can grasp how many steel material identification marks should be placed in each of the plurality of specific frames F1 and F2.
  • FIG. 14 shows another example.
  • auxiliary frames G1 to G3 are displayed above and below the specific frames F1 and F2 in addition to the specific frames F1 and F2.
  • the stage information can be identified by the positions in the plurality of frame groups including the specific frames F1 and F2 and the auxiliary frames G1 to G3.
  • the specific frame F1 shown in FIG. 14 is located at the third level from the bottom in the frame group. From this, it can be seen that the step information associated with the specific frame F1 is the third step from the bottom.
  • the auxiliary frames G1 to G3 may have the same design as the specific frames F1 and F2 and may differ only in shape and size, or may have different designs as shown in FIG.
  • the auxiliary frames G1 to G3 can be made smaller than the specific frames F1 and F2. In this way, it is possible to reduce the inconvenience that the visibility of the image to be captured displayed on the display 100 by the auxiliary frames G1 to G3 is impaired.
  • the specific frames F1 and F2 displayed on the display 100 can individually change at least one of the display position, shape, and size in the display 100 in accordance with a user input. Good.
  • the display position, shape, and size of one specific frame F may be similarly changed.
  • a touch panel type display capable of recognizing a plurality of points
  • a touch panel type display that can recognize only one point
  • a method such as touching and dragging the intersection of one side or two sides constituting a specific frame to change the display size.
  • a method touch and slide
  • information indicating the position of the specific frame F at the time of image capturing (eg, position in the image data, position in the display 100) in the image data of the captured image.
  • specific frame position information indicating the positions of the plurality of specific frames F is associated with the image data.
  • Each of the plurality of specific frame position information is associated with stage information associated with the specific frame F.
  • the image recognition unit 15 When a plurality of specific frame position information is associated with the image data of the image captured by the image capturing unit 14, the image recognition unit 15 performs image recognition using only each partial image specified by each specific frame position information. Process. Then, the image recognition unit 15 obtains a plurality of recognition results (identification information). The contents of the image recognition process are the same as in the first embodiment. Each of the plurality of pieces of identification information recognized by the image recognition unit 15 is associated with step information associated with each specific frame position information.
  • the correspondence information search unit 13 searches the correspondence information stored in the storage unit 11, and selects a plurality of identification information associated with the address at which the input reception unit 12 has received the input and any one of the plurality of stage information. get.
  • the collation unit 16 determines whether or not the plurality of identification information acquired by the correspondence information search unit 13 and the plurality of identification information recognized by the image recognition unit 15 match. Specifically, identification information with which the corresponding stage information matches is compared to determine whether or not they match. And the collation part 16 obtains a discrimination
  • FIG. 15 shows an example in which the output unit 17 displays the discrimination result of the collation unit 16 on the display 100.
  • all five steel materials stored at address AA1 are subject to collation, and the respective recognition results and collation results are shown.
  • the steel material in the second stage from the bottom has insufficient accuracy of recognition by the image recognition unit 15, and as a result, the authentication result is NG.
  • the “NG” button is touched in the user interface, the screen shown in FIG. 10 may be displayed. Processing using the user interface shown in FIG. 10 is the same as in the first embodiment.
  • the same operational effects as those of the first embodiment can be realized. Moreover, since several steel materials can be made into the collation object at once, the working efficiency of collation processing improves. Moreover, since the identification marks of each of the plurality of steel materials are identified and identified in each of the plurality of specific frames F, the accuracy of the process of extracting the identification marks from the image can be improved.
  • the verification system 1 of this embodiment is different from the verification system 1 of the first embodiment in that a plurality of steel materials stored at the same address can be targeted for verification at a time.
  • this embodiment will be described.
  • omission is demonstrated suitably.
  • FIG. 1 An example of a functional block diagram of the collation system 1 of the present embodiment is shown in FIG. 1 as in the first embodiment.
  • the input reception part 12 can receive the input of the address of the steel material of collation object, for example using a user interface as shown in FIG. Then, the correspondence information search part 13 searches the correspondence information memorize
  • the correspondence information search unit 13 searches the correspondence information stored in the storage unit 11 and recognizes the number of pieces of steel material identification information associated with the address at which the input reception unit 12 has accepted the input. The number of the first steel materials stored at the address may be specified. Then, the output part 17 displays the 1 or several specific flame
  • each part when the number of the first steel materials is plural will be described.
  • the structure of each part in case the number of 1st steel materials is one can be made the same as that of 1st Embodiment.
  • FIG. 16 shows an example in which the output unit 17 displays the same number of specific frames F as the number of first steel materials on the display 100.
  • the display 100 displays five stacked steel materials.
  • five specific frames F1 to F5 are displayed on the steel material image.
  • Each of the five specific frames F1 to F5 is associated with stage information.
  • the output unit 17 displays a plurality of specific frames F1 to F5 so that the associated stage information can be identified.
  • the first to fifth stages are associated in order from the bottom according to the arrangement order of the five specific frames F1 to F5. That is, the worker can identify the stage information associated with each of the specific frames F1 to F5 based on the arrangement order of the plurality of specific frames F1 to F5. Using such information, the operator can grasp how many steel material identification marks should be placed in each of the plurality of specific frames F1 to F5.
  • the stage information matched with each specific frame can also be identified and displayed by the structure similar to 2nd Embodiment.
  • a cross mark M is displayed in association with each specific frame F.
  • the specific frame associated with the cross mark M may disappear, or may change to the auxiliary frame described in the second embodiment. That is, the input reception part 12 may receive the input of the stage information of the steel materials to be verified by receiving an input for selecting one or more of the plurality of specific frames F1 to F5.
  • the step information corresponding to the specific frame F not touched with the cross mark M in other words, the specific frame F remaining on the display 100 when the input receiving unit 12 receives the imaging instruction input.
  • Corresponding step information is input as step information of the steel material to be verified.
  • the specific frames F1 to F5 displayed on the display 100 can individually change at least one of the display position, shape, and size in the display 100 in accordance with a user input. Good.
  • the display position, shape, and size of one specific frame F may be similarly changed.
  • a touch panel type display capable of recognizing a plurality of points
  • a touch panel display that can recognize only one point, when changing the size, there is a method such as touching and dragging the intersection of one side or two sides that make up a specific frame, and when changing the display position, within the specific frame There is a method such as touching and dragging an arbitrary position of.
  • a display that is not a touch panel type, there is a display using a predetermined button.
  • information indicating the position of the specific frame F at the time of image capturing (eg, position in the image data, position in the display 100) in the image data of the captured image.
  • specific frame position information indicating the positions of the plurality of specific frames F is associated with the image data.
  • Each of the plurality of specific frame position information is associated with stage information associated with the specific frame F.
  • the image recognition unit 15 When a plurality of specific frame position information is associated with the image data of the image captured by the image capturing unit 14, the image recognition unit 15 performs image recognition using only each partial image specified by each specific frame position information. Process. Then, the image recognition unit 15 obtains a plurality of recognition results (identification information). The contents of the image recognition process are the same as in the first embodiment. Each of the plurality of pieces of identification information recognized by the image recognition unit 15 is associated with step information associated with each specific frame position information.
  • the correspondence information search unit 13 searches the correspondence information stored in the storage unit 11, and selects a plurality of identification information associated with the address at which the input reception unit 12 has received the input and any one of the plurality of stage information. get.
  • the collation unit 16 determines whether or not the plurality of identification information acquired by the correspondence information search unit 13 and the plurality of identification information recognized by the image recognition unit 15 match. Specifically, identification information with which the corresponding stage information matches is compared to determine whether or not they match. And the collation part 16 obtains a discrimination
  • FIG. 15 shows an example in which the output unit 17 displays the discrimination result of the collation unit 16 on the display 100.
  • all five steel materials stored at address AA1 are subject to collation, and the respective recognition results and collation results are shown.
  • the steel material in the second level from the bottom has insufficient recognition accuracy by the image recognition unit 15, and as a result, the collation result is NG.
  • the “NG” button is touched in the user interface, the screen shown in FIG. 10 may be displayed. Processing using the user interface shown in FIG. 10 is the same as in the first embodiment.
  • the same operational effects as those of the first and second embodiments can be realized. Moreover, since several steel materials can be made into the collation object at once, the working efficiency of collation processing improves. Moreover, since the identification marks of each of the plurality of steel materials are identified and identified in each of the plurality of specific frames F, the accuracy of the process of extracting the identification marks from the image can be improved.
  • the collation system 1 of this embodiment is based on the configuration of the second embodiment and the third embodiment that can target a plurality of steel materials at one time. That is, the collation system 1 of the present embodiment can display a plurality of specific frames F on the display 100.
  • each of a plurality of identification marks can be accommodated in each of a plurality of specific frames F arranged in a line as shown in FIGS.
  • the identification marks are not aligned in a line, and their positions may vary. In such a case, as shown in the diagram on the left side of FIG. 17, it is impossible to fit each of the plurality of identification marks in each of the plurality of specific frames F arranged in a line.
  • the input receiving unit 12 of the present embodiment can individually move a plurality of specific frames F (the diagram on the right side of FIG. 17).
  • the display position of the specific frame F may be moved by touching and sliding the specific frame F.
  • each of the plurality of identification marks can be contained in each of the plurality of specific frames F.
  • the same operational effects as those of the first to third embodiments can be realized. Further, even when the positions of the identification marks of the plurality of steel materials stacked and stored in a plurality of stages are not aligned and vary, the plurality of identification marks are stored in each of the plurality of specific frames F. be able to.
  • the collation system 1 of this embodiment is based on the configuration of the second embodiment and the third embodiment that can target a plurality of steel materials at one time. That is, the collation system 1 of the present embodiment can display a plurality of specific frames F on the display 100.
  • the collation system 1 of the present embodiment has a fourth problem in the case where the positions of the identification marks of a plurality of steel materials stacked and stored in a plurality of stages are not aligned in one line (eg, one line in the stacking direction) and vary.
  • the configuration is different from that of the embodiment and can be solved.
  • the input receiving unit 12 receives a designation input for designating a part of the plurality of specific frames F displayed on the display 100 and an imaging instruction input for imaging in a state where some of the specific frames F are designated. Accept.
  • the imaging unit 14 captures an image according to the imaging instruction input received by the input receiving unit 12.
  • the image recognition unit 15 performs image recognition processing using only a partial image in the specific frame F designated at the time of image capturing in the image captured by the image capturing unit 14.
  • the output unit 17 may display the specific frame F that has been imaged in the designated state and the specific frame F that has not been imaged in the designated state in an identifiable manner. This will be described in more detail below using specific examples.
  • FIG. 18 shows a display example by the output unit 17.
  • a part of the plurality of steel materials is displayed on the display 100.
  • the positions of the identification marks displayed on each of the plurality of steel materials are not aligned in one row (eg, one row in the stacking direction), but vary.
  • On the display 100 three specific frames F1 to F3 are displayed. Three symbols are written in the upper left corner of each of the specific frames F1 to F3. These three symbols are, in order from the left, “information indicating the associated stage information”, “information indicating whether or not the image has been captured in the specified state”, “whether or not specified. Information ”. “Information indicating the associated stage information” is as described in the second embodiment.
  • “Information indicating whether or not an image has been taken in a specified state” is a character “completed” or “uncompleted”. “Done” indicates that the image has been picked up in the specified state, and “Not” indicates that the image has not been picked up in the specified state. “Information indicating whether or not it is designated” is a check box that can be input by an operator. The specific frame F that is checked is the specified specific frame F, and the specific frame F that is not checked is the specific frame F that is not specified. Note that the specific frame F that has been imaged in the designated state cannot be checked.
  • the worker designates only a part of the identification marks. That is, check the check box.
  • one or a plurality of specific frames F are designated, and a predetermined steel material identification mark is placed in the designated specific frames F.
  • an imaging instruction is input (touching the imaging button) while maintaining the state.
  • the input receiving unit 12 receives an imaging instruction input
  • the imaging unit 14 captures an image, and information (specific frame position information) indicating the position of the specific frame F1 specified at the time of imaging in the image data of the captured image.
  • the input reception part 12 receives the step information matched with the specific flame
  • the step information received by the input receiving unit 12 is associated with the image data of the image.
  • the output unit 17 may continue the screen display illustrated in FIG. 18 even after the imaging unit 14 captures an image in the state illustrated in FIG. However, since the specific frame F1 is captured in the designated state, the “unfinished” character in the upper left corner is replaced with “done”. Further, like the specific frame F3, the check box cannot be selected. By visually recognizing such a display, the operator can recognize that it is the second stage steel material identification mark that has not yet been imaged.
  • the operator can input an instruction to start the collation process (touch the collation button) after imaging one or a plurality of steel identification marks.
  • the input receiving unit 12 receives an instruction input for starting such a collation process
  • the image recognition unit 15, the collation unit 16, Collation processing by the correspondence information search unit 13 and the storage unit 11 and collation result output by the output unit 17 are performed.
  • the contents of collation processing by the image recognition unit 15, collation unit 16, correspondence information retrieval unit 13 and storage unit 11, and collation result output by the output unit 17 are the same as those in the first to fourth embodiments.
  • each of the plurality of identification marks is individually included in each of the plurality of specific frames F.
  • the worker refers to the “information indicating whether or not the image has been captured in the specified state” associated with each specific frame F, so that a plurality of stacked and stored in a plurality of stages are stored. Steel materials that have not yet been verified can be recognized.
  • the collation system 1 of this embodiment is different from the first to fifth embodiments in that it includes a terminal device and a server device that are configured to be able to communicate with each other by wire and / or wirelessly.
  • FIG. 19 shows an example of a functional block diagram of the verification system 1 of the present embodiment.
  • the terminal device 2 includes an input reception unit 12, an imaging unit 14, an output unit 17, and a terminal side transmission / reception unit 18.
  • the server device 3 includes a storage unit 11, a correspondence information search unit 13, an image recognition unit 15, a collation unit 16, and a server side transmission / reception unit 19.
  • the terminal device 2 and the server device 3 can communicate with each other via the terminal side transmission / reception unit 18 and the server side transmission / reception unit 19.
  • FIG. 20 shows another example of a functional block diagram of the verification system 1 of the present embodiment.
  • the terminal device 2 includes an input reception unit 12, an imaging unit 14, an image recognition unit 15, an output unit 17, and a terminal side transmission / reception unit 18.
  • the server device 3 includes a storage unit 11, a correspondence information search unit 13, a collation unit 16, and a server side transmission / reception unit 19. The terminal device 2 and the server device 3 can communicate with each other via the terminal side transmission / reception unit 18 and the server side transmission / reception unit 19.
  • the terminal-side transmitting / receiving unit 18 and the server-side transmitting / receiving unit 19 are configured to be able to communicate with each other by wire and / or wirelessly, and can transmit and receive data.
  • the terminal-side transmitting / receiving unit 18 may transmit only the image data of a partial image in the specific frame F in the image captured by the imaging unit 14 to the server device 3 (external device).
  • the terminal-side transmission / reception unit 18 includes information for identifying a partial image in the specific frame F in the image captured by the imaging unit 14 (eg, specific frame position information indicating the position of the specific frame F) and the image Data may be transmitted to the server device 3 (external device).
  • the terminal-side transmitting / receiving unit 18 includes information for identifying a partial image in the specific frame F in the image captured by the imaging unit 14 (for example, specific frame position information indicating the position of the specific frame F) and the image It is assumed that data is transmitted to the server device 3 (external device).
  • the input receiving unit 12 of the terminal device 2 receives the input of the address of the steel material to be verified through, for example, a user interface as shown in FIG. 11 (S10). Then, the terminal side transmission / reception unit 18 of the terminal device 2 transmits the address information to the server device 3 (S11). The server device 3 receives the address information via the server side transmission / reception unit 19. Thereafter, the correspondence information search unit 13 of the server device 3 searches the correspondence information (see FIG. 4) stored in the storage unit 11 using the address as a key, and the step information (steel material identification information) associated with the key. Are associated with each other) (S12). Then, the server side transmission / reception unit 19 of the server device 3 returns the acquired stage information to the terminal device 2 (S13). The terminal device 2 acquires the stage information via the terminal side transmission / reception unit 18. The stage information transmitted and received here may be associated with the address received in S11.
  • the output unit 17 of the terminal device 2 displays the acquired stage information on the display 100 as a list as shown in FIG.
  • the input reception part 12 receives the input which designates one or several step information from the said user interface (S14).
  • the terminal device 2 switches to the imaging mode. That is, the output unit 17 displays an image to be imaged on the display 100 as shown in FIG. Further, the output unit 17 displays the same number of specific frames F1 and F2 as the stage information specified in S14 on the display 100. Note that the circled numbers displayed in the upper left corner of each of the specific frames F1 and F2 indicate the stage information associated with each. Using such information, the operator can grasp how many steel material identification marks should be placed in each of the plurality of specific frames F1 and F2.
  • the operator adjusts the position, orientation, etc. of the verification system 1 to place a predetermined identification mark in each of the displayed one or more specific frames F.
  • the worker inputs an imaging instruction (for example, touching the shooting button) while maintaining the state.
  • the imaging unit 14 captures an image displayed on the display 100 (S15).
  • the imaging unit 14 associates information (specific frame position information) indicating the position of each of the one or more specific frames F displayed at the time of imaging with the image data of the captured image.
  • the stage information associated with each specific frame F is associated with each.
  • the terminal-side transmission / reception unit 18 of the terminal device 2 transmits the image data of the captured image to the server device 3 together with the specific frame position information and the step information that are associated with each other (S16).
  • the server device 3 receives the image data of the captured image, the specific frame position information, and the step information via the server side transmission / reception unit 19.
  • the image data, the specific frame position information, and the stage information transmitted / received here may be associated with the address associated with the stage information received in S13.
  • the image recognition unit 15 of the server device 3 performs image recognition processing using only a partial image within a specific frame in the image captured by the imaging unit 14.
  • the image recognition unit 15 extracts the identification mark written on the surface of each of the one or more steel materials, and recognizes the identification information using the extracted identification mark (S17).
  • the correspondence information search unit 13 searches the correspondence information in the storage unit 11 using the address and step information acquired in S16 as a key, and acquires one or a plurality of identification information.
  • the collation part 16 discriminate
  • the identification information with the matching step information is collated.
  • the server side transmission / reception unit 19 of the server device 3 returns the determination result of S18 to the terminal device 2 (S19).
  • the terminal device 2 acquires the determination result via the terminal side transmission / reception unit 18. Note that the determination result transmitted and received here may be associated with an address and stage information.
  • the output unit 17 of the terminal device 2 creates, for example, a user interface as shown in FIG. 15 and displays it on the display 100 (S20).
  • Storage means for storing correspondence information; Input accepting means for accepting input of the address of the steel material to be verified and the step information; Corresponding information search means for referring to the correspondence information and acquiring the identification information associated with the address and the stage information received by the input receiving means; A viewfinder, displaying a pre-imaged image and / or an imaged image on the viewfinder, and overlaying the image with a specific frame indicating a partial area to be subjected to image recognition processing in the displayed image Output means for displaying on the viewfinder, Imaging means for imaging an image displayed on the viewfinder; Image recognition processing is performed using only a partial image in the specific frame in the image picked up by the image pickup means, and an identification mark written on the surface of each of the plurality of steel materials is extracted and extracted.
  • Image recognition means for recognizing the identification information using the identification mark; Collation means for determining whether or not the identification information acquired by the correspondence information search means matches the identification information recognized by the image recognition means; A collation system. 2. In the verification system according to 1, The collating system in which the output means outputs a discrimination result of the collating means. 3. In the verification system according to 1 or 2, The output unit is a collation system that outputs a recognition result by the image recognition unit. 4).
  • the verification system can target a plurality of the steel materials stored at the same address at a time
  • the input accepting means can accept input of the address and a plurality of the step information in which a plurality of the steel materials to be collated are stored
  • the collating system in which the output means displays a plurality of the specific frames on the finder in the same number as the number of the stage information accepted by the input accepting means. 5.
  • Each of the plurality of specific frames is associated with each of the stage information received by the input receiving unit, The collation system, wherein the output means displays a plurality of the specific frames so that the associated stage information can be identified. 6).
  • the correspondence information search means acquires the stage information associated with the address at which the input reception means has received an input,
  • the output means displays a list of the stage information acquired by the correspondence information search means,
  • the said input reception means is a collation system which receives the input of the said step information of the said steel materials of collation object by receiving the input which selects one or more in the said step information currently displayed by the list. 7).
  • the verification system can target a plurality of the steel materials stored at the same address at a time,
  • the correspondence information search means is configured to associate the number of the first steel materials that are the steel materials stored at the address that the input reception means has accepted with the address in the correspondence information.
  • the said output means is the collation system which displays the said specific frame of the same number as the number of said 1st steel materials on the said finder. 8).
  • Each of the specific frames is associated with the step information associated with each of the first steel materials, The collation system, wherein the output means displays a plurality of the specific frames so that the associated stage information can be identified.
  • the said input reception means is a collation system which accepts the input of the said step information of the said steel materials of collation object by receiving the input which selects one or more in the said specific frame currently displayed on the said finder. 10.
  • the collation system which can change at least one of the display position, shape, and size in the finder individually for the plurality of specific frames displayed on the finder.
  • the input receiving means includes a designation input for designating a part of the plurality of specific frames displayed on the finder, and an imaging instruction input for imaging in a state where some of the specific frames are designated.
  • the imaging unit images in accordance with the imaging instruction input received by the input receiving unit,
  • the collation system in which the image recognition means performs an image recognition process using only a partial image within the specific frame designated at the time of image capture of the image captured by the image capture means.
  • the output unit displays the specific frame that has been imaged in a specified state and the specific frame that has not been imaged in a specified state in a distinguishable manner.
  • the verification system includes a terminal device configured to be able to communicate with each other, and a server device,
  • the terminal device includes the input receiving unit, the output unit, and the imaging unit.
  • the server device includes the storage unit, the correspondence information search unit, and the collation unit, A verification system in which either the terminal device or the server device includes the image recognition means.
  • a terminal device comprising the input receiving unit, the output unit, and the imaging unit included in the collation system according to any one of 1 to 12. 15.
  • a viewfinder displaying a pre-imaged image and / or an imaged image on the viewfinder, and overlaying the image with a specific frame indicating a partial area to be subjected to image recognition processing in the displayed image
  • Output means for displaying on the viewfinder, Imaging means for capturing the image displayed on the viewfinder;
  • a transmission unit configured to transmit only a partial image within the specific frame in the image captured by the imaging unit to an external device.
  • a viewfinder displaying a pre-imaged image and / or an imaged image on the viewfinder, and overlaying the image with a specific frame indicating a partial area to be subjected to image recognition processing in the displayed image
  • Output means for displaying on the viewfinder, Imaging means for capturing the image displayed on the viewfinder;
  • a terminal device comprising: transmission means for transmitting the image to an external device together with information for identifying a partial image in the specific frame in the image captured by the imaging means.
  • a server apparatus comprising: the storage unit included in the verification system according to any one of 1 to 12, the correspondence information search unit, and the verification unit. 18. The server apparatus according to claim 17, further comprising the image recognition means included in the collation system according to any one of 1 to 12. 19.
  • a program for a terminal device provided with an imaging means for capturing an image displayed on a viewfinder, Computer A pre-imaged and / or already-captured image is displayed on the finder, and a specific frame indicating a partial area to be subjected to image recognition processing in the displayed image is superimposed on the image and displayed on the finder.
  • a program for a terminal device provided with an imaging means for capturing an image displayed on a viewfinder, Computer A pre-imaged and / or already-captured image is displayed on the finder, and a specific frame indicating a partial area to be subjected to image recognition processing in the displayed image is superimposed on the image and displayed on the finder.
  • a program for a collation system that collates a plurality of steel materials that are stacked and stored in a plurality of stages in each of a plurality of areas each assigned an address, Computer
  • the identification information of each of the plurality of steel materials stored, the address of the area in which each of the steel materials is stored, and stage information indicating the position in the steel material group stacked in a plurality of stages are associated with each other.
  • Storage means for storing correspondence information; An input receiving means for receiving the address of the steel material to be verified and the input of the step information; Correspondence information search means for referring to the correspondence information and acquiring the identification information associated with the address and the stage information received by the input reception means; A pre-imaged and / or already-captured image is displayed on the finder, and a specific frame indicating a partial area to be subjected to image recognition processing in the displayed image is displayed on the finder so as to overlap the image.
  • Output means Imaging means for capturing the image displayed on the viewfinder; Image recognition processing is performed using only a partial image in the specific frame in the image picked up by the image pickup means, and an identification mark written on the surface of each of the plurality of steel materials is extracted and extracted.
  • Image recognition means for recognizing the identification information using the identification mark; Collation means for determining whether or not the identification information acquired by the correspondence information search means matches the identification information recognized by the image recognition means; Program to function as. 21-2. 21.
  • the input receiving means is configured to accept input of the address and a plurality of the step information in which a plurality of the steel materials to be verified are stored, A program for causing the output unit to display the plurality of specific frames on the finder in the same number as the number of the stage information received by the input receiving unit.
  • 21-5 In the program described in 21-4, Associating each of the plurality of specific frames with each of the stage information received by the input receiving means; A program that causes the output means to display a plurality of the specific frames so that the associated stage information can be identified. 21-6.
  • the computer A program for causing a plurality of the specific frames displayed on the finder to individually function as means for changing at least one of a display position, a shape, and a size in the finder. 21-11.
  • the identification information of each of the plurality of steel materials stored, the address of the area in which each of the steel materials is stored, and stage information indicating the position in the steel material group stacked in a plurality of stages are associated with each other.
  • a pre-imaged and / or already-captured image is displayed on the finder, and a specific frame indicating a partial area to be subjected to image recognition processing in the displayed image is displayed on the finder so as to overlap the image.
  • Image recognition processing is performed using only a partial image in the specific frame in the image captured in the imaging step, and identification marks written on the surfaces of the plurality of steel materials are extracted and extracted.
  • the verification method can target a plurality of the steel materials stored at the same address at a time,
  • the input receiving step it is possible to receive input of the address and a plurality of the step information in which a plurality of the steel materials to be verified are stored
  • the output step a collation method for displaying the plurality of specific frames on the finder in the same number as the number of the step information received in the input receiving step. 22-5.
  • Each of the plurality of specific frames is associated with each of the step information received in the input receiving step,
  • a plurality of the specific frames are displayed so that the associated step information can be identified. 22-6.
  • the step information associated with the address that has received the input in the input reception step is acquired,
  • the stage information acquired in the correspondence information search step is displayed as a list,
  • a verification method for receiving input of the stage information of the steel material to be verified by receiving an input for selecting one or more of the stage information displayed in the list. 22-7.
  • the verification method can target a plurality of the steel materials stored at the same address at a time,
  • the correspondence information search step the number of the first steel material that is the steel material stored in the address that has received the input in the input reception step is associated with the address in the correspondence information.
  • Each of the specific frames is associated with the step information associated with each of the first steel materials
  • a plurality of the specific frames are displayed so that the associated step information can be identified. 22-9.
  • a collation method for accepting an input of the step information of the steel material to be collated by receiving an input for selecting one or a plurality of the specific frames displayed on the finder. 22-10.
  • the input receiving step a designation input for designating a part of the plurality of specific frames displayed on the finder, and an imaging instruction input for imaging in a state where some of the specific frames are designated Accept
  • the imaging step imaging is performed according to the imaging instruction input received in the input reception step
  • the image recognition step a collation method in which image recognition processing is performed using only a partial image within the specific frame specified at the time of image capturing in the image captured in the image capturing step. 22-12.
  • the output step the specific frame that has been imaged in the designated state and the specific frame that has not been imaged in the designated state are displayed in a distinguishable manner.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Collating Specific Patterns (AREA)

Abstract

L'invention concerne un système de comparaison (1) ayant : une unité de recherche d'informations correspondantes (13) qui permet d'acquérir, en provenance d'informations correspondantes, des informations d'identification associées à des informations de niveau et l'adresse d'un élément en acier qui fait l'objet de la comparaison dont l'entrée a été reçue ; une unité de sortie (17) qui affiche une image dans un localisateur et affiche une trame spécifique de manière chevauchée ; une unité de capture d'image (14) qui capture l'image affichée au niveau du localisateur ; une unité de reconnaissance d'image (15) qui, au moyen uniquement d'une image partielle dans la trame spécifique dans l'image capturée, effectue un procédé de reconnaissance d'image pour extraire une marque d'identification notée sur la surface d'un élément en acier, et identifie les informations d'identification au moyen de la marque d'identification extraite ; et une unité de comparaison (16) qui détermine si oui ou non les informations d'identification acquises par l'unité de recherche d'informations correspondantes (13) et les informations d'identification reconnues par l'unité de reconnaissance d'image (15) correspondent.
PCT/JP2013/080637 2013-03-13 2013-11-13 Système de comparaison, dispositif terminal, dispositif serveur, procédé de comparaison, et programme WO2014141534A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2015505229A JP6123881B2 (ja) 2013-03-13 2013-11-13 照合システム、端末装置、サーバ装置、照合方法及びプログラム
CN201380074556.9A CN105008251B (zh) 2013-03-13 2013-11-13 核对系统、终端装置、服务器装置及核对方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013050843 2013-03-13
JP2013-050843 2013-03-13

Publications (1)

Publication Number Publication Date
WO2014141534A1 true WO2014141534A1 (fr) 2014-09-18

Family

ID=51536221

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/080637 WO2014141534A1 (fr) 2013-03-13 2013-11-13 Système de comparaison, dispositif terminal, dispositif serveur, procédé de comparaison, et programme

Country Status (3)

Country Link
JP (1) JP6123881B2 (fr)
CN (1) CN105008251B (fr)
WO (1) WO2014141534A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018150137A (ja) * 2017-03-13 2018-09-27 日本電気株式会社 物品管理システム、物品管理方法および物品管理プログラム
CN110597165A (zh) * 2019-08-30 2019-12-20 三明学院 一种堆钢监测系统及堆钢监测方法
WO2022114173A1 (fr) * 2020-11-30 2022-06-02 日本製鉄株式会社 Dispositif de suivi, procédé de suivi, structure de données de données de suivi et programme
JP7464460B2 (ja) 2020-06-22 2024-04-09 日本電気通信システム株式会社 情報処理装置、分布状況検知システム、分布状況検知方法およびコンピュータプログラム

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6262809B2 (ja) * 2016-06-28 2018-01-17 新日鉄住金ソリューションズ株式会社 システム、情報処理装置、情報処理方法、及び、プログラム
JP2018056253A (ja) * 2016-09-28 2018-04-05 パナソニックIpマネジメント株式会社 部品管理支援システムおよび部品管理支援方法
CN110573980B (zh) * 2019-07-25 2020-11-06 灵动科技(北京)有限公司 具有rfid读取器和内置打印机的自动驾驶系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5960413A (en) * 1996-03-05 1999-09-28 Amon; James A. Portable system for inventory identification and classification
JP2007326700A (ja) * 2006-06-09 2007-12-20 Nippon Steel Corp 鋼材の管理方法と管理システム
JP2008265909A (ja) * 2007-04-18 2008-11-06 Hitachi-Ge Nuclear Energy Ltd 資材保管位置管理システムおよびその方法
JP2012144371A (ja) * 2010-12-24 2012-08-02 Jfe Steel Corp 物品管理方法
WO2013005445A1 (fr) * 2011-07-06 2013-01-10 株式会社インスピーディア Système de collecte de stock et procédé de collecte de stock

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE602004032488D1 (de) * 2003-02-18 2011-06-16 Canon Kk Photographiervorrichtung mit MItteln zur Aufnahme von Funkinformationen und Steuerungsmethode dafür
EP1452997B1 (fr) * 2003-02-25 2010-09-15 Canon Kabushiki Kaisha Dispositif et procédé pour la gestion d'articles
US7290701B2 (en) * 2004-08-13 2007-11-06 Accu-Assembly Incorporated Gathering data relating to electrical components picked from stacked trays
KR100754656B1 (ko) * 2005-06-20 2007-09-03 삼성전자주식회사 이미지와 관련한 정보를 사용자에게 제공하는 방법 및시스템과 이를 위한 이동통신단말기
JP2011090662A (ja) * 2009-09-25 2011-05-06 Dainippon Printing Co Ltd 帳票受付システム、携帯電話、サーバ、プログラム、及び複写式帳票
CN101853387A (zh) * 2010-04-02 2010-10-06 北京物资学院 立体仓库货物盘点方法及系统
JP2011221860A (ja) * 2010-04-12 2011-11-04 Sanyo Special Steel Co Ltd 鋼材識別システム及びその方法
JP2012074804A (ja) * 2010-09-28 2012-04-12 Promise Co Ltd 証明書類撮影用カメラ、融資審査装置及び融資審査方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5960413A (en) * 1996-03-05 1999-09-28 Amon; James A. Portable system for inventory identification and classification
JP2007326700A (ja) * 2006-06-09 2007-12-20 Nippon Steel Corp 鋼材の管理方法と管理システム
JP2008265909A (ja) * 2007-04-18 2008-11-06 Hitachi-Ge Nuclear Energy Ltd 資材保管位置管理システムおよびその方法
JP2012144371A (ja) * 2010-12-24 2012-08-02 Jfe Steel Corp 物品管理方法
WO2013005445A1 (fr) * 2011-07-06 2013-01-10 株式会社インスピーディア Système de collecte de stock et procédé de collecte de stock

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018150137A (ja) * 2017-03-13 2018-09-27 日本電気株式会社 物品管理システム、物品管理方法および物品管理プログラム
CN110597165A (zh) * 2019-08-30 2019-12-20 三明学院 一种堆钢监测系统及堆钢监测方法
JP7464460B2 (ja) 2020-06-22 2024-04-09 日本電気通信システム株式会社 情報処理装置、分布状況検知システム、分布状況検知方法およびコンピュータプログラム
WO2022114173A1 (fr) * 2020-11-30 2022-06-02 日本製鉄株式会社 Dispositif de suivi, procédé de suivi, structure de données de données de suivi et programme
JPWO2022114173A1 (fr) * 2020-11-30 2022-06-02
JP7288231B2 (ja) 2020-11-30 2023-06-07 日本製鉄株式会社 トラッキング装置、トラッキング方法、およびプログラム
KR20230082662A (ko) * 2020-11-30 2023-06-08 닛폰세이테츠 가부시키가이샤 트래킹 장치, 트래킹 방법, 트래킹 데이터의 데이터 구조 및 프로그램
KR102595542B1 (ko) 2020-11-30 2023-10-30 닛폰세이테츠 가부시키가이샤 트래킹 장치, 트래킹 방법, 트래킹 데이터의 데이터 구조 및 프로그램

Also Published As

Publication number Publication date
JPWO2014141534A1 (ja) 2017-02-16
CN105008251A (zh) 2015-10-28
CN105008251B (zh) 2017-10-31
JP6123881B2 (ja) 2017-05-10

Similar Documents

Publication Publication Date Title
JP6123881B2 (ja) 照合システム、端末装置、サーバ装置、照合方法及びプログラム
JP5879725B2 (ja) 板金工程作業支援システム
JP6527410B2 (ja) 文字認識装置、文字認識方法、及びプログラム
JP6712045B2 (ja) 情報処理システムと、その処理方法及びプログラム
WO2018016214A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image et programme
JP6531368B2 (ja) 情報処理システム、情報処理装置、処理方法及びプログラム
US20180174324A1 (en) Image processing apparatus for clipping and sorting images from read image according to cards and control method therefor
US20160269586A1 (en) System, control method, and recording medium
JP5454639B2 (ja) 画像処理装置及びプログラム
JP5534207B2 (ja) 情報読取装置及びプログラム
JP6112240B2 (ja) 情報読取装置及びプログラム
JP5130081B2 (ja) 制御装置及びイメージデータの表示方法
WO2019181441A1 (fr) Dispositif de traitement d'informations, procédé de commande et programme
JP2017097859A (ja) 情報処理装置と、その処理方法及びプログラム
CN102496010A (zh) 一种结合预览图像和拍摄图像的名片识别方法
JP6708935B2 (ja) 情報処理装置、その処理方法及びプログラム
KR102273198B1 (ko) 시각적으로 코딩된 패턴 인식 방법 및 장치
WO2021033310A1 (fr) Dispositif de traitement, procédé de traitement et programme
CN114611475A (zh) 信息处理装置、信息处理方法和计算机可读介质
JP6249025B2 (ja) 画像処理装置及びプログラム
JP6875061B2 (ja) 画像判定システム、画像判定方法、画像判定プログラム、画像判定プログラムを記録する記録媒体
US11462014B2 (en) Information processing apparatus and non-transitory computer readable medium
JP6175642B2 (ja) データ変換装置、方法、及びコンピュータプログラム
JP6582875B2 (ja) 検品処理装置、検品システム、検品処理方法及びプログラム
JP2017091252A (ja) 情報入力装置及び情報入力プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13877730

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015505229

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13877730

Country of ref document: EP

Kind code of ref document: A1