WO2020008999A1 - Information processing device, information processing system, and information processing program - Google Patents

Information processing device, information processing system, and information processing program Download PDF

Info

Publication number
WO2020008999A1
WO2020008999A1 PCT/JP2019/025638 JP2019025638W WO2020008999A1 WO 2020008999 A1 WO2020008999 A1 WO 2020008999A1 JP 2019025638 W JP2019025638 W JP 2019025638W WO 2020008999 A1 WO2020008999 A1 WO 2020008999A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
article
information processing
unit
flying object
Prior art date
Application number
PCT/JP2019/025638
Other languages
French (fr)
Japanese (ja)
Inventor
亨 横渡
領 此村
Original Assignee
日本通運株式会社
本郷飛行機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本通運株式会社, 本郷飛行機株式会社 filed Critical 日本通運株式会社
Publication of WO2020008999A1 publication Critical patent/WO2020008999A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K17/00Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management

Definitions

  • the present invention relates to an information processing apparatus, an information processing system, and an information processing program for performing information processing relating to article management.
  • WMS Warehouse Management System
  • Patent Document 1 Such an inventory information processing system is disclosed in, for example, Patent Document 1.
  • the inventory information processing system disclosed in Patent Literature 1 compares the actual number of stocks measured by the user in the inventory work with the number of stocks managed by the master, and when the comparison result satisfies a predetermined condition. Next, an alarm is notified to the user. Thereby, the user can easily specify the article in which the abnormality has occurred in the stock quantity.
  • the present invention has been made in view of such circumstances, and has as its object to provide an information processing apparatus, an information processing system, and an information processing program for more easily managing articles.
  • an information processing device of one embodiment of the present invention From a flying object that has flown at the location of the article, receiving means for receiving imaging information obtained by associating an image captured by the flying object with information indicating a position at which the flying object was captured, Detecting means for detecting identification information of the article from an image included in the photographing information; Identification means for detecting the identification information detected by the detection means, and a position where the flying object included in the shooting information included in the detection information detected by the detection means, to specify the location of the article, Is provided.
  • 1 is a schematic diagram illustrating an example of an overall configuration of an information processing system according to an embodiment of the present invention.
  • 1 is a schematic diagram illustrating an example of a configuration in which an information processing system according to an embodiment of the present invention is overlooked.
  • It is a block diagram showing an example of composition of an inventory management device concerning one embodiment of the present invention.
  • It is a block diagram showing an example of composition of a drone concerning one embodiment of the present invention.
  • It is a block diagram showing an example of composition of a base unit concerning one embodiment of the present invention.
  • It is a block diagram showing an example of composition of an information processor concerning one embodiment of the present invention.
  • 6 is a flowchart illustrating a flow of a data matching process in the information processing system according to the embodiment of the present invention. It is an image figure showing an example of display of the matching result of data matching processing in the information processing system concerning one embodiment of the present invention.
  • FIG. 1 is a schematic diagram illustrating an overall configuration of an information processing system S according to the present embodiment.
  • the information processing system S includes an inventory management device 10, a drone 20, a base device 30, and an information processing device 40.
  • a shelf 53 in a place here, as an example, a warehouse
  • articles managed by the information processing system S are installed is also illustrated.
  • a manager of a warehouse or the like arranges a plurality of pallets 51, which are storage containers for articles to be managed, in a plurality of stages. Further, a pallet label 52 for identifying an article in the pallet is attached to each of the pallets 51.
  • the pallet label 52 may be any label. However, in the present embodiment, it is assumed that the pallet label 52 is a pallet label in which a management number (hereinafter, referred to as “article identification information”) for identifying an article in a warehouse is described in a two-dimensional code. I do. In the drawing, for convenience of illustration, only one pallet 51 and one pallet label 52 are denoted by reference numerals, and the other pallets 51 and other pallet labels 52 are not denoted by reference numerals.
  • the inventory management device 10 and the information processing device 40 are communicably connected to each other directly or via a network (not shown).
  • the drone 20 and the base device 30 are communicably connected to each other directly or via a network.
  • the base device 30 and the information processing device 40 are communicably connected to each other directly or via a network. Communication between these devices may be performed in accordance with an arbitrary communication method, and the communication method is not particularly limited.
  • the network is realized by, for example, any one of the Internet, a LAN (Local Area Network), and a mobile phone network, or a combination of these.
  • one device may perform communication based on a plurality of communication methods.
  • the base device 30 may conform to both a communication method for performing wired communication with the drone 20 and a communication method for performing wireless communication with the information processing device 40.
  • the inventory management device 10, the base device 30, and the information processing device 40 are realized by an electronic device having an information processing function, such as a personal computer, a server device, or a device unique to the present embodiment.
  • the drone 20 is realized by a drone having a photographing function.
  • FIG. 2 is a schematic diagram illustrating an example of a configuration in which the information processing system S according to the present embodiment is overlooked.
  • shelves 53 are arranged in a plurality of rows (corresponding to shelves 53a to 53e in the figure).
  • Pallets 51 to which pallet labels 52 are attached are arranged in the right and left rows in the drawing of each shelf 53 in the vertical direction in the drawing. That is, the articles stored on the pallet 51 to which the pallet label 52 is attached are arranged side by side in the horizontal direction and stacked in the vertical direction.
  • a drone 20 and a base device 30 corresponding to the drone 20 are arranged corresponding to each of the left and right columns in these figures.
  • the drone 20a and the base device 30a are arranged corresponding to the left column in the drawing of 53a. Further, for example, the drone 20b and the base device 30b are arranged corresponding to the right column in the drawing of 53a and the left column in the drawing of 53b.
  • the information processing system S having such a configuration performs a data matching process.
  • the data matching process refers to specifying the actual existence position of the article stored in each pallet 51 based on the photographing of the drone 20 and including the actual existence position of the identified article and the management information. It refers to a series of processes that collate the managed arrangement positions.
  • the drone 20 flies over the section of the warehouse where the shelves 53 are installed, and photographs each pallet 51 of each stage installed on the shelves 53 corresponding to the drone 20 itself. I do. Then, the drone 20 generates shooting information in which the shot image is associated with the position information at the time of shooting. When the shooting is completed, the drone 20 moves to the installation location of the base device 30 and transmits shooting information to the base device 30. The base device 30 transmits the received photographing information to the information processing device 40. Upon receiving the photographing information, the information processing device 40 detects the pallet label 52 by image-analyzing the received photographing information, and decodes the two-dimensional code described in the detected pallet label 52 to identify the item. Detect information.
  • the information processing apparatus 40 determines the actual existence position of each of the pallets 51 (the articles stored in the pallets 51) based on the detected article identification information and the position information included in the imaging information to be detected. Identify. Further, the information processing device 40 acquires the inventory information of each of the pallets 51 (articles stored in the pallet 51) managed by the inventory management device 10 by communicating with the inventory management device 10. Finally, the information processing apparatus 40 determines the actual location of each of the pallets 51 (the articles stored in the pallets) identified based on the photographing information, and the location of the management article included in the stock information. Matches and outputs the result of this match.
  • the present embodiment By performing the above-described data matching processing, in the present embodiment, it is possible to eliminate the need for manual visual inspection and manual barcode scanning. Further, even if the warehouse is large-scale or the ceiling height of the warehouse is high, the present embodiment can be applied. That is, according to the present embodiment, it is possible to more easily manage the articles.
  • the configuration illustrated in FIG. 1 is merely an example of the present embodiment, and the present embodiment is not limited to this configuration.
  • the number of each device and the number of shelves 53 included in the information processing system S are not limited to those illustrated, and may be any number.
  • the articles stored in the pallet 51 may be arbitrary.
  • the pallet label 52 may be directly attached to the article.
  • the inventory management device 10 includes a CPU (Central Processing Unit) 11, a ROM (Read Only Memory) 12, a RAM (Random Access Memory) 13, a communication unit 14, a storage unit 15, An input unit 16 and a display unit 17 are provided. These units are connected by a bus via signal lines, and mutually transmit and receive signals.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • a communication unit 14 a storage unit
  • An input unit 16 and a display unit 17 are provided. These units are connected by a bus via signal lines, and mutually transmit and receive signals.
  • the CPU 11 executes various processes according to a program recorded in the ROM 12 or a program loaded from the storage unit 15 into the RAM 13.
  • the RAM 13 also appropriately stores data and the like necessary for the CPU 11 to execute various processes.
  • the communication unit 14 performs communication control for allowing the CPU 11 to communicate with another device included in the information processing system S.
  • the storage unit 15 is composed of a semiconductor memory such as a DRAM (Dynamic Random Access Memory) and stores various data.
  • the input unit 16 is configured by various buttons and a touch panel, or an external input device such as a mouse and a keyboard, and inputs various information according to a user's instruction operation.
  • the display unit 17 includes a liquid crystal display or the like, and displays an image corresponding to the image data output by the CPU 11.
  • the inventory information management unit 111 functions in the CPU 11 as shown in FIG. In one area of the storage unit 15, a stock information storage unit 151 is set.
  • the stock information storage unit 151 stores stock information of articles in a warehouse managed by the information processing system S.
  • This stock information is, for example, the same information as the stock information managed in a general WMS. More specifically, for example, data such as article identification information, the number of stocked articles, and identification information (location information) of each position where the articles are arranged are stored as stock information in a table format, for example.
  • the inventory information management unit 111 includes a function of managing inventory information stored in the inventory information storage unit 151.
  • the stock information management unit 111 stores the stock information based on, for example, a user operation received through the input unit 16 or data received from another device (not shown) used by the user via the communication unit 14.
  • the management is performed by updating the stock information stored in the storage unit 151 to the latest content. That is, the inventory management device 10 realizes a function as a so-called WMS.
  • the inventory information management unit 111 reads out the latest inventory information from the inventory information storage unit 151. Then, the inventory information management unit 111 transmits the read latest inventory information to the information processing device 40 as a response to the inventory information request.
  • the drone 20 includes a CPU 21, a ROM 22, a RAM 23, a communication unit 24, a storage unit 25, a photographing unit 26, a driving unit 27, a sensor unit 28, and a battery 29. Have. These units are connected by a bus via signal lines, and mutually transmit and receive signals.
  • the functions of the CPU 21, the ROM 22, the RAM 23, the communication unit 24, and the storage unit 25 as hardware are the same as the hardware functions of the units having the same names but differing only in the codes, provided in the inventory management device 10 described above. It is. Therefore, duplicate description will be omitted.
  • the image capturing unit 26 is configured by a camera or the like including an optical lens, an image sensor, and the like, and captures an image around the drone 20 (for example, an image including the pallet 51 and the pallet label 52 arranged on the shelf 53).
  • the image obtained by the photographing of the photographing unit 26 is converted into a digital signal and output to, for example, the CPU 21 or the like.
  • the drive unit 27 is driven using electric power supplied from a battery 29 described later.
  • the drone 20 can fly in space by the driving of the driving unit 27.
  • the drive unit 27 is composed of, for example, a set of a propeller that generates lift and thrust and a motor that rotates the propeller.
  • the sensor unit 28 is a sensor for detecting a distance from another object (for example, the shelf 53 or the bottom or wall surface of the warehouse). The distance detected by the sensor unit 28 is converted into a digital signal and output to, for example, the CPU 21 or the like.
  • the sensor unit 28 includes, for example, a sensor that detects a distance using a radio wave in a microwave band and a sensor that detects a distance using an ultrasonic wave. Further, the sensor unit 28 may include an acceleration sensor, an angular velocity sensor, and the like, for detecting a moving distance, a moving direction, and the like of the drone 20.
  • the battery 29 stores electric power, and supplies this electric power to the driving unit 27 and other parts of the drone 20. In addition, the battery 29 stores the consumed power again by receiving power supply from the base device 30.
  • a schedule management unit 211 When the drone 20 operates, as shown in FIG. 4, in the CPU 21, a schedule management unit 211, a flight control unit 212, and a shooting information generation unit 213 function. In one area of the storage unit 25, a schedule storage unit 251, a position detection information storage unit 252, and a shooting information storage unit 253 are set.
  • the schedule storage unit 251 stores a schedule for performing photographing or the like by the drone 20.
  • a time zone during which the drone 20 should take a picture while flying is stored as a schedule.
  • the time period during which the drone 20 should take an image while flying can be set arbitrarily, but may be set to a time period during which the user does not work in the warehouse. For example, a nighttime or early morning time zone may be set. Accordingly, it is possible to perform photographing or the like by the drone 20 without disturbing a user who works in the warehouse.
  • the information for the drone 20 to detect its own current position is stored in the position detecting information storage unit 252.
  • the respective positional relationships between the respective base devices 30, the respective shelves 53, and markers for example, markers on a ladder (ladder)
  • markers for example, markers on a ladder (ladder)
  • Information indicating the number of stages of each shelf 53, identification information (location information) of each position in the warehouse, and the like are stored as information for detecting the current position.
  • Pieces of information are represented by, for example, coordinates in a three-dimensional coordinate system common to the pieces of information.
  • the shooting information storage unit 253 stores the shooting information generated by the shooting information generation unit 213.
  • the photographing information is information in which a photographed image is associated with positional information at the time of photographing (for example, information on a two-dimensional position or information on a three-dimensional position).
  • the schedule management unit 211 manages the execution of photographing and the like by the drone 20 based on the schedule stored in the schedule storage unit 251. For example, when a predetermined time zone set in the schedule has arrived, the schedule management unit 211 instructs the flight control unit 212 to start flying and photographing by the drone 20. In addition, the schedule management unit 211 can stop the flight and the imaging by giving an instruction to the flight control unit 212 at a time other than the predetermined time zone set in the schedule. In this case, the schedule management unit 211 instructs the flight control unit 212 to cause the drone 20 to receive power supply from the base device 30 corresponding to the drone 20 and store the power consumed in the battery 29 again. Can be.
  • the flight control unit 212 starts the photography of the drone 20 or the like based on the instruction from the schedule management unit 211.
  • the flight control unit 212 causes the drone 20 to autonomously fly around the shelf 53 corresponding to the drone 20, and the photographing unit 26 photographs the shelf 53.
  • the flight control unit 212 realizes an autonomous flight based on the information for detecting its own current position stored in the position detection information storage unit 252 and the detection result by the sensor unit 28.
  • the drone 20 is flying by calculating the moving distance and the moving direction from the position of the base device 30 corresponding to the drone 20 that is the position where the flight starts, based on the detection result by the sensor unit 28. Identify the current 3D position.
  • the flight control unit 212 performs autonomous control based on the current three-dimensional position of the drone 20 and the positional relationship between each base device 30 and each shelf 53 stored in the position detection information storage unit 252. Realize the flight.
  • the flight control unit 212 analyzes the image captured by the image capturing unit 26 (or by the sensor unit 28) to detect a marker disposed on a floor of a warehouse or the like, and based on the detection result of the marker. Thus, the current three-dimensional position where the drone 20 is flying may be corrected.
  • the flight control unit 212 controls the photographing unit 26 during the autonomous flight, so that the surroundings of the drone 20 (for example, the periphery including the pallet 51 and the pallet label 52 arranged on the shelf 53) are controlled at a predetermined cycle. ) To shoot. Then, the flight control unit 212 outputs the photographed image and information on the position at the time of photographing to the photographing information generating unit 213.
  • the autonomous flight and photographing are performed until the entire periphery of the shelf 53 corresponding to the drone 20 is photographed. More specifically, pallets 51 and pallet labels 52 are arranged in a row on the shelf 53 in a horizontal direction and stacked in a vertical direction. For this reason, the flight control unit 212, for example, shoots one stage of the shelf 53 as a shooting target, and when all the stages have been shot, ends the autonomous flight and shooting and responds to itself. It returns to the base device 30 which performs. Then, the drone 20 receives power supply from the base device 30 until a predetermined time zone arrives again and there is an instruction again from the schedule management unit 211.
  • the drone 20 may shoot all the stages of the shelf 53 at one time, but each time one stage is shot, the drone 20 returns to the base device 30 and receives power, and then shoots the next stage. It may be performed. In this way, the capacity of the battery 29 can be reduced.
  • the shooting information generation unit 213 generates shooting information by associating a shot image input from the flight control unit 212 with information on the position at the time of shooting. Then, the shooting information generation unit 213 causes the shooting information storage unit 253 to store the generated shooting information.
  • the base device 30 includes a CPU 31, a ROM 32, a RAM 33, a communication unit 34, a storage unit 35, and a power supply unit 36. These units are connected by a bus via signal lines, and mutually transmit and receive signals.
  • the functions of the hardware of the CPU 31, the ROM 32, the RAM 33, the communication unit 34, and the storage unit 35 are the same as those of the units having the same names but differing only in the signs, provided in the inventory management device 10 and the drone 20 described above. Equivalent to function. Therefore, duplicate description will be omitted.
  • the power supply unit 36 is a unit that supplies power to the drone 20.
  • the base device 30 obtains electric power from a home power supply in the warehouse. Then, power is supplied to the drone 20 by converting the power into a voltage or the like suitable for supplying power to the drone 20.
  • the power supply to the drone 20 is continued after the autonomous flight and the photographing by the drone 20 are completed and until the autonomous flight and the photographing by the drone 20 are performed again (or until the battery 29 is sufficiently charged).
  • the photographing information relay unit 311 and the power supply control unit 312 function.
  • a shooting information storage unit 351 is set.
  • the shooting information storage unit 351 stores shooting information acquired from the drone 20.
  • the photographing information relay unit 311 transmits the photographing information acquired from the drone 20 to the information processing device 40. That is, the photographing information relay unit 311 relays the photographing information.
  • the relay of the photographing information can be performed at an arbitrary timing.
  • the photographing information relay unit 311 communicates with the drone 20 while receiving power from the power supply unit 36 after the drone 20 has completed the automatic flight and the photographing.
  • the present invention is not limited to this, and the imaging information relay unit 311 may communicate with the drone 20, for example, while the drone 20 is performing automatic flight and imaging. Then, the photographing information relay unit 311 acquires the photographing information generated by the drone 20 through this communication and stored in the photographing information storage unit 253.
  • the photographing information relay unit 311 stores the acquired photographing information in the photographing information storage unit 351. Further, the photographing information relay unit 311 transmits the photographing information stored in the photographing information storage unit 351 to the information processing device 40 at a predetermined timing.
  • the predetermined timing may be immediately after acquiring the shooting information from the drone 20, or may be a timing at which the information processing device 40 requests the shooting information.
  • the power supply control unit 312 is a part that controls power supply by the power supply unit 36 described above. Specifically, the power supply control unit 312 controls the above-described conversion of the voltage and the like, and the start and end of power supply.
  • the information processing device 40 includes a CPU 41, a ROM 42, a RAM 43, a communication unit 44, a storage unit 45, an input unit 46, and a display unit 47. These units are connected by a bus via signal lines, and mutually transmit and receive signals.
  • the functions as hardware of the CPU 41, the ROM 42, the RAM 43, the communication unit 44, the storage unit 45, the input unit 46, and the display unit 47 are provided in the inventory management device 10, the drone 20, and the base device 30 described above. This is equivalent to the function as hardware of each unit having the same name except for the sign. Therefore, duplicate description will be omitted.
  • the photographing information acquisition unit 411, the identification information detection unit 412, the presence position identification unit 413, the inventory data acquisition unit 414, the data matching unit 415 function.
  • a shooting information storage unit 451 and a specific information storage unit 452 are set.
  • the shooting information storage unit 451 stores shooting information of each drone 20 received from each base device 30.
  • the specific information storage unit 452 stores the actual existence position of each of the pallets 51 (articles stored in the pallets 51) identified by the existence position identification unit 413 described below based on the photographing information.
  • the photographing information acquisition unit 411 acquires the photographing information of each drone 20 by receiving it from each base device 30, and stores the acquired photographing information of each drone 20 in the photographing information storage unit 451.
  • the identification information detection unit 412 detects the pallet label 52 from the image by performing image analysis on the imaging information stored in the imaging information storage unit 451 using an existing image analysis technique. Further, the identification information detection unit 412 detects the article identification information by decoding the two-dimensional code described in the detected pallet label 52. Further, the identification information detection unit 412 associates the position information included in the imaging information to be detected with the article identification information. Then, the identification information detection unit 412 outputs the associated position information and article identification information to the existing position identification unit 413.
  • the existence position specifying unit 413 specifies the actual existence position of each of the pallets 51 (the items stored in the pallets 51) based on the item identification information and the position information input from the identification information detecting unit 412. That is, the existence position specifying unit 413 specifies that the article stored in the pallet 51 corresponding to the article identification information exists at the position corresponding to the position information. Further, the existence position specifying unit 413 causes the specific information storage unit 452 to store the actual position of each specified article.
  • the inventory data acquisition unit 414 acquires the inventory information of the articles stored in each pallet 51 managed by the inventory management device 10 by communicating with the inventory management device 10. For example, the inventory data acquisition unit 414 requests the inventory management device 10 for current inventory information of the article. Further, the inventory data acquisition unit 414 outputs the current inventory information of the article returned from the inventory management device 10 to the data matching unit 415 in response to the request. Note that the current stock information of the article may be spontaneously transmitted from the stock management device 10 without requiring a request from the stock data acquisition unit 414.
  • the data collating unit 415 includes the actual existence position of the article stored in each pallet 51 identified by the existence position identification unit 413 based on the photographing information, and the article management information input from the inventory data acquisition unit 414.
  • the arrangement positions of the managed articles are collated, and the result of the collation is output.
  • the output of the matching result is realized by, for example, displaying on the display unit 47, printing on a paper medium using another terminal (not shown) used by the user, or a printer (not shown). A specific example of the output matching result will be described later with reference to FIG.
  • the data matching process refers to specifying the actual existence position of the article stored in each pallet 51 based on the photographing of the drone 20, and adding the actual existence position of the identified article to the management information. This refers to a series of processes for collating the included management arrangement positions and the like.
  • the data matching process is periodically performed when the operation of the information processing system S starts. It should be noted that as a premise of the data matching process, management of inventory information by the inventory information management unit 111 is also performed periodically.
  • step S11 the base device 30 supplies power to the drone 20.
  • step S12 the drone 20 determines whether a predetermined time zone for performing autonomous flight and photographing has arrived. If the predetermined time period has arrived, it is determined as Yes in step S12, and the process proceeds to step S13. On the other hand, when the predetermined time zone has not arrived, No is determined in step S12, and the power supply in step S11 is continued.
  • step S13 the drone 20 generates shooting information by performing autonomous flight and shooting.
  • step S ⁇ b> 14 the drone 20 determines whether or not all the positions of the shelf 53 to be photographed have been photographed and the generation of the photographing information based on the photographing has been completed. When the generation of the photographing information is completed, Yes is determined in step S14, and the process proceeds to step S15. On the other hand, if the generation of the shooting information has not been completed, No is determined in step S14, and the generation of the shooting information in step S13 continues.
  • step S15 the drone 20 returns to the base device 30 corresponding to itself, and transmits the imaging information generated in step S13 to the base device 30.
  • step S16 the base device 30 transmits the imaging information received in step S15 to the information processing device 40.
  • step S17 the base device 30 performs power supply to the drone 20 again.
  • step S18 the information processing device 40 detects article identification information by performing image analysis or the like on the photographing information received in step S16.
  • step S19 the information processing device 40 specifies the actual existence position of the article stored in each pallet 51 based on the article identification information detected in step S18 and the corresponding position information.
  • step S20 the information processing device 40 requests the inventory management device 10 for current inventory information of the article.
  • step S21 the inventory management device 10 transmits the current item inventory information to the information processing device 40 as a response to the request in step S20.
  • step S22 the information processing device 40 includes the actual existence position of the item stored in each pallet 51 specified in step S19 and the item inventory information transmitted from the inventory management device 10 in step S21. Collate the positions of the items on the management.
  • step S23 the information processing device 40 outputs the result of the abutment performed in step S23.
  • the present embodiment By performing the above-described data matching processing, in the present embodiment, it is possible to eliminate the need for manual visual inspection and manual barcode scanning. Further, even if the warehouse is large-scale or the ceiling height of the warehouse is high, the present embodiment can be applied. Further, in the present embodiment, since the drone 20 performs autonomous flight using a sensor or the like, data matching processing is performed in an indoor warehouse or the like where it is difficult to perform positioning or the like by GPS (Global Positioning System). be able to.
  • GPS Global Positioning System
  • the pallet label 52 generally used in the management of articles in the warehouse is read and used, it is not necessary to separately prepare a module such as an RFID (Radio Frequency IDentifier). That is, according to the present embodiment, the data matching process can be performed at low cost. Further, in the present embodiment, since a two-dimensional code is used instead of a one-dimensional barcode which needs to be photographed with high precision in order to perform decoding, an image photographed during flight is photographed with such high precision. Even if the decoding has not been performed, decoding can be performed.
  • a module such as an RFID (Radio Frequency IDentifier).
  • the present embodiment since it is not necessary to perform processing such as detection of a two-dimensional code by image analysis and decoding of the detected two-dimensional code in the drone 20, it is possible to suppress the arithmetic processing capability required for the drone 20. Can be. Accordingly, the power consumption of the drone 20 can be reduced, and the drone 20 itself can be easily reduced in size. That is, according to the present embodiment, it is possible to more easily manage the articles.
  • FIG. 8 is an image diagram showing an example of the display of the result of the data matching process in the information processing system S. As shown in FIG. 8, this display example includes three display areas, a display area AR1, a display area AR2, and a display area AR3.
  • the first operation button is displayed in the display area AR1.
  • the first operation button is a user interface for the user to instruct the printing of the matching result.
  • the user performs an operation of pressing the first operation button when the user wants to print the matching result.
  • a second operation button is displayed in the display area AR2.
  • the second operation button is a user interface for the user to instruct output of the data of the matching result in a predetermined format (here, for example, CSV (comma-separated @ values) format).
  • the user performs an operation of pressing the second operation button when the user desires to output the result data in a predetermined format.
  • the output destination of the data may be the storage unit 45 included in the information processing apparatus 40, or may be output to another terminal (not shown) used by the user, as described above.
  • the result of the comparison is displayed in a predetermined format (in this example, a table format).
  • the reconciliation result indicates, for example, whether or not the article exists at a position corresponding to the arrangement position of the article on management included in the stock information of the article managed by the stock management apparatus 10.
  • Information, information indicating whether there is an article not included in the stock information, and information indicating whether an article included in the stock information does not exist.
  • IPA_PRODUCT CODE stores a product number of an article. Further, “item: IPA_SUB INVENTORY” stores status information of the article.
  • IPA_ARRIVED DATE stores information indicating the date when the pallet 51 storing the articles arrives at the warehouse. Further, “item: IPA_QUANTITY” stores the carton quantity of the article.
  • IPA_PACKAGES stores the package quantity of the article. Further, in “item: IPA_DAMAGE FLAG”, information indicating whether or not the article has damage such as a scratch or breakage is stored as a flag.
  • “item: CHECKED DATE” stores information indicating a date on which the autonomous flight and imaging by the drone 20 were performed.
  • “item: CHECKED TIME” stores information indicating a time zone in which the autonomous flight and imaging by the drone 20 are performed.
  • “item: RESULT” stores information indicating a result of the data matching process.
  • the information indicating the matching result is information indicating whether or not the article is present at a position corresponding to the management article arrangement position included in the article stock information.
  • “Item: RESULT @ ANALYSIS” stores information indicating a more detailed matching result obtained by analyzing the matching result in the data matching process.
  • the information indicating a more detailed matching result is, for example, information “1: OK” indicating that an article is present at a position that coincides with a management arrangement position of the article included in the stock information of the article,
  • the information “2: Wrong @ Location” indicating that an article exists at a position that does not match the management article placement position included in the article inventory information, or the management information included in the article inventory information.
  • 6: Decode @ Failed indicating that the analysis of the two-dimensional code has failed
  • 7 Non-Located "indicating that the location information has not been correctly assigned.
  • the information indicating whether or not there is damage at a position that does not match the arrangement position of the managed article is the information “Found @ Damaged” indicating that there is an article stored as a flag.
  • the user who refers to the information indicating the result of the matching in the data matching process can know whether or not the article exists at an appropriate position according to the stock information for each article. Further, even if the article is not located at an appropriate position, the user can know that the article is located at another position. Further, the user can know that there is an article that is not included in the stock information in the first place. Thereby, for example, the user moves the article in the warehouse to an appropriate position, or corrects the stock information managed by the stock management apparatus 10 so that the stock information is the same as the actual article location. , And so on. In addition, the user can visually confirm the actual thing of the article that could not be analyzed.
  • the information processing device 40 may also notify the user of such specific countermeasures in accordance with the result of the data matching process.
  • “item: FINAL @ RESULT” stores information indicating the result of a match when the user visually checks the location of the article. As described above, the user may visually check an article or the like that could not be analyzed in the data matching process. In such a case, information is stored in “item: FINAL @ RESULT”.
  • the type of information stored in “item: RESULT @ ANALYSIS” is the same as the type of information such as “1: OK” stored in “item: RESULT” described above. Therefore, a duplicate description is omitted here.
  • “item: PICTURE” functions as a user interface for displaying an image photographed on the drone 20 in the data matching process.
  • the user can cause the drone 20 to display an image photographed by photographing by performing an operation of checking a check box provided in “item: PICTURE”.
  • the user can refer to the images of the pallet 51 and the pallet label 52, and can visually recognize the state of the pallet 51 and specify the article identification information. Therefore, it is possible to reduce the trouble of the user visually confirming the actual product.
  • the drone 20 has realized the autonomous flight by specifying the current position without using the GPS.
  • the present invention is not limited thereto, and the drone 20 may realize an autonomous flight using a positioning result by the GPS in an environment where a signal from a satellite in the GPS can be received.
  • the drone 20 performs the autonomous flight and the photographing when the predetermined time period has arrived, and generates the photographing information.
  • the present invention is not limited to this, and the drone 20 may generate the shooting information by performing the flight and the shooting in response to an instruction from the user. Further, the drone 20 may fly in response to a user operation, instead of flying autonomously.
  • the photographing information relay unit 311 of the base device 30 transmits the photographing information acquired from the drone 20 to the information processing device 40. That is, the photographing information relay unit 311 relays the photographing information.
  • the invention is not limited thereto, and the drone 20 and the information processing device 40 may be communicably connected to each other directly or via a network. Then, the drone 20 may transmit the shooting information to the information processing device 40 without passing through the base device 30.
  • the present invention is not limited to the above-described embodiment, and includes modifications and improvements as long as the object of the present invention can be achieved.
  • the series of processes described above can be executed by hardware or can be executed by software.
  • each of the above-described functional blocks may be configured by hardware alone, may be configured by software alone, or may be configured by a combination thereof.
  • the functional configurations illustrated in FIGS. 3, 4, 5, and 6 are merely examples, and are not particularly limited. That is, it is sufficient that the information processing system S has a function capable of executing the above-described series of processes as a whole, and what kind of functional blocks are used to realize this function is particularly described in FIGS. 5 and 6 are not limited.
  • the functional configuration included in the present embodiment can be realized by a processor that executes arithmetic processing, and a processor that can be used in the present embodiment includes various types of processors such as a single processor, a multiprocessor, and a multicore processor.
  • the present invention also includes a combination of these various processing devices and a processing circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field-Programmable Gate Array).
  • a program constituting the software is installed in a computer or the like from a network or a recording medium.
  • the computer may be a computer embedded in dedicated hardware. Further, the computer may be a computer that can execute various functions by installing various programs, for example, a general-purpose personal computer.
  • the recording medium including such a program may be provided to the user by being distributed separately from the apparatus main body in order to provide the program to the user, or may be provided to the user in a state where the recording medium is incorporated in the apparatus main body in advance. Is also good.
  • the storage medium distributed separately from the apparatus main body is composed of, for example, a magnetic disk (including a floppy disk), an optical disk, a magneto-optical disk, or the like.
  • the optical disc is composed of, for example, a CD-ROM (Compact Disc-Only Memory), a DVD (Digital Versatile Disc), a Blu-ray (registered trademark) Disc (Blu-ray Disc), and the like.
  • the magneto-optical disk is composed of an MD (Mini-Disk) or the like.
  • the recording medium provided to the user in a state in which the program is pre-installed in the apparatus main body is, for example, the ROM 12 in FIG. 3, the ROM 22, in FIG. 4, the ROM 32 in FIG. 5, and the ROM 42 in FIG.
  • the steps of describing a program recorded on a recording medium may be performed in chronological order according to the order, or in parallel or individually, even if not necessarily performed in chronological order. This includes the processing to be executed.
  • the term “system” refers to an entire device including a plurality of devices and a plurality of means.
  • Reference Signs List 10 inventory management device 20 drone 30 base device 40 information processing device 11, 21, 31, 41 CPU 12, 22, 32, 42 ROM 13, 23, 33, 43 RAM 14, 24, 34, 44 Communication unit 15, 25, 35, 45 Storage unit 16, 46 Input unit 17, 47 Display unit 26 Imaging unit 27 Drive unit 28 Sensor unit 29 Battery 36 Power supply unit 51 Pallet 52 Pallet label 53 Shelf 111 Inventory information management unit 151 Inventory information storage unit 211 Schedule management unit 212 Flight control unit 213 Imaging information generation unit 251 Schedule storage unit 252 Position detection information storage unit 253 Imaging information storage unit 311 Imaging information relay unit 312 Power supply control unit 351 Imaging information Storage unit 411 Photographing information acquisition unit 412 Identification information detection unit 413 Location identification unit 414 Inventory data acquisition unit 415 Data matching unit 451 Photography information storage unit 452 Specific information storage unit S Information processing system

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Economics (AREA)
  • Artificial Intelligence (AREA)
  • Toxicology (AREA)
  • General Health & Medical Sciences (AREA)
  • Electromagnetism (AREA)
  • Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • General Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Strategic Management (AREA)
  • Quality & Reliability (AREA)
  • Operations Research (AREA)
  • Marketing (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Development Economics (AREA)
  • Warehouses Or Storage Devices (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The purpose of the present invention is to more simply manage articles. An information processing device 40 comprises an image capturing information acquisition unit 411, an identification information detection unit 412, and an existent position identification unit 413. The image capturing information acquisition unit 411 receives, from a flying body having flied over a place where an article is placed, image capturing information in which an image captured by the flying body is associated with information indicative of a position where the flying body captured the image. The identification information detection unit 412 detects, in the image included in the image capturing information, identification information related to the article. The existent position identification unit 413 identifies the existent position of the article from the identification information detected by the identification information detection unit 412 and the position where the flying body captured the image, the position being included in the image capturing information in which the detection means performed the detection.

Description

情報処理装置、情報処理システム、及び情報処理プログラムInformation processing apparatus, information processing system, and information processing program
 本発明は、物品管理に関する情報処理を行なう、情報処理装置、情報処理システム、及び情報処理プログラムに関する。 The present invention relates to an information processing apparatus, an information processing system, and an information processing program for performing information processing relating to article management.
 近年、より効率的に物品の管理を行なうために、例えば、コンピュータを用いて倉庫等の在庫管理を行なうWMS(Warehouse Management System)等のシステムが用いられている。 In recent years, in order to manage articles more efficiently, for example, a system such as WMS (Warehouse Management System) that manages inventory of warehouses and the like using a computer has been used.
 このような在庫情報処理システムが、例えば、特許文献1に開示されている。特許文献1に開示の在庫情報処理システムでは、ユーザが棚卸し作業において計測した実際の在庫の数と、マスタにて管理されている在庫の数を比較し、比較結果が所定の条件を満たした場合に、ユーザに警報を報知する。これにより、ユーザは、在庫数に異常が発生している物品を容易に特定することができる。 在 庫 Such an inventory information processing system is disclosed in, for example, Patent Document 1. The inventory information processing system disclosed in Patent Literature 1 compares the actual number of stocks measured by the user in the inventory work with the number of stocks managed by the master, and when the comparison result satisfies a predetermined condition. Next, an alarm is notified to the user. Thereby, the user can easily specify the article in which the abnormality has occurred in the stock quantity.
特開2011-197948号公報JP 2011-197948 A
 上述した特許文献1に開示の技術等の一般的な技術を用いることにより、在庫管理を容易とすることができる。しかしながら、倉庫内の在庫の実数の計測自体は、ユーザによる目視やバーコードスキャナを利用したスキャン等により、人手で行われており、ユーザの負担となっていた。特に、倉庫が大規模な場合や、倉庫の天井高が高く、複数段に物品が積載されている場合は、高所作業が発生するなどし、ユーザの負担が増大すると共に、作業時間も長くなってしまっていた。 (4) By using a general technique such as the technique disclosed in Patent Document 1 described above, inventory management can be facilitated. However, the measurement of the actual number of inventory in the warehouse itself is manually performed by visual observation by a user, scanning using a barcode scanner, or the like, which is a burden on the user. In particular, when the warehouse is large, or when the ceiling height of the warehouse is high and goods are stacked in multiple stages, work at heights may occur, increasing the burden on the user and increasing the work time. Had become.
 本発明は、このような状況に鑑みてなされたものであり、より簡便に物品の管理を行なうための、情報処理装置、情報処理システム、及び情報処理プログラムを提供することを目的とする。 The present invention has been made in view of such circumstances, and has as its object to provide an information processing apparatus, an information processing system, and an information processing program for more easily managing articles.
 上記目的を達成するため、本発明の一態様の情報処理装置は、
 物品の配置場所を飛行した飛行体から、前記飛行体が撮影した画像と、前記飛行体が撮影した位置を示す情報とを対応付けた撮影情報を受信する受信手段と、
 前記撮影情報に含まれる画像から、前記物品の識別情報を検出する検出手段と、
 前記検出手段が検出した識別情報と、前記検出手段が検出対象とした撮影情報に含まれる前記飛行体が撮影した位置とから、前記物品の存在位置を特定する特定手段と、
 を備える。
In order to achieve the above object, an information processing device of one embodiment of the present invention
From a flying object that has flown at the location of the article, receiving means for receiving imaging information obtained by associating an image captured by the flying object with information indicating a position at which the flying object was captured,
Detecting means for detecting identification information of the article from an image included in the photographing information;
Identification means for detecting the identification information detected by the detection means, and a position where the flying object included in the shooting information included in the detection information detected by the detection means, to specify the location of the article,
Is provided.
 本発明によれば、より簡便に物品の管理を行なうことができる。 According to the present invention, articles can be managed more easily.
本発明の一実施形態に係る情報処理システムの全体構成の一例を示す模式図である。1 is a schematic diagram illustrating an example of an overall configuration of an information processing system according to an embodiment of the present invention. 本発明の一実施形態に係る情報処理システムを俯瞰した構成の一例を示す模式図である。1 is a schematic diagram illustrating an example of a configuration in which an information processing system according to an embodiment of the present invention is overlooked. 本発明の一実施形態に係る在庫管理装置の構成の一例を示すブロック図である。It is a block diagram showing an example of composition of an inventory management device concerning one embodiment of the present invention. 本発明の一実施形態に係るドローンの構成の一例を示すブロック図である。It is a block diagram showing an example of composition of a drone concerning one embodiment of the present invention. 本発明の一実施形態に係る基地装置の構成の一例を示すブロック図である。It is a block diagram showing an example of composition of a base unit concerning one embodiment of the present invention. 本発明の一実施形態に係る情報処理装置の構成の一例を示すブロック図である。It is a block diagram showing an example of composition of an information processor concerning one embodiment of the present invention. 本発明の一実施形態に係る情報処理システムにおける、データ突合処理の流れを説明するフローチャートである。6 is a flowchart illustrating a flow of a data matching process in the information processing system according to the embodiment of the present invention. 本発明の一実施形態に係る情報処理システムにおける、データ突合処理の突合結果の表示の一例を示すイメージ図である。It is an image figure showing an example of display of the matching result of data matching processing in the information processing system concerning one embodiment of the present invention.
 以下、添付の図面を参照して本発明の実施形態の一例について説明する。
 [システム構成]
 図1は、本実施形態に係る情報処理システムSの全体構成を示す模式図である。図1に示すように、情報処理システムSは、在庫管理装置10、ドローン20、基地装置30、及び情報処理装置40を含む。また、図中には、この情報処理システムSが管理対象とする物品の設置された場所(ここでは、一例として倉庫)における棚53も図示する。
Hereinafter, an example of an embodiment of the present invention will be described with reference to the accompanying drawings.
[System configuration]
FIG. 1 is a schematic diagram illustrating an overall configuration of an information processing system S according to the present embodiment. As shown in FIG. 1, the information processing system S includes an inventory management device 10, a drone 20, a base device 30, and an information processing device 40. In addition, in the drawing, a shelf 53 in a place (here, as an example, a warehouse) where articles managed by the information processing system S are installed is also illustrated.
 この棚53には、倉庫等の管理者により、管理対象とする物品の格納容器であるパレット51が、複数段に、それぞれ複数個ずつ配置される。また、このパレット51それぞれには、パレット内の物品を識別するためのパレットラベル52が貼り付けられる。パレットラベル52は、任意のラベルであってよい。ただし、本実施形態では、パレットラベル52は、物品を倉庫内において識別するための管理番号(以下、「物品識別情報」と称する。)が、二次元コードで記載されたパレットラベルであると想定する。
 なお、図中では、図示の都合上、1つのパレット51と1つのパレットラベル52にのみ符号を付し、他のパレット51と他のパレットラベル52については符号を省略する。
On the shelf 53, a manager of a warehouse or the like arranges a plurality of pallets 51, which are storage containers for articles to be managed, in a plurality of stages. Further, a pallet label 52 for identifying an article in the pallet is attached to each of the pallets 51. The pallet label 52 may be any label. However, in the present embodiment, it is assumed that the pallet label 52 is a pallet label in which a management number (hereinafter, referred to as “article identification information”) for identifying an article in a warehouse is described in a two-dimensional code. I do.
In the drawing, for convenience of illustration, only one pallet 51 and one pallet label 52 are denoted by reference numerals, and the other pallets 51 and other pallet labels 52 are not denoted by reference numerals.
 在庫管理装置10と情報処理装置40は、直接又はネットワーク(図示省略)を介して、相互に通信可能に接続される。同様に、ドローン20と基地装置30は、直接又はネットワークを介して、相互に通信可能に接続される。同様に、基地装置30と情報処理装置40は、直接又はネットワークを介して、相互に通信可能に接続される。これら各装置間の通信は、任意の通信方式に準拠して行われてよく、その通信方式は特に限定されない。ネットワークは、例えば、インターネット、LAN(Local Area Network)、及び携帯電話網の何れか又はこれらを組み合わせたネットワークにより実現される。なお、1つの装置が複数の通信方式に準拠して通信を行ってもよい。例えば、基地装置30が、ドローン20との間で有線通信を行なうための通信方式と、情報処理装置40との間で無線通信を行なう通信方式との、双方に準拠していてもよい。 The inventory management device 10 and the information processing device 40 are communicably connected to each other directly or via a network (not shown). Similarly, the drone 20 and the base device 30 are communicably connected to each other directly or via a network. Similarly, the base device 30 and the information processing device 40 are communicably connected to each other directly or via a network. Communication between these devices may be performed in accordance with an arbitrary communication method, and the communication method is not particularly limited. The network is realized by, for example, any one of the Internet, a LAN (Local Area Network), and a mobile phone network, or a combination of these. In addition, one device may perform communication based on a plurality of communication methods. For example, the base device 30 may conform to both a communication method for performing wired communication with the drone 20 and a communication method for performing wireless communication with the information processing device 40.
 これら、在庫管理装置10、基地装置30、及び情報処理装置40は、例えば、パーソナルコンピュータやサーバ装置、あるいは本実施形態特有の装置、といった情報処理機能を有する電子機器により実現される。また、ドローン20は、撮影機能を備えたドローンにより実現される。 The inventory management device 10, the base device 30, and the information processing device 40 are realized by an electronic device having an information processing function, such as a personal computer, a server device, or a device unique to the present embodiment. The drone 20 is realized by a drone having a photographing function.
 図2は、本実施形態に係る情報処理システムSを俯瞰した構成の一例を示す模式図である。図2に示すように、棚53は複数列(図中の棚53a~棚53eに相当)配置される。そして、これら各棚53の図中における左右の列には、パレットラベル52が貼り付けられたパレット51が、図中における上下方向に並べて配置されている。すなわち、パレットラベル52が貼り付けられたパレット51に格納された物品は、水平方向に並んで配置されると共に垂直方向に積載されて配置されている。そして、これら図中における左右の列それぞれに対応してドローン20と、このドローン20に対応する基地装置30とが配置される。例えば、53aの図中における左の列に対応して、ドローン20aと基地装置30aが配置される。また、例えば、53aの図中における右の列と53bの図中における左の列とに対応して、ドローン20bと基地装置30bが配置される。 FIG. 2 is a schematic diagram illustrating an example of a configuration in which the information processing system S according to the present embodiment is overlooked. As shown in FIG. 2, shelves 53 are arranged in a plurality of rows (corresponding to shelves 53a to 53e in the figure). Pallets 51 to which pallet labels 52 are attached are arranged in the right and left rows in the drawing of each shelf 53 in the vertical direction in the drawing. That is, the articles stored on the pallet 51 to which the pallet label 52 is attached are arranged side by side in the horizontal direction and stacked in the vertical direction. Then, a drone 20 and a base device 30 corresponding to the drone 20 are arranged corresponding to each of the left and right columns in these figures. For example, the drone 20a and the base device 30a are arranged corresponding to the left column in the drawing of 53a. Further, for example, the drone 20b and the base device 30b are arranged corresponding to the right column in the drawing of 53a and the left column in the drawing of 53b.
 このような構成を有する情報処理システムSは、データ突合処理を行う。ここで、データ突合処理とは、ドローン20の撮影に基づいて各パレット51に格納されている物品の実際の存在位置を特定し、この特定した物品の実際の存在位置と、管理情報に含まれている管理上の配置位置等を突合する一連の処理をいう。 情報 処理 The information processing system S having such a configuration performs a data matching process. Here, the data matching process refers to specifying the actual existence position of the article stored in each pallet 51 based on the photographing of the drone 20 and including the actual existence position of the identified article and the management information. It refers to a series of processes that collate the managed arrangement positions.
 具体的に、データ突合処理において、ドローン20は、倉庫内の棚53の設置されている区画を飛行すると共に、ドローン20自身に対応する棚53に設置されている各段の各パレット51を撮影する。そして、ドローン20は、撮影した画像と、撮影時の位置情報とを対応付けた撮影情報を生成する。また、ドローン20は撮影が終了すると、基地装置30の設置場所に移動し、撮影情報を基地装置30に対して送信する。基地装置30は、受信した撮影情報を、情報処理装置40に対して送信する。情報処理装置40は、撮影情報を受信すると、受信した撮影情報を画像解析することにより、パレットラベル52を検出すると共に、検出したパレットラベル52に記載の二次元コードをデコードすることにより、物品識別情報を検出する。また、情報処理装置40は、検出した物品識別情報と、検出対象とした撮影情報に含まれている位置情報とに基づいて、各パレット51(に格納されている物品)の実際の存在位置を特定する。更に、情報処理装置40は、在庫管理装置10との通信により、在庫管理装置10が管理する各パレット51(に格納されている物品)の在庫情報を取得する。最後に、情報処理装置40は、撮影情報に基づいて特定した各パレット51(に格納されている物品)の実際の存在位置と、在庫情報に含まれている管理上の物品の配置位置等を突合し、この突合結果を出力する。 Specifically, in the data matching process, the drone 20 flies over the section of the warehouse where the shelves 53 are installed, and photographs each pallet 51 of each stage installed on the shelves 53 corresponding to the drone 20 itself. I do. Then, the drone 20 generates shooting information in which the shot image is associated with the position information at the time of shooting. When the shooting is completed, the drone 20 moves to the installation location of the base device 30 and transmits shooting information to the base device 30. The base device 30 transmits the received photographing information to the information processing device 40. Upon receiving the photographing information, the information processing device 40 detects the pallet label 52 by image-analyzing the received photographing information, and decodes the two-dimensional code described in the detected pallet label 52 to identify the item. Detect information. In addition, the information processing apparatus 40 determines the actual existence position of each of the pallets 51 (the articles stored in the pallets 51) based on the detected article identification information and the position information included in the imaging information to be detected. Identify. Further, the information processing device 40 acquires the inventory information of each of the pallets 51 (articles stored in the pallet 51) managed by the inventory management device 10 by communicating with the inventory management device 10. Finally, the information processing apparatus 40 determines the actual location of each of the pallets 51 (the articles stored in the pallets) identified based on the photographing information, and the location of the management article included in the stock information. Matches and outputs the result of this match.
 以上説明したデータ突合処理を行なうことにより、本実施形態では、人手による目視や、人手によるバーコードスキャンといった手間をかける必要をなくすことができる。また、仮に倉庫が大規模な場合や、倉庫の天井高が高い場合であっても、本実施形態を適用することができる。
 すなわち、本実施形態によれば、より簡便に物品の管理を行なうことができる。
By performing the above-described data matching processing, in the present embodiment, it is possible to eliminate the need for manual visual inspection and manual barcode scanning. Further, even if the warehouse is large-scale or the ceiling height of the warehouse is high, the present embodiment can be applied.
That is, according to the present embodiment, it is possible to more easily manage the articles.
 なお、図1に示す構成は、本実施形態の一例に過ぎず、本実施形態はこの構成に限定されない。例えば、情報処理システムSに含まれる、各装置や棚53の数は図示されているものに限定されず、任意の数であってよい。また、例えば、パレット51に格納されている物品は任意のものであってよい。更に、例えば、パレットラベル52が物品に直接貼り付けられていてもよい。
 次に、これら図1及び図2を参照して上述した各装置それぞれの構成について詳細に説明をする。
The configuration illustrated in FIG. 1 is merely an example of the present embodiment, and the present embodiment is not limited to this configuration. For example, the number of each device and the number of shelves 53 included in the information processing system S are not limited to those illustrated, and may be any number. Further, for example, the articles stored in the pallet 51 may be arbitrary. Further, for example, the pallet label 52 may be directly attached to the article.
Next, the configuration of each device described above will be described in detail with reference to FIGS.
 [在庫管理装置10の構成]
 次に、在庫管理装置10の構成について、図3のブロック図を参照して説明をする。図3に示すように、在庫管理装置10は、CPU(Central Processing Unit)11と、ROM(Read Only Memory)12と、RAM(Random Access Memory)13と、通信部14と、記憶部15と、入力部16と、表示部17と、を備えている。これら各部は、信号線によりバス接続されており、相互に信号を送受する。
[Configuration of Inventory Management Device 10]
Next, the configuration of the inventory management device 10 will be described with reference to the block diagram of FIG. As shown in FIG. 3, the inventory management device 10 includes a CPU (Central Processing Unit) 11, a ROM (Read Only Memory) 12, a RAM (Random Access Memory) 13, a communication unit 14, a storage unit 15, An input unit 16 and a display unit 17 are provided. These units are connected by a bus via signal lines, and mutually transmit and receive signals.
 CPU11は、ROM12に記録されているプログラム、又は、記憶部15からRAM13にロードされたプログラムに従って各種の処理を実行する。
 RAM13には、CPU11が各種の処理を実行する上において必要なデータ等も適宜記憶される。
The CPU 11 executes various processes according to a program recorded in the ROM 12 or a program loaded from the storage unit 15 into the RAM 13.
The RAM 13 also appropriately stores data and the like necessary for the CPU 11 to execute various processes.
 通信部14は、CPU11が、情報処理システムSに含まれる他の装置との間で通信を行うための通信制御を行う。
 記憶部15は、DRAM(Dynamic Random Access Memory)等の半導体メモリで構成され、各種データを記憶する。
The communication unit 14 performs communication control for allowing the CPU 11 to communicate with another device included in the information processing system S.
The storage unit 15 is composed of a semiconductor memory such as a DRAM (Dynamic Random Access Memory) and stores various data.
 入力部16は、各種ボタン及びタッチパネル、又はマウス及びキーボード等の外部入力装置で構成され、ユーザの指示操作に応じて各種情報を入力する。
 表示部17は、液晶ディスプレイ等で構成され、CPU11が出力する画像データに対応する画像を表示する。
The input unit 16 is configured by various buttons and a touch panel, or an external input device such as a mouse and a keyboard, and inputs various information according to a user's instruction operation.
The display unit 17 includes a liquid crystal display or the like, and displays an image corresponding to the image data output by the CPU 11.
 在庫管理装置10が動作する場合、図3に示すように、CPU11において、在庫情報管理部111が機能する。
 また、記憶部15の一領域には、在庫情報記憶部151が設定される。
When the inventory management device 10 operates, the inventory information management unit 111 functions in the CPU 11 as shown in FIG.
In one area of the storage unit 15, a stock information storage unit 151 is set.
 在庫情報記憶部151には、情報処理システムSが管理対象とする倉庫における物品の在庫情報が記憶される。この在庫情報は、例えば、一般的なWMSにおいて管理される在庫情報と同様の情報である。より詳細には、例えば、物品識別情報、物品の在庫数、物品の配置されている各位置の識別情報(ロケーション情報)等のデータが、例えば、テーブル形式で在庫情報として記憶される。 The stock information storage unit 151 stores stock information of articles in a warehouse managed by the information processing system S. This stock information is, for example, the same information as the stock information managed in a general WMS. More specifically, for example, data such as article identification information, the number of stocked articles, and identification information (location information) of each position where the articles are arranged are stored as stock information in a table format, for example.
 在庫情報管理部111は、在庫情報記憶部151が記憶する在庫情報を管理する機能を含む。在庫情報管理部111は、例えば、入力部16により受け付けたユーザの操作や、ユーザが利用する他の装置(図示を省略する)から通信部14を介して受信したデータ等に基づいて、在庫情報記憶部151が記憶する在庫情報を最新の内容に更新することにより管理を行う。すなわち、在庫管理装置10は、いわゆるWMSとしての機能を実現する。
 また、在庫情報管理部111は、情報処理装置40からの在庫情報の要求を受信した場合には、在庫情報記憶部151から最新の在庫情報を読み出す。そして、在庫情報管理部111は、在庫情報の要求の応答として、読み出した最新の在庫情報を情報処理装置40に対して送信する。
The inventory information management unit 111 includes a function of managing inventory information stored in the inventory information storage unit 151. The stock information management unit 111 stores the stock information based on, for example, a user operation received through the input unit 16 or data received from another device (not shown) used by the user via the communication unit 14. The management is performed by updating the stock information stored in the storage unit 151 to the latest content. That is, the inventory management device 10 realizes a function as a so-called WMS.
When receiving a request for inventory information from the information processing device 40, the inventory information management unit 111 reads out the latest inventory information from the inventory information storage unit 151. Then, the inventory information management unit 111 transmits the read latest inventory information to the information processing device 40 as a response to the inventory information request.
 [ドローン20の構成]
 次に、ドローン20の構成について、図4のブロック図を参照して説明をする。図4に示すように、ドローン20は、CPU21と、ROM22と、RAM23と、通信部24と、記憶部25と、撮影部26と、駆動部27と、センサ部28と、バッテリ29と、を備えている。これら各部は、信号線によりバス接続されており、相互に信号を送受する。
[Configuration of Drone 20]
Next, the configuration of the drone 20 will be described with reference to the block diagram of FIG. As shown in FIG. 4, the drone 20 includes a CPU 21, a ROM 22, a RAM 23, a communication unit 24, a storage unit 25, a photographing unit 26, a driving unit 27, a sensor unit 28, and a battery 29. Have. These units are connected by a bus via signal lines, and mutually transmit and receive signals.
 ここで、CPU21、ROM22、RAM23、通信部24、及び記憶部25それぞれのハードウェアとしての機能は、上述の在庫管理装置10が備える、符号のみが異なる同名の各部のハードウェアとしての機能と同等である。従って、重複する説明を省略する。 Here, the functions of the CPU 21, the ROM 22, the RAM 23, the communication unit 24, and the storage unit 25 as hardware are the same as the hardware functions of the units having the same names but differing only in the codes, provided in the inventory management device 10 described above. It is. Therefore, duplicate description will be omitted.
 撮影部26は、光学レンズやイメージセンサ等を含んだカメラ等で構成され、ドローン20の周囲(例えば、棚53に配置されたパレット51及びパレットラベル52が含まれる周囲)を撮影する。撮影部26の撮影により取得された画像は、デジタル信号に変換され、例えば、CPU21等に対して出力される。 The image capturing unit 26 is configured by a camera or the like including an optical lens, an image sensor, and the like, and captures an image around the drone 20 (for example, an image including the pallet 51 and the pallet label 52 arranged on the shelf 53). The image obtained by the photographing of the photographing unit 26 is converted into a digital signal and output to, for example, the CPU 21 or the like.
 駆動部27は、後述のバッテリ29から供給される電力を用いて駆動する。駆動部27の駆動によりドローン20は、空間を飛行することができる。駆動部27は、例えば、揚力や推力を発生させるプロペラとプロペラを回転させるモータの組等により構成される。 The drive unit 27 is driven using electric power supplied from a battery 29 described later. The drone 20 can fly in space by the driving of the driving unit 27. The drive unit 27 is composed of, for example, a set of a propeller that generates lift and thrust and a motor that rotates the propeller.
 センサ部28は、他の物体(例えば、棚53や、倉庫の底面や壁面)との距離を検出するためのセンサである。センサ部28の検出した距離は、デジタル信号に変換され、例えば、CPU21等に対して出力される。センサ部28は、例えば、マイクロ波帯の電波を利用して距離を検出するセンサや、超音波を利用して距離を検出するセンサにより構成される。また、センサ部28には、ドローン20の移動距離や移動方向等を検出するために、加速度センサや角速度センサ等が含まれていてもよい。 The sensor unit 28 is a sensor for detecting a distance from another object (for example, the shelf 53 or the bottom or wall surface of the warehouse). The distance detected by the sensor unit 28 is converted into a digital signal and output to, for example, the CPU 21 or the like. The sensor unit 28 includes, for example, a sensor that detects a distance using a radio wave in a microwave band and a sensor that detects a distance using an ultrasonic wave. Further, the sensor unit 28 may include an acceleration sensor, an angular velocity sensor, and the like, for detecting a moving distance, a moving direction, and the like of the drone 20.
 バッテリ29は、電力を蓄えており、駆動部27をはじめとするドローン20の各部に対してこの電力を供給する。また、バッテリ29は、基地装置30からの給電を受けることにより、消費した電力を再度蓄える。 The battery 29 stores electric power, and supplies this electric power to the driving unit 27 and other parts of the drone 20. In addition, the battery 29 stores the consumed power again by receiving power supply from the base device 30.
 ドローン20が動作する場合、図4に示すように、CPU21において、スケジュール管理部211と、飛行制御部212と、撮影情報生成部213と、が機能する。
 また、記憶部25の一領域には、スケジュール記憶部251と、位置検出用情報記憶部252と、撮影情報記憶部253が設定される。
When the drone 20 operates, as shown in FIG. 4, in the CPU 21, a schedule management unit 211, a flight control unit 212, and a shooting information generation unit 213 function.
In one area of the storage unit 25, a schedule storage unit 251, a position detection information storage unit 252, and a shooting information storage unit 253 are set.
 スケジュール記憶部251には、ドローン20による撮影等を行なうためのスケジュールが記憶される。例えば、ドローン20が飛行しながら撮影を行なうべき時間帯がスケジュールとして記憶される。この、ドローン20が飛行しながら撮影を行なうべき時間帯は、任意に設定することができるが、倉庫においてユーザが作業を行わない時間帯に設定するとよい。例えば、夜間や早朝の時間帯が設定されるとよい。これにより、倉庫において作業を行なうユーザの邪魔をすることなく、ドローン20による撮影等を実施することができる。 The schedule storage unit 251 stores a schedule for performing photographing or the like by the drone 20. For example, a time zone during which the drone 20 should take a picture while flying is stored as a schedule. The time period during which the drone 20 should take an image while flying can be set arbitrarily, but may be set to a time period during which the user does not work in the warehouse. For example, a nighttime or early morning time zone may be set. Accordingly, it is possible to perform photographing or the like by the drone 20 without disturbing a user who works in the warehouse.
 位置検出用情報記憶部252には、ドローン20が自身の現在位置を検出するための情報が記憶される。例えば、図2に示したような配置における、各基地装置30と、各棚53と、倉庫の床等に配置されたマーカー(例えば、ラダー(はしご)上のマーカー)との、それぞれの位置関係を示す情報や、各棚53の段数を示す情報や、倉庫における各位置の識別情報(ロケーション情報)等が現在位置を検出するための情報として記憶される。これらの情報は、例えば、各情報で共通する、三次元座標系における座標で表現される。 The information for the drone 20 to detect its own current position is stored in the position detecting information storage unit 252. For example, in the arrangement as shown in FIG. 2, the respective positional relationships between the respective base devices 30, the respective shelves 53, and markers (for example, markers on a ladder (ladder)) arranged on the floor of a warehouse or the like. , Information indicating the number of stages of each shelf 53, identification information (location information) of each position in the warehouse, and the like are stored as information for detecting the current position. These pieces of information are represented by, for example, coordinates in a three-dimensional coordinate system common to the pieces of information.
 撮影情報記憶部253には、撮影情報生成部213が生成した撮影情報が記憶される。ここで、撮影情報は、上述したように、撮影画像と、撮影時の位置情報(例えば、二次元位置の情報や、あるいは三次元位置の情報)とを対応付けた情報である。 The shooting information storage unit 253 stores the shooting information generated by the shooting information generation unit 213. Here, as described above, the photographing information is information in which a photographed image is associated with positional information at the time of photographing (for example, information on a two-dimensional position or information on a three-dimensional position).
 スケジュール管理部211は、スケジュール記憶部251に記憶されているスケジュールに基づいて、ドローン20による撮影等の実施を管理する。例えば、スケジュール管理部211は、スケジュールにおいて設定されている所定の時間帯が到来した場合に、飛行制御部212に対して指示を行なうことにより、ドローン20による飛行と撮影を開始させる。
 また、スケジュール管理部211は、スケジュールにおいて設定されている所定の時間帯以外の時間には、飛行制御部212に対して指示を行なうことにより、飛行と撮影を停止させることができる。この場合に、スケジュール管理部211は、飛行制御部212に対して指示を行なうことによって、ドローン20に自身に対応する基地装置30からの給電を受けさせ、バッテリ29において消費した電力を再度蓄えることができる。
The schedule management unit 211 manages the execution of photographing and the like by the drone 20 based on the schedule stored in the schedule storage unit 251. For example, when a predetermined time zone set in the schedule has arrived, the schedule management unit 211 instructs the flight control unit 212 to start flying and photographing by the drone 20.
In addition, the schedule management unit 211 can stop the flight and the imaging by giving an instruction to the flight control unit 212 at a time other than the predetermined time zone set in the schedule. In this case, the schedule management unit 211 instructs the flight control unit 212 to cause the drone 20 to receive power supply from the base device 30 corresponding to the drone 20 and store the power consumed in the battery 29 again. Can be.
 飛行制御部212は、スケジュール管理部211からの指示に基づいて、ドローン20の撮影等の実施を開始する。この場合、飛行制御部212は、ドローン20を自身に対応する棚53の周辺を自律飛行させると共に、撮影部26によって、この棚53の撮影を行なう。ここで、飛行制御部212は、位置検出用情報記憶部252に格納されている自身の現在位置を検出するための情報と、センサ部28による検出結果に基づいて自律飛行を実現する。具体的には、飛行を開始する位置であるドローン20に対応する基地装置30の位置からの移動距離及び移動方向を、センサ部28による検出結果により算出することにより、ドローン20が飛行している現在の三次元位置を特定する。そして、飛行制御部212は、このドローン20の現在の三次元位置と、位置検出用情報記憶部252に記憶されている、各基地装置30と各棚53との位置関係とに基づいて、自律飛行を実現する。なお、飛行制御部212は、撮影部26の撮影した画像を解析することにより(又は、センサ部28により、)、倉庫の床等に配置されたマーカーを検出し、このマーカーの検出結果に基づいて、ドローン20が飛行している現在の三次元位置の補正を行なうようにしてもよい。 (4) The flight control unit 212 starts the photography of the drone 20 or the like based on the instruction from the schedule management unit 211. In this case, the flight control unit 212 causes the drone 20 to autonomously fly around the shelf 53 corresponding to the drone 20, and the photographing unit 26 photographs the shelf 53. Here, the flight control unit 212 realizes an autonomous flight based on the information for detecting its own current position stored in the position detection information storage unit 252 and the detection result by the sensor unit 28. Specifically, the drone 20 is flying by calculating the moving distance and the moving direction from the position of the base device 30 corresponding to the drone 20 that is the position where the flight starts, based on the detection result by the sensor unit 28. Identify the current 3D position. Then, the flight control unit 212 performs autonomous control based on the current three-dimensional position of the drone 20 and the positional relationship between each base device 30 and each shelf 53 stored in the position detection information storage unit 252. Realize the flight. The flight control unit 212 analyzes the image captured by the image capturing unit 26 (or by the sensor unit 28) to detect a marker disposed on a floor of a warehouse or the like, and based on the detection result of the marker. Thus, the current three-dimensional position where the drone 20 is flying may be corrected.
 また、飛行制御部212は、自律飛行の間に撮影部26を制御することにより、所定の周期で、ドローン20の周囲(例えば、棚53に配置されたパレット51及びパレットラベル52が含まれる周囲)を撮影する。そして、飛行制御部212は、撮影した画像と、撮影時の位置の情報とを撮影情報生成部213に対して出力する。 Further, the flight control unit 212 controls the photographing unit 26 during the autonomous flight, so that the surroundings of the drone 20 (for example, the periphery including the pallet 51 and the pallet label 52 arranged on the shelf 53) are controlled at a predetermined cycle. ) To shoot. Then, the flight control unit 212 outputs the photographed image and information on the position at the time of photographing to the photographing information generating unit 213.
 なお、この自律飛行及び撮影は、ドローン20に対応する棚53周辺全てを撮影するまで行われる。具体的に、棚53の各段には、パレット51及びパレットラベル52が水平方向に並んで配置されると共に、垂直方向に積載されて配置されている。そのため、飛行制御部212は、例えば、ドローン20は、棚53の一段ずつを撮影対象として撮影を行い、全ての段について撮影を行った場合に、自律飛行及び撮影を終了して、自身に対応する基地装置30に戻る。そして、ドローン20は、再度所定の時間帯が到来してスケジュール管理部211からの再度の指示があるまでの間、この基地装置30からの給電を受ける。なお、ドローン20は、棚53の全ての段について一度に撮影を行なうようにしてもよいが、一段について撮影を行なう都度、基地装置30に戻り、給電を受けてから、次の段について撮影を行なうようにしてもよい。このようにすると、バッテリ29の容量を低容量のものとすることができる。 The autonomous flight and photographing are performed until the entire periphery of the shelf 53 corresponding to the drone 20 is photographed. More specifically, pallets 51 and pallet labels 52 are arranged in a row on the shelf 53 in a horizontal direction and stacked in a vertical direction. For this reason, the flight control unit 212, for example, shoots one stage of the shelf 53 as a shooting target, and when all the stages have been shot, ends the autonomous flight and shooting and responds to itself. It returns to the base device 30 which performs. Then, the drone 20 receives power supply from the base device 30 until a predetermined time zone arrives again and there is an instruction again from the schedule management unit 211. Note that the drone 20 may shoot all the stages of the shelf 53 at one time, but each time one stage is shot, the drone 20 returns to the base device 30 and receives power, and then shoots the next stage. It may be performed. In this way, the capacity of the battery 29 can be reduced.
 撮影情報生成部213は、飛行制御部212から入力された、撮影画像と、撮影時の置の情報とを対応付けることにより、撮影情報を生成する。そして、撮影情報生成部213は、生成した撮影情報を撮影情報記憶部253に記憶させる。 (4) The shooting information generation unit 213 generates shooting information by associating a shot image input from the flight control unit 212 with information on the position at the time of shooting. Then, the shooting information generation unit 213 causes the shooting information storage unit 253 to store the generated shooting information.
 [基地装置30の構成]
 次に、基地装置30の構成について、図5のブロック図を参照して説明をする。図5に示すように、基地装置30は、CPU31と、ROM32と、RAM33と、通信部34と、記憶部35と、給電部36と、を備えている。これら各部は、信号線によりバス接続されており、相互に信号を送受する。
[Configuration of Base Device 30]
Next, the configuration of the base device 30 will be described with reference to the block diagram of FIG. As shown in FIG. 5, the base device 30 includes a CPU 31, a ROM 32, a RAM 33, a communication unit 34, a storage unit 35, and a power supply unit 36. These units are connected by a bus via signal lines, and mutually transmit and receive signals.
 ここで、CPU31、ROM32、RAM33、通信部34、及び記憶部35それぞれのハードウェアとしての機能は、上述の在庫管理装置10やドローン20が備える、符号のみが異なる同名の各部のハードウェアとしての機能と同等である。従って、重複する説明を省略する。 Here, the functions of the hardware of the CPU 31, the ROM 32, the RAM 33, the communication unit 34, and the storage unit 35 are the same as those of the units having the same names but differing only in the signs, provided in the inventory management device 10 and the drone 20 described above. Equivalent to function. Therefore, duplicate description will be omitted.
 給電部36は、ドローン20に対して給電を行なう部分である。基地装置30は、倉庫内において家庭用電源から電力を得る。そして、この電力をドローン20に給電するのに適した電圧等に変換することにより、ドローン20に対する給電を行なう。ドローン20に対する給電は、ドローン20による自律飛行及び撮影が終了後、再度ドローン20による自律飛行及び撮影が行われるまでの間(又は、バッテリ29が十分に充電されるまでの間)続けられる。 The power supply unit 36 is a unit that supplies power to the drone 20. The base device 30 obtains electric power from a home power supply in the warehouse. Then, power is supplied to the drone 20 by converting the power into a voltage or the like suitable for supplying power to the drone 20. The power supply to the drone 20 is continued after the autonomous flight and the photographing by the drone 20 are completed and until the autonomous flight and the photographing by the drone 20 are performed again (or until the battery 29 is sufficiently charged).
 基地装置30が動作する場合、図5に示すように、CPU31において、撮影情報中継部311と、給電制御部312と、が機能する。
 また、記憶部35の一領域には、撮影情報記憶部351が設定される。
When the base device 30 operates, as shown in FIG. 5, in the CPU 31, the photographing information relay unit 311 and the power supply control unit 312 function.
In one area of the storage unit 35, a shooting information storage unit 351 is set.
 撮影情報記憶部351には、ドローン20から取得した撮影情報が記憶される。
 撮影情報中継部311は、ドローン20から取得した撮影情報を、情報処理装置40に対して送信する。すなわち、撮影情報中継部311は、撮影情報を中継する。この撮影情報の中継は、任意のタイミングで行なうことができる。一例として、撮影情報中継部311は、ドローン20が自動飛行及び撮影を終了後、給電部36からの給電を受けている間に、ドローン20と通信を行なう。ただし、これに限らず、撮影情報中継部311は、例えば、ドローン20が自動飛行及び撮影行っている間に、ドローン20と通信を行なってもよい。
 そして、撮影情報中継部311は、この通信によりドローン20が生成し、撮影情報記憶部253に記憶している撮影情報を取得する。また、撮影情報中継部311は、取得した撮影情報を撮影情報記憶部351に記憶させる。また、撮影情報中継部311は、撮影情報記憶部351に記憶させた撮影情報を、所定のタイミングで情報処理装置40に対して送信する。所定のタイミングは、ドローン20から撮影情報を取得した直後であってもよいし、情報処理装置40から撮影情報を要求されたタイミング等であってもよい。
The shooting information storage unit 351 stores shooting information acquired from the drone 20.
The photographing information relay unit 311 transmits the photographing information acquired from the drone 20 to the information processing device 40. That is, the photographing information relay unit 311 relays the photographing information. The relay of the photographing information can be performed at an arbitrary timing. As an example, the photographing information relay unit 311 communicates with the drone 20 while receiving power from the power supply unit 36 after the drone 20 has completed the automatic flight and the photographing. However, the present invention is not limited to this, and the imaging information relay unit 311 may communicate with the drone 20, for example, while the drone 20 is performing automatic flight and imaging.
Then, the photographing information relay unit 311 acquires the photographing information generated by the drone 20 through this communication and stored in the photographing information storage unit 253. Further, the photographing information relay unit 311 stores the acquired photographing information in the photographing information storage unit 351. Further, the photographing information relay unit 311 transmits the photographing information stored in the photographing information storage unit 351 to the information processing device 40 at a predetermined timing. The predetermined timing may be immediately after acquiring the shooting information from the drone 20, or may be a timing at which the information processing device 40 requests the shooting information.
 給電制御部312は、上述した給電部36による給電を制御する部分である。具体的には、給電制御部312は、上述した電圧等の変換や、給電の開始及び終了を制御する。 The power supply control unit 312 is a part that controls power supply by the power supply unit 36 described above. Specifically, the power supply control unit 312 controls the above-described conversion of the voltage and the like, and the start and end of power supply.
 [情報処理装置40の構成]
 次に、情報処理装置40の構成について、図6のブロック図を参照して説明をする。図6に示すように、情報処理装置40は、CPU41と、ROM42と、RAM43と、通信部44と、記憶部45と、入力部46と、表示部47と、を備えている。これら各部は、信号線によりバス接続されており、相互に信号を送受する。
[Configuration of Information Processing Device 40]
Next, the configuration of the information processing apparatus 40 will be described with reference to the block diagram of FIG. As shown in FIG. 6, the information processing device 40 includes a CPU 41, a ROM 42, a RAM 43, a communication unit 44, a storage unit 45, an input unit 46, and a display unit 47. These units are connected by a bus via signal lines, and mutually transmit and receive signals.
 ここで、CPU41、ROM42、RAM43、通信部44、記憶部45、入力部46、及び表示部47それぞれのハードウェアとしての機能は、上述の在庫管理装置10やドローン20や基地装置30が備える、符号のみが異なる同名の各部のハードウェアとしての機能と同等である。従って、重複する説明を省略する。 Here, the functions as hardware of the CPU 41, the ROM 42, the RAM 43, the communication unit 44, the storage unit 45, the input unit 46, and the display unit 47 are provided in the inventory management device 10, the drone 20, and the base device 30 described above. This is equivalent to the function as hardware of each unit having the same name except for the sign. Therefore, duplicate description will be omitted.
 情報処理装置40が動作する場合、図6に示すように、CPU41において、撮影情報取得部411と、識別情報検出部412と、存在位置特定部413と、在庫データ取得部414と、データ突合部415と、が機能する。
 また、記憶部45の一領域には、撮影情報記憶部451と、特定情報記憶部452と、が設定される。
When the information processing apparatus 40 operates, as shown in FIG. 6, in the CPU 41, the photographing information acquisition unit 411, the identification information detection unit 412, the presence position identification unit 413, the inventory data acquisition unit 414, the data matching unit 415 function.
In one area of the storage unit 45, a shooting information storage unit 451 and a specific information storage unit 452 are set.
 撮影情報記憶部451には、各基地装置30から受信した、各ドローン20の撮影情報が記憶される。 The shooting information storage unit 451 stores shooting information of each drone 20 received from each base device 30.
 特定情報記憶部452には、後述の存在位置特定部413が撮影情報に基づいて特定した各パレット51(に格納されている物品)の実際の存在位置が記憶される。 The specific information storage unit 452 stores the actual existence position of each of the pallets 51 (articles stored in the pallets 51) identified by the existence position identification unit 413 described below based on the photographing information.
 撮影情報取得部411は、各ドローン20の撮影情報を、各基地装置30から受信することにより取得し、取得した各ドローン20の撮影情報を撮影情報記憶部451に記憶させる。 The photographing information acquisition unit 411 acquires the photographing information of each drone 20 by receiving it from each base device 30, and stores the acquired photographing information of each drone 20 in the photographing information storage unit 451.
 識別情報検出部412は、撮影情報記憶部451が記憶している撮影情報を、既存の画像解析技術にて画像解析することにより、画像内からパレットラベル52を検出する。また、識別情報検出部412は、検出したパレットラベル52に記載の二次元コードをデコードすることにより、物品識別情報を検出する。更に、識別情報検出部412は、検出対象とした撮影情報に含まれている位置情報を物品識別情報と対応付ける。そして、識別情報検出部412は、対応付けた位置情報と物品識別情報とを存在位置特定部413に対して出力する。 The identification information detection unit 412 detects the pallet label 52 from the image by performing image analysis on the imaging information stored in the imaging information storage unit 451 using an existing image analysis technique. Further, the identification information detection unit 412 detects the article identification information by decoding the two-dimensional code described in the detected pallet label 52. Further, the identification information detection unit 412 associates the position information included in the imaging information to be detected with the article identification information. Then, the identification information detection unit 412 outputs the associated position information and article identification information to the existing position identification unit 413.
 存在位置特定部413は、識別情報検出部412から入力された、物品識別情報と位置情報とに基づいて、各パレット51(に格納されている物品)の実際の存在位置を特定する。すなわち、存在位置特定部413は、この位置情報に対応する位置に、この物品識別情報に対応するパレット51に格納されている物品が存在すると特定する。また、存在位置特定部413は、特定した各物品の実際の存在位置を特定情報記憶部452に記憶させる。 The existence position specifying unit 413 specifies the actual existence position of each of the pallets 51 (the items stored in the pallets 51) based on the item identification information and the position information input from the identification information detecting unit 412. That is, the existence position specifying unit 413 specifies that the article stored in the pallet 51 corresponding to the article identification information exists at the position corresponding to the position information. Further, the existence position specifying unit 413 causes the specific information storage unit 452 to store the actual position of each specified article.
 在庫データ取得部414は、在庫管理装置10との通信により、在庫管理装置10が管理する各パレット51に格納されている物品の在庫情報を取得する。例えば、在庫データ取得部414は、在庫管理装置10に対して、現在の物品の在庫情報を要求する。また、在庫データ取得部414は、要求に応じて在庫管理装置10から返信された現在の物品の在庫情報をデータ突合部415に対して出力する。なお、現在の物品の在庫情報は、在庫データ取得部414からの要求を要することなく、在庫管理装置10から自発的に送信されてもよい。 The inventory data acquisition unit 414 acquires the inventory information of the articles stored in each pallet 51 managed by the inventory management device 10 by communicating with the inventory management device 10. For example, the inventory data acquisition unit 414 requests the inventory management device 10 for current inventory information of the article. Further, the inventory data acquisition unit 414 outputs the current inventory information of the article returned from the inventory management device 10 to the data matching unit 415 in response to the request. Note that the current stock information of the article may be spontaneously transmitted from the stock management device 10 without requiring a request from the stock data acquisition unit 414.
 データ突合部415は、存在位置特定部413が撮影情報に基づいて特定した各パレット51に格納されている物品の実際の存在位置と、在庫データ取得部414から入力された物品の管理情報に含まれている管理上の物品の配置位置等を突合し、この突合結果を出力する。この突合結果の出力は、例えば、表示部47への表示や、ユーザの利用する他の端末(図示省略)や、プリンタ(図示省略)を利用した紙媒体への印刷により実現される。なお、出力される突合結果の具体例については、図8を参照して後述する。 The data collating unit 415 includes the actual existence position of the article stored in each pallet 51 identified by the existence position identification unit 413 based on the photographing information, and the article management information input from the inventory data acquisition unit 414. The arrangement positions of the managed articles are collated, and the result of the collation is output. The output of the matching result is realized by, for example, displaying on the display unit 47, printing on a paper medium using another terminal (not shown) used by the user, or a printer (not shown). A specific example of the output matching result will be described later with reference to FIG.
 [データ突合処理]
 次に、図7のシーケンス図を参照して、情報処理システムSが実行するデータ突合処理の流れについて説明をする。上述したように、データ突合処理とは、ドローン20の撮影に基づいて各パレット51に格納されている物品の実際の存在位置を特定し、この特定した物品の実際の存在位置と、管理情報に含まれている管理上の配置位置等を突合する一連の処理をいう。
 データ突合処理は、情報処理システムSの稼働開始と共に、定期的に行われる。なお、データ突合処理の前提として、在庫情報管理部111による、在庫情報の管理も定期的に行われる。
[Data matching process]
Next, the flow of the data matching process executed by the information processing system S will be described with reference to the sequence diagram of FIG. As described above, the data matching process refers to specifying the actual existence position of the article stored in each pallet 51 based on the photographing of the drone 20, and adding the actual existence position of the identified article to the management information. This refers to a series of processes for collating the included management arrangement positions and the like.
The data matching process is periodically performed when the operation of the information processing system S starts. It should be noted that as a premise of the data matching process, management of inventory information by the inventory information management unit 111 is also performed periodically.
 ステップS11において、基地装置30は、ドローン20に対する給電を行なう。
 ステップS12において、ドローン20は、自律飛行及び撮影を行なう所定の時間帯が到来したか否かを判定する。所定の時間帯が到来した場合には、ステップS12においてYesと判定され、処理はステップS13に進む。一方で、所定の時間帯が到来していない場合は、ステップS12においてNoと判定され、ステップS11における給電が継続する。
In step S11, the base device 30 supplies power to the drone 20.
In step S12, the drone 20 determines whether a predetermined time zone for performing autonomous flight and photographing has arrived. If the predetermined time period has arrived, it is determined as Yes in step S12, and the process proceeds to step S13. On the other hand, when the predetermined time zone has not arrived, No is determined in step S12, and the power supply in step S11 is continued.
 ステップS13において、ドローン20は、自律飛行及び撮影を行うことにより、撮影情報を生成する。
 ステップS14において、ドローン20は、自身が撮影すべき棚53の各位置全てについての撮影が行われ、この撮影に基づいた撮影情報の生成が終了したか否かを判定する。撮影情報の生成が終了した場合には、ステップS14においてYesと判定され、処理はステップS15に進む。一方で、撮影情報の生成が終了していない場合は、ステップS14においてNoと判定され、ステップS13における撮影情報の生成が継続する。
In step S13, the drone 20 generates shooting information by performing autonomous flight and shooting.
In step S <b> 14, the drone 20 determines whether or not all the positions of the shelf 53 to be photographed have been photographed and the generation of the photographing information based on the photographing has been completed. When the generation of the photographing information is completed, Yes is determined in step S14, and the process proceeds to step S15. On the other hand, if the generation of the shooting information has not been completed, No is determined in step S14, and the generation of the shooting information in step S13 continues.
 ステップS15において、ドローン20は、自身に対応する基地装置30に戻り、ステップS13において生成した撮影情報を、基地装置30に対して送信する。
 ステップS16において、基地装置30は、ステップS15において受信した撮影情報を情報処理装置40に対して送信する。
In step S15, the drone 20 returns to the base device 30 corresponding to itself, and transmits the imaging information generated in step S13 to the base device 30.
In step S16, the base device 30 transmits the imaging information received in step S15 to the information processing device 40.
 ステップS17において、基地装置30は、ドローン20に対する給電を再度行なう。
 ステップS18において、情報処理装置40は、ステップS16において受信した撮影情報に対して画像解析等を行なうことにより、物品識別情報を検出する。
In step S17, the base device 30 performs power supply to the drone 20 again.
In step S18, the information processing device 40 detects article identification information by performing image analysis or the like on the photographing information received in step S16.
 ステップS19において、情報処理装置40は、ステップS18において検出した物品識別情報と、対応する位置情報とに基づいて、各パレット51に格納されている物品の実際の存在位置を特定する。
 ステップS20において、情報処理装置40は、在庫管理装置10に対して、現在の物品の在庫情報を要求する。
In step S19, the information processing device 40 specifies the actual existence position of the article stored in each pallet 51 based on the article identification information detected in step S18 and the corresponding position information.
In step S20, the information processing device 40 requests the inventory management device 10 for current inventory information of the article.
 ステップS21において、在庫管理装置10は、ステップS20における要求の応答として、現在の物品の在庫情報を情報処理装置40に対して送信する。
 ステップS22において、情報処理装置40は、ステップS19において特定した各パレット51に格納されている物品の実際の存在位置と、ステップS21において在庫管理装置10から送信された物品の在庫情報に含まれている管理上の物品の配置位置等を突合する。
 ステップS23において、情報処理装置40は、ステップS23において行った突合の突合結果を出力する。
In step S21, the inventory management device 10 transmits the current item inventory information to the information processing device 40 as a response to the request in step S20.
In step S22, the information processing device 40 includes the actual existence position of the item stored in each pallet 51 specified in step S19 and the item inventory information transmitted from the inventory management device 10 in step S21. Collate the positions of the items on the management.
In step S23, the information processing device 40 outputs the result of the abutment performed in step S23.
 以上説明したデータ突合処理を行なうことにより、本実施形態では、人手による目視や、人手によるバーコードスキャンといった手間をかける必要をなくすことができる。また、仮に倉庫が大規模な場合や、倉庫の天井高が高い場合であっても、本実施形態を適用することができる。
 また、本実施形態では、ドローン20が、センサ等を利用して自律飛行を行なうことから、GPS(Global Positioning System)による測位等を行なうことが困難な室内倉庫等において、データ突合処理を実施することができる。
By performing the above-described data matching processing, in the present embodiment, it is possible to eliminate the need for manual visual inspection and manual barcode scanning. Further, even if the warehouse is large-scale or the ceiling height of the warehouse is high, the present embodiment can be applied.
Further, in the present embodiment, since the drone 20 performs autonomous flight using a sensor or the like, data matching processing is performed in an indoor warehouse or the like where it is difficult to perform positioning or the like by GPS (Global Positioning System). be able to.
 更に、本実施形態では、倉庫内での物品の管理において一般的に利用されているパレットラベル52を読み取って利用することから、RFID(Radio Frequency IDentifier)等のモジュールを別途用意する必要はない。すなわち、本実施形態によれば、低コストにデータ突合処理を実施することができる。また、本実施形態では、デコードを行なうために、高精度に撮影を行なう必要がある一次元バーコードではなく、二次元コードを利用することから、飛行中に撮影した画像がそれほど高精度に撮影されていない場合であっても、デコードを行なうことができる。 Further, in the present embodiment, since the pallet label 52 generally used in the management of articles in the warehouse is read and used, it is not necessary to separately prepare a module such as an RFID (Radio Frequency IDentifier). That is, according to the present embodiment, the data matching process can be performed at low cost. Further, in the present embodiment, since a two-dimensional code is used instead of a one-dimensional barcode which needs to be photographed with high precision in order to perform decoding, an image photographed during flight is photographed with such high precision. Even if the decoding has not been performed, decoding can be performed.
 更に、本実施形態では、画像解析による二次元コードの検出や、検出した二次元コードのデコード等の処理をドローン20で行なう必要がないことから、ドローン20に要求される演算処理能力を抑えることができる。また、これに伴い、ドローン20の電力使用量を削減することができ、ドローン20自体を小型化することも容易となる。
 すなわち、本実施形態によれば、より簡便に物品の管理を行なうことができる。
Further, in the present embodiment, since it is not necessary to perform processing such as detection of a two-dimensional code by image analysis and decoding of the detected two-dimensional code in the drone 20, it is possible to suppress the arithmetic processing capability required for the drone 20. Can be. Accordingly, the power consumption of the drone 20 can be reduced, and the drone 20 itself can be easily reduced in size.
That is, according to the present embodiment, it is possible to more easily manage the articles.
 [データ突合処理の処理結果の出力例]
 次に、図8を参照して、データ突合処理の突合結果の出力例について説明をする。ここで、図8は、情報処理システムSにおける、データ突合処理の突合結果の表示の一例を示すイメージ図である。
 図8に示すように、本表示例は、表示領域AR1、表示領域AR2、及び表示領域AR3の3つの表示領域を含む。
[Output example of processing result of data matching process]
Next, with reference to FIG. 8, a description will be given of an example of the output of the matching result of the data matching process. Here, FIG. 8 is an image diagram showing an example of the display of the result of the data matching process in the information processing system S.
As shown in FIG. 8, this display example includes three display areas, a display area AR1, a display area AR2, and a display area AR3.
 表示領域AR1には、第1操作ボタンが表示される。第1操作ボタンは、ユーザが突合結果の印刷を指示するためのユーザインタフェースである。ユーザは、突合結果の印刷を希望する場合に、第1操作ボタンを押下する操作を行なう。 (1) The first operation button is displayed in the display area AR1. The first operation button is a user interface for the user to instruct the printing of the matching result. The user performs an operation of pressing the first operation button when the user wants to print the matching result.
 表示領域AR2には、第2操作ボタンが表示される。第2操作ボタンは、ユーザが突合結果のデータを所定の形式(ここでは、一例としてCSV(comma-separated values)形式)での出力を指示するためのユーザインタフェースである。ユーザは、突合結果のデータの所定の形式での出力を希望する場合に、第2操作ボタンを押下する操作を行なう。なお、データの出力先は、情報処理装置40が備える記憶部45であってもよいし、上述したように、ユーザの利用する他の端末(図示省略)等への出力であってもよい。 第 A second operation button is displayed in the display area AR2. The second operation button is a user interface for the user to instruct output of the data of the matching result in a predetermined format (here, for example, CSV (comma-separated @ values) format). The user performs an operation of pressing the second operation button when the user desires to output the result data in a predetermined format. The output destination of the data may be the storage unit 45 included in the information processing apparatus 40, or may be output to another terminal (not shown) used by the user, as described above.
 表示領域AR3には、突合結果が所定の形式(ここでは、一例としてテーブル形式)で表示される。ここで、突合結果には、例えば、在庫管理装置10にて管理されている物品の在庫情報に含まれている管理上の物品の配置位置と一致する位置に物品が存在するか否かを示す情報や、この在庫情報に含まれない物品が存在するか否かを示す情報、この在庫情報に含まれる物品が存在しないか否かを示す情報が含まれる。 {Circle around (5)} in the display area AR3, the result of the comparison is displayed in a predetermined format (in this example, a table format). Here, the reconciliation result indicates, for example, whether or not the article exists at a position corresponding to the arrangement position of the article on management included in the stock information of the article managed by the stock management apparatus 10. Information, information indicating whether there is an article not included in the stock information, and information indicating whether an article included in the stock information does not exist.
 具体的には、「項目:DRONE_LOCATION NO.」は、データ突合処理において、ドローン20に撮影により特定された、実際の物品の存在位置を示す情報である。また、「項目:DRONE」_PALLET NO.」は、データ突合処理において、ドローン20に撮影により特定された物品識別情報を示す情報である。 {Specifically, "item: DRONE_LOCATION @ NO." Is information indicating the actual position of the article specified by photographing on the drone 20 in the data matching process. Also, “item: DRONE” _PALLET @ NO. "Is information indicating the article identification information specified by photographing on the drone 20 in the data matching process.
 これに対して、「項目:IPA_LOCATION NO.」は、物品の在庫情報に含まれている管理上の物品の配置位置である。また、「項目:IPA_PALLET NO.」は、物品の在庫情報に含まれている管理上の物品の物品識別情報を示す情報である。なお、当然のことながら、同じ物品に関して突合を行なうので、「項目:DRONE_PALLET NO.」と「項目:IPA_PALLET NO.」とは、同じ物品の物品識別情報(すなわち同じ物品識別情報)が格納される。 {On the other hand, "item: IPA_LOCATION @ NO." Is a management arrangement position of the article included in the stock information of the article. “Item: IPA_PALLET @ NO.” Is information indicating article identification information of an article on management included in the stock information of the article. Naturally, since the matching is performed for the same article, “item: DRONE_PALLET @ NO.” And “item: IPA_PALLET @ NO.” Store the article identification information of the same article (that is, the same article identification information). .
 また、「項目:IPA_PRODUCT CODE」は、物品の製品番号が格納される。
 更に、「項目:IPA_SUB INVENTORY」は、物品のステータス情報が格納される。
“Item: IPA_PRODUCT CODE” stores a product number of an article.
Further, “item: IPA_SUB INVENTORY” stores status information of the article.
 更に、「項目:IPA_ARRIVED DATE」は、物品を格納したパレット51が、倉庫に到着した日付を示す情報が格納される。
 更に、「項目:IPA_QUANTITY」は、物品のカートン数量が格納される。
Further, “item: IPA_ARRIVED DATE” stores information indicating the date when the pallet 51 storing the articles arrives at the warehouse.
Further, “item: IPA_QUANTITY” stores the carton quantity of the article.
 更に、「項目:IPA_PACKAGES」は、物品のパッケージ数量が格納される。
 更に、「項目:IPA_DAMAGE FLAG」は、物品に傷や破損といったダメージが有るか否かを示す情報がフラグとして格納される。
Further, “item: IPA_PACKAGES” stores the package quantity of the article.
Further, in “item: IPA_DAMAGE FLAG”, information indicating whether or not the article has damage such as a scratch or breakage is stored as a flag.
 更に、「項目:CHECKED DATE」は、ドローン20による自律飛行及び撮影が行われた日付を示す情報が格納される。
 更に、「項目:CHECKED TIME」は、ドローン20による自律飛行及び撮影が行われた時間帯を示す情報が格納される。
Further, “item: CHECKED DATE” stores information indicating a date on which the autonomous flight and imaging by the drone 20 were performed.
Further, “item: CHECKED TIME” stores information indicating a time zone in which the autonomous flight and imaging by the drone 20 are performed.
 更に、「項目:RESULT」は、データ突合処理における突合結果を示す情報が格納される。突合結果を示す情報とは、物品の在庫情報に含まれている管理上の物品の配置位置と一致する位置に物品が存在したか否かを示す情報である。 Furthermore, “item: RESULT” stores information indicating a result of the data matching process. The information indicating the matching result is information indicating whether or not the article is present at a position corresponding to the management article arrangement position included in the article stock information.
 「項目:RESULT ANALYSIS」は、データ突合処理における突合結果を解析した、より詳細な突合結果を示す情報が格納される。より詳細な突合結果を示す情報とは、例えば、物品の在庫情報に含まれている管理上の物品の配置位置と一致する位置に物品が存在することを示す「1:OK」という情報や、物品の在庫情報に含まれている管理上の物品の配置位置と一致しない位置に物品が存在することを示す「2:Wrong Location」という情報や、物品の在庫情報に含まれている管理上の物品が発見できなかったことを示す「3:Missing」という情報や、物品の在庫情報に含まれている管理上の物品ではない物品を発見したことを示す「4:Non-Stocked」という情報や、物品の在庫情報に含まれている管理上の物品ではない物品を発見したが、それがどのような物品であるかまでは解析できないことを示す「5:Non-Stocked(Decode Failed)」という情報や、二次元コードの解析に失敗したことを示す「6:Decode Failed」という情報や、ロケーション情報が正しく付与されていないことを示す「7:Non-Located」という情報や、物品の在庫情報に含まれている管理上の物品の配置位置と一致しない位置に保留された物品が存在する事を示す「Found Hold」という情報や、物品の在庫情報に含まれている管理上の物品の配置位置と一致しない位置にダメージがあるか否かを示す情報がフラグとして格納されている物品が存在することを示す「Found Damaged」という情報である。 “Item: RESULT @ ANALYSIS” stores information indicating a more detailed matching result obtained by analyzing the matching result in the data matching process. The information indicating a more detailed matching result is, for example, information “1: OK” indicating that an article is present at a position that coincides with a management arrangement position of the article included in the stock information of the article, The information “2: Wrong @ Location” indicating that an article exists at a position that does not match the management article placement position included in the article inventory information, or the management information included in the article inventory information. Information "3: Missing" indicating that the article could not be found, information "4: Non-Stocked" indicating that an article that is not a management article included in the stock information of the article was found, "5: Non-Stock" indicates that an article that is not a management article included in the stock information of the article is found, but the kind of the article cannot be analyzed. d (Decode @ Failed) "," 6: Decode @ Failed "indicating that the analysis of the two-dimensional code has failed, or" 7: Non-Located "indicating that the location information has not been correctly assigned. Information, such as “Found @ Hold”, which indicates that a reserved item exists at a position that does not match the management item placement position included in the item inventory information, or is included in the item inventory information. The information indicating whether or not there is damage at a position that does not match the arrangement position of the managed article is the information “Found @ Damaged” indicating that there is an article stored as a flag.
 このようなデータ突合処理における突合結果を示す情報を参照したユーザは、各物品について、在庫情報通りの適切な位置に物品が存在するか否かを知ることができる。また、ユーザは、仮に物品が適切な位置に存在しない場合でも、他の位置に存在することを知ることができる。更に、ユーザは、そもそも在庫情報に含まれない物品が存在すること等も知ることができる。これにより、ユーザは、例えば、倉庫内の物品を適切な位置に移動させたり、もしくは、在庫管理装置10が管理する在庫情報を、実際の物品の存在位置と同一になるように修正したりする、といった対応をとることができる。また、ユーザは、解析できなかった物品について、現物を目視で確認しに行くようなことができる。なお、情報処理装置40は、データ突合処理における突合結果に応じて、このような具体的な対応策の内容もユーザに通知するようにしてもよい。 (4) The user who refers to the information indicating the result of the matching in the data matching process can know whether or not the article exists at an appropriate position according to the stock information for each article. Further, even if the article is not located at an appropriate position, the user can know that the article is located at another position. Further, the user can know that there is an article that is not included in the stock information in the first place. Thereby, for example, the user moves the article in the warehouse to an appropriate position, or corrects the stock information managed by the stock management apparatus 10 so that the stock information is the same as the actual article location. , And so on. In addition, the user can visually confirm the actual thing of the article that could not be analyzed. The information processing device 40 may also notify the user of such specific countermeasures in accordance with the result of the data matching process.
 更に、「項目:FINAL RESULT」は、物品の存在位置についてユーザが目視で確認した場合の、突合結果を示す情報が格納される。ユーザは、上述したように、データと突合処理において、解析できなかった物品等について目視による確認を行なう場合がある。このような場合に、「項目:FINAL RESULT」に情報が格納される。なお、「項目:RESULT ANALYSIS」に格納される情報の種類は、上述した「項目:RESULT」に格納される「1:OK」等の情報の種類と同じである。そのため、ここでは重複する再度の説明は省略する。 Furthermore, “item: FINAL @ RESULT” stores information indicating the result of a match when the user visually checks the location of the article. As described above, the user may visually check an article or the like that could not be analyzed in the data matching process. In such a case, information is stored in “item: FINAL @ RESULT”. The type of information stored in “item: RESULT @ ANALYSIS” is the same as the type of information such as “1: OK” stored in “item: RESULT” described above. Therefore, a duplicate description is omitted here.
 更に、「項目:PICTURE」は、データ突合処理において、ドローン20に撮影により撮影された画像を表示するためのユーザインタフェースとして機能する。ユーザは、「項目:PICTURE」内に設けられたチェックボックスをチェックする操作を行なうことにより、ドローン20に撮影により撮影された画像を表示させることができる。これにより、ユーザは、パレット51や、パレットラベル52の画像を参照することができ、目視により、パレット51の状態を把握したり、物品識別情報を特定したりすることができる。従って、ユーザが、現物を目視で確認する手間を軽減することが可能となる。 Furthermore, “item: PICTURE” functions as a user interface for displaying an image photographed on the drone 20 in the data matching process. The user can cause the drone 20 to display an image photographed by photographing by performing an operation of checking a check box provided in “item: PICTURE”. Thereby, the user can refer to the images of the pallet 51 and the pallet label 52, and can visually recognize the state of the pallet 51 and specify the article identification information. Therefore, it is possible to reduce the trouble of the user visually confirming the actual product.
 以上、本発明のいくつかの実施形態について説明したが、これらの実施形態は、例示に過ぎず、本発明の技術的範囲を限定するものではない。本発明はその他の様々な実施形態を取ることが可能であり、更に、本発明の要旨を逸脱しない範囲で、省略及び置換等種々の変更を行うことができる。これら実施形態及びその変形は、本明細書等に記載された発明の範囲及び要旨に含まれると共に、特許請求の範囲に記載された発明とその均等の範囲に含まれる。
 例えば、本発明の実施形態を以下の変形例のように変形してもよい。
Although some embodiments of the present invention have been described above, these embodiments are merely examples and do not limit the technical scope of the present invention. The present invention can take various other embodiments, and various changes such as omissions and substitutions can be made without departing from the gist of the present invention. These embodiments and their modifications are included in the scope and gist of the invention described in this specification and the like, and are also included in the invention described in the claims and their equivalents.
For example, the embodiment of the present invention may be modified as in the following modified examples.
 <第1変形例>
 上述の実施形態では、ドローン20は、GPSを利用することなく現在位置を特定することにより、自律飛行を実現していた。これに限らず、GPSにおける衛星からの信号が受信できるような環境であれば、ドローン20は、GPSによる測位結果を利用して自律飛行を実現するようにしてもよい。
<First Modification>
In the above-described embodiment, the drone 20 has realized the autonomous flight by specifying the current position without using the GPS. However, the present invention is not limited thereto, and the drone 20 may realize an autonomous flight using a positioning result by the GPS in an environment where a signal from a satellite in the GPS can be received.
 <第2変形例>
 上述の実施形態では、ドローン20は、所定の時間帯が到来したことを契機として、自律飛行及び撮影を行って、撮影情報を生成していた。これに限らず、ドローン20は、ユーザからの指示があったことを契機として、律飛行及び撮影を行って、撮影情報を生成するようにしてもよい。また、ドローン20が、自律飛行するのではなく、ユーザの操作に応じて飛行をするようにしてもよい。
<Second modification>
In the above-described embodiment, the drone 20 performs the autonomous flight and the photographing when the predetermined time period has arrived, and generates the photographing information. The present invention is not limited to this, and the drone 20 may generate the shooting information by performing the flight and the shooting in response to an instruction from the user. Further, the drone 20 may fly in response to a user operation, instead of flying autonomously.
 <第3変形例>
 上述の実施形態では、基地装置30の撮影情報中継部311が、ドローン20から取得した撮影情報を、情報処理装置40に対して送信していた。すなわち、撮影情報中継部311が、撮影情報を中継していた。これに限らず、ドローン20と情報処理装置40が、直接又はネットワークを介して、相互に通信可能に接続されるようにしてもよい。そして、ドローン20が、基地装置30を経由することなく、情報処理装置40に対して撮影情報を送信するようにしてもよい。
<Third Modification>
In the above embodiment, the photographing information relay unit 311 of the base device 30 transmits the photographing information acquired from the drone 20 to the information processing device 40. That is, the photographing information relay unit 311 relays the photographing information. However, the invention is not limited thereto, and the drone 20 and the information processing device 40 may be communicably connected to each other directly or via a network. Then, the drone 20 may transmit the shooting information to the information processing device 40 without passing through the base device 30.
 また、本発明は、上述の実施形態に限定されるものではなく、本発明の目的を達成できる範囲での変形、改良等は本発明に含まれるものである。 The present invention is not limited to the above-described embodiment, and includes modifications and improvements as long as the object of the present invention can be achieved.
 例えば、上述した一連の処理は、ハードウェアにより実行させることもできるし、ソフトウェアにより実行させることもできる。また、上述した各機能ブロックのそれぞれは、ハードウェア単体で構成してもよいし、ソフトウェア単体で構成してもよいし、それらの組み合わせで構成してもよい。
 換言すると、図3、図4、図5及び図6に図示した機能的構成は例示に過ぎず、特に限定されない。即ち、上述した一連の処理を全体として実行できる機能が情報処理システムSに備えられていれば足り、この機能を実現するためにどのような機能ブロックを用いるのかは特に図3、図4、図5及び図6に図示した例に限定されない。
For example, the series of processes described above can be executed by hardware or can be executed by software. Also, each of the above-described functional blocks may be configured by hardware alone, may be configured by software alone, or may be configured by a combination thereof.
In other words, the functional configurations illustrated in FIGS. 3, 4, 5, and 6 are merely examples, and are not particularly limited. That is, it is sufficient that the information processing system S has a function capable of executing the above-described series of processes as a whole, and what kind of functional blocks are used to realize this function is particularly described in FIGS. 5 and 6 are not limited.
 例えば、本実施形態に含まれる機能的構成を、演算処理を実行するプロセッサによって実現することができ、本実施形態に用いることが可能なプロセッサには、シングルプロセッサ、マルチプロセッサ及びマルチコアプロセッサ等の各種処理装置単体によって構成されるものの他、これら各種処理装置と、ASIC(Application Specific Integrated Circuit)又はFPGA(Field‐Programmable Gate Array)等の処理回路とが組み合わせられたものを含む。 For example, the functional configuration included in the present embodiment can be realized by a processor that executes arithmetic processing, and a processor that can be used in the present embodiment includes various types of processors such as a single processor, a multiprocessor, and a multicore processor. In addition to those constituted by a single processing device, the present invention also includes a combination of these various processing devices and a processing circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field-Programmable Gate Array).
 また、一連の処理をソフトウェアにより実行させる場合には、そのソフトウェアを構成するプログラムが、コンピュータ等にネットワーク又は記録媒体からインストールされる。
 コンピュータは、専用のハードウェアに組み込まれているコンピュータであってもよい。また、コンピュータは、各種のプログラムをインストールすることで、各種の機能を実行することが可能なコンピュータ、例えば汎用のパーソナルコンピュータであってもよい。
When a series of processing is executed by software, a program constituting the software is installed in a computer or the like from a network or a recording medium.
The computer may be a computer embedded in dedicated hardware. Further, the computer may be a computer that can execute various functions by installing various programs, for example, a general-purpose personal computer.
 このようなプログラムを含む記録媒体は、ユーザにプログラムを提供するために装置本体とは別に配布されることによりユーザに提供されてもよく、装置本体に予め組み込まれた状態でユーザに提供されてもよい。装置本体とは別に配布される記憶媒体は、例えば、磁気ディスク(フロッピディスクを含む)、光ディスク、又は光磁気ディスク等により構成される。光ディスクは、例えば、CD-ROM(Compact Disk-Read Only Memory),DVD(Digital Versatile Disk),Blu-ray(登録商標) Disc(ブルーレイディスク)等により構成される。光磁気ディスクは、MD(Mini-Disk)等により構成される。また、装置本体に予め組み込まれた状態でユーザに提供される記録媒体は、例えば、プログラムが記録されている図3のROM12、図4のROM22、図5のROM32、及び図6のROM42、又は図3の記憶部15、図4の記憶部25、図5の記憶部35、図6の記憶部45に含まれるハードディスク等の補助記憶装置で構成される。 The recording medium including such a program may be provided to the user by being distributed separately from the apparatus main body in order to provide the program to the user, or may be provided to the user in a state where the recording medium is incorporated in the apparatus main body in advance. Is also good. The storage medium distributed separately from the apparatus main body is composed of, for example, a magnetic disk (including a floppy disk), an optical disk, a magneto-optical disk, or the like. The optical disc is composed of, for example, a CD-ROM (Compact Disc-Only Memory), a DVD (Digital Versatile Disc), a Blu-ray (registered trademark) Disc (Blu-ray Disc), and the like. The magneto-optical disk is composed of an MD (Mini-Disk) or the like. The recording medium provided to the user in a state in which the program is pre-installed in the apparatus main body is, for example, the ROM 12 in FIG. 3, the ROM 22, in FIG. 4, the ROM 32 in FIG. 5, and the ROM 42 in FIG. The storage unit 15 shown in FIG. 3, the storage unit 25 shown in FIG. 4, the storage unit 35 shown in FIG. 5, and an auxiliary storage device such as a hard disk included in the storage unit 45 shown in FIG.
 なお、本明細書において、記録媒体に記録されるプログラムを記述するステップは、その順序に沿って時系列的に行われる処理はもちろん、必ずしも時系列的に処理されなくとも、並列的あるいは個別に実行される処理をも含むものである。
 また、本明細書において、システムの用語は、複数の装置及び複数の手段等より構成される全体的な装置を意味するものとする。
In this specification, the steps of describing a program recorded on a recording medium may be performed in chronological order according to the order, or in parallel or individually, even if not necessarily performed in chronological order. This includes the processing to be executed.
In this specification, the term “system” refers to an entire device including a plurality of devices and a plurality of means.
 10 在庫管理装置
 20 ドローン
 30 基地装置
 40 情報処理装置
 11、21、31、41 CPU
 12、22、32、42 ROM
 13、23、33、43 RAM
 14、24、34、44 通信部
 15、25、35、45 記憶部
 16、46 入力部
 17、47 表示部
 26 撮影部
 27 駆動部
 28 センサ部
 29 バッテリ
 36 給電部
 51 パレット
 52 パレットラベル
 53 棚
 111 在庫情報管理部
 151 在庫情報記憶部
 211 スケジュール管理部
 212 飛行制御部
 213 撮影情報生成部
 251 スケジュール記憶部
 252 位置検出用情報記憶部
 253 撮影情報記憶部
 311 撮影情報中継部
 312 給電制御部
 351 撮影情報記憶部
 411 撮影情報取得部
 412 識別情報検出部
 413 存在位置特定部
 414 在庫データ取得部
 415 データ突合部
 451 撮影情報記憶部
 452 特定情報記憶部
 S 情報処理システム
Reference Signs List 10 inventory management device 20 drone 30 base device 40 information processing device 11, 21, 31, 41 CPU
12, 22, 32, 42 ROM
13, 23, 33, 43 RAM
14, 24, 34, 44 Communication unit 15, 25, 35, 45 Storage unit 16, 46 Input unit 17, 47 Display unit 26 Imaging unit 27 Drive unit 28 Sensor unit 29 Battery 36 Power supply unit 51 Pallet 52 Pallet label 53 Shelf 111 Inventory information management unit 151 Inventory information storage unit 211 Schedule management unit 212 Flight control unit 213 Imaging information generation unit 251 Schedule storage unit 252 Position detection information storage unit 253 Imaging information storage unit 311 Imaging information relay unit 312 Power supply control unit 351 Imaging information Storage unit 411 Photographing information acquisition unit 412 Identification information detection unit 413 Location identification unit 414 Inventory data acquisition unit 415 Data matching unit 451 Photography information storage unit 452 Specific information storage unit S Information processing system

Claims (8)

  1.  物品の配置場所を飛行した飛行体から、前記飛行体が撮影した画像と、前記飛行体が撮影した位置を示す情報とを対応付けた撮影情報を受信する受信手段と、
     前記撮影情報に含まれる画像から、前記物品の識別情報を検出する検出手段と、
     前記検出手段が検出した識別情報と、前記検出手段が検出対象とした撮影情報に含まれる前記飛行体が撮影した位置とから、前記物品の存在位置を特定する特定手段と、
     を備える情報処理装置。
    From a flying object that has flown at the location of the article, receiving means for receiving imaging information obtained by associating an image captured by the flying object with information indicating a position at which the flying object was captured,
    Detecting means for detecting identification information of the article from an image included in the photographing information;
    Identification means for detecting the identification information detected by the detection means, and a position where the flying object included in the shooting information included in the detection information detected by the detection means, to specify the location of the article,
    An information processing device comprising:
  2.  前記物品の配置場所における、前記物品に関する管理情報を管理手段から取得する取得手段と、
     前記取得手段が取得した前記管理情報と、前記特定手段が特定した前記物品の存在位置とを突合し、該突合の結果を出力する突合手段と、
     を更に備える請求項1に記載の情報処理装置。
    Acquisition means for acquiring management information on the article from the management means at the location of the article,
    The management information obtained by the obtaining unit, and the presence position of the article identified by the identification unit, and a collision unit that outputs the result of the collision,
    The information processing apparatus according to claim 1, further comprising:
  3.  前記突合の結果は、前記管理情報と一致する位置に前記物品が存在するか否かを示す情報、前記管理情報に含まれない前記物品が存在するか否かを示す情報、及び、前記管理情報に含まれる物品が存在しないか否かを示す情報の、少なくとも何れかの情報を含む、
     請求項2に記載の情報処理装置。
    The result of the matching is information indicating whether the article is present at a position that matches the management information, information indicating whether the article not included in the management information is present, and the management information Including at least one of the information indicating whether or not there is an article included in the
    The information processing device according to claim 2.
  4.  前記物品の識別情報は、前記物品の格納容器又は前記物品に、二次元コードとして付与される、
     請求項1から3までの何れか1項に記載の情報処理装置。
    The identification information of the article is given as a two-dimensional code to the storage container of the article or the article,
    The information processing apparatus according to claim 1.
  5.  前記物品の配置場所において、前記物品は水平方向に並んで配置されると共に垂直方向に積載されて配置されており、
     前記飛行体が撮影した位置を示す情報は、前記飛行体が撮影した二次元位置を示す情報である、
     請求項1から4までの何れか1項に記載の情報処理装置。
    In the arrangement location of the articles, the articles are arranged side by side in the horizontal direction and stacked and arranged in the vertical direction,
    The information indicating the position photographed by the flying object is information indicating a two-dimensional position photographed by the flying object.
    The information processing apparatus according to claim 1.
  6.  請求項1から5までの何れか1項に記載の情報処理装置と、前記飛行体とを含んだ情報処理システムであって、
     前記飛行体は、
     前記物品の配置場所における当該飛行体の飛行を制御する飛行制御手段と、
     飛行中に撮影をする撮影手段と、
     前記撮影手段が撮影した画像と、該飛行体が撮影した位置を示す情報とを対応付けた撮影情報を生成する生成手段と、
     を備える、
     情報処理システム。
    An information processing system including the information processing device according to any one of claims 1 to 5 and the flying object,
    The flying object is
    Flight control means for controlling the flight of the aircraft at the location of the article,
    Shooting means for shooting during the flight,
    Generating means for generating shooting information in which the image shot by the shooting means is associated with information indicating a position where the flying object is shot;
    Comprising,
    Information processing system.
  7.  前記物品の配置場所は、所定の区画に区分されており、
     前記所定の区画毎に前記飛行体が配置される、
     請求項6に記載の情報処理システム。
    The location of the article is divided into predetermined sections,
    The flying object is arranged for each of the predetermined sections,
    The information processing system according to claim 6.
  8.  コンピュータに、
     物品の配置場所を飛行した飛行体から、前記飛行体が撮影した画像と、前記飛行体が撮影した位置を示す情報とを対応付けた撮影情報を取得する取得機能と、
     前記撮影情報に含まれる画像から、前記物品の識別情報を検出する検出機能と、
     前記検出機能が検出した識別情報と、前記検出機能が検出対象とした撮影情報に含まれる前記飛行体が撮影した位置とから、前記物品の存在位置を特定する特定機能と、
     を実現させる情報処理プログラム。
    On the computer,
    An acquisition function for acquiring shooting information in which an image captured by the flying object and information indicating a position where the flying object is captured are associated with the flying object flying at the location where the article is placed,
    From an image included in the shooting information, a detection function of detecting the identification information of the article,
    From the identification information detected by the detection function and the position where the flying object is included in the imaging information detected by the detection function, an identification function for identifying the location of the article,
    Information processing program that realizes.
PCT/JP2019/025638 2018-07-06 2019-06-27 Information processing device, information processing system, and information processing program WO2020008999A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018128788A JP2020009083A (en) 2018-07-06 2018-07-06 Apparatus, system and program for processing information
JP2018-128788 2018-07-06

Publications (1)

Publication Number Publication Date
WO2020008999A1 true WO2020008999A1 (en) 2020-01-09

Family

ID=69059758

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/025638 WO2020008999A1 (en) 2018-07-06 2019-06-27 Information processing device, information processing system, and information processing program

Country Status (2)

Country Link
JP (1) JP2020009083A (en)
WO (1) WO2020008999A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011150460A (en) * 2010-01-20 2011-08-04 Hitachi Information & Control Solutions Ltd System and method for managing warehousing/shipping
JP2017050718A (en) * 2015-09-02 2017-03-09 株式会社東芝 Wearable terminal, method, and system
JP2017218325A (en) * 2016-06-03 2017-12-14 裕之 本地川 Information collection device, article management system using the same, and winding device
JP2018043815A (en) * 2016-09-12 2018-03-22 ユーピーアール株式会社 Load monitoring system in warehouse using drone
WO2018100676A1 (en) * 2016-11-30 2018-06-07 株式会社オプティム Camera control system, camera control method, and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011150460A (en) * 2010-01-20 2011-08-04 Hitachi Information & Control Solutions Ltd System and method for managing warehousing/shipping
JP2017050718A (en) * 2015-09-02 2017-03-09 株式会社東芝 Wearable terminal, method, and system
JP2017218325A (en) * 2016-06-03 2017-12-14 裕之 本地川 Information collection device, article management system using the same, and winding device
JP2018043815A (en) * 2016-09-12 2018-03-22 ユーピーアール株式会社 Load monitoring system in warehouse using drone
WO2018100676A1 (en) * 2016-11-30 2018-06-07 株式会社オプティム Camera control system, camera control method, and program

Also Published As

Publication number Publication date
JP2020009083A (en) 2020-01-16

Similar Documents

Publication Publication Date Title
US20200118064A1 (en) Product Status Detection System
US9984354B1 (en) Camera time synchronization system
EP3343481A1 (en) In-vehicle package location identification at load and delivery times
US10223670B1 (en) Bin content determination using flying automated aerial vehicles for imaging
JP2006221529A (en) Material management system
US9892378B2 (en) Devices, systems and methods for tracking and auditing shipment items
JP2018116482A (en) Logistics management apparatus, logistics management method, and program
US20140222709A1 (en) Method and apparatus for updating detailed delivery tracking
JP2018147138A (en) Information processing system, information processing apparatus, information processing method, and information processing program
US20200074676A1 (en) Management system, storage medium, position calculation method, and management apparatus
US20210065585A1 (en) Automated user interfaces for efficient packaging of objects
JP2017214197A (en) Management system, management method and information processing device
JP6728995B2 (en) Automated warehouse system and automated warehouse management method
KR101826193B1 (en) Unmanned aerial vehicle, computer program and distribution center managing system
WO2020008999A1 (en) Information processing device, information processing system, and information processing program
JP2019104625A (en) Arrangement support system, arrangement support method and program
JP6750315B2 (en) Management system, management method, and transportation system
US20190114582A1 (en) Delivery management system and method for outputting location and environmental information
JP6690411B2 (en) Management system, management method, and transportation system
JP4593165B2 (en) Product identification device and product management system
US20220318529A1 (en) Error correction using combination rfid signals
JP2018092434A (en) Management system, information processing device, program, and management method
US20190075009A1 (en) Control method and apparatus in a mobile automation system
WO2022107000A1 (en) Automated tracking of inventory items for order fulfilment and replenishment
JP4860978B2 (en) Container management system and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19830999

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19830999

Country of ref document: EP

Kind code of ref document: A1