WO2020244592A1 - 物品取放检测系统、方法及装置 - Google Patents

物品取放检测系统、方法及装置 Download PDF

Info

Publication number
WO2020244592A1
WO2020244592A1 PCT/CN2020/094433 CN2020094433W WO2020244592A1 WO 2020244592 A1 WO2020244592 A1 WO 2020244592A1 CN 2020094433 W CN2020094433 W CN 2020094433W WO 2020244592 A1 WO2020244592 A1 WO 2020244592A1
Authority
WO
WIPO (PCT)
Prior art keywords
pick
place
marker
item
detection
Prior art date
Application number
PCT/CN2020/094433
Other languages
English (en)
French (fr)
Inventor
解松霖
马强
朱镇峰
王靖雄
毛慧
浦世亮
Original Assignee
杭州海康威视数字技术股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 杭州海康威视数字技术股份有限公司 filed Critical 杭州海康威视数字技术股份有限公司
Priority to EP20817655.2A priority Critical patent/EP3982291A4/en
Priority to US17/616,810 priority patent/US20220309444A1/en
Publication of WO2020244592A1 publication Critical patent/WO2020244592A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1443Methods for optical code recognition including a method step for retrieval of the optical code locating of the code in an image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement or balancing against orders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/20Point-of-sale [POS] network systems
    • G06Q20/202Interconnection or interaction of plural electronic cash registers [ECR] or to host computer, e.g. network details, transfer of information from host to ECR or from ECR to ECR
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/40Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
    • G06Q20/401Transaction verification
    • G06Q20/4014Identity check for transactions
    • G06Q20/40145Biometric identity checks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • G06V10/225Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition based on a marking or identifier characterising the area
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/245Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F17/00Coin-freed apparatus for hiring articles; Coin-freed facilities or services
    • G07F17/10Coin-freed apparatus for hiring articles; Coin-freed facilities or services for means for safe-keeping of property, left temporarily, e.g. by fastening the property
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F9/00Details other than those peculiar to special kinds or types of apparatus
    • G07F9/009User recognition or proximity detection

Definitions

  • the embodiments of the present application relate to the field of image processing technology, and in particular to an article picking and placing detection system, method, and device.
  • item pick-and-place cabinets have gradually been widely used.
  • the item pick-and-place cabinet can be used to store items, and when there is a need for pick-and-place, the items can be taken out of the item pick-and-place cabinet or placed in the item pick-and-place cabinet. Therefore, how to implement item pick-and-place inspection is a key issue.
  • a deep learning network is used to recognize user action gestures or detect target items frame by frame to achieve target item tracking, and achieve the purpose of item picking and placement detection.
  • this type of method uses a deep learning network for video data analysis, which requires a large amount of calculation and computing resources are often deployed on cloud servers.
  • the large amount of video transmission data results in low detection efficiency, high transmission bandwidth requirements, and high deployment costs. .
  • the embodiments of the present application provide an article pick-and-place detection system, method, and device, which can be used to solve problems in related technologies.
  • the technical solution is as follows:
  • an embodiment of the present application provides an item pick-and-place detection system, the system includes: an item pick-and-place cabinet, an image acquisition unit, and a pick-and-place detection unit;
  • the edge of the entrance and exit of the article pick-and-place cabinet is provided with a marker, and the image acquisition unit is used to collect images of the entrance and exit of the article pick-and-place cabinet;
  • the pick-and-place detection unit is connected to the image acquisition unit, and is configured to perform item pick-and-place detection based on the marker information in the image collected by the image acquisition unit.
  • the marker includes one or more of a line feature marker, a barcode marker, and a checkerboard marker.
  • the plurality of markers form a feature array, and the interval between each two markers is smaller than the width of the smallest item among the items taken and placed in the item pick-and-place cabinet.
  • the image acquisition unit includes a camera, and the monitoring area of the one camera covers the entire entrance and exit of the article pick-and-place cabinet.
  • the image acquisition unit includes a plurality of cameras, and the monitoring area of each camera covers a part of the entrance and exit of the article pick-and-place cabinet, and the monitoring regions of the multiple cameras cover the entire entrance and exit of the article pick-and-place cabinet .
  • the system further includes a light source for supplementing light with the marker.
  • one side of the edge of the marker is made of light-absorbing material, and the other side is made of diffusely reflecting material.
  • An article pick-and-place detection method is also provided, the method is applied to any of the above systems, and the method includes:
  • the item pick-and-place test based on the test result includes:
  • the detected marker information is matched with the reference marker information, and the reference marker information includes the marker when the marker at the edge of the entrance and exit of the article pick-and-place cabinet is not blocked ⁇ ;
  • the marker information is used to indicate the location and characteristics of the marker
  • the performing item pick-and-place detection based on the matching result includes:
  • the performing item pick-and-place detection based on the detection result includes:
  • the detection result is that the marker information is not detected, it is determined that there is an item picking and placing operation.
  • the method further includes:
  • the area where the marker is blocked is determined, and a pick-and-place signal is output according to changes in the area where the marker is blocked, and the pick-and-place signal is used to indicate the state of the item pick-and-place operation.
  • An article pick-and-place detection device is also provided, the device is applied to any of the above systems, and the device includes:
  • the acquisition module is used to acquire the current image of the entrance and exit of the item pick-and-place cabinet
  • the first detection module is configured to detect marker information in the current image
  • the second detection module is used to perform item pick-and-place detection based on the detection result.
  • the second detection module is configured to, if the detection result is that marker information is detected, match the detected marker information with reference marker information, where the reference marker information includes the item picking and placing Marker information when the markers on the edges of the entrance and exit of the cabinet are not blocked; carry out item pick-and-place detection based on the matching result.
  • the marker information is used to indicate the location and characteristics of the marker
  • the second detection module is configured to determine whether the position and characteristics of the marker indicated by the detected marker information respectively match the positions and characteristics of the marker indicated by the reference marker information. There is an item pick and place operation;
  • the second detection module is configured to, if the detection result is that the marker information is not detected, determine that there is an item pick-and-place operation.
  • the device further includes:
  • the output module is used to determine the area where the marker is blocked after the item pick-and-place operation is determined, and output a pick-and-place signal according to changes in the area where the marker is blocked, and the pick-and-place signal is used to instruct the item to pick up The status of the release operation.
  • a computer device is also provided.
  • the computer device includes a processor and a memory, and at least one instruction is stored in the memory.
  • the at least one instruction is executed by the processor, the item fetching as described above is implemented. Put the detection method.
  • a computer-readable storage medium is also provided, and at least one instruction is stored in the computer-readable storage medium, and the at least one instruction, when executed, realizes the item picking and placing detection method described above.
  • the marker information in the entry and exit image of the item picking and placing cabinet is detected, and the item picking and placing detection is performed based on the detection results, without the need for detection and tracking of target items, which reduces the scene and the target
  • the requirements of items can respond to frequently changing scenes and pick and place at any speed, with high detection robustness.
  • this application uses marker information to perform pick-and-place detection, which improves the authenticity and accuracy of the pick-and-place detection operation.
  • FIG. 1 is a schematic structural diagram of an article pick-and-place detection system provided by an embodiment of the present application
  • FIG. 2 is a schematic diagram of markers provided by an embodiment of the present application.
  • FIG. 3 is a schematic structural diagram of an article pick-and-place detection system provided by an embodiment of the present application.
  • FIG. 4 is a schematic structural diagram of an article picking and placing detection system provided by an embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of an article pick-and-place detection system provided by an embodiment of the present application.
  • FIG. 6 is a flowchart of an article picking and placing detection method provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of an article picking and placing detection process provided by an embodiment of the present application.
  • FIG. 8 is a flowchart of an article picking and placing detection method provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of an output process of a pick-and-place signal provided by an embodiment of the present application.
  • FIG. 10 is a schematic diagram of an article picking and placing detection process provided by an embodiment of the present application.
  • FIG. 11 is a schematic diagram of an article pick-and-place detection process provided by an embodiment of the present application.
  • FIG. 12 is a schematic structural diagram of an article pick-and-place detection device provided by an embodiment of the present application.
  • FIG. 13 is a schematic structural diagram of an article pick-and-place detection device provided by an embodiment of the present application.
  • FIG. 14 is a schematic structural diagram of a computer device provided by an embodiment of the present application.
  • FIG. 15 is a schematic structural diagram of a computer device provided by an embodiment of the present application.
  • the embodiment of the present application provides an item picking and placing detection system.
  • the system includes: an item picking and placing cabinet 11, an image acquisition unit 12, and a picking and placing detection unit 13;
  • the pick-and-place detection unit 13 is connected to the image acquisition unit 12, and is used to perform item pick-and-place detection based on the marker information in the image collected by the image acquisition unit 12.
  • the article pick-and-place cabinet 11 is used to store articles.
  • the embodiment of the present application does not limit the product form of the article pick-and-place cabinet 11, nor does it limit the types, sizes, and quantities of articles stored in the article pick-and-place cabinet 11. Since when items are taken from the item pick-and-place cabinet 11, part of the area of the entrance and exit of the item pick-and-place cabinet 11 will be blocked. Therefore, in the embodiment of the present application, a marker 111 is provided on the edge of the entrance and exit of the item pick-and-place cabinet 11, which can be based on The occlusion of the marker 111 is used to detect whether there is an item picking and placing operation.
  • the pick-and-place detection unit 13 may be connected to the image acquisition unit 12 for remote communication via a network. That is to say, the pick-and-place detection unit 13 may not be arranged in the same area as the image acquisition unit 12, the article pick-and-place cabinet 11, etc., to realize remote pick-and-place detection.
  • the pick-and-place detection unit 13 can serve multiple item pick-and-place detection systems at the same time, that is, one pick-and-place detection unit 13 can communicate with image acquisition units in multiple item pick-and-place detection systems at the same time, so as to be based on The images collected by the image acquisition unit in the different item pick-and-place detection system perform pick-and-place detection on the different item pick-and-place detection system.
  • the pick-and-place detection unit 13 can be implemented by a computer device, for example, a server is used as the pick-and-place detection unit 13, and the server can be deployed in the cloud.
  • the marker 111 includes, but is not limited to, one or more of a line feature marker, a barcode marker, and a checkerboard marker.
  • multiple line features are encoded to form markers of the line feature.
  • the marker of the line feature is a vertical gradient encoding type, and the marker of the vertical gradient encoding type is gradient-encoded in a direction perpendicular to the pick-and-place boundary (ie, the edge of the entrance and exit).
  • the gradient exists in the vertical direction of the boundary, and the marker interval of this coding method is infinitely small.
  • the line feature marker may be a black and white stripe marker.
  • Barcodes and checkerboards can be coded as two-dimensional coding type markers, and the two-dimensional coding type markers are coded in both the vertical and horizontal directions of the pick-and-place boundary.
  • Common two-dimensional codes include two-dimensional codes and two-dimensional codes in the form of a checkerboard, such as a barcode in the form of a two-dimensional code as shown in Figure 2(b), and a checkerboard as shown in Figure 2(c).
  • the multiple markers 111 form a feature array.
  • the interval between every two markers 111 is smaller than the width of the smallest item among the items picked and placed from the item pick-and-place cabinet 11.
  • the markers 111 can be arranged continuously for a week at the edge of the entrance and exit of the article pick-and-place cabinet 11, and the interval between every two markers 111 is smaller than the width of the smallest item among the articles picked and placed in the article pick-and-place cabinet 11, thereby Avoid missing detection and further improve the accuracy of pick and place detection.
  • the edge gradient of the marker 111 is guaranteed to be above 10, that is, the difference between the pixel values (or pixel gray values) of the two sides of the marker 11 It is greater than 10 to ensure the accuracy of feature extraction of the marker 111.
  • one side of the edge of the marker 111 is made of a light-absorbing material, and the other side is made of a diffuse reflection material.
  • the materials on both sides of the edge of the marker 111 are often selected from materials with strong light absorption properties, such as light-absorbing photographic cloth, printing ink, rubber, etc., and materials with strong diffuse reflection capabilities on the other side, such as printing Paper, PET (Polyethylene Terephthalate, polyethylene terephthalate) diffuse reflective materials, etc.
  • the embodiment of the present application does not limit the material of the marker 111, as long as the feature can be extracted.
  • the marker 111 is in black and white, and the paper marker 111 printed in black and white can be posted on the edge of the entrance and exit of the article access cabinet 11, for example, there is a circle around the inner cavity of the cabinet for posting the marker 111.
  • the graphite in the black part of the marker 111 has good light absorption performance, while the white part of the printing paper has good diffuse reflection performance to ensure that the grayscale difference between black and white in the grayscale image of the marker 111 is more than 10, for example, It's 100.
  • the image acquisition unit 12 is used to acquire images of the entrance and exit of the article taking and placing cabinet 11, and the image acquisition unit 12 may include a camera, and the monitoring area of the camera covers the entire entrance and exit of the article taking and placing cabinet 11. Therefore, the entire entrance and exit of the article pick-and-place cabinet 11 can be photographed by one camera, so as to avoid inaccurate pick-and-place detection due to missing detection of a certain marker 111.
  • a circle of markers 111 are continuously arranged, and the detection camera can detect the characteristics of the marker 111 while monitoring the entrance.
  • the camera's angle of view can cover the entire entrance and exit, so as to ensure that any position of the pick-and-place operation can be displayed by the collected image, thereby avoiding missed detection.
  • the image acquisition unit 12 includes multiple cameras, and the monitoring area of each camera covers a part of the entrance and exit of the item access cabinet 11, and the monitoring areas of the multiple cameras cover the item access cabinet. 11 the entire entrance.
  • the number of cameras is determined according to the size of the entrance and exit of the item pick-and-place cabinet 11 and the viewing angle range of the camera to ensure that the sum of the monitoring areas of the cameras used for detection can cover the entire entrance and exit of the item pick and place cabinet 11.
  • each camera transmits the current collected image to the pick-and-place detection unit 13.
  • the images collected by each camera need to be synchronized, so that the current image acquired by the pick-and-place detection unit 13 is an image at the same time, so that the current image can reflect the situation of the entrance and exit of the item pick-and-place cabinet 11 at the same time to improve detection The accuracy of the results.
  • the embodiment of the present application only uses the connection of the image acquisition unit 12 with the article access cabinet 11 as an example for description.
  • the image acquisition unit 12 can be set at a certain range of the entrance and exit of the article access cabinet 11. Inside, so as to ensure that images of the entrance and exit can be collected.
  • the image acquisition unit 12 can also be arranged separately from the item pick-and-place cabinet 11.
  • the image acquisition unit 12 is arranged opposite to the article pick-and-place cabinet 11, and is directly at the entrance and exit of the article pick-and-place cabinet 11 and can acquire images of the entrance and exit.
  • the embodiment of the present application does not limit the specific number and location of the image acquisition unit 12.
  • the image acquisition unit 12 includes a camera, and markers can be set on the edge of the entrance and exit of the article pick-and-place cabinet 11.
  • a camera can be set at the upper right corner of the entrance to detect the entrance and collect images of the entire entrance.
  • a camera can also be set at the upper right corner and the upper left corner of the entrance to detect the entire entrance and collect images of the entire entrance.
  • the system also includes a light source 14 for supplementing light to the marker 111, as shown in FIG. 5.
  • the marker 111 is supplemented by the light source 14 so as to ensure that the gray scale of the characteristic image of the marker 111 does not change with changes in the light conditions of the external environment, thereby ensuring the accuracy of pick and place detection.
  • the embodiment of the present application does not limit it, as long as the marker 111 can be supplemented with light.
  • the light source 14 may be arranged directly opposite to the item taking and placing cabinet 11 to face the edge of the entrance and exit of the item taking and placing cabinet 11.
  • the number of the light sources 14 may be one or more, and the embodiment of the present application also does not limit the number of the light sources 14 and does not limit the type of the light sources 14.
  • the system may also include a control device for controlling the light source 14 to be turned on and off.
  • the control device can control the turning on and off of the light source 14 based on the brightness of the environment where the item picking and placing cabinet 11 is located.
  • an embodiment of the present application provides an item pick-and-place detection method, which is applied to the pick-and-place detection unit 13 in the foregoing item pick-and-place detection system. As shown in Figure 6, the method includes the following steps.
  • step 601 the current image of the entrance and exit of the item pick-and-place cabinet is acquired.
  • the current image of the entrance and exit of the article pick-and-place cabinet 11 is collected by the image acquisition unit 12.
  • the image acquisition unit 12 transmits the collected current image to the pick-and-place detection unit 13.
  • each camera transmits the current collected image to the pick-and-place detection unit 13.
  • the images collected by each camera need to be synchronized, so that the current image acquired by the pick-and-place detection unit 13 is an image at the same time, so that the current image can reflect the situation of the entrance and exit of the article pick-and-place cabinet 11 at the same time to improve detection. The accuracy of the results.
  • step 602 the marker information is detected in the current image.
  • the pick-and-place detection unit 13 After the pick-and-place detection unit 13 acquires the current image of the entrance and exit of the article pick-and-place cabinet 11, it detects marker information in the current image.
  • the marker information is used to indicate the location and characteristic of the marker 111, and the characteristic may be a characteristic of the location where the marker 111 is located.
  • the marker information when the marker 111 on the edge of the entrance and exit of the article picking and placing cabinet 11 is not blocked may be acquired first.
  • the item pick-and-place detection system needs to be initialized once, using the reference image provided by the image acquisition unit 12 when no pick-and-place operation is performed, to detect the position and corresponding feature of the marker 111 therein.
  • the marker feature is obtained by extracting the gradient position and direction in the reference image.
  • the two-dimensional code-encoded marker is recognized by the two-dimensional code recognition algorithm and the location and internal code of the two-dimensional code are obtained.
  • the checkerboard corner points are obtained by the checkerboard corner extraction algorithm, and the positions and characteristics of all the markers in the reference image will be obtained.
  • the above-mentioned method of detecting the landmark information in the reference image can be used. I won't repeat them here.
  • step 603 an item pick-and-place test is performed based on the test result.
  • performing an item pick-and-place test based on the test result includes: if the test result is the detected marker information, matching the detected marker information with the reference marker information, the reference marker information includes the information of the item pick-and-place cabinet The information of the marker when the marker at the edge of the entrance and exit is not blocked; the item pick-and-place detection is performed based on the matching result.
  • the reference marker information includes the marker information when the marker at the edge of the entrance and exit of the item pick-and-place cabinet is not blocked. Therefore, after the marker information is detected , The detected marker information can be matched with the reference marker information, so as to determine whether the marker is occluded according to the matching result, thereby determining whether a pick-and-place operation occurs.
  • the item pick-and-place detection is performed based on the matching result, including but not limited to the following two situations:
  • the first case if the location and characteristics of the marker indicated by the detected marker information respectively match the position and characteristics of the marker indicated by the reference marker information, it is determined that there is no item picking and placing operation.
  • the second case if the position and characteristics of the marker indicated by the detected marker information do not match the positions and characteristics of the marker indicated by the reference marker information, it is determined that there is an item pick-and-place operation.
  • the matching degree threshold may be determined based on the scene, or may be set based on experience, and may also be adjusted appropriately during the application process, which is not limited in the embodiment of the present application.
  • performing an item pick-and-place detection based on the detection result includes: if the detection result is that no marker information is detected, determining that there is an item pick-and-place operation.
  • the method provided in the embodiment of the present application further includes: determining the area where the marker is blocked, and outputting the pick-and-place signal according to the change in the area where the marker is blocked, The signal is used to indicate the status of the item pick-and-place operation.
  • the image acquisition unit 12 can collect current images of the entrance and exit of the article pick-and-place cabinet 11 at each time, each current image can determine whether the marker 111 is blocked and the area where the marker 111 is blocked.
  • the pick-and-place signal is output according to the change of the blocked area of the marker 111, and the pick-and-place signal is used to indicate the state of the item pick-and-place operation.
  • the pick-and-play signal is output by comparing changes in the number of occluded areas in the previous and next frame images in the time domain.
  • the pick and place signal is used to indicate the status of the item pick and place operation.
  • the pick-and-place signal includes an entry signal and an exit signal.
  • an entry signal can be triggered, which is used to instruct to enter the item pick-and-place cabinet to perform pick-and-place operations; when the number of sheltered areas
  • the leave signal can be triggered.
  • the leave signal is used to indicate to leave the item pick-and-place cabinet, but the pick-and-place operation has not ended yet.
  • the pick-and-place signal may also include the start and end state signals of the entire pick-and-place process obtained by analyzing the number of shielded areas. For example, when the number of blocked areas changes from 0 to 1, the start signal is triggered, and the start signal is used to indicate the start of entering the item pick-and-place cabinet to perform pick-and-place operations; when the number of blocked areas changes from 1 to 0, the end signal is triggered. The end signal is used to indicate to completely leave the item pick-and-place cabinet to complete the pick-and-place operation.
  • the markers of the two-dimensional code encoding are black and white, and the paper two-dimensional code printed in black and white is posted on the edge of the entrance and exit of the item pick-and-place cabinet.
  • the marker can be supplemented with light by a light source to reduce illumination changes, thereby reducing the impact on the feature extraction of the two-dimensional code.
  • the graphite in the black part of the two-dimensional code has good light absorption performance, and the white part of the printing paper has good diffuse reflection performance, ensuring that the grayscale difference between black and white in the grayscale image of the two-dimensional code is more than 100.
  • the method provided in the embodiment of the present application first uses the image acquisition unit 12 to collect the reference image at the time when the pick-and-place operation is not performed, and then the pick-and-place detection unit 13 identifies all the two-dimensional codes in the image.
  • the continuous two-dimensional code sequence located at the edge of the entrance and exit of the item pick-and-place cabinet 11 obtain the position of all the two-dimensional codes and the internal coding vector as the marker feature during pick-and-place detection, and obtain the reference marker Information for subsequent testing.
  • the image acquisition unit 12 detects the current image of the entrance and exit of the article picking and placing cabinet 11 in real time.
  • the current image collected by the image acquisition unit 12 is detected for occlusion.
  • the pick-and-place operation since the pick-and-place operation will block the two-dimensional code located at the edge of the entrance and exit, the blocked two-dimensional code is shown in the shaded area in Figure 7(b).
  • the two-dimensional code is detected on the current image according to the position of the two-dimensional code derived from the reference marker information and the internal coding vector of the two-dimensional code is extracted.
  • the two-dimensional code at that location is blocked, and it is determined that there is a pick and place operation.
  • the process of determining different pick-and-place signals can be seen in FIG. 9.
  • the pick and place is triggered, and the masked area is analyzed to obtain the position and number of the masked area.
  • the dotted line indicates the pick and place operation.
  • the occluded area there are two occluded areas in Figure 7; using the occluded area information, the pick-and-play signal is output by comparing the changes in the number of occluded areas of the previous and previous frames in the time domain.
  • the access signal is output by comparing the number of changes in the number of occluded areas obtained by comparing the marker information in the current image collected at different times. For example, when the number of time-domain occlusion regions increases, if it is from nothing, it is a start signal. When the number of time-domain occlusion areas increases, if it is not from scratch, it is an entry signal. When the number of time-domain occlusion areas is reduced, if it is from presence to absence, it is an end signal. When the number of time-domain occlusion areas is reduced, if it is not from there to nothing, it is a departure signal.
  • the content of the fetching signal can also be referred to the description below the above step 603, which will not be repeated here.
  • the method for picking and placing an article based on this marker is similar to that shown in Figures 7 and 8.
  • the difference of the process lies in the coding type of the marker.
  • the marker shown in Figure 10(a) is a continuous strip of markings printed with horizontal black and white stripes (ie, vertical gradient).
  • a black-and-white printed paper marking strip is posted to the edge of the entrance and exit of the item pick-and-place cabinet.
  • the camera angle of view is adjusted so that the sign bar in the current image of the entrance and exit of the item pick-and-place cabinet captured by the camera is as parallel as possible to the horizontal axis of the current image. Since the markers are continuous, a column of markers in the detected camera image is used as a feature description unit. Take the line-encoded flag strip as shown in Figure 10(a) as an example. Each column of flag strips has two vertical gradients, one of which has a downward gradient, that is, the gray scale increases from top to bottom, and the other has an upward gradient, that is The gray scale becomes smaller from top to bottom.
  • the estimated position of each gradient edge can be manually drawn by drawing a line on the reference image when the pick-and-place operation is not performed.
  • the method provided by the embodiment of the present application first uses the image acquisition unit to collect the reference image at the time when the pick-and-place operation is not performed, and the pick-and-place detection unit searches in the neighborhood of the estimated position in the vertical direction. By searching and finding the pixel position with the largest gradient in the neighborhood as the accurate gradient position, all gradient positions and corresponding gradient directions in each column of markers in the reference image are obtained as reference marker information.
  • the image acquisition unit detects the current image of the entrance and exit of the item picking and placing cabinet in real time.
  • the gradient is extracted on the current image according to the gradient position in the reference marker information. If there is a situation in the image that the gradient is not extracted or the direction of the extracted gradient does not match the features in the reference marker information, then there is a pick and place operation in the current area, so that the marker is blocked.
  • the blocked marker is shown in Figure 10 (b) ) Is shown in the shaded area.
  • the dotted line is the occlusion area where the pick and place operation exists.
  • the occlusion region information is used to compare the individual occlusion regions of the previous and previous frames in the time domain.
  • Number change output pick and place signal That is, the access signal is output by comparing the number of changes in the number of occluded areas obtained by comparing the marker information in the current image collected at different times.
  • the content of the fetching signal can be referred to the above-mentioned FIG. 9 and the description below step 603, which will not be repeated here.
  • checkerboard markers are black and white, and the paper checkerboard markers printed in black and white are posted on the edge of the entrance and exit of the item pick-and-place cabinet.
  • the marker can be supplemented with light by a light source to reduce illumination changes, thereby reducing the impact on the extraction of checkerboard features.
  • the graphite in the black part of the checkerboard has good light absorption performance, while the white part of the printing paper has good diffuse reflection performance, ensuring that the grayscale difference between black and white in the checkerboard is more than 100 in the grayscale image.
  • the method provided by the embodiment of the present application first uses an image acquisition unit to collect a reference image at a time when no pick-and-place operation is performed, and then the pick-and-place detection unit identifies all checkerboard corners in the image. As shown in Figure 11(a), the continuous checkerboard corner point sequence at the edge of the entrance and exit of the item picking and placing cabinet is obtained, and all the checkerboard corner positions are obtained as the marker features during item picking and placement detection, and the reference marker information is obtained for use For subsequent testing.
  • the image acquisition unit detects the current image of the entrance and exit of the item picking and placing cabinet in real time.
  • the pick-and-place operation since the pick-and-place operation will cover the corners of the chessboard at the edge of the entrance and exit, the blocked corners of the chessboard are shown in the shaded area in Figure 11(b).
  • the checkerboard corner points are extracted from the current image based on the position of the checkerboard corner points obtained by the reference marker information. If a checkerboard corner point can be extracted in the current image, the extracted checkerboard corner point matches the position of the checkerboard corner point, and the checkerboard corner point at that position is not blocked. If the checkerboard corner cannot be extracted in the current image, the checkerboard corner at that position is blocked, and it is determined that there is a pick and place operation.
  • the pick-and-play signal is output by comparing the changes in the number of occlusion areas of the previous and subsequent frames in the time domain. That is, the access signal is output by comparing the number of changes in the number of occluded areas obtained by comparing the marker information in the current image collected at different times.
  • the content of the fetch signal can be referred to the above description, which will not be repeated here.
  • the embodiment of the present application provides an article pick-and-place detection device, which is applied to the above-mentioned article pick-and-place detection system.
  • the article pick-and-place detection device shown in Fig. 12 can execute the above Item pick-and-place detection method.
  • the device includes:
  • the acquiring module 1201 is used to acquire the current image of the entrance and exit of the item picking and placing cabinet;
  • the first detection module 1202 is used to detect marker information in the current image
  • the second detection module 1203 is configured to perform an item pick-and-place detection based on the detection result.
  • the second detection module 1203 is configured to, if the detection result is that the marker information is detected, match the detected marker information with the reference marker information, and the reference marker information includes the edge of the entrance and exit of the item pick-and-place cabinet. The information of the marker when the marker is not obscured; based on the matching result, the item pick-and-place detection is performed.
  • the marker information is used to indicate the location and characteristics of the marker
  • the second detection module 1203 is used for determining that there is no item pick-and-place operation if the location and feature of the marker indicated by the detected marker information respectively match the location and feature of the marker indicated by the reference marker information ;
  • the second detection module 1203 is configured to, if the detection result is that the marker information is not detected, determine that there is an item picking and placing operation.
  • the device further includes:
  • the output module 1204 is used to determine the area where the marker is blocked after determining that there is an item pick-and-place operation, and output a pick-and-place signal according to the change of the area where the marker is blocked.
  • the pick-and-place signal is used to indicate the state of the item pick-and-place operation.
  • the current image acquired by the above-mentioned acquisition module 1201 can be implemented by the image acquisition unit 12 shown in FIG. 1, FIG. 3, and FIG. 5.
  • the functions of the acquisition module 1201, the first detection module 1202, and the second detection module 1203 can be implemented by the image acquisition unit 12 shown in FIG. 3.
  • the pick-and-place detection unit 13 shown in FIGS. 3 and 5 is implemented.
  • a computer device is also provided.
  • the computer device includes a processor 141 and a memory 142, and the memory 142 stores at least one instruction.
  • the at least one instruction is configured to be executed by one or more processors 141, so as to implement any one of the aforementioned item picking and placing detection methods.
  • Fig. 15 is a schematic structural diagram of a computer device provided by an embodiment of the present invention.
  • the device can be a terminal, for example: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, Motion Picture Expert Compression Standard Audio Layer 3), MP4 (Moving Picture Experts Group Audio Layer IV, Motion Picture Expert compression standard audio level 4) Player, laptop or desktop computer.
  • the terminal may also be called user equipment, portable terminal, laptop terminal, desktop terminal and other names.
  • the terminal includes a processor 1501 and a memory 1502.
  • the processor 1501 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on.
  • the processor 1501 can adopt at least one hardware form among DSP (Digital Signal Processing), FPGA (Field-Programmable Gate Array), and PLA (Programmable Logic Array, Programmable Logic Array). achieve.
  • the processor 1501 may also include a main processor and a coprocessor.
  • the main processor is a processor used to process data in the wake state, also called a CPU (Central Processing Unit, central processing unit); the coprocessor is A low-power processor used to process data in the standby state.
  • the processor 1501 may be integrated with a GPU (Graphics Processing Unit, image processor), and the GPU is responsible for rendering and drawing content that needs to be displayed on the display screen.
  • the processor 1501 may further include an AI (Artificial Intelligence) processor, and the AI processor is used to process computing operations related to machine learning.
  • AI Artificial Intelligence
  • the memory 1502 may include one or more computer-readable storage media, which may be non-transitory.
  • the memory 1502 may also include high-speed random access memory and non-volatile memory, such as one or more magnetic disk storage devices and flash memory storage devices.
  • the non-transitory computer-readable storage medium in the memory 1502 is used to store at least one instruction, and the at least one instruction is used to be executed by the processor 1501 to implement the item retrieval provided in the method embodiment of the present application. Put the detection method.
  • the terminal may optionally further include: a peripheral device interface 1503 and at least one peripheral device.
  • the processor 1501, the memory 1502, and the peripheral device interface 1503 may be connected by a bus or a signal line.
  • Each peripheral device can be connected to the peripheral device interface 1503 through a bus, a signal line or a circuit board.
  • the peripheral device includes at least one of a radio frequency circuit 1504, a touch display screen 1505, a camera 1506, an audio circuit 1507, a positioning component 1508, and a power supply 1509.
  • the peripheral device interface 1503 may be used to connect at least one peripheral device related to I/O (Input/Output) to the processor 1501 and the memory 1502.
  • the processor 1501, the memory 1502, and the peripheral device interface 1503 are integrated on the same chip or circuit board; in some other embodiments, any one of the processor 1501, the memory 1502, and the peripheral device interface 1503 or The two can be implemented on separate chips or circuit boards, which are not limited in this embodiment.
  • the radio frequency circuit 1504 is used to receive and transmit RF (Radio Frequency, radio frequency) signals, also called electromagnetic signals.
  • the radio frequency circuit 1504 communicates with the communication network and other communication devices through electromagnetic signals.
  • the radio frequency circuit 1504 converts electrical signals into electromagnetic signals for transmission, or converts received electromagnetic signals into electrical signals.
  • the radio frequency circuit 1504 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a user identity module card, and so on.
  • the radio frequency circuit 1504 can communicate with other terminals through at least one wireless communication protocol.
  • the wireless communication protocol includes but is not limited to: metropolitan area network, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area network and/or WiFi (Wireless Fidelity, wireless fidelity) network.
  • the radio frequency circuit 1504 may also include NFC (Near Field Communication) related circuits, which is not limited in this application.
  • the display screen 1505 is used to display a UI (User Interface, user interface).
  • the UI can include graphics, text, icons, videos, and any combination thereof.
  • the display screen 1505 also has the ability to collect touch signals on or above the surface of the display screen 1505.
  • the touch signal may be input to the processor 1501 as a control signal for processing.
  • the display screen 1505 may also be used to provide virtual buttons and/or virtual keyboards, also called soft buttons and/or soft keyboards.
  • the display screen 1505 may be one display screen 1505, which is provided with the front panel of the terminal; in other embodiments, there may be at least two display screens 1505, which are respectively provided on different surfaces of the terminal or in a folding design;
  • the display screen 1505 may be a flexible display screen, which is arranged on a curved surface or a folding surface of the terminal.
  • the display screen 1505 can also be set as a non-rectangular irregular pattern, that is, a special-shaped screen.
  • the display screen 1505 may be made of materials such as LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode, organic light emitting diode).
  • the camera assembly 1506 is used to collect images or videos.
  • the camera assembly 1506 includes a front camera and a rear camera.
  • the front camera is set on the front panel of the terminal, and the rear camera is set on the back of the terminal.
  • the camera assembly 1506 may also include a flash.
  • the flash can be a single-color flash or a dual-color flash. Dual color temperature flash refers to a combination of warm light flash and cold light flash, which can be used for light compensation under different color temperatures.
  • the audio circuit 1507 may include a microphone and a speaker.
  • the microphone is used to collect sound waves of the user and the environment, and convert the sound waves into electrical signals and input them to the processor 1501 for processing, or input to the radio frequency circuit 1504 to implement voice communication. For the purpose of stereo collection or noise reduction, there may be multiple microphones, which are respectively set in different parts of the terminal.
  • the microphone can also be an array microphone or an omnidirectional acquisition microphone.
  • the speaker is used to convert the electrical signal from the processor 1501 or the radio frequency circuit 1504 into sound waves.
  • the speaker can be a traditional membrane speaker or a piezoelectric ceramic speaker.
  • the speaker When the speaker is a piezoelectric ceramic speaker, it can not only convert the electrical signal into human audible sound waves, but also convert the electrical signal into human inaudible sound waves for purposes such as distance measurement.
  • the audio circuit 1507 may also include a headphone jack.
  • the positioning component 1508 is used to locate the current geographic location of the terminal to implement navigation or LBS (Location Based Service, location-based service).
  • the positioning component 1508 may be a positioning component based on the GPS (Global Positioning System, Global Positioning System) of the United States, the Beidou system of China, the Granus system of Russia, or the Galileo system of the European Union.
  • the power supply 1509 is used to supply power to various components in the terminal.
  • the power source 1509 may be alternating current, direct current, disposable batteries or rechargeable batteries.
  • the rechargeable battery may support wired charging or wireless charging.
  • the rechargeable battery can also be used to support fast charging technology.
  • the terminal further includes one or more sensors 1510.
  • the one or more sensors 1510 include, but are not limited to: an acceleration sensor 1511, a gyroscope sensor 1512, a pressure sensor 1513, a fingerprint sensor 1514, an optical sensor 1515, and a proximity sensor 1516.
  • the acceleration sensor 1511 can detect the magnitude of acceleration on the three coordinate axes of the coordinate system established by the terminal. For example, the acceleration sensor 1511 can be used to detect the components of the gravitational acceleration on three coordinate axes.
  • the processor 1501 may control the touch screen 1505 to display the user interface in a horizontal view or a vertical view according to the gravity acceleration signal collected by the acceleration sensor 1511.
  • the acceleration sensor 1511 may also be used for the collection of game or user motion data.
  • the gyroscope sensor 1512 can detect the body direction and rotation angle of the terminal, and the gyroscope sensor 1512 can cooperate with the acceleration sensor 1511 to collect the user's 3D actions on the terminal. Based on the data collected by the gyroscope sensor 1512, the processor 1501 can implement the following functions: motion sensing (such as changing the UI according to the user's tilt operation), image stabilization during shooting, game control, and inertial navigation.
  • the pressure sensor 1513 may be arranged on the side frame of the terminal and/or the lower layer of the touch screen 1505.
  • the processor 1501 performs left and right hand recognition or quick operation according to the holding signal collected by the pressure sensor 1513.
  • the processor 1501 operates according to the user's pressure on the touch display screen 1505 to control the operability controls on the UI interface.
  • the operability control includes at least one of a button control, a scroll bar control, an icon control, and a menu control.
  • the fingerprint sensor 1514 is used to collect the user's fingerprint.
  • the processor 1501 identifies the user's identity according to the fingerprint collected by the fingerprint sensor 1514, or the fingerprint sensor 1514 identifies the user's identity according to the collected fingerprint.
  • the processor 1501 authorizes the user to perform related sensitive operations, including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings.
  • the fingerprint sensor 1514 may be provided on the front, back or side of the terminal. When a physical button or a manufacturer logo is provided on the terminal, the fingerprint sensor 1514 can be integrated with the physical button or the manufacturer logo.
  • the optical sensor 1515 is used to collect the ambient light intensity.
  • the processor 1501 may control the display brightness of the touch screen 1505 according to the intensity of the ambient light collected by the optical sensor 1515. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1505 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 1505 is decreased.
  • the processor 1501 may also dynamically adjust the shooting parameters of the camera assembly 1506 according to the ambient light intensity collected by the optical sensor 1515.
  • the proximity sensor 1516 also called a distance sensor, is usually set on the front panel of the terminal.
  • the proximity sensor 1516 is used to collect the distance between the user and the front of the terminal.
  • the processor 1501 controls the touch screen 1505 to switch from the on-screen state to the off-screen state; when the proximity sensor 1516 detects When the distance between the user and the front of the terminal gradually increases, the processor 1501 controls the touch display screen 1505 to switch from the on-screen state to the on-screen state.
  • FIG. 15 does not constitute a limitation on the terminal, and may include more or fewer components than shown in the figure, or combine some components, or adopt different component arrangements.
  • a computer-readable storage medium is also provided, and at least one instruction is stored in the storage medium.
  • the at least one instruction is executed by a processor of a computer device, any of the foregoing items can be retrieved. Put the detection method.
  • the aforementioned computer-readable storage medium may be ROM, random access memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Multimedia (AREA)
  • Accounting & Taxation (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Toxicology (AREA)
  • General Health & Medical Sciences (AREA)
  • Electromagnetism (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Finance (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Tourism & Hospitality (AREA)
  • Quality & Reliability (AREA)
  • Operations Research (AREA)
  • Development Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Computer Security & Cryptography (AREA)
  • Image Analysis (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

一种物品取放检测系统、方法、装置、设备及存储介质。物品取放检测系统包括:物品取放柜(11)、图像采集单元(12)及取放检测单元(13);物品取放柜的出入口边缘具有标志物(111),图像采集单元(12)用于采集物品取放柜(11)的出入口的图像;取放检测单元(13)与图像采集单元(12)连接,用于基于图像采集单元(12)采集到的图像中的标志物信息进行物品取放检测。通过在物品取放柜(11)的出入口边缘设置标志物(111),检测物品取放柜(11)的出入口图像中的标志物信息,基于检测结果进行物品取放检测,而无需目标物品的检测和跟踪,降低了场景和目标物品的要求,可以应对频繁变化的场景和任意速度取放,检测鲁棒性高。此外,通过标志物信息进行取放检测,提高了检测取放操作的真实性和准确性。

Description

物品取放检测系统、方法及装置
本申请要求于2019年06月06日提交的申请号为201910492987.6、发明名称为“物品取放检测系统、方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及图像处理技术领域,特别涉及一种物品取放检测系统、方法及装置。
背景技术
随着人工智能的发展,物品取放柜逐渐被广泛应用。例如,通过物品取放柜可以用来存放物品,在有取放需求时,从该物品取放柜中将物品取出或将物品放入该物品取放柜。因此,如何实现物品取放检测,是一个比较关键的问题。
相关技术中,通过深度学习网络对用户动作姿态进行识别或者逐帧检测目标物品实现目标物品跟踪,达到物品取放检测的目的。
然而,该类方法使用深度学习网络进行视频数据分析,计算量大,而且计算资源往往部署于云服务器上,视频传输数据量大,导致检测效率较低,且对传输带宽要求高,部署成本大。
发明内容
本申请实施例提供了一种物品取放检测系统、方法及装置,可用于解决相关技术中的问题。所述技术方案如下:
一方面,本申请实施例提供了一种物品取放检测系统,所述系统包括:物品取放柜、图像采集单元及取放检测单元;
所述物品取放柜的出入口边缘具有标志物,所述图像采集单元用于采集所述物品取放柜的出入口的图像;
所述取放检测单元与所述图像采集单元连接,用于基于所述图像采集单元采集到的图像中的标志物信息进行物品取放检测。
可选地,所述标志物包括线特征的标志物、条形码的标志物及棋盘格的标志物中的一种或多种。
可选地,所述标志物为多个,多个标志物构成特征阵列,每两个标志物之间的间隔小 于从所述物品取放柜中取放的物品中的最小物品的宽度。
可选地,所述图像采集单元包括一个相机,所述一个相机的监测区域覆盖所述物品取放柜的整个出入口。
可选地,所述图像采集单元包括多个相机,每个相机的监测区域覆盖所述物品取放柜的出入口的一部分,所述多个相机的监测区域覆盖所述物品取放柜的整个出入口。
可选地,所述系统还包括用于对所述标志物进行补光的光源。
可选地,所述标志物边缘的其中一侧采用吸收光的材料,另一侧采用漫反射的材料。
还提供了一种物品取放检测方法,所述方法应用于上面任一所述的系统,所述方法包括:
获取物品取放柜的出入口的当前图像;
在所述当前图像中检测标志物信息;
基于检测结果进行物品取放检测。
可选地,基于检测结果进行物品取放检测,包括:
若检测结果为检测到标志物信息,将检测到的标志物信息与参考标志物信息进行匹配,所述参考标志物信息包括所述物品取放柜的出入口边缘的标志物未被遮挡时的标志物信息;
基于匹配结果进行物品取放检测。
可选地,所述标志物信息用于指示所述标志物的位置及特征;
所述基于匹配结果进行物品取放检测,包括:
若所述检测到的标志物信息所指示的标志物的位置及特征,分别与所述参考标志物信息所指示的标志物的位置及特征匹配,则确定不存在物品取放操作;
若所述检测到的标志物信息所指示的标志物的位置及特征,分别与所述参考标志物信息所指示的标志物的位置及特征不匹配,则确定存在物品取放操作。
可选地,所述基于检测结果进行物品取放检测,包括:
若检测结果为未检测到标志物信息,则确定存在物品取放操作。
可选地,所述确定存在物品取放操作之后,所述方法还包括:
确定标志物被遮挡的区域,根据所述标志物被遮挡的区域变化输出取放信号,所述取放信号用于指示所述物品取放操作的状态。
还提供一种物品取放检测装置,所述装置应用于上面任一所述的系统,所述装置包括:
获取模块,用于获取物品取放柜的出入口的当前图像;
第一检测模块,用于在所述当前图像中检测标志物信息;
第二检测模块,用于基于检测结果进行物品取放检测。
可选地,所述第二检测模块,用于若检测结果为检测到标志物信息,将检测到的标志物信息与参考标志物信息进行匹配,所述参考标志物信息包括所述物品取放柜的出入口边缘的标志物未被遮挡时的标志物信息;基于匹配结果进行物品取放检测。
可选地,所述标志物信息用于指示所述标志物的位置及特征;
所述第二检测模块,用于若所述检测到的标志物信息所指示的标志物的位置及特征,分别与所述参考标志物信息所指示的标志物的位置及特征匹配,则确定不存在物品取放操作;
若所述检测到的标志物信息所指示的标志物的位置及特征,分别与所述参考标志物信息所指示的标志物的位置及特征不匹配,则确定存在物品取放操作。
可选地,所述第二检测模块,用于若检测结果为未检测到标志物信息,则确定存在物品取放操作。
可选地,所述装置还包括:
输出模块,用于所述确定存在物品取放操作之后,确定标志物被遮挡的区域,根据所述标志物被遮挡的区域变化输出取放信号,所述取放信号用于指示所述物品取放操作的状态。
还提供了一种计算机设备,所述计算机设备包括处理器和存储器,所述存储器中存储有至少一条指令,所述至少一条指令在被所述处理器执行时实现如上任一所述的物品取放检测方法。
还提供了一种计算机可读存储介质,所述计算机可读存储介质中存储有至少一条指令,所述至少一条指令在被执行时实现如上任一所述的物品取放检测方法。
本申请实施例提供的技术方案至少带来如下有益效果:
通过在物品取放柜的出入口边缘设置标志物,检测物品取放柜的出入口图像中的标志物信息,基于检测结果进行物品取放检测,而无需目标物品的检测和跟踪,降低了场景和目标物品的要求,可以应对频繁变化的场景和任意速度取放,检测鲁棒性高。此外,本申请通过标志物信息进行取放检测,提高了检测取放操作的真实性和准确性。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请实施例提供的物品取放检测系统的结构示意图;
图2是本申请实施例提供的标志物示意图;
图3是本申请实施例提供的物品取放检测系统的结构示意图;
图4是本申请实施例提供的物品取放检测系统的结构示意图;
图5是本申请实施例提供的物品取放检测系统的结构示意图;
图6是本申请实施例提供的物品取放检测方法流程图;
图7是本申请实施例提供的物品取放检测过程示意图;
图8是本申请实施例提供的物品取放检测方法流程图;
图9是本申请实施例提供的取放信号的输出过程示意图;
图10是本申请实施例提供的物品取放检测过程示意图;
图11是本申请实施例提供的物品取放检测过程示意图;
图12是本申请实施例提供的物品取放检测装置的结构示意图;
图13是本申请实施例提供的物品取放检测装置的结构示意图;
图14是本申请实施例提供的计算机设备的结构示意图;
图15是本申请实施例提供的计算机设备的结构示意图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请实施方式作进一步地详细描述。
本申请实施例提供了一种物品取放检测系统,如图1所示,该系统包括:物品取放柜11、图像采集单元12及取放检测单元13;
物品取放柜11的出入口边缘具有标志物111,图像采集单元12用于采集物品取放柜11的出入口的图像;
取放检测单元13与图像采集单元12连接,用于基于图像采集单元12采集到的图像中的标志物信息进行物品取放检测。
其中,物品取放柜11用于存放物品,本申请实施例不对物品取放柜11的产品形态进行 限定,也不对该物品取放柜11中存放的物品种类、大小和数量进行限定。由于从物品取放柜11中取放物品时,会遮挡该物品取放柜11的出入口的部分区域,因而本申请实施例通过在该物品取放柜11的出入口边缘设置标志物111,可基于该标志物111的遮挡情况来检测是否存在物品取放操作。
此外,取放检测单元13可以与图像采集单元12通过网络进行远程通信连接。也就是说,取放检测单元13可以与图像采集单元12及物品取放柜11等不设置在同一区域内,即实现远程取放检测。可选地,该取放检测单元13可同时服务于多个物品取放检测系统,即一个取放检测单元13可同时与多个物品取放检测系统中的图像采集单元进行通信连接,以基于不同物品取放检测系统中的图像采集单元所采集的图像对不同物品取放检测系统进行取放检测。该取放检测单元13可以通过计算机设备实现,例如以服务器作为取放检测单元13,该服务器可以部署在云端。
可选地,标志物111包括但不限于线特征的标志物、条形码的标志物及棋盘格的标志物中的一种或多种。
在本申请的可选实施例中,多条线特征被编码形成线特征的标志物。例如,线特征的标志物为垂直梯度编码类型,垂直梯度编码类型的标志物在垂直于取放边界(即出入口边缘)的方向进行梯度编码。如图2(a)所示的线特征的标志物,梯度存在于边界的垂直方向,该种编码方式的标志物间隔为无穷小。例如,线特征的标志物可以为黑白条纹标志物。
条形码及棋盘格可以被编码为二维编码类型的标志物,二维编码类型的标志物在取放边界的垂直和水平方向上均进行编码。常见的二维编码包括二维码、棋盘格形式的二维编码,如图2(b)中所示的二维码形式的条形码,如图2(c)所示的棋盘格。
无论是哪种类型的标志物111,本申请实施例提供的系统中,标志物111为多个,多个标志物111构成特征阵列。此外,每两个标志物111之间的间隔小于从物品取放柜11中取放的物品中的最小物品的宽度。例如,可以在物品取放柜11的出入口边缘连续排列一周的标志物111,每两个标志物111之间的间隔小于从物品取放柜11中取放的物品中的最小物品的宽度,从而避免出现漏检测,进一步提高取放检测的准确性。
在本申请的一个可选实施例中,在设置了标志物111的基础上,标志物111边缘梯度保证在10以上,即标志物11边缘两侧区域像素值(或者像素灰度值)的差大于10,从而保证标志物111特征提取的准确性。为保证标志物111拥有显著的边缘梯度,可选地,标志物111边缘的其中一侧采用吸收光的材料,另一侧采用漫反射的材料。也就是说,标志物111边缘两侧的材料往往一侧选择吸收光性能较强的材料,如吸光摄影布、印刷墨、橡胶等,另一侧选择漫反射能力较强的材料,如:打印纸、PET(Polyethylene Terephthalate,聚对苯 二甲酸乙二醇酯)漫反射材料等。本申请实施例不对标志物111的材料加以限定,能提取特征即可。
例如,标志物111为黑白两色,使用黑白打印的纸质标志物111可以张贴到物品取放柜11的出入口边缘,如柜内腔四周具有一圈可用于张贴标志物111的区域。标志物111的黑色部分的石墨拥有良好的吸收光性能,而白色部分的打印纸拥有良好的漫反射性能,保证标志物111在灰度图像中黑色和白色的灰度差在10以上,例如可以是100。
可选地,图像采集单元12用于采集物品取放柜11的出入口的图像,该图像采集单元12可包括一个相机,该一个相机的监测区域覆盖物品取放柜11的整个出入口。因此,通过一个相机可拍摄物品取放柜11的整个出入口,从而避免因漏检测某个标志物111而导致取放检测不准确。例如,在物品取放柜11的出入口边缘处的内腔上,连续排列着一周的标志物111,检测相机监测出入口的同时能够采集到标志物111的特征。通过相机的视角能够覆盖整个出入口,从而保证任意位置的取放操作都能够被采集到的图像呈现出来,进而避免漏检测。
可选地,除了采用一个相机,参见图3,图像采集单元12包括多个相机,每个相机的监测区域覆盖物品取放柜11的出入口的一部分,多个相机的监测区域覆盖物品取放柜11的整个出入口。例如,根据物品取放柜11的出入口的大小和相机的视角范围,确定相机的数量,保证检测用到的相机的监测区域之和能够覆盖物品取放柜11的整个出入口。
需要说明的是,若物品取放检测系统中的图像采集单元12包括多个相机,则每个相机均将采集到的当前图像传输至取放检测单元13。此外,每个相机采集图像需要保持同步,使得取放检测单元13获取到的当前图像是同一时间的图像,从而使得当前图像能够反应物品取放柜11的出入口在同一时间的情况,以提高检测结果的准确性。
此外,无论是图1还是图3,本申请实施例仅以将图像采集单元12与物品取放柜11相连为例进行说明,该图像采集单元12可以设置在物品取放柜11的出入口一定范围内,从而保证能够采集出入口的图像。可选地,该图像采集单元12也可以与物品取放柜11分离设置。例如,图像采集单元12设置在物品取放柜11对面,正对该物品取放柜11的出入口,能够采集该出入口的图像即可。本申请实施例不对图像采集单元12的具体设置数量及位置进行限定。
为了便于理解,本申请以图4所示的示意图为例。如图4(a)所示,图像采集单元12包括一个相机,在物品取放柜11的出入口边缘可以设置标志物。在该出入口的右上角可以设置一个相机,以检测该出入口,采集整个出入口的图像。如图4(b)所示,还可以在该出入口的右上角和左上角分别设置一个相机,以检测该整个出入口,采集整个出入口的图 像。
可选地,考虑到物品取放柜11所在环境的光线变化,可能会影响图像采集单元12采集出的图像的清晰程度,影响标志物识别。对此,该系统还包括用于对标志物111进行补光的光源14,如图5所示。标志物111通过光源14进行补光,从而保证标志物111的特征图像灰度不随外界环境光照条件变化而变化,进而保证取放检测的准确性。
关于光源14的位置,本申请实施例不加以限定,能够对标志物111进行补光即可。例如,光源14可以设置在物品取放柜11的正对面,以面对物品取放柜11的出入口边缘。此外,光源14的数量可以为一个或多个,本申请实施例同样不对光源14的数量加以限定,且也不对光源14的种类进行限定。可选地,该系统还可以包括用于控制光源14开启和关闭的控制装置。例如,该控制装置可基于物品取放柜11所在环境的光亮度来控制光源14的开启和关闭。
基于上述物品取放检测系统,当进行物品取放操作时,进入物品取放柜内执行取放操作的对象会遮挡标志物111,通过检测标志物111的遮挡情况能够准确的检测到取放操作。对此,参见图6,本申请实施例提供了一种物品取放检测方法,该方法应用于上述物品取放检测系统中的取放检测单元13。如图6所示,该方法包括如下几个步骤。
在步骤601中,获取物品取放柜的出入口的当前图像。
例如,基于上述物品取放检测系统,通过图像采集单元12采集物品取放柜11的出入口的当前图像。图像采集单元12将采集到的当前图像传输至取放检测单元13。
需要说明的是,若物品取放检测系统中的图像采集单元12包括多个相机,则每个相机均将采集到的当前图像传输至取放检测单元13。此外,每个相机采集图像需要保持同步,使得取放检测单元13获取到的当前图像是同一时间的图像,从而能够通过当前图像反应物品取放柜11的出入口在同一时间的情况,以提高检测结果的准确性。
在步骤602中,在当前图像中检测标志物信息。
取放检测单元13获取到物品取放柜11的出入口的当前图像后,在当前图像中检测标志物信息。可选地,标志物信息用于指示标志物111的位置及特征,该特征可以是标志物111所在位置的特征。可选地,在当前图像中检测标志物信息之前,可先获取物品取放柜11的出入口边缘的标志物111未被遮挡时的标志物信息。
例如,物品取放检测系统需要进行一次初始化,利用图像采集单元12提供的未进行任何取放操作时的参考图像,检测其中的标志物111的位置与对应的特征。如:图2(a)所示的线特征的标志物,通过提取参考图像中的梯度位置与方向获取标志物特征。如图2(b) 中所示的二维码编码的标志物,通过二维码识别算法识别二维码并获取二维码位置和内部编码。如图2(c)所示的棋盘格的标志物,通过棋盘格角点提取算法获取棋盘格角点位置,将得到参考图像中所有标志物的位置和特征。
同理,检测当前图像中的标志物信息时,可采用上述检测参考图像中的标志物信息的方式。此处不再赘述。
在步骤603中,基于检测结果进行物品取放检测。
可选地,基于检测结果进行物品取放检测,包括:若检测结果为检测到标志物信息,将检测到的标志物信息与参考标志物信息进行匹配,参考标志物信息包括物品取放柜的出入口边缘的标志物未被遮挡时的标志物信息;基于匹配结果进行物品取放检测。
由于如果未发生取放操作,则标志物不会被遮挡,而参考标志物信息包括物品取放柜的出入口边缘的标志物未被遮挡时的标志物信息,因此,在检测到标志物信息后,可将检测到的标志物信息与参考标志物信息进行匹配,从而根据匹配结果判断标志物是否被遮挡,从而判断出是否发生取放操作。
可选地,基于匹配结果进行物品取放检测,包括但不限于如下两种情况:
第一种情况:若检测到的标志物信息所指示的标志物的位置及特征,分别与参考标志物信息所指示的标志物的位置及特征匹配,则确定不存在物品取放操作。
该种情况下,由于当前图像中检测到的标志物信息所指示的标志物的位置及特征,分别与参考标志物信息所指示的标志物的位置及特征匹配,那么说明标志物未被遮挡,因而可确定不存在物品取放操作。
第二种情况:若检测到的标志物信息所指示的标志物的位置及特征,分别与参考标志物信息所指示的标志物的位置及特征不匹配,则确定存在物品取放操作。
需要说明的是,在上述匹配过程中,将同一位置的标志物特征进行匹配,完全匹配或者匹配度达到匹配度阈值的情况下,再确定二者匹配,否则确认二者不匹配。其中,匹配度阈值可基于场景确定,也可以基于经验设置,还可以在应用过程中适当调整,本申请实施例对此不加以限定。
可选地,以上是以检测结果为检测到标志物信息的情况进行的说明,然而有些时候,可能标志物111在取放操作时被全部遮挡,因而不会从当前图像中检测到标志物信息。对此,可选地,基于检测结果进行物品取放检测,包括:若检测结果为未检测到标志物信息,则确定存在物品取放操作。
在确定存在物品取放操作之后,为了能够实时监测取放操作,本申请实施例提供的方法还包括:确定标志物被遮挡的区域,根据标志物被遮挡的区域变化输出取放信号,取放 信号用于指示物品取放操作的状态。
可选地,确定标志物111被遮挡的区域时,如果未检测到标志物信息,则认为标志物111全部被遮挡。因而可将标志物111的全部区域作为被遮挡的区域。如果检测到标志物信息,由于标志物信息用于指示标志物111的位置,因而可通过将当前图像中的标志物信息与参考标志物信息进行比对,确定哪些标志物111未被匹配上,可将未被匹配上的标志物111的区域作为被遮挡的区域。由于图像采集单元12在每个时间都可以采集物品取放柜11的出入口的当前图像,每个当前图像都可以确定标志物111是否被遮挡,以及标志物111被遮挡的区域。因此,不同时间的图像相互比较,可以得出标志物111被遮挡的区域变化。根据标志物111被遮挡的区域变化输出取放信号,取放信号用于指示物品取放操作的状态。
例如,基于确定的标志物被遮挡的区域信息,通过在时域上比较前后帧图像中遮挡区域的个数变化来输出取放信号。其中,取放信号用于指示物品取放操作的状态。例如,该取放信号包括进入信号和离开信号。当遮挡区域个数由n变n+1时(n=0,1,2,…)时,可触发进入信号,该进入信号用于指示进入物品取放柜执行取放操作;当遮挡区域个数由n+1变n时(n=0,1,2,…)时,可触发离开信号,该离开信号用于指示从物品取放柜离开,但此时取放操作还未结束。
进一步地,取放信号还可以包括通过遮挡区域个数分析得到的整个取放过程的开始和结束状态信号。例如,当遮挡区域个数由0变1时,触发开始信号,该开始信号用于指示开始进入物品取放柜执行取放操作;当遮挡区域个数由1变0时,触发结束信号,该结束信号用于指示完全离开物品取放柜,完成取放操作。
接下来,以几种不同编码类型的标志物为例,对上述物品取放检测方法进行举例说明。
首先,以如图7(a)所示的二维码编码的标志物为例进行说明。二维码编码的标志物为黑白两色,使用黑白打印的纸质二维码张贴到物品取放柜的出入口边缘。可以通过光源对标志物进行补光以减少光照变化,从而降低对二维码特征提取的影响。二维码的黑色部分的石墨拥有良好的吸收光性能,而白色部分的打印纸拥有良好的漫反射性能,保证二维码在灰度图像中黑色和白色的灰度差在100以上。
在进行取放操作检测之前,本申请实施例提供的方法先利用图像采集单元12采集未进行取放操作时刻的参考图像,之后由取放检测单元13识别图像中的所有二维码。如图7(a)所示位于物品取放柜11的出入口边缘处的连续二维码序列,获得所有二维码的位置和内部编码向量作为取放检测时的标志物特征,得到参考标志物信息,以用于后续的检测。
之后,图像采集单元12实时检测物品取放柜11的出入口的当前图像。该物品取放检 测过程可参见图8,对图像采集单元12采集的当前图像进行遮挡检测。当存在取放操作时,由于取放操作会遮挡住位于出入口边缘的二维码,被遮挡的二维码如图7(b)中阴影区域所示。在当前图像上根据参考标志物信息得出的二维码位置检测二维码并提取二维码内部编码向量。如果在当前图像中未检测出二维码或二维码内部编码向量无法与该位置处参考二维码的内部编码向量匹配,则该位置二维码被遮挡,确定存在取放操作。
进一步地,当存在取放操作时,确定不同取放信号的过程可参见图9所示。每个二维码都采用该种方式识别后,触发取放,对遮挡区域进行分析,得出遮挡区域的位置和个数,如图7(c)所示,虚线部分为存在取放操作的遮挡区域,该图7中共有两个遮挡区域;利用遮挡区域信息,通过在时域上比较前后帧遮挡区域的个数变化输出取放信号。即通过比较不同时间采集到的当前图像中的标志物信息得出的被遮挡区域的个数变化来输出取放信号。例如,时域遮挡区域个数增加的情况下,如果是从无到有,则为开始信号。时域遮挡区域个数增加的情况下,如果不是从无到有,则为进入信号。时域遮挡区域个数减少的情况下,如果是从有到无,则为结束信号。时域遮挡区域个数减少的情况下,如果不是从有到无,则为离开信号。取放信号的内容也可参见上述步骤603下面的描述,此处不再赘述。
接下来,以本申请实施例提供的方法应用于如图10(a)所示的线特征的标志物为例,基于该种标志物的物品取放检测方法与上述图7和图8所示的过程不同点在于标志物的编码种类。图10(a)所示的标志物为一条连续的印有水平黑白条纹(即垂直梯度)的标志条。在部署该标志物时,将黑白打印的纸质标志条张贴到物品取放柜的出入口边缘。之后调整相机视角,使得相机拍摄的物品取放柜的出入口的当前图像中的标志条尽量平行于当前图像的水平轴。由于标志物是连续的,以检测相机图像中的一列标志物作为一个特征描述单元。如图10(a)中所示的线编码的标志条为例,每列标志条拥有两个垂直梯度,其中一个梯度向下,即灰度从上向下变大,另外一个梯度向上,即灰度从上向下变小。
在进行取放操作检测之前,可以人工通过在未进行取放操作时的参考图像上画线的方式给出每条梯度边缘的预估位置。本申请实施例提供的方法先利用图像采集单元采集未进行取放操作时刻的参考图像,由取放检测单元在预估位置垂直方向的邻域内进行搜索。通过搜索查找到邻域内拥有最大梯度的像素位置作为准确的梯度位置,得出参考图像中每一列标志物中的所有梯度位置和对应的梯度方向作为参考标志物信息。
之后,图像采集单元实时检测物品取放柜的出入口的当前图像。当存在取放操作时,在当前图像上根据参考标志物信息中的梯度位置提取梯度。如果图像中存在未提取到梯度或提取梯度的方向与参考标志物信息中的特征不符的情况,则当前区域存在取放操作,从 而有标志物被遮挡,被遮挡的标志物如图10(b)中的阴影区域所示。
每个标志物都采用该种方式识别后,得出遮挡区域的位置和数目。如图10(c)所示,虚线部分为存在取放操作的遮挡区域,该图10(b)中共有两个遮挡区域;利用遮挡区域信息,通过在时域上比较前后帧遮挡区域的个数变化输出取放信号。即通过比较不同时间采集到的当前图像中的标志物信息得出的被遮挡区域的个数变化来输出取放信号。取放信号的内容可参见上述图9及步骤603下面的描述,此处不再赘述。
接下来,以如图11(a)所示的棋盘格的标志物为例进行说明。棋盘格的标志物为黑白两色,使用黑白打印的纸质棋盘格的标志物张贴到物品取放柜的出入口边缘。可以通过光源对标志物进行补光以减少光照变化,从而降低对棋盘格特征提取的影响。棋盘格的黑色部分的石墨拥有良好的吸收光性能,而白色部分的打印纸拥有良好的漫反射性能,保证棋盘格在灰度图像中黑色和白色的灰度差在100以上。
在进行取放操作检测之前,本申请实施例提供的方法先利用图像采集单元采集未进行取放操作时刻的参考图像,之后由取放检测单元识别图像中的所有棋盘格角点。如图11(a)所示位于物品取放柜出入口边缘处的连续棋盘格角点序列,获得所有棋盘格角点位置作为物品取放检测时的标志物特征,得到参考标志物信息,以用于后续的检测。
之后,图像采集单元实时检测物品取放柜的出入口的当前图像。当存在取放操作时,由于取放操作会遮挡住位于出入口边缘的棋盘格角点,被遮挡的棋盘格角点如图11(b)中阴影区域所示。根据参考标志物信息得出的棋盘格角点位置在当前图像上提取棋盘格角点。如果在当前图像中能够提取出棋盘格角点,则提取出的棋盘格角点与该棋盘格角点位置匹配,则该位置的棋盘格角点未被遮挡。如果在当前图像中未能够提取出棋盘格角点,则该位置的棋盘格角点被遮挡,确定存在取放操作。
每个棋盘格都采用该种方式识别后,得出遮挡区域的位置和数目,如图11(c)所示,虚线部分为存在取放操作的遮挡区域,该图中共有两个遮挡区域;利用遮挡区域信息,通过在时域上比较前后帧遮挡区域的个数变化输出取放信号。即通过比较不同时间采集到的当前图像中的标志物信息得出的被遮挡区域的个数变化来输出取放信号。取放信号的内容可参见上述描述,此处不再赘述。
本申请实施例提供了一种物品取放检测装置,该装置应用于上述物品取放检测系统,基于图12所示的如下多个模块,该图12所示的物品取放检测装置能够执行上述物品取放检测方法。参见图12,装置包括:
获取模块1201,用于获取物品取放柜的出入口的当前图像;
第一检测模块1202,用于在当前图像中检测标志物信息;
第二检测模块1203,用于基于检测结果进行物品取放检测。
可选地,第二检测模块1203,用于若检测结果为检测到标志物信息,将检测到的标志物信息与参考标志物信息进行匹配,参考标志物信息包括物品取放柜的出入口边缘的标志物未被遮挡时的标志物信息;基于匹配结果进行物品取放检测。
可选地,标志物信息用于指示标志物的位置及特征;
第二检测模块1203,用于若检测到的标志物信息所指示的标志物的位置及特征,分别与参考标志物信息所指示的标志物的位置及特征匹配,则确定不存在物品取放操作;
若检测到的标志物信息所指示的标志物的位置及特征,分别与参考标志物信息所指示的标志物的位置及特征不匹配,则确定存在物品取放操作。
可选地,第二检测模块1203,用于若检测结果为未检测到标志物信息,则确定存在物品取放操作。
可选地,参见图13,该装置还包括:
输出模块1204,用于确定存在物品取放操作之后,确定标志物被遮挡的区域,根据标志物被遮挡的区域变化输出取放信号,取放信号用于指示物品取放操作的状态。
其中,上述获取模块1201获取的当前图像可以由图1、图3和图5所示的图像采集单元12实现,获取模块1201、第一检测模块1202和第二检测模块1203的功能可以由图1、图3和图5所示的取放检测单元13实现。
需要说明的是,上述实施例提供的装置在实现其功能时,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将设备的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的装置与方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。
在示例中实施例中,还提供了一种计算机设备,参见图14,该计算机设备包括处理器141和存储器142,所述存储器142中存储有至少一条指令。所述至少一条指令经配置以由一个或者一个以上处理器141执行,以实现上述任一种物品取放检测方法。
图15是本发明实施例提供的一种计算机设备的结构示意图。该设备可以为终端,例如 可以是:智能手机、平板电脑、MP3播放器(Moving Picture Experts Group Audio Layer III,动态影像专家压缩标准音频层面3)、MP4(Moving Picture Experts Group Audio Layer IV,动态影像专家压缩标准音频层面4)播放器、笔记本电脑或台式电脑。终端还可能被称为用户设备、便携式终端、膝上型终端、台式终端等其他名称。
通常,终端包括有:处理器1501和存储器1502。
处理器1501可以包括一个或多个处理核心,比如4核心处理器、8核心处理器等。处理器1501可以采用DSP(Digital Signal Processing,数字信号处理)、FPGA(Field-Programmable Gate Array,现场可编程门阵列)、PLA(Programmable Logic Array,可编程逻辑阵列)中的至少一种硬件形式来实现。处理器1501也可以包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称CPU(Central Processing Unit,中央处理器);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施例中,处理器1501可以在集成有GPU(Graphics Processing Unit,图像处理器),GPU用于负责显示屏所需要显示的内容的渲染和绘制。一些实施例中,处理器1501还可以包括AI(Artificial Intelligence,人工智能)处理器,该AI处理器用于处理有关机器学习的计算操作。
存储器1502可以包括一个或多个计算机可读存储介质,该计算机可读存储介质可以是非暂态的。存储器1502还可包括高速随机存取存储器,以及非易失性存储器,比如一个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器1502中的非暂态的计算机可读存储介质用于存储至少一个指令,该至少一个指令用于被处理器1501所执行以实现本申请中方法实施例提供的物品取放检测方法。
在一些实施例中,终端还可选包括有:外围设备接口1503和至少一个外围设备。处理器1501、存储器1502和外围设备接口1503之间可以通过总线或信号线相连。各个外围设备可以通过总线、信号线或电路板与外围设备接口1503相连。具体地,外围设备包括:射频电路1504、触摸显示屏1505、摄像头1506、音频电路1507、定位组件1508和电源1509中的至少一种。
外围设备接口1503可被用于将I/O(Input/Output,输入/输出)相关的至少一个外围设备连接到处理器1501和存储器1502。在一些实施例中,处理器1501、存储器1502和外围设备接口1503被集成在同一芯片或电路板上;在一些其他实施例中,处理器1501、存储器1502和外围设备接口1503中的任意一个或两个可以在单独的芯片或电路板上实现,本实施例对此不加以限定。
射频电路1504用于接收和发射RF(Radio Frequency,射频)信号,也称电磁信号。射 频电路1504通过电磁信号与通信网络以及其他通信设备进行通信。射频电路1504将电信号转换为电磁信号进行发送,或者,将接收到的电磁信号转换为电信号。可选地,射频电路1504包括:天线系统、RF收发器、一个或多个放大器、调谐器、振荡器、数字信号处理器、编解码芯片组、用户身份模块卡等等。射频电路1504可以通过至少一种无线通信协议来与其它终端进行通信。该无线通信协议包括但不限于:城域网、各代移动通信网络(2G、3G、4G及5G)、无线局域网和/或WiFi(Wireless Fidelity,无线保真)网络。在一些实施例中,射频电路1504还可以包括NFC(Near Field Communication,近距离无线通信)有关的电路,本申请对此不加以限定。
显示屏1505用于显示UI(User Interface,用户界面)。该UI可以包括图形、文本、图标、视频及其它们的任意组合。当显示屏1505是触摸显示屏时,显示屏1505还具有采集在显示屏1505的表面或表面上方的触摸信号的能力。该触摸信号可以作为控制信号输入至处理器1501进行处理。此时,显示屏1505还可以用于提供虚拟按钮和/或虚拟键盘,也称软按钮和/或软键盘。在一些实施例中,显示屏1505可以为一个,设置终端的前面板;在另一些实施例中,显示屏1505可以为至少两个,分别设置在终端的不同表面或呈折叠设计;在再一些实施例中,显示屏1505可以是柔性显示屏,设置在终端的弯曲表面上或折叠面上。甚至,显示屏1505还可以设置成非矩形的不规则图形,也即异形屏。显示屏1505可以采用LCD(Liquid Crystal Display,液晶显示屏)、OLED(Organic Light-Emitting Diode,有机发光二极管)等材质制备。
摄像头组件1506用于采集图像或视频。可选地,摄像头组件1506包括前置摄像头和后置摄像头。通常,前置摄像头设置在终端的前面板,后置摄像头设置在终端的背面。在一些实施例中,后置摄像头为至少两个,分别为主摄像头、景深摄像头、广角摄像头、长焦摄像头中的任意一种,以实现主摄像头和景深摄像头融合实现背景虚化功能、主摄像头和广角摄像头融合实现全景拍摄以及VR(Virtual Reality,虚拟现实)拍摄功能或者其它融合拍摄功能。在一些实施例中,摄像头组件1506还可以包括闪光灯。闪光灯可以是单色温闪光灯,也可以是双色温闪光灯。双色温闪光灯是指暖光闪光灯和冷光闪光灯的组合,可以用于不同色温下的光线补偿。
音频电路1507可以包括麦克风和扬声器。麦克风用于采集用户及环境的声波,并将声波转换为电信号输入至处理器1501进行处理,或者输入至射频电路1504以实现语音通信。出于立体声采集或降噪的目的,麦克风可以为多个,分别设置在终端的不同部位。麦克风还可以是阵列麦克风或全向采集型麦克风。扬声器则用于将来自处理器1501或射频电路1504的电信号转换为声波。扬声器可以是传统的薄膜扬声器,也可以是压电陶瓷扬声器。 当扬声器是压电陶瓷扬声器时,不仅可以将电信号转换为人类可听见的声波,也可以将电信号转换为人类听不见的声波以进行测距等用途。在一些实施例中,音频电路1507还可以包括耳机插孔。
定位组件1508用于定位终端的当前地理位置,以实现导航或LBS(Location Based Service,基于位置的服务)。定位组件1508可以是基于美国的GPS(Global Positioning System,全球定位系统)、中国的北斗系统、俄罗斯的格雷纳斯系统或欧盟的伽利略系统的定位组件。
电源1509用于为终端中的各个组件进行供电。电源1509可以是交流电、直流电、一次性电池或可充电电池。当电源1509包括可充电电池时,该可充电电池可以支持有线充电或无线充电。该可充电电池还可以用于支持快充技术。
在一些实施例中,终端还包括有一个或多个传感器1510。该一个或多个传感器1510包括但不限于:加速度传感器1511、陀螺仪传感器1512、压力传感器1513、指纹传感器1514、光学传感器1515以及接近传感器1516。
加速度传感器1511可以检测以终端建立的坐标系的三个坐标轴上的加速度大小。比如,加速度传感器1511可以用于检测重力加速度在三个坐标轴上的分量。处理器1501可以根据加速度传感器1511采集的重力加速度信号,控制触摸显示屏1505以横向视图或纵向视图进行用户界面的显示。加速度传感器1511还可以用于游戏或者用户的运动数据的采集。
陀螺仪传感器1512可以检测终端的机体方向及转动角度,陀螺仪传感器1512可以与加速度传感器1511协同采集用户对终端的3D动作。处理器1501根据陀螺仪传感器1512采集的数据,可以实现如下功能:动作感应(比如根据用户的倾斜操作来改变UI)、拍摄时的图像稳定、游戏控制以及惯性导航。
压力传感器1513可以设置在终端的侧边框和/或触摸显示屏1505的下层。当压力传感器1513设置在终端的侧边框时,可以检测用户对终端的握持信号,由处理器1501根据压力传感器1513采集的握持信号进行左右手识别或快捷操作。当压力传感器1513设置在触摸显示屏1505的下层时,由处理器1501根据用户对触摸显示屏1505的压力操作,实现对UI界面上的可操作性控件进行控制。可操作性控件包括按钮控件、滚动条控件、图标控件、菜单控件中的至少一种。
指纹传感器1514用于采集用户的指纹,由处理器1501根据指纹传感器1514采集到的指纹识别用户的身份,或者,由指纹传感器1514根据采集到的指纹识别用户的身份。在识别出用户的身份为可信身份时,由处理器1501授权该用户执行相关的敏感操作,该敏感操作包括解锁屏幕、查看加密信息、下载软件、支付及更改设置等。指纹传感器1514可以被设置终端的正面、背面或侧面。当终端上设置有物理按键或厂商Logo时,指纹传感器1514 可以与物理按键或厂商Logo集成在一起。
光学传感器1515用于采集环境光强度。在一个实施例中,处理器1501可以根据光学传感器1515采集的环境光强度,控制触摸显示屏1505的显示亮度。具体地,当环境光强度较高时,调高触摸显示屏1505的显示亮度;当环境光强度较低时,调低触摸显示屏1505的显示亮度。在另一个实施例中,处理器1501还可以根据光学传感器1515采集的环境光强度,动态调整摄像头组件1506的拍摄参数。
接近传感器1516,也称距离传感器,通常设置在终端的前面板。接近传感器1516用于采集用户与终端的正面之间的距离。在一个实施例中,当接近传感器1516检测到用户与终端的正面之间的距离逐渐变小时,由处理器1501控制触摸显示屏1505从亮屏状态切换为息屏状态;当接近传感器1516检测到用户与终端的正面之间的距离逐渐变大时,由处理器1501控制触摸显示屏1505从息屏状态切换为亮屏状态。
本领域技术人员可以理解,图15中示出的结构并不构成对终端的限定,可以包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
在示例性实施例中,还提供了一种计算机可读存储介质,所述存储介质中存储有至少一条指令,所述至少一条指令在被计算机设备的处理器执行时实现上述任一种物品取放检测方法。
在本申请的可能实施方式中,上述计算机可读存储介质可以是ROM、随机存取存储器(RAM)、CD-ROM、磁带、软盘和光数据存储设备等。
应当理解的是,在本文中提及的“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。字符“/”一般表示前后关联对象是一种“或”的关系。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
以上所述仅为本申请的示例性实施例,并不用以限制本申请,凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (20)

  1. 一种物品取放检测系统,其特征在于,所述系统包括:物品取放柜、图像采集单元及取放检测单元;
    所述物品取放柜的出入口边缘具有标志物,所述图像采集单元用于采集所述物品取放柜的出入口的图像;
    所述取放检测单元与所述图像采集单元连接,用于基于所述图像采集单元采集到的图像中的标志物信息进行物品取放检测。
  2. 根据权利要求1所述的系统,其特征在于,所述标志物包括线特征的标志物、条形码的标志物及棋盘格的标志物中的一种或多种。
  3. 根据权利要求1所述的系统,其特征在于,所述标志物为多个,多个标志物构成特征阵列,每两个标志物之间的间隔小于从所述物品取放柜中取放的物品中的最小物品的宽度。
  4. 根据权利要求1-3任一所述的系统,其特征在于,所述图像采集单元包括一个相机,所述一个相机的监测区域覆盖所述物品取放柜的整个出入口。
  5. 根据权利要求1-3任一所述的系统,其特征在于,所述图像采集单元包括多个相机,每个相机的监测区域覆盖所述物品取放柜的出入口的一部分,所述多个相机的监测区域覆盖所述物品取放柜的整个出入口。
  6. 根据权利要求1-3任一所述的系统,其特征在于,所述系统还包括用于对所述标志物进行补光的光源。
  7. 根据权利要求1-3任一所述的系统,其特征在于,所述标志物边缘的其中一侧采用吸收光的材料,另一侧采用漫反射的材料。
  8. 根据权利要求1-3任一项所述的系统,其特征在于,在所述图像采集单元采集到的图像中的所述标志物边缘两侧区域像素值或者像素灰度值的差大于10。
  9. 一种物品取放检测方法,其特征在于,所述方法应用于物品取放柜,所述方法包括:
    获取物品取放柜的出入口的当前图像;
    在所述当前图像中检测标志物信息;
    基于检测结果进行物品取放检测。
  10. 根据权利要求9所述的方法,其特征在于,基于检测结果进行物品取放检测,包括:
    若检测结果为检测到标志物信息,将检测到的标志物信息与参考标志物信息进行匹配,所述参考标志物信息包括所述物品取放柜的出入口边缘的标志物未被遮挡时的标志物信息;
    基于匹配结果进行物品取放检测。
  11. 根据权利要求10所述的方法,其特征在于,所述标志物信息用于指示所述标志物的位置及特征;
    所述基于匹配结果进行物品取放检测,包括:
    若所述检测到的标志物信息所指示的标志物的位置及特征,分别与所述参考标志物信息所指示的标志物的位置及特征匹配,则确定不存在物品取放操作;
    若所述检测到的标志物信息所指示的标志物的位置及特征,分别与所述参考标志物信息所指示的标志物的位置及特征不匹配,则确定存在物品取放操作。
  12. 根据权利要求9所述的方法,其特征在于,所述基于检测结果进行物品取放检测,包括:
    若检测结果为未检测到标志物信息,则确定存在物品取放操作。
  13. 根据权利要求11或12所述的方法,其特征在于,所述确定存在物品取放操作之后,所述方法还包括:
    确定标志物被遮挡的区域,根据所述标志物被遮挡的区域变化输出取放信号,所述取放信号用于指示所述物品取放操作的状态。
  14. 一种物品取放检测装置,其特征在于,所述装置应用于物品取放柜,所述装置包 括:
    获取模块,用于获取物品取放柜的出入口的当前图像;
    第一检测模块,用于在所述当前图像中检测标志物信息;
    第二检测模块,用于基于检测结果进行物品取放检测。
  15. 根据权利要求14所述的装置,其特征在于,所述第二检测模块,用于若检测结果为检测到标志物信息,将检测到的标志物信息与参考标志物信息进行匹配,所述参考标志物信息包括所述物品取放柜的出入口边缘的标志物未被遮挡时的标志物信息;基于匹配结果进行物品取放检测。
  16. 根据权利要求15所述的装置,其特征在于,所述标志物信息用于指示所述标志物的位置及特征;
    所述第二检测模块,用于若所述检测到的标志物信息所指示的标志物的位置及特征,分别与所述参考标志物信息所指示的标志物的位置及特征匹配,则确定不存在物品取放操作;
    若所述检测到的标志物信息所指示的标志物的位置及特征,分别与所述参考标志物信息所指示的标志物的位置及特征不匹配,则确定存在物品取放操作。
  17. 根据权利要求14所述的装置,其特征在于,所述第二检测模块,用于若检测结果为未检测到标志物信息,则确定存在物品取放操作。
  18. 根据权利要求16或17所述的装置,其特征在于,所述装置还包括:
    输出模块,用于所述确定存在物品取放操作之后,确定标志物被遮挡的区域,根据所述标志物被遮挡的区域变化输出取放信号,所述取放信号用于指示所述物品取放操作的状态。
  19. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有至少一条指令,该至少一条指令在被执行时,实现权利要求9-13中任一项所述的物品取放检测方法。
  20. 一种计算机设备,其特征在于,所述计算机设备包括:
    处理器和存储器;
    所述存储器用于存储所述处理器可执行的至少一条指令;
    所述处理器用于执行所述指令以实现权利要求9-13中任一项所述的物品取放检测方法。
PCT/CN2020/094433 2019-06-06 2020-06-04 物品取放检测系统、方法及装置 WO2020244592A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP20817655.2A EP3982291A4 (en) 2019-06-06 2020-06-04 SYSTEM, METHOD AND DEVICE FOR RECOGNIZING OBJECT MOUNTING
US17/616,810 US20220309444A1 (en) 2019-06-06 2020-06-04 System, method and apparatus for detecting article store or retrieve operations

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910492987.6 2019-06-06
CN201910492987.6A CN112052701B (zh) 2019-06-06 2019-06-06 物品取放检测系统、方法及装置

Publications (1)

Publication Number Publication Date
WO2020244592A1 true WO2020244592A1 (zh) 2020-12-10

Family

ID=73608691

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/094433 WO2020244592A1 (zh) 2019-06-06 2020-06-04 物品取放检测系统、方法及装置

Country Status (4)

Country Link
US (1) US20220309444A1 (zh)
EP (1) EP3982291A4 (zh)
CN (1) CN112052701B (zh)
WO (1) WO2020244592A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022156593A1 (zh) * 2021-01-20 2022-07-28 北京京东乾石科技有限公司 目标物检测方法、装置、电子设备、存储介质和程序

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112734337B (zh) * 2021-01-15 2023-09-26 广州富港生活智能科技有限公司 基于信息采集的物品监控管理方法及装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070019842A1 (en) * 2005-05-16 2007-01-25 Idteck Co., Ltd. Method for identifying a person from a detected eye image
CN101871768A (zh) * 2010-05-12 2010-10-27 中国科学院自动化研究所 一种基于单目视觉的行李外观信息自动获取系统及其方法
CN107358150A (zh) * 2017-06-01 2017-11-17 深圳赛飞百步印社科技有限公司 物体边框识别方法、装置和高拍仪
CN108885813A (zh) * 2018-06-06 2018-11-23 深圳前海达闼云端智能科技有限公司 智能售货柜、物品识别的方法、装置、服务器和存储介质
CN109711337A (zh) * 2018-12-26 2019-05-03 苏州浪潮智能软件有限公司 一种利用背景匹配实现物体有无检测的方法

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8630924B2 (en) * 2007-08-31 2014-01-14 Accenture Global Services Limited Detection of stock out conditions based on image processing
US8363905B2 (en) * 2010-07-21 2013-01-29 Cognisense Labs, Inc. Automated image analysis of an organic polarized object
CN105184231B (zh) * 2015-08-14 2018-11-23 北京京东尚科信息技术有限公司 一种储物空间的图像采集系统及方法
CN107403248A (zh) * 2016-05-19 2017-11-28 阿里巴巴集团控股有限公司 物品管理方法、装置、智能储藏设备和操作系统
CN106531161A (zh) * 2016-10-17 2017-03-22 南京理工大学 基于图像识别的移动小车自动分拣搬运物品的装置及方法
CN106934692B (zh) * 2017-03-03 2020-12-22 陈维龙 物品信息处理系统、方法及装置
CN109712102B (zh) * 2017-10-25 2020-11-27 杭州海康威视数字技术股份有限公司 一种图像融合方法、装置及图像采集设备
CN109330284B (zh) * 2018-09-21 2020-08-11 京东方科技集团股份有限公司 一种货架系统
CN109840760A (zh) * 2018-12-11 2019-06-04 深圳招商建筑科技有限公司 一种基于人脸识别的智慧支付方法及系统
US11907894B2 (en) * 2019-01-11 2024-02-20 Fff Enterprises, Inc. Storage devices and operation methods thereof
CN109840504B (zh) * 2019-02-01 2022-11-25 腾讯科技(深圳)有限公司 物品取放行为识别方法、装置、存储介质及设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070019842A1 (en) * 2005-05-16 2007-01-25 Idteck Co., Ltd. Method for identifying a person from a detected eye image
CN101871768A (zh) * 2010-05-12 2010-10-27 中国科学院自动化研究所 一种基于单目视觉的行李外观信息自动获取系统及其方法
CN107358150A (zh) * 2017-06-01 2017-11-17 深圳赛飞百步印社科技有限公司 物体边框识别方法、装置和高拍仪
CN108885813A (zh) * 2018-06-06 2018-11-23 深圳前海达闼云端智能科技有限公司 智能售货柜、物品识别的方法、装置、服务器和存储介质
CN109711337A (zh) * 2018-12-26 2019-05-03 苏州浪潮智能软件有限公司 一种利用背景匹配实现物体有无检测的方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3982291A4

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022156593A1 (zh) * 2021-01-20 2022-07-28 北京京东乾石科技有限公司 目标物检测方法、装置、电子设备、存储介质和程序

Also Published As

Publication number Publication date
US20220309444A1 (en) 2022-09-29
CN112052701B (zh) 2022-08-05
EP3982291A4 (en) 2022-07-27
EP3982291A1 (en) 2022-04-13
CN112052701A (zh) 2020-12-08

Similar Documents

Publication Publication Date Title
WO2021008456A1 (zh) 图像处理方法、装置、电子设备及存储介质
WO2020221012A1 (zh) 图像特征点的运动信息确定方法、任务执行方法和设备
CN110839128B (zh) 拍照行为检测方法、装置及存储介质
CN109886208B (zh) 物体检测的方法、装置、计算机设备及存储介质
CN111127509B (zh) 目标跟踪方法、装置和计算机可读存储介质
CN111256676B (zh) 移动机器人定位方法、装置和计算机可读存储介质
CN110442521B (zh) 控件单元检测方法及装置
CN111754386B (zh) 图像区域屏蔽方法、装置、设备及存储介质
CN113627413A (zh) 数据标注方法、图像比对方法及装置
CN113706576A (zh) 检测跟踪方法、装置、设备及介质
WO2020244592A1 (zh) 物品取放检测系统、方法及装置
CN110738185B (zh) 表单对象的识别方法、装置及存储介质
CN113723136A (zh) 条码矫正方法、装置、设备及存储介质
CN112308103B (zh) 生成训练样本的方法和装置
CN111586279B (zh) 确定拍摄状态的方法、装置、设备及存储介质
CN112396076A (zh) 车牌图像生成方法、装置及计算机存储介质
CN111127541A (zh) 车辆尺寸的确定方法、装置及存储介质
CN111860064B (zh) 基于视频的目标检测方法、装置、设备及存储介质
CN111241869B (zh) 物料盘点的方法、装置及计算机可读存储介质
CN112184802B (zh) 标定框的调整方法、装置及存储介质
CN110728275B (zh) 车牌识别方法、装置及存储介质
CN112950535A (zh) 视频处理方法、装置、电子设备及存储介质
CN111563402B (zh) 车牌识别方法、装置、终端及存储介质
CN110163192B (zh) 字符识别方法、装置及可读介质
CN110672036B (zh) 确定投影区域的方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20817655

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020817655

Country of ref document: EP

Effective date: 20220107