CN112767597A - Intelligent device for displaying articles and method of article monitoring - Google Patents

Intelligent device for displaying articles and method of article monitoring Download PDF

Info

Publication number
CN112767597A
CN112767597A CN202110025033.1A CN202110025033A CN112767597A CN 112767597 A CN112767597 A CN 112767597A CN 202110025033 A CN202110025033 A CN 202110025033A CN 112767597 A CN112767597 A CN 112767597A
Authority
CN
China
Prior art keywords
target
ultrasonic sensor
access port
determining
computing device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110025033.1A
Other languages
Chinese (zh)
Inventor
李默
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alipay Hangzhou Information Technology Co Ltd
Original Assignee
Alipay Hangzhou Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alipay Hangzhou Information Technology Co Ltd filed Critical Alipay Hangzhou Information Technology Co Ltd
Priority to CN202110025033.1A priority Critical patent/CN112767597A/en
Publication of CN112767597A publication Critical patent/CN112767597A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F11/00Coin-freed apparatus for dispensing, or the like, discrete articles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/04Systems determining presence of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/06Systems determining the position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V7/00Measuring gravitational fields or waves; Gravimetric prospecting or detecting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/35Categorising the entire scene, e.g. birthday party or wedding scene
    • G06V20/36Indoor scenes
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F9/00Details other than those peculiar to special kinds or types of apparatus
    • G07F9/02Devices for alarm or indication, e.g. when empty; Advertising arrangements in coin-freed apparatus
    • G07F9/026Devices for alarm or indication, e.g. when empty; Advertising arrangements in coin-freed apparatus for alarm, monitoring and auditing in vending machines or means for indication, e.g. when empty

Abstract

According to the intelligent equipment for displaying the articles and the article monitoring method, the access port of the bearing device can be monitored through the first ultrasonic sensor and the second ultrasonic sensor, the access position of the target device passing through the access port is determined according to the change of the monitoring result, and the position of the target article loaded or unloaded by the target device is determined, so that the attribute information of the target article can be rapidly and accurately identified. The intelligent device and the method adopt the ultrasonic waves to measure the access position of the target device, are not interfered by ambient light, and are more accurate in distance measurement. And the ultrasonic wave can be used for tracking and positioning the target device with high precision when the target device enters a sensing area, the calculation method is simple and quick, and the calculation cost is saved. Therefore, the intelligent equipment and the method can quickly and accurately identify and count the target articles loaded or unloaded by the target device, greatly improve the working efficiency and reduce the cost.

Description

Intelligent device for displaying articles and method of article monitoring
Technical Field
The present description relates to the field of intelligent management devices, and more particularly, to an intelligent device for displaying items and a method of item monitoring.
Background
In the retail, logistics, warehousing and other industries, it is often necessary to provide a large number of display devices for displaying various types of articles. For example, in supermarkets and malls, merchants typically display items in display devices for customers to choose to purchase. In a place such as a logistics or warehouse, a manager generally stores articles in display equipment in a classified manner and retrieves the articles as needed. For management purposes, the merchant or manager typically needs to label the different items displayed on the display device and periodically make statistics of the quantity of items by the staff. When certain items on the display device are out of stock or lost, the staff cannot find them in time. In addition, the manual management mode has low management efficiency and high operation cost due to the limitation of factors such as the number of workers, personal ability and the like. And the management is carried out by labeling or visual monitoring through the radio frequency identification technology, so that the equipment cost is higher. Particularly, with the rise of industries such as unmanned selling and intelligent containers, an unmanned selling mode of 'getting goods first and then settling accounts' is gradually popular. In the unmanned selling mode, the type and the position of the commodity are accurately judged, and the accurate identification and settlement of the commodity are particularly important.
Therefore, there is a need for an intelligent device and method for item monitoring for displaying items that can quickly detect changes in items on the display device and accurately identify the changed items, while reducing costs.
Disclosure of Invention
The present specification provides an intelligent device for displaying articles and a method of monitoring articles, which can rapidly detect the change of the articles on the display device and accurately identify the changed articles, and simultaneously can reduce the cost.
In a first aspect, the present specification provides a smart device for displaying an item, comprising a carrier, a first ultrasonic sensor, a second ultrasonic sensor, and a computing device, wherein the carrier is configured to carry an item, the carrier comprises an access opening, and the item is loaded and unloaded from the carrier through a plane in which the access opening is located; the first ultrasonic sensor is arranged on the first side of the access port, and the plane of the access port is monitored during operation; the second ultrasonic sensor is arranged on the first side of the access opening and monitors the plane of the access opening during operation, wherein the first ultrasonic sensor and the second ultrasonic sensor monitor whether a target device passes through the access opening during operation, the target device comprises equipment for loading or unloading a target object from the bearing device, and the object comprises the target object; the computing device is in communication connection with the first ultrasonic sensor and the second ultrasonic sensor during operation, and determines the position of the target device when the target device passes through the access opening based on the change of the monitoring results of the first ultrasonic sensor and the second ultrasonic sensor, so as to determine the position of the target object.
In some embodiments, said determining the location of the target item comprises: determining that the target device passes through the access port at the current moment; determining a first distance between the target device and the first ultrasonic sensor at the current moment according to a first monitoring result of the first ultrasonic sensor at the current moment; determining a second distance between the target device and the second ultrasonic sensor at the current moment according to a second monitoring result of the second ultrasonic sensor at the current moment; determining an access position of the target device passing through the access port at the current moment according to the first distance and the second distance; and determining the position of the target object at the current moment according to the access position.
In some embodiments, said determining that the target device passes through the access port at the current time comprises: determining a first difference value between the first monitoring result of the first ultrasonic sensor at the current moment and the monitoring result of the first ultrasonic sensor when the target device does not pass through the access port; determining a second difference value between the second monitoring result of the second ultrasonic sensor at the current moment and the monitoring result of the second ultrasonic sensor when the target device does not pass through the access port; determining that at least one of the first difference and the second difference exceeds a preset threshold; and determining that the target device passes through the access port at the current moment.
In some embodiments, the smart device further comprises a visual detection device, operatively connected in communication with the computing device, installed on the smart device, that captures an image of the item and transmits the image to the computing device, wherein the computing device determines a change in the target item from the image, the change in the target item comprising: the target item is loaded on the carrier; or the target item is unloaded from the carrier.
In some embodiments, the computing device stores attribute information of each of the articles in advance, wherein the computing device obtains the attribute information of the target article according to the position of the target article and the image.
In some embodiments, the smart device further includes a pressure sensing device installed on the carrying device and in communication with the computing device during operation, for measuring the pressure carried on the carrying device, wherein the computing device determines the change of the target object according to the change of the measurement result of the pressure sensing device, and the change of the target object includes: the target item is loaded on the carrier; or the target item is unloaded from the carrier.
In some embodiments, the computing device stores attribute information of each of the items and display rules of the items in advance, and the computing device acquires the attribute information of the target item according to the position of the target item and the display rules of the items.
In some embodiments, the smart device further comprises: and the display device is in communication connection with the computing device during operation and is used for displaying information, wherein the computing device generates related information related to the target object according to the attribute information of the target object and sends the related information to the display device.
In some embodiments, the related information comprises at least one of: attribute information of the target item; and recommendation information related to the target item.
In a second aspect, the present specification provides a method of item monitoring for an intelligent apparatus for displaying items as described in the first aspect of the specification, the method comprising, by the computing device: acquiring a first monitoring result of the first ultrasonic sensor and a second monitoring result of the second ultrasonic sensor; and determining the position of the target device when the target device passes through the access opening based on the change of the first monitoring result and the second monitoring result, thereby determining the position of the target object.
In some embodiments, said determining the location of the target item comprises: determining that the target device passes through the access port at the current moment; determining a first distance between the target device and the first ultrasonic sensor at the current moment according to the first monitoring result at the current moment; determining a second distance between the target device and the second ultrasonic sensor at the current moment according to the second monitoring result at the current moment; determining an access position of the target device passing through the access port at the current moment according to the first distance and the second distance; and determining the position of the target object at the current moment according to the access position.
In some embodiments, said determining that the target device passes through the access port at the current time comprises: determining a first difference value between the first monitoring result at the current moment and a monitoring result of a first ultrasonic sensor when the target device does not pass through the access port; determining a second difference value between the second monitoring result at the current moment and the monitoring result of a second ultrasonic sensor when the target device does not pass through the access port; determining that at least one of the first difference and the second difference exceeds a preset threshold; and determining that the target device passes through the access port at the current moment.
In some embodiments, the method further comprises, by the computing device: determining a change in the target item from the image, the change in the target item including the target item being loaded on the carrier or the target item being unloaded from the carrier.
In some embodiments, the computing device has previously stored therein attribute information for each of the items, the method further comprising, by the computing device: and acquiring the attribute information of the target object according to the position of the target object and the image.
In some embodiments, the method further comprises, by the computing device: determining a change in the target item based on a change in the pressure-sensing device measurement, the change in the target item including the target item being loaded on or unloaded from the carrier.
In some embodiments, the computing device has pre-stored therein display rules for the item, the method further comprising, by the computing device: and acquiring the attribute information of the target item according to the position of the target item and the display rule of the item.
In some embodiments, the method further comprises, by the computing device: determining the target articles changed within a preset time window, wherein the target articles changed within the preset time window comprise the articles loaded on the carrying device by the target device within the preset time window and the quantity and attribute information of the articles unloaded from the carrying device by the target device.
In some embodiments, the method further comprises, by the computing device: and settling accounts according to the attribute information and the quantity of the target articles unloaded from the bearing device by the target device in the preset time window.
In some embodiments, the method further comprises, by the computing device: and generating related information related to the target object according to the attribute information of the target object, and sending the related information to the display device.
In some embodiments, the related information comprises at least one of: attribute information of the target item; and recommendation information related to the target item.
According to the technical scheme, the intelligent equipment for displaying the articles and the article monitoring method provided by the specification can monitor signals of the access port of the bearing device through the first ultrasonic sensor and the second ultrasonic sensor, determine that a target device passes through the access port according to changes of monitoring results of the first ultrasonic sensor and the second ultrasonic sensor, and respectively calculate the distances between the target device and the first ultrasonic sensor and the second ultrasonic sensor, so that the access position of the target device when the target device passes through the access port is determined, the position of a target article loaded or unloaded by the target device is determined, and the attribute information of the target article is rapidly and accurately identified. The intelligent device and the method adopt ultrasonic waves to measure the access position of a target device, the ultrasonic waves are mechanical waves and are not interfered by ambient light, and the distance measurement is more accurate. The first ultrasonic sensor and the second ultrasonic sensor can track and position when the target device enters the sensing area, high-precision positioning and tracking can be achieved in the whole process that a user selects a target object, the calculation method is simple and fast, and calculation cost is saved. Therefore, the intelligent equipment and the method can quickly and accurately identify and count the articles on the bearing device, greatly improve the working efficiency and reduce the cost.
Additional functions of the smart device for displaying items and the method of item monitoring provided by the present specification are set forth in part in the description that follows. The following numerical and exemplary descriptions will be readily apparent to those of ordinary skill in the art in view of the description. The inventive aspects of the method, system, and storage medium for item monitoring provided herein can be fully explained by the practice or use of the methods, apparatus, and combinations described in the detailed examples below.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present disclosure, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 illustrates a schematic structural diagram of a smart device for displaying an item provided in accordance with an embodiment of the present description;
FIG. 2A illustrates a forward view of a first ultrasonic sensor and a second ultrasonic sensor provided in accordance with embodiments of the present description;
FIG. 2B illustrates a side view of a first ultrasonic sensor and a second ultrasonic sensor provided in accordance with embodiments of the present description;
FIG. 3A illustrates a schematic diagram of a display device provided in accordance with an embodiment of the present description;
FIG. 3B illustrates a schematic diagram of another display device provided in accordance with embodiments of the present description;
FIG. 4 illustrates a flow chart of a method of item monitoring provided in accordance with embodiments of the present description;
FIG. 5A illustrates a graph of a range signal for a first ultrasonic sensor provided in accordance with embodiments of the present description;
FIG. 5B illustrates a distance signal plot of a first difference provided in accordance with embodiments of the present description; and
fig. 6 is a schematic diagram illustrating a method for determining an access location of a target device according to an embodiment of the present disclosure.
Detailed Description
The following description is presented to enable any person skilled in the art to make and use the present description, and is provided in the context of a particular application and its requirements. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present description. Thus, the present description is not limited to the embodiments shown, but is to be accorded the widest scope consistent with the claims.
The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting. For example, as used herein, the singular forms "a", "an" and "the" may include the plural forms as well, unless the context clearly indicates otherwise. The terms "comprises," "comprising," "includes," and/or "including," when used in this specification, are intended to specify the presence of stated integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
These and other features of the present specification, as well as the operation and function of the elements of the structure related thereto, and the combination of parts and economies of manufacture, may be particularly improved upon in view of the following description. Reference is made to the accompanying drawings, all of which form a part of this specification. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the specification. It should also be understood that the drawings are not drawn to scale.
The flow diagrams used in this specification illustrate the operation of system implementations according to some embodiments of the specification. It should be clearly understood that the operations of the flow diagrams may be performed out of order. Rather, the operations may be performed in reverse order or simultaneously. In addition, one or more other operations may be added to the flowchart. One or more operations may be removed from the flowchart.
The intelligent equipment for displaying the articles can be used in industries such as retail, logistics and warehousing, and industries such as unmanned selling and intelligent containers, and can quickly detect the change of the articles on the display equipment, accurately identify the changed articles and reduce the cost. In the prior art, articles on display equipment are usually monitored by adopting an infrared light, distributed weighing or dynamic vision mode, so that the equipment installation cost is high, the calculated amount is large, the external environment is easy to interfere, and the calculation result is inaccurate.
The intelligent equipment for displaying the articles provided by the specification adopts two ultrasonic sensors to monitor the access port of the display equipment, and the positions of hands of users or other clamping devices passing through the access port are measured through the ultrasonic sensors, so that the positions of the articles loaded or unloaded by the users are determined, and the attribute information of the articles is acquired. The ultrasonic sensor used in the specification has the advantages that ultrasonic signals are not interfered by external environment, and the measurement result is more accurate; and the requirements for data transmission and calculation force are low according to the calculation method of the ultrasonic measurement distance, and meanwhile, the power consumption of the ultrasonic sensor is low, so that the running cost can be reduced.
Fig. 1 shows a schematic structural diagram of a smart device 001 (hereinafter referred to as smart device 001) for displaying an article 100. The smart device 001 may be used to display and store items 100. The article 100 may be a sporadic object that may exist alone. Such as a bottle of beverage, a packet of snack food, a screw, etc. The smart device 001 may include a computing apparatus 200, a first ultrasonic sensor 310, a second ultrasonic sensor 320, and a carrying apparatus 400. In some embodiments, smart device 001 may also include a rack 600. In some embodiments, the smart device 001 may further include a pressure sensing apparatus 700. In some embodiments, the smart device 001 may further include a visual detection apparatus 800. In some embodiments, the smart device 001 may further include a display device 900. In some embodiments, smart device 001 may also include speaker 500. For convenience of illustration, we define the X direction as "right", the reverse direction of the X direction as "left", the Y direction as "front", the reverse direction of the Y direction as "rear", the Z direction as "up", and the reverse direction of the Z direction as "down".
The rack 600 may be a support base for the smart device 001.
The carrier 400 may be mounted on a rack 600 for carrying the article 100. The carrier 400 is used to carry the article 100. The smart device 001 may include a plurality of carriers 400. Each carrier 400 may include an access port 420 for loading and unloading the articles 100. The user may load the article 100 on the carrier 400 or unload the article 100 from the carrier 400 through the access port 420. As shown in fig. 1, the pressure of the article 100 against the carrier 400 may be in the Z-direction. Of course, the article 100 can be adsorbed on the carrier 400 by the adsorption force, and the pressure of the article 100 on the carrier 400 can also be in other directions, such as Y direction, X direction, or even oblique direction, etc. For convenience of illustration, the present specification will describe the pressing force of the article 100 on the carrier 400 along the Z direction as an example. The embodiments of the pressure of the article 100 on the bearing device 400 along other directions are similar to the embodiments of the pressure of the article 100 on the bearing device 400 along the Z direction, and can be derived by those skilled in the art, and the description thereof is omitted here. The access opening 420 may be an access plane, and the user loads the article 100 on the carrier 400 or unloads the article 100 from the carrier 400 by using the target device 002 through the plane in which the access opening 420 is located. The target device 002 may be a human hand of a user, a robot, various gripping apparatuses, such as a gripper, and the like. Any equipment that can load an article 100 on the carrier 400 or unload an article 100 from the carrier 400 may be the target device 002. For convenience of description, we define the article 100 loaded on the carrier 400 at the present time or the article 100 unloaded from the carrier 400 at the present time as the target article 120. Item 100 includes target item 120. The access port 420 may be any flat surface. As shown in fig. 1, access port 420 may be an XZ plane. Of course, the access port 420 may be a YZ plane, an XY plane, or other planes. For convenience of illustration, the access port 420 will be described as an XZ plane in this specification.
The first ultrasonic sensor 310 may be mounted on the smart device 001 and disposed on a first side of the access port 420. A second ultrasonic sensor 320 may be installed on the smart device 001, disposed at the first side. The first ultrasonic sensor 310 and the second ultrasonic sensor 320 may be mounted on the rack 600 of the smart device 001 or may be mounted on the carrier 400. The first ultrasonic sensor 310 and the second ultrasonic sensor 320 may be installed at one side of the access port 420 for monitoring the access port 420. The first ultrasonic sensor 310 and the second ultrasonic sensor 320 are operated to monitor whether the target device 002 passes through the access port 420 and the position of the target device 002 passing through the access port 420. When the target device 002 loads or unloads the target item 120 through the access port 420, the monitoring results of the first ultrasonic sensor 310 and the second ultrasonic sensor 320 will change. The first ultrasonic sensor 310 and the second ultrasonic sensor 320 may be located at one side of the access port 420, as shown in fig. 1, the first ultrasonic sensor 310 and the second ultrasonic sensor 320 are located above the access port 420, of course, the first ultrasonic sensor 310 and the second ultrasonic sensor 320 may also be located at a lateral position of the access port 420, for example, at the left or right of the access port 420, the first ultrasonic sensor 310 and the second ultrasonic sensor 320 may also be located below the access port 420, and so on. The first ultrasonic sensor 310 and the second ultrasonic sensor 320 may be located at different positions in the same direction of the access port 420, or may be located at different positions in different directions of the access port 420. The relative position between the first ultrasonic sensor 310 and the second ultrasonic sensor 320 may be stored in the computing device 200 in advance.
The first ultrasonic sensor 310 and the second ultrasonic sensor 320 detect objects in the space by using ultrasonic waves, the ultrasonic waves are reflected when encountering the detected objects when propagating in the air, and the reflected ultrasonic waves are received by the first ultrasonic sensor 310 and the second ultrasonic sensor 320. The first ultrasonic sensor 310 and the second ultrasonic sensor 320 measure the distance of the object to be detected from the ultrasonic sensors by measuring the time of the ultrasonic waves traveling in the air. The first ultrasonic sensor 310 may be an ultrasonic transceiver or a combination of an ultrasonic receiver and an ultrasonic transmitter. That is, the first ultrasonic sensor 310 can transmit ultrasonic waves as well as receive ultrasonic waves. The second ultrasonic sensor 320 may be an ultrasonic transceiver, a combination of an ultrasonic receiver and an ultrasonic transmitter, or an ultrasonic receiver. That is, the second ultrasonic sensor 320 may receive only ultrasonic waves, or may receive and emit ultrasonic waves.
FIG. 2A illustrates a forward view of a first ultrasonic sensor 310 and a second ultrasonic sensor 320 provided in accordance with embodiments of the present description; fig. 2A illustrates a side view of a first ultrasonic sensor 310 and a second ultrasonic sensor 320 provided in accordance with embodiments of the present description. As shown in fig. 2A and 2B, the range shown by the solid line in the drawing is the field of view of the first ultrasonic sensor 310 and the second ultrasonic sensor 320. The field of view of the ultrasonic sensor refers to the range of ultrasonic waves that the ultrasonic sensor can detect. The ultrasonic sensor can detect an object in a specific direction, and the distance between the detected object in the specific direction and the ultrasonic sensor is calculated according to the time difference between the transmission of the ultrasonic wave and the reception of the ultrasonic wave. The ultrasonic sensor can also detect objects in a specific range, and the distance between different detected objects in the specific range and the ultrasonic sensor is calculated according to the time difference between the transmitted ultrasonic wave and the received ultrasonic wave. As previously described, the access port 420 may be a flat surface. As shown in fig. 2A and 2B, in order to ensure that the first ultrasonic sensor 310 and the second ultrasonic sensor 320 can detect the object in the plane of the access port 420, the field of view of the first ultrasonic sensor 310 and the second ultrasonic sensor 320 in the X direction should be as large as possible to cover the range of the access port 420 in the X direction, so as to ensure the accuracy of the measurement result; the field of view of the first ultrasonic sensor 310 and the second ultrasonic sensor 320 in the Y direction should be as small as possible to reduce the detection range of the first ultrasonic sensor 310 and the second ultrasonic sensor 320 in the Y direction, so as to ensure that the first ultrasonic sensor 310 and the second ultrasonic sensor 320 detect the object in the plane where the access port 420 is located, and avoid that the first ultrasonic sensor 310 and the second ultrasonic sensor 320 detect the object outside the plane where the access port 420 is located. Therefore, the influence of the external environment on the detection results of the first ultrasonic sensor 310 and the second ultrasonic sensor 320 is avoided, and the accuracy of the detection results is improved. For example, the first ultrasonic sensor 310 and the second ultrasonic sensor 320 should avoid detecting the article 100 on the carrying device 400, and prevent the article 100 from affecting the detection results of the first ultrasonic sensor 310 and the second ultrasonic sensor 320. For example, when the user approaches the access port 420 and does not pass through the access port 420, in order to ensure the accuracy of the detection results of the first ultrasonic sensor 310 and the second ultrasonic sensor 320, the fields of view of the first ultrasonic sensor 310 and the second ultrasonic sensor 320 in the Y direction should be as small as possible, so as to avoid being interfered by the external environment.
The smart machine 001 provided by the specification uses the ultrasonic sensor to monitor and measure distance, and ultrasonic waves are mechanical waves and are not interfered by external environment, so that the measurement precision is higher. The loudspeakers of the first ultrasonic sensor 310 and the second ultrasonic sensor 320 in this specification are optimally designed, and the visual fields of the first ultrasonic sensor 310 and the second ultrasonic sensor 320 in the X direction and the Y direction are optimized, so that the visual field in the X direction is as large as possible, and the visual field in the Y direction is as small as possible, so that the accurate positioning in the plane where the access port 420 is located can be realized, meanwhile, the interference by an object outside the plane where the access port 420 is located is avoided, and the detection accuracy of the first ultrasonic sensor 310 and the second ultrasonic sensor 320 is improved.
Computing device 200 may store data or instructions for performing the methods of item monitoring described herein, and may execute or be used to execute the data and/or instructions. The method of item monitoring will be described in detail in the following description. For example, fig. 4-6 illustrate the item monitoring method in detail.
The computing apparatus 200 may be a smart mobile device, such as a smart phone, a tablet computer, a notebook computer, etc., and the computing apparatus 200 may also be a personal computer, or even a server. The computing device 200 is operatively coupled in communication with the first ultrasonic sensor 310 and the second ultrasonic sensor 320, and may determine the location of the target item 120 based on changes in the monitoring results of the first ultrasonic sensor 310 and the second ultrasonic sensor 320. In this specification, the communication connection refers to any form of connection capable of directly or indirectly receiving information. For example, the computing device 200 may communicate data with each other by establishing a wireless connection with the first ultrasonic sensor 310 and the second ultrasonic sensor 320 via wireless communication; the computing device 200 may communicate data with each other by direct connection of wires with the first ultrasonic sensor 310 and the second ultrasonic sensor 320; the computing device 200 may also communicate data with each other by establishing an indirect connection with the first ultrasonic sensor 310 and the second ultrasonic sensor 320 through a direct connection of wires with other circuitry.
As described above, the first ultrasonic sensor 310 and the second ultrasonic sensor 320 may monitor the plane of the access port 420 to detect whether the target device 002 passes through the access port 420. When the target device 002 passes through the access port 420, the monitoring results of the first ultrasonic sensor 310 and the second ultrasonic sensor 320 will change. The computing device 200 may calculate the position of the target device 002 on the access port 420 at the time when the change occurs, based on the change in the measurement results of the first ultrasonic sensor 310 and the second ultrasonic sensor 320. Thus, the computing device 200 can calculate the position of the target item 120 loaded or unloaded by the target device 002 on the carrier device 400 at the time of the change. For convenience of description, the time when the measurement results of the first ultrasonic sensor 310 and the second ultrasonic sensor 320 are changed is defined as the current time. The current time may be a time when the target device 002 passes through the access port 420, and the current time may also be a time when a target item 120 is currently loaded on the carrier 400 or a target item 120 is currently unloaded from the carrier 400.
As previously described, the computing device 200 may determine the location of the target item 120 based on changes in the monitoring results of the first ultrasonic sensor 310 and the second ultrasonic sensor 320. However, the computing device 200 cannot determine whether the target item 120 is loaded on the carrier device 400 by the target device 002 or unloaded from the carrier device 400 by the target device 002 at the present moment according to the change of the monitoring results of the first ultrasonic sensor 310 and the second ultrasonic sensor 320. Therefore, the computing device 200 needs to determine whether the target item 120 at the current moment is loaded on the carrier 400 by the target device 002 or unloaded from the carrier 400 by the target device 002 through other devices.
In some embodiments, the smart device 001 may include a visual detection apparatus 800, as shown in fig. 1. The visual inspection apparatus 800 may be installed on the smart device 001. The visual inspection device 800 may be in operative communication with the computing device 200 for capturing an image of the item 100 and transmitting the image of the item 100 to the computing device 200. Computing device 200 may determine a change in target item 120 from the image of item 100, the change in target item 120 may include target item 120 being loaded on carrier 400; or the target item 120 is unloaded from the carrier 400. That is, the computing device 200 may identify the image from the image of the object 100 captured by the visual inspection device 800, and determine whether the current time target object 120 is loaded on the carrier 400 by the target device 002 or unloaded from the carrier 400 by the target device 002. In particular, the computing device 200 may further implement image recognition on a plurality of image information in continuous time by using a multi-frame difference element segmentation technique. Specifically, the multi-frame difference element segmentation technique can be referred to in the prior art, and is not described herein again. In summary, the smart device 001 provided in the present specification can combine the first ultrasonic sensor 310 and the second ultrasonic sensor 320 and the visual detection device 800 to quickly and accurately identify the changed position of the target item 120 on the carrier 400 at the present moment and whether the target item 120 is loaded or unloaded. Compare in only using visual detection device 800 to carry out the equipment monitored, the smart machine 001 that this specification provided combines together ultrasonic sensor and visual detection device 800, and the amount of calculation when can greatly reduced computing device 200 discerns the image can reduce the cost of equipment and simultaneously can also improve the rate of accuracy of monitoring.
In some embodiments, smart device 001 may include a pressure sensing apparatus 700, as shown in fig. 1. The pressure sensing device 700 may be mounted on the carrier 400. The pressure sensing device 700 may be in operative communication with the computing device 200 for measuring the pressure carried on the carrier 400. The pressure sensing device 700 may be connected between the carrier 400 and the rack 600 of the smart device 001, and measure the total pressure carried by the carrier 400. The number of the pressure-sensitive devices 700 may be 1 or more. The computing device 200 may determine a change in the target item 120 based on a change in the measurements of the pressure-sensing device 700. The change in the target item 120 may include the target item 120 being loaded on the carrier 400; or the target item 120 is unloaded from the carrier 400. That is, the computing device 200 may determine whether the target item 120 is loaded on the carrier device 400 by the target device 002 or unloaded from the carrier device 400 by the target device 002 at the present moment based on the pressure change carried on the carrier device 400. If the pressure on the carrier 400 measured by the pressure sensing device 700 decreases at the current moment, it represents that the target object 120 is unloaded from the carrier 400 by the target device 002 at the current moment; when the pressure on the carrier 400 measured by the pressure sensing device 700 increases, it represents that the target object 120 is loaded on the carrier 400 by the target device 002 at the current moment. In summary, the smart device 001 provided in the present specification can combine the first ultrasonic sensor 310, the second ultrasonic sensor 320 and the pressure sensing device 700 to quickly and accurately identify the position of the target item 120 changed on the carrying device 400 at the present moment, and whether the target item 120 is loaded or unloaded, so as to reduce the equipment cost and the calculation cost, and improve the monitoring accuracy.
In some embodiments, the computing device 200 may have previously stored therein attribute information for each of the articles 100. The attribute information may be category information of the article 100, such as maytansha, kola, or the like, weight information and/or volume information of the article 100, such as 300ml, 500ml, or the like, or value information of the article 100, such as 4.5, 5, 10, or the like. The attribute information may also be quantity information of the item 100, such as a jueda cola 2 tank, a sprite 3 tank, and the like. The attribute information may also be inventory information, origin information, preference information of article 100, even component information of article 100, and so on.
The computing device 200 may obtain the attribute information of the target item 120 according to the position of the target item 120 and the image captured by the visual inspection device 800. In particular, the computing device 200 may use an image recognition model (such as a convolutional neural network model) to recognize the captured image of the target object 120, so as to obtain the attribute information of the target object 120. Alternatively, the computing device 200 may also use a multi-frame difference element segmentation technique to identify the target object 120 for a plurality of image information within the continuous time. Specifically, the multi-frame difference element segmentation technique can be referred to in the prior art, and is not described herein again. The computing device 200 may match the image captured by the visual inspection device 800 and the position of the target item 120 calculated by the computing device 200 with the attribute information of the item 100 stored in the computing device 200 in advance, and may use the attribute information of the item corresponding to the matching result as the attribute information of the target item 120. Compared with the equipment which is monitored only by the visual detection device 800, the intelligent equipment 001 provided by the specification can improve the accuracy of detection. When the articles 100 on the carrier 400 are frequently loaded or unloaded in a short time, the apparatus that is monitored only by the visual inspection means is prone to miscalculation. When the intelligent device 001 provided in this specification is used for monitoring, each time the object device 002 loads or unloads the object 100 on the carrying device 400, the computing device 200 can accurately identify the position of the changed object 120, and then, in combination with the image captured by the visual detection device 800, the computing device 200 can accurately obtain the attribute information of the changed object 120 each time, so as to quickly and accurately count the attribute information of the changed object on the carrying device 400 within a period of time, thereby improving the monitoring accuracy.
The visual inspection device 800 may be a general camera. The visual inspection apparatus 800 may be installed on the top of the rack 600 of the smart device 001, and the shooting range may cover all the articles 100 on all the carrying apparatuses 400 of the entire smart device 001. The visual inspection device 800 may also be installed on the top of each carrier 400, and the shooting range may cover all the articles 100 currently in the carrier 400. The vision inspection apparatus 800 may use a wide-angle general camera, for example, a wide-angle camera whose shooting angle is 160 degrees, or may use a fisheye camera.
In some embodiments, the display rules of the item 100 may be pre-stored in the computing device 200. The display rule may be to divide the carrier 400 into several sections, and different sections are loaded with the articles 100 with different attributes. The position coordinates and attributes of the displayed items corresponding to each section are stored in the computing device 200. The computing device 200 may obtain attribute information for the target item 120 based on the location of the target item 120 and the display rules for the item 100. Therefore, each time the target device 002 loads or unloads the object 100 on the carrier 400, the computing device 200 can accurately identify the position of the changed target object 120, and in combination with the display rule of the object 100, the computing device 200 can accurately obtain the attribute information of the changed target object 120 each time, so as to quickly and accurately count the attribute information of the object changed on the carrier 400 over a period of time, thereby improving the monitoring accuracy. Therefore, the smart device 001 can accurately identify the change of the article 100 on the carrying device 400 and the attribute information of the changed article 100 only by the first ultrasonic sensor 310, the second ultrasonic sensor 320, the visual detection device 800 and the computing device 200, so that statistics can be quickly and accurately performed, and the identification accuracy is improved while the cost is reduced.
The computing device 200 can identify and count the objects 100 changing on the carrying device 400 in continuous time according to the attribute information of the target object 120, thereby improving the management efficiency, saving time and cost, and improving the accuracy of statistics. The items 100 that change in continuous time may include the number and attribute information of the items 100 loaded on the carriers 400 and the items 100 unloaded from the carriers 400 within the continuous time window.
In some embodiments, the smart device 001 may further include a display device 900, as shown in fig. 1. Display device 900 may be in operative communication with computing device 200 for displaying information. The computing device 200 may generate related information related to the target item 120 based on the attribute information of the target item 120, and transmit the related information to the display device 900 for display. The related information may include at least one of attribute information of the target item 120 and recommendation information related to the target item 120. The attribute information of the target item 120 may be price information, origin information, capacity information, composition information, etc. of the target item 120. The recommendation information associated with the target item 120 may be other items associated with the target item 120. The display device 900 may be installed on the rack 600 of the smart device 001, and the display device 900 displays information related to all the articles 100 on the smart device 001. A display device 900 may also be mounted on each carrier 400 to display information related to the article 100 currently on the carrier 400. Fig. 3A illustrates a schematic diagram of a display device 900 provided in accordance with an embodiment of the present description. Fig. 3B illustrates a schematic diagram of another display device 900 provided in accordance with embodiments of the present description. As shown in fig. 3A and 3B, the computing device 200 may determine the location of the target item 120 according to the location of the target device 002 passing through the access opening 420, thereby sending display information to the display device 900 according to the attribute information of the target item 120. When the target item 120 is different, the information displayed on the display device 900 is also different. In summary, the smart device 001 provided in the present specification provides an interactive window between the smart device 001 and the user through the display device 900. The intelligent device 001 can interact with the user in the stage of goods taking by the user, so that the intention of the user in the process of selecting goods is known, and better service is provided for the user. Meanwhile, the intelligent device 001 can accurately deliver advertisements to the user, and user experience is improved.
In some embodiments, smart device 001 may also include a speaker 500, as shown in fig. 1. Speaker 500 may be in operative communication with computing device 200 for audibly communicating information related to target item 120 to a user. The speaker 500 may be mounted on a rack 600 of the smart device 001 broadcasting information related to all items 100 on the smart device 001. The speaker 500 may also be mounted on each carrier 400 to broadcast information related to the item 100 currently on the carrier 400.
In summary, the smart device 001 may monitor the access opening 420 of the carrying device 400 through the first ultrasonic sensor 310 and the second ultrasonic sensor 320, and determine the position of the target device 002 passing through the access opening 420 according to the change of the monitoring results of the first ultrasonic sensor 310 and the second ultrasonic sensor 320, so as to determine the position of the target object 120 loaded or unloaded by the target device 002. The smart device 001 may determine whether the target item 120 is loaded on the carrier 400 by the target device 002 or unloaded from the carrier 400 by the target device 002 according to the image photographed by the visual inspection device 800 or the measurement result of the pressure sensing device 700, thereby identifying and counting the target item 120 varying in continuous time. The smart device 001 may further obtain the attribute information of the target item 120 according to the image captured by the visual detection device 800 or the display rule of the item 100 pre-stored in the computing device 200, and display the attribute information of the target item 120 and the recommendation information related to the target item 120 on the display device 900, so as to promote the interaction between the user and the smart device 001 and improve the user experience. The computing device 200 may also identify and count the number of items 100 that change on the carrier 400 over a continuous period of time, determine the number and attribute information of the items 100 that change on the carrier 400 over the continuous period of time, and manage, for example, the amount and price information of the changed items 100, perform a settlement, for example, perform an inventory check, and the like, based on the number and category of the changed items 100. The intelligent device 001 can monitor the article 100 on the bearing device 400 conveniently and quickly, and identify and count the change of the article 100, so that the article 100 is managed and monitored instead of manually, and the equipment cost can be reduced while the monitoring accuracy is improved.
Fig. 4 shows a flowchart of a method P100 of item monitoring provided according to an embodiment of the present description. The method P100 is suitable for the smart device 001 provided in this specification. The method P100 may include performing, by the computing device 200:
s120: a first monitoring result of the first ultrasonic sensor 310 and a second monitoring result of the second ultrasonic sensor 320 are acquired.
Specifically, the first ultrasonic sensor 310 and the second ultrasonic sensor 320 can monitor the plane of the access port 420. The first ultrasonic sensor 310 and the second ultrasonic sensor 320 emit ultrasonic signals and receive reflected signals of the ultrasonic signals by an object. The first monitoring result may include a distance between an object existing in the field of view of the first ultrasonic sensor 310 and an intensity of a reflection signal received by the first ultrasonic sensor 310 and reflected by the object. When a plurality of objects exist in the field of view of the first ultrasonic sensor 310, the first monitoring result may include a distance between each object of the plurality of objects in the field of view of the first ultrasonic sensor 310 and the first ultrasonic sensor, and an intensity of a reflection signal received by the first ultrasonic sensor 310 and reflected by the each object. The distance corresponds to the intensity of the reflected signal. The second monitoring result may include a distance between an object existing in the field of view of the second ultrasonic sensor 320 and an intensity of a reflection signal received by the second ultrasonic sensor 320 and reflected by the object. When a plurality of objects exist in the field of view of the second ultrasonic sensor 320, the second monitoring result may include a distance between each object of the plurality of objects in the field of view of the second ultrasonic sensor 320 and the second ultrasonic sensor, and an intensity of a reflection signal received by the second ultrasonic sensor 320 and reflected by each object. The distance corresponds to the intensity of the reflected signal. The first ultrasonic sensor 310 and the second ultrasonic sensor 320 transmit the first detection result and the second monitoring result to the computing device 200. The computing device 200 may generate a distance signal graph of distance versus reflected signal strength based on the first monitoring result and the second monitoring result. The horizontal axis represents the distance from the object to be detected to the first ultrasonic sensor 310 or the second ultrasonic sensor 320, and the vertical axis represents the amplitude of the reflected signal received by the first ultrasonic sensor 310 or the second ultrasonic sensor 320 and reflected by the object to be detected. The closer the object detected by the first ultrasonic sensor 310 or the second ultrasonic sensor 320 is to the first ultrasonic sensor 310 or the second ultrasonic sensor 320, the stronger the intensity of the reflected signal received by the first ultrasonic sensor 310 or the second ultrasonic sensor 320 is.
S140: determining a location of the target item 120 based on a change in the first monitoring result and the second monitoring result.
Step S140 may include performing, by the computing device 200:
s142: the target device 002 at the current time is determined to pass through the access port 420.
The computing device 200 determines whether the target device 002 passes through the access port 420 at the current time according to the first detection result and the second detection result. As described above, the first ultrasonic sensor 310 and the second ultrasonic sensor 320 can monitor the plane of the access port 420. When no target device 002 passes through the access port 420, the computing device 200 may generate a base range signal map for the first ultrasonic sensor 310 and a base range signal map for the second ultrasonic sensor 320. When the target device 002 passes through the access port 420, the first monitoring result of the first ultrasonic sensor 310 and the second monitoring result of the second ultrasonic sensor 320 change with respect to the monitoring result when the target device 002 does not pass through the access port 420 at the timing when the target device 002 passes through the access port 420. The computing device 200 may determine that the target device 002 passes through the access port 420 at the current moment according to the changes of the first monitoring result and the second monitoring result relative to the basic distance signal diagram.
Fig. 5A illustrates a distance signal diagram of a first ultrasonic sensor 310 provided in accordance with an embodiment of the present description. As shown in fig. 5A, curve 1 represents the distance signal plot of the first ultrasonic sensor 310 when the target device 002 passes through the access port 420. Curve 2 represents the base range signal plot of the first ultrasonic sensor 310. As shown in fig. 5A, when the target device 002 passes through the access port 420, the distance signal pattern of the first ultrasonic sensor 310 changes from the basic distance signal pattern.
Specifically, step S142 may include performing, by computing device 200:
determining a first difference between the first monitoring result at the current time and the monitoring result of the first ultrasonic sensor 310 when the target device 002 does not pass through the access port 420;
determining a second difference between the second monitoring result at the current time and the monitoring result of the second ultrasonic sensor 320 when the target device 002 does not pass through the access port 420;
determining that at least one of the first difference and the second difference exceeds a preset threshold;
the current time when the target device passes through the access port 420 is determined.
Fig. 5B illustrates a distance signal graph of the first difference provided according to an embodiment of the present description. The calculating device 200 obtains a distance signal map corresponding to the first difference value by performing a difference operation on the curve 1 and the curve 2. The method for determining the second difference is similar to the method for determining the first difference, and is not repeated herein. As shown in fig. 5B, a dashed line 3 indicates a preset threshold line. The preset threshold value can be obtained through machine learning. As described above, when the target device 002 approaches the access port 420 without passing through the access port 420, the approach of the target device 002 may cause fluctuation in the monitoring results of the first ultrasonic distance sensor 310 and the second ultrasonic distance sensor 320 due to the close position of the target device 420 to the access port 420. The threshold value may be preset in the computing apparatus 200 in order to improve the accuracy of monitoring of the smart device 001. When neither the first difference nor the second difference exceeds the threshold, the computing device 200 determines that no target device 002 passes through the access port 420, and the fluctuation of the distance signal curve is caused by an environmental disturbance factor. When at least one of the first difference and the second difference exceeds the threshold, the computing device 200 determines that the target device 002 passes through the access port 420. The computing device 200 may also perform optimization processing on the distance signal map through a MASK algorithm.
S144: according to the first monitoring result at the current time, a first distance from the target device 002 at the current time to the first ultrasonic sensor 310 is determined.
S146: according to the second monitoring result at the current time, a second distance from the target device 002 to the second ultrasonic sensor 320 at the current time is determined.
S148: and determining the access position of the target device 002 passing through the access port 420 at the current moment according to the first distance and the second distance.
Fig. 6 is a schematic diagram illustrating a method for determining an access location of a target device 002 according to an embodiment of the present disclosure. As shown in fig. 6, the computing device 200 may calculate a first distance a between the target device 002 and the first ultrasonic sensor 310 according to the distance signal diagram corresponding to the first difference. In the distance signal map corresponding to the first difference, when the target device 002 passes through the access port 420, the target device 002 is at the first distance a from the first ultrasonic sensor 310, which is the distance corresponding to the position where the signal amplitude changes most severely. The calculation device 200 may calculate the second distance b between the target device 002 and the second ultrasonic sensor 320 according to the distance signal map corresponding to the second difference. In the distance signal map corresponding to the two difference values, when the target device 002 passes through the access port 420, the target device 002 is at the second distance b from the second ultrasonic sensor 320, which is the distance corresponding to the position where the signal amplitude changes most severely. As shown in fig. 6, the first ultrasonic sensor 310 is located at a relative distance c from the second ultrasonic sensor 320. The coordinates of the first ultrasonic sensor 310 and the second ultrasonic sensor 320, and the relative distance c between the first ultrasonic sensor 310 and the second ultrasonic sensor 320 have been stored in the computing device 200 in advance. The computing device 200 may calculate the access position of the target device 002 passing through the access port 420 according to triangulation.
S149: and determining the position of the target item 120 at the current moment according to the access position.
The computing device 200 may determine the location of the current time target item 120 based on the access location of the current time target device 002 passing through the access port 420.
It should be noted that, since there may be an error in the measurement process or an error in the installation accuracy of the smart device 001, the calculation result of the calculation apparatus 200 needs to be corrected in some cases. The correction mode can be used for determining correction parameters according to experience for correction, and can also be used for obtaining the correction parameters for correction through training of a neural network algorithm. It is within the scope of the present disclosure that the computing device 200 corrects the calculation result.
The method P100 may further include performing, by the computing device 200:
s160: a change in the target item 120 is determined.
As previously described, the change in the target item 120 may include the target item 120 being loaded on the carrier 400 by the target device 002 or the target item 120 being unloaded from the carrier 400 by the target device 002. The computing device 200 may determine whether the target item 120 is loaded on the carrier device 400 by the target device 002 or the target item 120 is unloaded from the carrier device 400 by the target device 002 at the current time by recognizing the image of the item 100 captured by the visual inspection device 800. The computing device 700 may also determine whether the target item 120 is loaded on the carrier device 400 by the target device 002 or the target item 120 is unloaded from the carrier device 400 by the target device 002 at the current time based on the change in the pressure carried by the carrier device 400 measured by the pressure sensing device 700.
The method P100 may further include, by the computing device 200:
s170: attribute information of the target item 120 is acquired.
As described above, the computing device 200 may store the attribute information of the article 100 in advance. The computing device 200 may obtain the attribute information corresponding to the target item 120 according to the position of the target item 120 and the image information captured by the visual inspection device 800. The display rule of the article 100 may be stored in the computing device 200 in advance. The computing device 200 may obtain attribute information for the target item 120 based on the location of the target item 120 and the display rules for the item 100.
The method P100 may further include performing, by the computing device 200:
s180: the target item 120 that changes within the preset time window is determined.
The preset time window may be a fixed time window, such as, for example, an hour, a day, a week, a month, a year, and the like. The preset time window may also be a non-fixed time window, for example, in a vending machine, the preset time window may be a time from the first time the door of the cargo cabinet is opened to the next time the door of the cargo cabinet is closed. The target items 120 with the changed preset time window may include the number and attribute information of the target items 120 loaded on the carrier 400 by the target device 002 and the target items 120 unloaded from the carrier 400 by the target device 002 within the preset time window. As previously described, the computing device 200 may calculate the location and attribute information of the target item 120 each time the target device 002 passes through the access opening 420. Therefore, the calculation apparatus 200 can count the articles 100 that change within the preset time window, and calculate the number and attribute information of the articles 100 that change within the preset time window, thereby improving the management efficiency, saving time and cost, and at the same time, improving the accuracy of the statistics.
The method P100 may further include performing, by the computing device 200:
s190: and performing settlement according to the attribute information and the quantity of the target articles 120 unloaded from the carrying device 400 by the target device 002 within the preset time window.
In some embodiments, the computing device 200 may also perform statistical calculation according to the attribute information of the object 100 that changes within the preset time window. For example, in a vending machine, the attribute information may be a price corresponding to the article 100. The calculating device 200 may count the items 100 that change in time from the first opening of the door to the next closing of the door, obtain the prices corresponding to the changed items 100, and calculate the total price of the changed items 100. The smart device 001 and the method P100 provided in this specification can quickly and accurately identify the location and attribute information of the target item 120, and can perform statistical settlement on the target item 120 that changes within the preset time window. Compared with the traditional equipment for counting and settling by using the visual detection device, the intelligent equipment 001 and the method P100 provided by the specification can reduce the calculated amount, reduce the difficulty of image recognition of the visual detection device 800, effectively shorten settling time, improve working efficiency and improve user experience.
The method P100 may further include performing, by the computing device 200:
s210: the related information related to the target item 120 is generated based on the attribute information of the target item 120, and transmitted to the display device 900.
As previously described, the related information may include attribute information of the target item 120 and recommendation information related to the target item 120. The calculation device 200 may determine, as the recommendation information, the plurality of items 100 having the highest similarity to the target item 120 by similarity calculation based on the attribute information of the target item 120, and may transmit the plurality of items to the display device 900 to be displayed. The smart device 001 provided in this specification provides an interactive window between the smart device 001 and a user through the display apparatus 900. The intelligent device 001 can interact with the user in the stage of goods taking by the user, so that the intention of the user in the process of selecting goods is known, and better service is provided for the user. Meanwhile, the intelligent device 001 can accurately deliver advertisements to the user, and user experience is improved.
In summary, the smart device 001 provided herein may monitor the access port 420 through the first ultrasonic sensor 310 and the second ultrasonic sensor 320, and the computing apparatus 200 may generate a distance signal graph according to the monitoring results of the first ultrasonic sensor 310 and the second ultrasonic sensor 320. The distance signal map changes when a target device 002 passes through the access port 420. The computing device 200 may determine the access location of the target device 002 passing through the access opening 420 based on the monitoring results of the first ultrasonic sensor 310 and the second ultrasonic sensor 320, and further determine the location of the target item 120. The computing device 200 may also obtain attribute information of the target item 120 based on the location of the target item 120 and the display rules of the item 100 or the image captured by the visual inspection device 800. The computing device 200 may also count and settle target items 120 that are loaded or unloaded by the target device 002 over a continuous period of time. The intelligent device 001 and the method P100 provided by the specification can conveniently and rapidly identify and count the target object 120, can replace manual work to manage and monitor the object 100, effectively reduce the equipment cost and the management cost, improve the monitoring accuracy and improve the user experience.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
In conclusion, upon reading the present detailed disclosure, those skilled in the art will appreciate that the foregoing detailed disclosure can be presented by way of example only, and not limitation. Those skilled in the art will appreciate that the present specification contemplates various reasonable variations, enhancements and modifications to the embodiments, even though not explicitly described herein. Such alterations, improvements, and modifications are intended to be suggested by this specification, and are within the spirit and scope of the exemplary embodiments of this specification.
Furthermore, certain terminology has been used in this specification to describe embodiments of the specification. For example, "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various portions of this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined as suitable in one or more embodiments of the specification.
It should be appreciated that in the foregoing description of embodiments of the specification, various features are grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the specification, for the purpose of aiding in the understanding of one feature. This is not to be taken as an admission that any of the features are required in combination, and it is fully possible for one skilled in the art to extract some of the features as separate embodiments when reading this specification. That is, embodiments in this specification may also be understood as an integration of a plurality of sub-embodiments. And each sub-embodiment described herein is equally applicable to less than all features of a single foregoing disclosed embodiment.
Each patent, patent application, publication of a patent application, and other material, such as articles, books, descriptions, publications, documents, articles, and the like, cited herein is hereby incorporated by reference. All matters hithertofore set forth herein except as related to any prosecution history, may be inconsistent or conflicting with this document or any prosecution history which may have a limiting effect on the broadest scope of the claims. Now or later associated with this document. For example, if there is any inconsistency or conflict in the description, definition, and/or use of terms associated with any of the included materials with respect to the terms, descriptions, definitions, and/or uses associated with this document, the terms in this document are used.
Finally, it should be understood that the embodiments of the application disclosed herein are illustrative of the principles of the embodiments of the present specification. Other modified embodiments are also within the scope of this description. Accordingly, the disclosed embodiments are to be considered in all respects as illustrative and not restrictive. Those skilled in the art may implement the applications in this specification in alternative configurations according to the embodiments in this specification. Therefore, the embodiments of the present description are not limited to the embodiments described precisely in the application.

Claims (20)

1. A smart device for displaying items, comprising:
the carrying device is used for carrying articles and comprises an access port, and the articles are loaded and unloaded from the carrying device through the plane of the access port;
the first ultrasonic sensor is arranged on the first side of the access port and monitors the plane of the access port during operation;
a second ultrasonic sensor mounted on the first side of the access port for monitoring the plane of the access port during operation,
wherein the first ultrasonic sensor and the second ultrasonic sensor are operated to monitor whether a target device passes through the access port, the target device comprises equipment for loading or unloading a target article from the carrying device, and the article comprises the target article; and
and the computing device is in communication connection with the first ultrasonic sensor and the second ultrasonic sensor during operation, and determines the position of the target device when the target device passes through the access opening based on the change of the monitoring results of the first ultrasonic sensor and the second ultrasonic sensor, so as to determine the position of the target object.
2. The smart device of claim 1, wherein the determining the location of the target item comprises:
determining that the target device passes through the access port at the current moment;
determining a first distance between the target device and the first ultrasonic sensor at the current moment according to a first monitoring result of the first ultrasonic sensor at the current moment;
determining a second distance between the target device and the second ultrasonic sensor at the current moment according to a second monitoring result of the second ultrasonic sensor at the current moment;
determining an access position of the target device passing through the access port at the current moment according to the first distance and the second distance; and
and determining the position of the target object at the current moment according to the access position.
3. The smart device of claim 2, wherein said determining that the target device passes through the access port at the current time comprises:
determining a first difference value between the first monitoring result of the first ultrasonic sensor at the current moment and the monitoring result of the first ultrasonic sensor when the target device does not pass through the access port;
determining a second difference value between the second monitoring result of the second ultrasonic sensor at the current moment and the monitoring result of the second ultrasonic sensor when the target device does not pass through the access port;
determining that at least one of the first difference and the second difference exceeds a preset threshold; and
and determining that the target device passes through the access port at the current moment.
4. The smart device of claim 1, further comprising:
a visual detection device, communicatively coupled to the computing device during operation, mounted on the smart device, to capture an image of the item and transmit the image to the computing device,
wherein the computing device determines a change in the target item from the image, the change in the target item comprising:
the target item is loaded on the carrier; or
The target item is unloaded from the carrier.
5. The smart device of claim 4, wherein the computing apparatus has attribute information of each of the items pre-stored therein,
wherein the computing device obtains attribute information of the target item according to the position of the target item and the image.
6. The smart device of claim 1, further comprising:
a pressure sensing device mounted on the carrying device and in communication with the computing device during operation for measuring the pressure carried on the carrying device,
wherein the computing device determines a change in the target item from a change in the pressure-sensing device measurement, the change in the target item comprising:
the target item is loaded on the carrier; or
The target item is unloaded from the carrier.
7. The smart device according to claim 6, wherein the computing means stores attribute information of each of the items and display rules of the items in advance, and acquires the attribute information of the target item according to the position of the target item and the display rules of the items.
8. The smart device of any of claims 5 and 7, further comprising:
a display device in operative communication with the computing device for displaying information,
and the computing device generates related information related to the target object according to the attribute information of the target object and sends the related information to the display device.
9. The smart device of claim 8, wherein the related information comprises at least one of:
attribute information of the target item; and
recommendation information related to the target item.
10. A method of item monitoring, a smart device for displaying items, the smart device comprising:
the carrying device is used for carrying articles and comprises an access port, and the articles are loaded and unloaded from the carrying device through the plane of the access port;
the first ultrasonic sensor is arranged on the first side of the access port and monitors the plane of the access port during operation;
a second ultrasonic sensor mounted on the first side of the access port for monitoring the plane of the access port during operation,
wherein the first ultrasonic sensor and the second ultrasonic sensor are operated to monitor whether a target device passes through the access port, the target device comprises equipment for loading or unloading a target article from the carrying device, and the article comprises the target article; and
a computing device in operative communication with the first ultrasonic sensor and the second ultrasonic sensor;
the method includes, by the computing device:
acquiring a first monitoring result of the first ultrasonic sensor and a second monitoring result of the second ultrasonic sensor; and
and determining the position of the target device when the target device passes through the access opening based on the change of the first monitoring result and the second monitoring result, thereby determining the position of the target object.
11. The method of claim 10, wherein said determining the location of the target item comprises:
determining that the target device passes through the access port at the current moment;
determining a first distance between the target device and the first ultrasonic sensor at the current moment according to the first monitoring result at the current moment;
determining a second distance between the target device and the second ultrasonic sensor at the current moment according to the second monitoring result at the current moment;
determining an access position of the target device passing through the access port at the current moment according to the first distance and the second distance; and
and determining the position of the target object at the current moment according to the access position.
12. The method of claim 11, wherein said determining that the target device passes through the access port at the current time comprises:
determining a first difference value between the first monitoring result at the current moment and a monitoring result of a first ultrasonic sensor when the target device does not pass through the access port;
determining a second difference value between the second monitoring result at the current moment and the monitoring result of a second ultrasonic sensor when the target device does not pass through the access port;
determining that at least one of the first difference and the second difference exceeds a preset threshold; and
and determining that the target device passes through the access port at the current moment.
13. The method of claim 10, wherein the smart device further comprises:
the visual detection device is in communication connection with the computing device during operation, is installed on the intelligent equipment, shoots an image of the article and transmits the image to the computing device;
the method further comprises, by the computing device:
determining a change in the target item from the image, the change in the target item including the target item being loaded on the carrier or the target item being unloaded from the carrier.
14. The method of claim 13, wherein the computing device has pre-stored therein attribute information for each of the items, the method further comprising, by the computing device:
and acquiring the attribute information of the target object according to the position of the target object and the image.
15. The method of claim 10, wherein the smart device further comprises:
a pressure sensing device mounted on the carrying device and in communication with the computing device during operation for measuring the pressure carried on the carrying device,
the method further comprises, by the computing device:
determining a change in the target item based on a change in the pressure-sensing device measurement, the change in the target item including the target item being loaded on or unloaded from the carrier.
16. The method of claim 15, wherein the computing device has pre-stored therein display rules for the item, the method further comprising, by the computing device:
and acquiring the attribute information of the target item according to the position of the target item and the display rule of the item.
17. The method of any of claims 14 and 16, further comprising, by the computing device:
determining the target articles changed within a preset time window, wherein the target articles changed within the preset time window comprise the articles loaded on the carrying device by the target device within the preset time window and the quantity and attribute information of the articles unloaded from the carrying device by the target device.
18. The method of claim 17, further comprising, by the computing device:
and settling accounts according to the attribute information and the quantity of the target articles unloaded from the bearing device by the target device in the preset time window.
19. The method of any of claims 14 and 16, wherein the smart device further comprises:
a display device in operative communication with the computing device for displaying information,
the method further comprises, by the computing device:
and generating related information related to the target object according to the attribute information of the target object, and sending the related information to the display device.
20. The method of claim 19, wherein the related information comprises at least one of:
attribute information of the target item; and
recommendation information related to the target item.
CN202110025033.1A 2020-07-31 2020-07-31 Intelligent device for displaying articles and method of article monitoring Pending CN112767597A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110025033.1A CN112767597A (en) 2020-07-31 2020-07-31 Intelligent device for displaying articles and method of article monitoring

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110025033.1A CN112767597A (en) 2020-07-31 2020-07-31 Intelligent device for displaying articles and method of article monitoring
CN202010756096.XA CN111738665B (en) 2020-07-31 2020-07-31 Intelligent device for displaying articles and method of article monitoring

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202010756096.XA Division CN111738665B (en) 2020-07-31 2020-07-31 Intelligent device for displaying articles and method of article monitoring

Publications (1)

Publication Number Publication Date
CN112767597A true CN112767597A (en) 2021-05-07

Family

ID=72656700

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110025033.1A Pending CN112767597A (en) 2020-07-31 2020-07-31 Intelligent device for displaying articles and method of article monitoring
CN202010756096.XA Active CN111738665B (en) 2020-07-31 2020-07-31 Intelligent device for displaying articles and method of article monitoring

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202010756096.XA Active CN111738665B (en) 2020-07-31 2020-07-31 Intelligent device for displaying articles and method of article monitoring

Country Status (1)

Country Link
CN (2) CN112767597A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116052062A (en) * 2023-03-07 2023-05-02 深圳爱莫科技有限公司 Robust tobacco display image processing method and device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112200631B (en) * 2020-10-12 2022-06-24 支付宝(杭州)信息技术有限公司 Industry classification model training method and device

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140367401A1 (en) * 2011-10-21 2014-12-18 Sca Hygiene Products Ab Product Level Sensor for a Product Dispenser
JP2017142171A (en) * 2016-02-10 2017-08-17 株式会社Soken Object detection device
CN107862360A (en) * 2017-11-01 2018-03-30 北京旷视科技有限公司 Destination object and the correlating method of merchandise news, apparatus and system
CN107968988A (en) * 2017-11-30 2018-04-27 北京旷视科技有限公司 Monitoring device and intelligent commodity shelf
CN108492457A (en) * 2018-03-12 2018-09-04 远瞳(上海)智能技术有限公司 Automatic vending device and method
CN109031205A (en) * 2018-07-18 2018-12-18 北京进化者机器人科技有限公司 Robotic positioning device, method and robot
CN110161510A (en) * 2019-06-27 2019-08-23 北京智行者科技有限公司 Barrier localization method and device based on ultrasonic radar
EP3557542A1 (en) * 2018-04-19 2019-10-23 Laservideo S.R.L. Automatic items vending machine
CN110766860A (en) * 2018-07-27 2020-02-07 威海新北洋数码科技有限公司 Automatic vending machine
CN110879395A (en) * 2019-12-03 2020-03-13 北京百度网讯科技有限公司 Obstacle position prediction method and device and electronic equipment
CN110895747A (en) * 2018-09-13 2020-03-20 阿里巴巴集团控股有限公司 Commodity information identification, display, information association and settlement method and system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN205942895U (en) * 2016-08-12 2017-02-08 黄薇 Vending machine
CN110260796A (en) * 2019-06-04 2019-09-20 上海追月科技有限公司 Kinds of goods sensory perceptual system, kinds of goods cognitive method and electronic equipment

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140367401A1 (en) * 2011-10-21 2014-12-18 Sca Hygiene Products Ab Product Level Sensor for a Product Dispenser
JP2017142171A (en) * 2016-02-10 2017-08-17 株式会社Soken Object detection device
CN107862360A (en) * 2017-11-01 2018-03-30 北京旷视科技有限公司 Destination object and the correlating method of merchandise news, apparatus and system
CN107968988A (en) * 2017-11-30 2018-04-27 北京旷视科技有限公司 Monitoring device and intelligent commodity shelf
CN108492457A (en) * 2018-03-12 2018-09-04 远瞳(上海)智能技术有限公司 Automatic vending device and method
EP3557542A1 (en) * 2018-04-19 2019-10-23 Laservideo S.R.L. Automatic items vending machine
CN109031205A (en) * 2018-07-18 2018-12-18 北京进化者机器人科技有限公司 Robotic positioning device, method and robot
CN110766860A (en) * 2018-07-27 2020-02-07 威海新北洋数码科技有限公司 Automatic vending machine
CN110895747A (en) * 2018-09-13 2020-03-20 阿里巴巴集团控股有限公司 Commodity information identification, display, information association and settlement method and system
CN110161510A (en) * 2019-06-27 2019-08-23 北京智行者科技有限公司 Barrier localization method and device based on ultrasonic radar
CN110879395A (en) * 2019-12-03 2020-03-13 北京百度网讯科技有限公司 Obstacle position prediction method and device and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张璠: "《物联网技术基础》", 31 May 2018, 航空工业出版社 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116052062A (en) * 2023-03-07 2023-05-02 深圳爱莫科技有限公司 Robust tobacco display image processing method and device

Also Published As

Publication number Publication date
CN111738665B (en) 2020-12-01
CN111738665A (en) 2020-10-02

Similar Documents

Publication Publication Date Title
US8237563B2 (en) Utilization of motion and spatial identification in mobile RFID interrogator
CN107103503B (en) Order information determining method and device
US8421627B2 (en) Method for associating and RFID tag with a known region
US9916561B2 (en) Methods, devices and computer readable storage devices for tracking inventory
CN111738665B (en) Intelligent device for displaying articles and method of article monitoring
KR101994205B1 (en) Smart shopping cart and shopping management system using the same
US10378956B2 (en) System and method for reducing false positives caused by ambient lighting on infra-red sensors, and false positives caused by background vibrations on weight sensors
US11853961B1 (en) Customized neural network for item recognition
US9697397B2 (en) Utilization of motion and spatial identification in mobile RFID interrogator
CN107403332B (en) Goods shelf fetching detection system and method
US10346659B1 (en) System for reading tags
KR20190093733A (en) Items recognition system in unmanned store and the method thereof
US11514766B1 (en) Detecting interactions with storage units based on RFID signals and auxiliary signals
KR20210106967A (en) Purchasing­transaction recognition system and method thereof
CN112907168A (en) Dynamic commodity identification method, unmanned sales counter and sales method thereof
CN108896156A (en) Article monitoring method, apparatus and system
US20130241700A1 (en) Device and method for the automated reading/writing of rfid tags
CN104573589A (en) Commodity moving monitoring method and system based on radio frequency tag signal characteristic detection
US11276107B2 (en) Product management device, control method for product management device, and program
US11195140B1 (en) Determination of untidy item return to an inventory location using weight
CN111513492A (en) Intelligent device for displaying articles and method of article monitoring
CN208444367U (en) Article identification device and automatic vending machine
CN113538784B (en) Intelligent container and article identification method
CN111528632B (en) Intelligent device for displaying articles and method of article monitoring
CN111507702A (en) Self-service shopping method for unmanned supermarket, computer readable storage medium and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination