WO2019154112A1 - 入离场状态检测方法和装置 - Google Patents
入离场状态检测方法和装置 Download PDFInfo
- Publication number
- WO2019154112A1 WO2019154112A1 PCT/CN2019/073120 CN2019073120W WO2019154112A1 WO 2019154112 A1 WO2019154112 A1 WO 2019154112A1 CN 2019073120 W CN2019073120 W CN 2019073120W WO 2019154112 A1 WO2019154112 A1 WO 2019154112A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- target object
- state
- distance
- image acquisition
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/188—Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/65—Control of camera operation in relation to power supply
- H04N23/651—Control of camera operation in relation to power supply for reducing power consumption by affecting camera operations, e.g. sleep mode, hibernation mode or power off of selective parts of the camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
Definitions
- the present application relates to the field of computer technology, and in particular, to a method and apparatus for detecting an entry and exit state.
- the monitoring system can automatically determine whether the preset object is admitted or departed, so that the business system can enter the venue according to the preset object and/or The departure status provides an appropriate service.
- the embodiment of the present application provides a method and device for detecting an entry and exit state, and a corresponding application system, which can automatically detect a preset object entering and leaving the state.
- an embodiment of the present application provides a method for detecting an inbound and outbound state, which is performed by a snoop detection system, where the method includes:
- the first preset condition includes at least one of the following:
- the distance between the object and the target object is less than a first preset threshold
- the statistical value of the distance between the object and the target object in the first preset time period is less than the second preset threshold
- the difference between the distance between the object and the target object in the first preset time interval is greater than a third preset threshold
- the distance between the object and the target object is greater than a fourth preset threshold
- the statistical value of the distance between the object and the target object in the second preset time period is greater than the fifth preset threshold.
- the method before determining the state of the target object according to the recognition result obtained by performing object recognition on the image in the image collection area, the method also includes:
- the object recognition system is a cloud object identification system.
- the method for detecting the departure state includes the following at least one of the following:
- the number of preset objects included in the image in the image acquisition area is the number of preset objects included in the image in the image acquisition area.
- the method for determining the target object according to the recognition result obtained by performing object recognition on the image in the image collection area including the following at least One:
- Determining that the state of the target object is an admission state when the number of the preset objects included in the image in the image collection area is greater than zero;
- the method further includes: after determining the state of the target object, the method further includes:
- the method further includes: after determining the state of the target object, the method further includes:
- a second instruction is sent to the image acquisition system corresponding to the distance detection range to turn off the image acquisition system or to switch the image acquisition system to a standby mode.
- the method for sending the second instruction to the image collection system corresponding to the distance detection range includes:
- the second preset condition includes: a difference between the distance between the object and the target object in the second preset time interval is less than a sixth preset threshold.
- the method further includes: after determining the state of the target object, the method further includes:
- the first preset condition is determined according to a state of the target object.
- the embodiment of the present application provides a method for detecting an entry and exit state, which is performed by an image acquisition system, and the method includes:
- the object is located within a distance detection range of the monitoring detection system, the distance The detection range corresponds to the image acquisition system;
- the monitoring detection system determines a state of the target object according to a recognition result obtained by performing object recognition on an image in the image collection area, the target object
- the status includes the admission status and/or the departure status.
- the method further includes: after acquiring the image in the image collection area of the image acquisition system, the method further includes:
- the image acquisition system is turned off or switched to the standby mode.
- the embodiment of the present application provides an inbound and outbound state detecting device, which is applied to a monitoring and detecting system, and the device includes:
- the distance monitoring module monitors the distance between the object within the distance detection range and the target object
- a first instruction sending module when the distance between the object and the target object satisfies a first preset condition, sending a first instruction to the image acquisition system corresponding to the distance detection range, so as to activate the image acquisition system Obtain an image in its image acquisition area;
- the state determining module determines a state of the target object according to the recognition result obtained by performing object recognition on the image in the image capturing area, and the state of the target object includes an entering state and/or an exiting state.
- the embodiment of the present application provides an entry and exit state detecting device, which is applied to an image collecting system, and the device includes:
- the first instruction receiving module receives a first instruction sent by the monitoring detection system when the distance between the object and the target object satisfies the first preset condition; the object is located within a distance detection range of the monitoring detection system, the distance The detection range corresponds to the image acquisition system;
- the image acquisition module acquires an image in the image collection area of the image acquisition system, so that the monitoring detection system determines the state of the target object according to the recognition result obtained by performing object recognition on the image in the image collection area.
- the state of the target object includes an entry state and/or an exit state.
- an electronic device including:
- a memory arranged to store computer executable instructions that, when executed, cause the processor to perform the following operations:
- an embodiment of the present application provides a computer readable storage medium, where the one or more programs are stored, when the one or more programs are executed by an electronic device including multiple applications. , causing the electronic device to perform the following operations:
- an electronic device including:
- a memory arranged to store computer executable instructions that, when executed, cause the processor to perform the following operations:
- the object is located within a distance detection range of the monitoring detection system, the distance The detection range corresponds to the image acquisition system;
- the monitoring detection system determines a state of the target object according to a recognition result obtained by performing object recognition on an image in the image collection area, the target object
- the status includes the admission status and/or the departure status.
- an embodiment of the present application provides a computer readable storage medium, where the one or more programs are stored, when the one or more programs are executed by an electronic device including multiple applications. , causing the electronic device to perform the following operations:
- the object is located within a distance detection range of the monitoring detection system, the distance The detection range corresponds to the image acquisition system;
- the monitoring detection system determines a state of the target object according to a recognition result obtained by performing object recognition on an image in the image collection area, the target object
- the status includes the admission status and/or the departure status.
- an embodiment of the present application provides an application system, including a monitoring detection system, an image collection system, an object recognition system, and a service system, where:
- the monitoring detection system monitors a distance between the object within the distance detection range and the target object; and when the distance between the object and the target object satisfies the first preset condition, corresponding to the distance detection range
- the image acquisition system sends a first instruction to activate the image acquisition system to acquire an image in the image acquisition area thereof; and further determines the target object according to the recognition result obtained by performing object recognition on the image in the image acquisition area.
- a state the state of the target object includes an entry state and/or an exit state;
- the image acquisition system receives a first instruction, where the first instruction is sent by the monitoring detection system when the distance between the object and the target object satisfies a first preset condition, and the object is located in the distance detection of the monitoring detection system.
- the distance detection range corresponds to the image acquisition system; and an image in the image acquisition area of the image acquisition system is also acquired, so that the monitoring detection system performs an object according to an image in the image collection area. Identifying the obtained recognition result, determining a state of the target object, where the state of the target object includes an entry state and/or an exit state;
- the object recognition system receives an image in the image collection area, performs object recognition on the image in the image collection area, and obtains the recognition result; and returns the recognition result;
- the business system receives a status of the target object and determines a business process corresponding to the status of the target object.
- an image acquisition system may be used to acquire an image in an image collection area, and then the collected image is subjected to object recognition, thereby determining a state of the target object according to the recognition result. Therefore, it is possible to more accurately determine whether the preset object is admitted and/or departed.
- the image acquisition system is activated to acquire the image in the image acquisition area. Therefore, the system power consumption can be effectively reduced to meet the application requirements.
- FIG. 1 is a schematic structural diagram of an application system to which an embodiment of the present application is applied;
- FIG. 2 is a schematic flowchart of a method for detecting an entry and exit state performed by a monitoring and detecting system according to an embodiment of the present application
- FIG. 3 is a schematic diagram of an implementation of a scenario in which a self-service restaurant is applied to an embodiment of the present application
- FIG. 4 is a schematic flowchart of a method for detecting an entry and exit state performed by an image acquisition system according to an embodiment of the present application
- FIG. 5 is a schematic structural diagram of an entry and exit state detecting apparatus applied to a monitoring and detecting system according to an embodiment of the present application;
- FIG. 6 is a schematic structural diagram of an entry and exit state detecting device applied to an image acquisition system according to an embodiment of the present application
- FIG. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
- FIG. 8 is a schematic structural diagram of another electronic device according to an embodiment of the present application.
- FIG. 9 is a schematic flowchart of a method for detecting an entry and exit state of an application system according to an embodiment of the present application.
- FIG. 1 is a schematic diagram showing an architecture of an application system capable of automatically detecting a preset object into an off-field state. It can be understood that the application system can be applied to various application scenarios. For example, self-service restaurants, vending cabinets, or automatic access, and more.
- the application system may include a monitoring detection system 100, an image acquisition system 200, and a business system 300.
- the monitoring detection system 100 can monitor the distance between the object entering the distance detection range and the target object 500 to activate the image acquisition system for image acquisition when the distance meets certain conditions.
- the image acquisition system 200 may acquire images within the image acquisition area after being activated to determine whether the preset object is included in the image acquisition area based on the recognition result of the object recognition of the images.
- the monitoring detection system 100 and/or the image acquisition system 200 can transmit images within the image acquisition area to the recognition system 400 for object recognition.
- the image in the image capturing area includes a preset object, it may be determined that the target object is in the entering state; if the image in the image capturing area does not include the preset object, the target object may be determined to be in the field of leaving the field. Based on this, the status information of the target object can be further sent to the service system 300, so that the service system 300 determines the corresponding business process according to the status of the target object.
- the identification system 400 for object recognition of an image may be an identification system installed locally at the target object or a cloud recognition system installed at a remote location.
- the method for detecting the inbound and outbound state performed by the monitoring and detecting system may specifically include the following steps:
- S101 Monitor the distance between the object within the distance detection range and the target object.
- the monitoring detection system may use a distance sensing module to detect the distance between the object and the target object appearing within the distance detection range of the distance sensing module in real time.
- the distance sensing module may be disposed at the target object, and the distance between the object and the target object is obtained by detecting the distance between the object and the distance sensing module.
- the distance sensing module may adopt one or more of an ultrasonic ranging sensor, a laser ranging sensor, and an infrared ranging sensor, as long as the accuracy of the distance monitoring and the specific requirements of the application scenario can be met.
- the ultrasonic ranging sensor includes a transmitting unit for transmitting ultrasonic waves and a receiving unit for receiving ultrasonic echoes, and the ultrasonic echo ranging principle can be used to detect the distance between two objects.
- the emitted ultrasonic wave will rebound back after encountering the occlusion object (which may be an object or a human body). Therefore, the ultrasonic distance measuring sensor can calculate the distance traveled by the ultrasonic wave by using the time difference between the transmitted ultrasonic wave and the received ultrasonic echo, thereby obtaining the occlusion.
- Ultrasonic distance measuring sensors have the advantages of small dead zone, accurate measurement, no contact, and low cost.
- the ultrasonic ranging sensor may be disposed on the target object to monitor the distance between the object within the distance detection range and the target object.
- the specific arrangement position and direction of the ultrasonic ranging sensor are adjusted so that the transmitting unit emits ultrasonic waves in a certain direction and starts timing at the time of transmission. Since the ultrasonic wave hits the obstacle when it propagates in the air, it will immediately return. Therefore, the receiving unit stops counting after receiving the reflected wave (corresponding to the ultrasonic echo).
- the ultrasonic propagation velocity is v
- the time difference between the ultrasonic wave emitted by the transmitting unit and the ultrasonic wave received by the receiving unit is t
- the propagation speed of the ultrasonic wave is related to the temperature, the speed changes by less than 1% when the temperature changes by 5 degrees Celsius, so that the sound velocity can be considered to be fixed when the temperature does not change much. This accuracy is usually sufficient for indoor self-service restaurants, vending cabinets and other application scenarios.
- the ultrasonic ranging sensor it is also possible to measure and monitor the distance using a laser ranging sensor and/or an infrared ranging sensor.
- the principle of optical ranging is similar to the principle of acoustic ranging. The main difference lies in the time difference between the emitted light and the received light. Since the light wave on which the laser ranging sensor depends is greatly affected by sunlight and the like, it may be more susceptible to interference during the day. Therefore, the laser ranging sensor is more suitable for use at night, for example, in nighttime self-service access control. When the illumination is insufficient, the infrared ranging sensor can also be selected to achieve better ranging accuracy.
- a plurality of ranging sensors can be used in combination to meet the requirements of different measurement precisions and application scenarios, which is not limited in this embodiment of the present application.
- the target object may be a dining table 501 in a restaurant (or a buffet cabinet in a restaurant), as shown in FIG. 3 . It is determined whether the corresponding business process is started by detecting whether the human body (which can be understood as the preset object is the human body) is close to or away from the dining table.
- the ultrasonic ranging sensor 502 (which may also be other types of ranging sensors) may be disposed on the target object (ie, the dining table 501), and the direction in which the transmitting unit transmits the ultrasonic waves is adjusted to be the most likely to approach or away from the dining table.
- the direction for example, can be placed around the table.
- the ultrasonic distance measuring sensor 502 can be mounted on the side of the dining table, and the ultrasonic wave is emitted in the horizontal direction, and the distance detecting range 506 of the ultrasonic distance measuring sensor 502 is entered when the human body approaches or moves away from the table. It can be understood that in order to ensure that the human body can be monitored when approaching or away from the table in all directions, a distance measuring sensor can be arranged around the table.
- signals (sound waves or light waves) emitted by multiple ranging sensors may be disturbed.
- ranging sensors placed on the table on both sides of the aisle may have a range of distance detection.
- the ultrasonic waves emitted by one ranging sensor may be received by another ranging sensor, thereby affecting the accuracy of the distance measurement.
- a plurality of methods may be adopted.
- a plurality of ranging sensors may be controlled to transmit signals in turn; and, for example, when the distance determination is performed, the detection value whose distance exceeds a certain threshold may be automatically discarded; for example,
- the distance measuring sensor that transmits the signal at the current time can also be determined according to the reservation condition of the user (here, specifically the diners). For example, when the user enters the self-service restaurant, the scanning code reservation is made, and the table number for preparing the meal is determined (can be embodied) As the ID of the target object, only the table emission signal corresponding to the table number is activated.
- the distance between the object and the target object can be monitored by performing step S101.
- the condition for activating the image acquisition system that is, the first preset condition
- step S107 is performed to determine whether the distance between the object and the target object satisfies the first preset condition. If the first preset condition is met, step S103 is further performed; if the first preset condition is not met, then returning to step S101 to continue monitoring.
- the content of the first preset condition may be different in different application scenarios.
- the first preset condition may include at least one of the following:
- This preset condition can be understood as that the distance is smaller than the first preset threshold, indicating that the distance between the object and the target object is close enough, and the monitored object may need to use the service corresponding to the target object.
- an object which may be a human body, such as a diners; or an object, such as a cart for finishing a table residue
- the target object herein embodied as a table
- diners who need to dine at this table.
- the monitoring detection system can activate the image acquisition system for image acquisition, and then use the recognition system to perform object recognition on the collected image, etc., thereby determining whether the object close to the table is a preset object (here Turned into a human body): If the object close to the table is a human body, it means that the dining table may be used by the diners, and it can be understood that the target object enters the admission state, and then the business system can be further entered into a business process such as ordering; If the object close to the table is not a human body, it means that there is no table for the diners to use. It can be understood that the target object is in the state of leaving the field and does not need to enter the business system.
- a preset object here Turned into a human body
- the value of the distance between the monitored object and the target object may have some glitch signals, which may affect the judgment result. Therefore, the statistical value of the distance between the object and the target object in a certain period of time (for example, the first preset time period) can be calculated, and the statistical value is used to reflect the distance in the time window (ie, the first preset time) The overall measurement of the distance within the segment), thereby eliminating the effect of the glitch signal on the judgment result.
- the statistic value may be taken as an average value or a median (also referred to as a median value) of the distance measurement values in the first preset time period.
- the statistical value of the distance in the first preset time period is less than the second preset threshold. It can be understood that the distance between the object and the target object is close enough within a certain time window, and the monitored object may need to be used.
- the business corresponding to the target object. Taking the application scenario of the self-service container as an example, when an object (which can be set as a human body, such as a container manager; or an object, such as a container, etc.) approaches the target object (herein embodied as a container), it may be necessary to This container loads the container.
- the monitoring detection system can activate the image acquisition system for image acquisition, and then use the recognition system to perform object recognition on the acquired image, etc., to determine whether the object approaching the container is a preset object (here Turned into a container): If the object close to the container is a container, it means that the container needs to be loaded. It can be understood that the target object is in the entering state, and then the business system can be further entered into the business process such as warehousing, loading and unloading the container; If the object of the container is not a container, it means that the container does not need to be loaded. It can be understood that the target object is in the state of leaving the field and does not need to enter the business system.
- a preset object here Turned into a container
- This preset condition can be understood as if the distance between the object being monitored (which can be understood as an object located within the distance detection range) and the target object remains stable at a certain time interval (for example, the first preset)
- the change value within the time interval is small enough (for example, not greater than the third preset threshold), indicating that the object is likely not moving, or that the movement amplitude has not reached a preset level.
- the diners approach the table and sit in front of the table to dine, the distance between the diners and the table is usually small and basically stable.
- the table will always be in the admission state before the diners finish eating and leaving the table. Therefore, the distance between the diners and the dining table changes sufficiently, and the monitoring detection system does not need to activate the image acquisition system for image collection, thereby judging the entry and exit state of the table.
- this condition can often be combined with other conditions, so as to avoid frequent activation of the image acquisition system without changing the entry and exit state of the target object, which is beneficial to further reduce system power consumption.
- the distance between the object and the target object is greater than a fourth predetermined threshold.
- This preset condition can be understood as that the distance is greater than the fourth preset threshold, indicating that the distance between the object and the target object is far enough, and the monitored object may not need to use the service corresponding to the target object.
- an object which may be a human body, such as a diners; or an object, such as a cart for finishing a table residue
- the target object herein embodied as a dining table
- the monitoring detection system can activate the image acquisition system for image acquisition, and then use the recognition system to perform object recognition on the collected image, etc., thereby determining whether the object far from the table is a preset object (here Turned into a human body): If the object far from the table is a human body, it means that the diners have no need to use the table, and it can be understood that the target object enters the exit state, and the deduction can be further performed according to the business process corresponding to the departure state; On the other hand, if the object far from the table is not the human body, it means that the diners do not leave the table, which can be understood as no need to enter the business system to adjust the business process.
- a preset object here Turned into a human body
- the distance between the object and the target object is greater than the fifth preset threshold in the second preset time period.
- the statistical value of the distance in the second preset time period is greater than the fifth preset threshold. It can be understood that, within a certain time window, the distance between the object and the target object is far enough, and the monitored object may no longer be Need to use the business corresponding to the target object. Therefore, the image acquisition system can be activated to perform image acquisition, and then the object recognition is performed according to the collected image to determine whether the preset object is included in the image collection area: if the image capture area still contains the preset object, it can be understood as a preset object.
- the target object Relatively close to the target object, the target object is still in the state of admission; if the preset object is not included in the image acquisition area, it can be understood that the preset object is far away from the target object, and the target object can be considered to be in the exit state.
- the first preset condition used by the monitoring detection system to determine whether to activate the image acquisition system may be a combination of the foregoing multiple conditions. Whether the distance changes greatly (distance difference is greater than a certain threshold), the distance is far (the distance value is greater than a certain threshold, or the mean or median in the time window is greater than a certain threshold), or the distance is closer (the distance value is less than A certain threshold, or the mean or median of the time window is less than a certain threshold, may require activation of the image acquisition system for image acquisition.
- an image acquisition system can be implemented by using an image capture device such as a camera, an HD camera, or an infrared camera.
- an image capture device such as a camera, an HD camera, or an infrared camera.
- the specific types, specifications, and models may be determined according to actual application scenarios, which are not limited in this embodiment.
- the arrangement of the image acquisition device in the image acquisition system is related to the arrangement manner of the ranging sensor, and the distance detection range of the image acquisition system and the ranging sensor has a corresponding relationship.
- the image acquisition range of the image acquisition system and the distance detection range of the ranging sensor should have more intersections, and the effect achieved is that when the distance between the object and the target object is monitored within the distance monitoring range, the distance is satisfied.
- the image acquisition system corresponding to the distance detection range is activated, so that the image acquisition system can capture images in the image acquisition area.
- the images in the collected image acquisition area are often included. The object being listened to (unless the object being listened to while the image was captured has left the image capture area).
- the distance detection range 506 of the ultrasonic ranging sensor 502 has more intersection with the image acquisition area 505 of the image acquisition system 503 (which may be a camera).
- the position and angle of the camera should be such that the dinator is still in the image acquisition area after sitting down.
- the correspondence between the image acquisition system and the range detection range of the ranging sensor may be one-to-one correspondence, one-to-many or many-to-one.
- the camera used in the image acquisition system can be either fixed angle or can be adjusted under the control of the monitoring system. For example, when the ranging sensor monitors an object whose distance meets the preset requirement within the distance detection range, the camera is activated, and the camera adjustment angle is controlled until the coincidence degree of the distance detection range between the image acquisition area and the ranging sensor satisfies the requirement.
- the camera when it is determined that the image acquisition system needs to be activated, the camera may be directly activated, and the camera may be controlled to acquire an image; or the camera may be started when a certain condition is met, and the camera is at the start after the camera is started.
- the standby mode and when the other condition is satisfied, the camera is controlled to switch to the working mode, and the image in the image capturing area is acquired.
- the first instruction sent by the monitoring detection system to the image acquisition system is used to activate the image acquisition system.
- the image acquisition system may acquire an image in the image acquisition area of the image acquisition system directly or under certain conditions.
- the image acquisition system may directly send the collected image to the identification system, or return the image to the monitoring detection system, and the monitoring detection system sends the image in the image collection area to the recognition system for object recognition.
- the monitoring detection system may send an image in the image collection area to the object recognition system, and the object recognition system performs object recognition on the image in the image collection area to obtain a recognition result; then, the monitoring detection system receives the object recognition system and returns The result is recognized, and step S105 is performed.
- the identification system for object recognition may be disposed locally on the target object or may be set as a remote cloud recognition system.
- the remote cloud recognition system multiple target objects can use the common cloud recognition system for object recognition, which helps to reduce the deployment cost of the entire application system.
- the algorithm for identifying the object for recognition by the recognition system can adopt a general target detection algorithm such as YOLO (You Only Look Once), fast RCNN, and SSD.
- the recognition model for different target objects can be trained by using different training images. The construction and training of the model can be performed by a common method.
- S105 Determine a state of the target object according to the recognition result obtained by performing object recognition on the image in the image capturing area, and the state of the target object includes an entry state and/or an exit state.
- the recognition result may include at least one of the following:
- the number of preset objects contained in the image in the image acquisition area is the number of preset objects contained in the image in the image acquisition area.
- the following may include at least one of the following according to different contents of the recognition result:
- the image in the image collection area includes a preset object, determining that the state of the target object is an entry state;
- determining the state of the target object is an entry state
- the monitoring detection system does not distinguish the specific type of the object when performing the distance monitoring, but only determines whether to activate the image acquisition system according to the distance between the object and the target object.
- the state of the target object may be further determined as an entry state or an exit. status. Based on this, the state of the target object can be sent to the business system for the business system to determine the business process corresponding to the state of the target object.
- the second detection command is sent by the monitoring detection system to the image acquisition system corresponding to the distance detection range to close the image.
- the second instruction may be sent when the distance between the monitored object and the target object tends to be stable. Specifically, when the distance between the object and the target object satisfies the second preset condition, the second instruction may be sent to the image collection system corresponding to the distance detection range; wherein the second preset condition includes: the object and the target object The difference between the distances between the second preset time intervals is less than the sixth preset threshold.
- the state of the target object may also be recorded, and the first preset condition when determining whether to activate the image acquisition system may be further determined according to the current state of the target object.
- the monitoring detection system only needs to check whether an object that may change the state of the target object appears, thereby determining whether to activate the first preset condition when the image acquisition system is activated, and only selecting the target object that may be changed.
- the condition of the state can be.
- the first preset condition may be taken as at least the following One:
- the difference between the distance between the object and the target object at the first preset time interval is greater than a third preset threshold
- the distance between the object and the target object is greater than a fourth preset threshold
- the statistical value of the distance between the object and the target object in the second preset time period is greater than the fifth preset threshold.
- the first preset condition may be taken as the following. At least one item:
- the distance between the object and the target object is less than a first preset threshold
- the statistical value of the distance between the object and the target object in the first preset time period is less than the second preset threshold
- the difference between the object and the target object at the first preset time interval is greater than the third predetermined threshold.
- an image acquisition system may be used to acquire an image in an image collection area, and then the collected image is subjected to object recognition, thereby determining a state of the target object according to the recognition result. Therefore, it is possible to more accurately determine whether the preset object is admitted and/or departed.
- the image acquisition system is activated to acquire the image in the image acquisition area. Therefore, the system power consumption can be effectively reduced to meet the application requirements.
- the embodiment of the present application further provides an inbound and outbound state detection method, which is performed by an image acquisition system, and the method may include:
- S201 Receive a first instruction, where the first instruction is sent by the monitoring detection system when the distance between the object and the target object satisfies the first preset condition, and the object is located within the distance detection range of the monitoring detection system, and the distance detection range and the image acquisition system Corresponding;
- S203 Acquire an image in an image acquisition area of the image acquisition system, so that the monitoring detection system determines the state of the target object according to the recognition result obtained by performing object recognition on the image in the image collection area, and the state of the target object includes an admission state and/or Or departure status.
- the image acquisition system may further perform the following steps:
- the image acquisition system is turned off or switched to the standby mode.
- an image acquisition system may be used to acquire an image in an image collection area, and then the collected image is subjected to object recognition, thereby determining a state of the target object according to the recognition result. Therefore, it is possible to more accurately determine whether the preset object is admitted and/or departed.
- the image acquisition system is activated to acquire the image in the image acquisition area. Therefore, the system power consumption can be effectively reduced to meet the application requirements.
- the target object is taken as a dining table
- the business system can be embodied as a multimedia interactive system.
- the interactive system can be mainly composed of a motion collector, a data processor and a display screen.
- the hardware carrier of the interactive system can be disposed on the periphery of the dining table to facilitate the operation and viewing of the diners, or directly on the ordinary dining table as a display screen carrier, by deploying a touch screen, a gesture recognition device, etc. on an ordinary dining table.
- the action collector of the user ie, the diners
- the table desktop is used as the data processing result of the screen display feedback interaction system to realize the intelligentization of the table, and the interactive interaction between the diners and the business system is completed by the intelligent table.
- the business system can enter the ordering process.
- the menu can be displayed on the touch screen embedded in the desktop, and the diners select the corresponding dishes by clicking the touch screen, complete the series of operations such as self-ordering, adding vegetables, and even can view the real-time progress of the dishes through the screen, and view the cooking process of the dishes. and many more.
- the intelligent table can also record the diners' identification information and the frequently-used dishes, and then provide personalized recommendation information for the diners.
- the business system can enter the debit process. Specifically, the touch screen can be turned off, and the automatic deduction can be implemented according to the amount of the bill of the diners according to the identity information (such as account information, identity ID, and the like) provided by the diners.
- the business system can also enter the reminder process, for example, reminding service personnel to clean the table and so on.
- the embodiment of the present application further provides an entry and exit state detecting device, which is applied to the monitoring and detecting system 100.
- the device includes:
- the distance monitoring module 101 monitors the distance between the object within the distance detection range and the target object
- the first instruction sending module 103 sends a first instruction to the image acquisition system corresponding to the distance detection range when the distance between the object and the target object satisfies the first preset condition, so as to activate the image acquisition system to acquire the image acquisition area thereof.
- Image inside a first instruction to the image acquisition system corresponding to the distance detection range when the distance between the object and the target object satisfies the first preset condition, so as to activate the image acquisition system to acquire the image acquisition area thereof.
- the state determining module 105 determines the state of the target object according to the recognition result obtained by performing object recognition on the image in the image capturing area, and the state of the target object includes an entry state and/or an exit state.
- the entry and exit state detecting device in this embodiment corresponds to the inbound and outbound state detecting method performed by the monitoring and detecting system in the foregoing embodiment, and the related content in the foregoing embodiment is applicable to the embodiment, and details are not described herein again.
- the embodiment of the present application further provides an entry and exit state detecting device, which is applied to the image capturing system 200.
- the device includes:
- the first instruction receiving module 201 receives the first instruction sent by the monitoring detection system when the distance between the object and the target object satisfies the first preset condition; the object is located within the distance detection range of the monitoring detection system, the distance detection range and the image collection Corresponding to the system;
- the image acquiring module 203 acquires an image in the image capturing area of the image capturing system, so that the monitoring and detecting system determines the state of the target object according to the recognition result obtained by performing object recognition on the image in the image capturing area, and the state of the target object includes the admission. Status and/or departure status.
- the entry and exit state detecting device in this embodiment corresponds to the method of detecting the inbound and outbound state performed by the image collecting system in the foregoing embodiment, and the related content in the foregoing embodiment is applicable to the embodiment, and details are not described herein again.
- FIG. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
- the electronic device includes a processor, optionally including an internal bus, a network interface, and a memory.
- the memory may include a memory, such as a high-speed random access memory (RAM), and may also include a non-volatile memory, such as at least one disk memory.
- RAM high-speed random access memory
- non-volatile memory such as at least one disk memory.
- the electronic device may also include hardware required for other services.
- the processor, the network interface, and the memory may be interconnected by an internal bus, which may be an ISA (Industry Standard Architecture) bus, a PCI (Peripheral Component Interconnect) bus, or an EISA (Extended) Industry Standard Architecture, extended industry standard architecture) bus.
- the bus can be divided into an address bus, a data bus, a control bus, and the like. For ease of representation, only one double-headed arrow is shown in Figure 7, but it does not mean that there is only one bus or one type of bus.
- the program can include program code, the program code including computer operating instructions.
- the memory can include both memory and non-volatile memory and provides instructions and data to the processor.
- the processor reads the corresponding computer program from the non-volatile memory into the memory and then operates to form an entry and exit state detecting device on the logical level.
- the processor executes the program stored in the memory and is specifically used to perform the following operations:
- the state of the target object is determined based on the recognition result obtained by performing object recognition on the image in the image acquisition area, and the state of the target object includes an entry state and/or an exit state.
- the method performed by the entry and exit state detecting apparatus disclosed in the embodiment shown in FIG. 2 of the present application may be applied to a processor or implemented by a processor.
- the processor may be an integrated circuit chip with signal processing capabilities.
- each step of the above method may be completed by an integrated logic circuit of hardware in a processor or an instruction in a form of software.
- the above processor may be a general-purpose processor, including a central processing unit (CPU), a network processor (NP), etc.; or may be a digital signal processor (DSP), dedicated integration.
- ASIC Application Specific Integrated Circuit
- FPGA Field-Programmable Gate Array
- the general purpose processor may be a microprocessor or the processor or any conventional processor or the like.
- the steps of the method disclosed in the embodiments of the present application may be directly implemented by the hardware decoding processor, or may be performed by a combination of hardware and software modules in the decoding processor.
- the software module can be located in a conventional storage medium such as random access memory, flash memory, read only memory, programmable read only memory or electrically erasable programmable memory, registers, and the like.
- the storage medium is located in the memory, and the processor reads the information in the memory and combines the hardware to complete the steps of the above method.
- the electronic device can also perform the method performed by the entry and exit state detecting device in FIG. 1 and realize the function of the entry and exit state detecting device in the embodiment shown in FIG. 1.
- the embodiment of the present application does not repeat here.
- the embodiment of the present application further provides a computer readable storage medium storing one or more programs, the one or more programs including instructions that are executed by an electronic device including a plurality of applications
- the electronic device can be configured to perform the method performed by the entry and exit state detecting device in the embodiment shown in FIG.
- the state of the target object is determined based on the recognition result obtained by performing object recognition on the image in the image acquisition area, and the state of the target object includes an entry state and/or an exit state.
- an image acquisition system may be used to acquire an image in an image collection area, and then the collected image is subjected to object recognition, thereby determining a state of the target object according to the recognition result. Therefore, it is possible to more accurately determine whether the preset object is admitted and/or departed.
- the image acquisition system is activated to acquire the image in the image acquisition area. Therefore, the system power consumption can be effectively reduced to meet the application requirements.
- FIG. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
- the electronic device includes a processor, optionally including an internal bus, a network interface, and a memory.
- the memory may include a memory, such as a high-speed random access memory (RAM), and may also include a non-volatile memory, such as at least one disk memory.
- RAM high-speed random access memory
- non-volatile memory such as at least one disk memory.
- the electronic device may also include hardware required for other services.
- the processor, the network interface, and the memory may be interconnected by an internal bus, which may be an ISA (Industry Standard Architecture) bus, a PCI (Peripheral Component Interconnect) bus, or an EISA (Extended) Industry Standard Architecture, extending the industry standard structure) bus.
- the bus can be divided into an address bus, a data bus, a control bus, and the like. For ease of representation, only one double-headed arrow is shown in Figure 8, but it does not mean that there is only one bus or one type of bus.
- the program can include program code, the program code including computer operating instructions.
- the memory can include both memory and non-volatile memory and provides instructions and data to the processor.
- the processor reads the corresponding computer program from the non-volatile memory into the memory and then operates to form an off-field condition monitoring device at a logical level.
- the processor executes the program stored in the memory and is specifically configured to perform the following operations:
- the object is located within the distance detection range of the monitoring detection system, and the distance detection range corresponds to the image acquisition system ;
- the monitoring detection system determines the state of the target object according to the recognition result obtained by performing object recognition on the image in the image acquisition area, and the state of the target object includes an entry state and/or a departure state. Field status.
- the method performed by the entry and exit state monitoring device disclosed in the embodiment shown in FIG. 4 of the present application may be applied to a processor or implemented by a processor.
- the processor may be an integrated circuit chip with signal processing capabilities.
- each step of the above method may be completed by an integrated logic circuit of hardware in a processor or an instruction in a form of software.
- the above processor may be a general-purpose processor, including a central processing unit (CPU), a network processor (NP), etc.; or may be a digital signal processor (DSP), dedicated integration.
- ASIC Application Specific Integrated Circuit
- FPGA Field-Programmable Gate Array
- other programmable logic device discrete gate or transistor logic device, discrete hardware component.
- the general purpose processor may be a microprocessor or the processor or any conventional processor or the like.
- the steps of the method disclosed in the embodiments of the present application may be directly implemented by the hardware decoding processor, or may be performed by a combination of hardware and software modules in the decoding processor.
- the software module can be located in a conventional storage medium such as random access memory, flash memory, read only memory, programmable read only memory or electrically erasable programmable memory, registers, and the like.
- the storage medium is located in the memory, and the processor reads the information in the memory and combines the hardware to complete the steps of the above method.
- the electronic device can also perform the method performed by the entry and exit state monitoring device in FIG. 4, and realize the function of the entry and exit state monitoring device in the embodiment shown in FIG. 4, which is not described herein again.
- the embodiment of the present application further provides a computer readable storage medium storing one or more programs, the one or more programs including instructions that are executed by an electronic device including a plurality of applications
- the electronic device can be configured to perform the method performed by the entry and exit state monitoring device in the embodiment shown in FIG. 4, and specifically for performing:
- the object is located within the distance detection range of the monitoring detection system, and the distance detection range corresponds to the image acquisition system ;
- the monitoring detection system determines the state of the target object according to the recognition result obtained by performing object recognition on the image in the image acquisition area, and the state of the target object includes an entry state and/or a departure state. Field status.
- an image acquisition system may be used to acquire an image in an image collection area, and then the collected image is subjected to object recognition, thereby determining a state of the target object according to the recognition result. Therefore, it is possible to more accurately determine whether the preset object is admitted and/or departed.
- the image acquisition system is activated to acquire the image in the image acquisition area. Therefore, the system power consumption can be effectively reduced to meet the application requirements.
- the embodiment of the present application further provides an application system, including a monitoring detection system, an image acquisition system, an object recognition system, and a service system, where:
- the monitoring detection system monitors the distance between the object within the distance detection range and the target object; and when the distance between the object and the target object satisfies the first preset condition, sends the first to the image acquisition system corresponding to the distance detection range
- the image acquisition system receives the first instruction, and the first instruction is sent by the monitoring detection system when the distance between the object and the target object satisfies the first preset condition, and the object is located within the distance detection range of the monitoring detection system, and the distance detection range and the image
- the acquisition system corresponds to; the image in the image acquisition area of the image acquisition system is also acquired, so that the monitoring detection system determines the state of the target object according to the recognition result obtained by performing object recognition on the image in the image acquisition area, and the state of the target object includes Field state and/or departure state;
- the object recognition system receives an image in the image acquisition area, and performs object recognition on the image in the image acquisition area to obtain a recognition result; and returns a recognition result;
- the business system receives the state of the target object and determines the business process corresponding to the state of the target object.
- the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
- the apparatus implements the functions specified in one or more blocks of a flow or a flow and/or block diagram of the flowchart.
- These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
- the instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.
- a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
- processors CPUs
- input/output interfaces network interfaces
- memory volatile and non-volatile memory
- the memory may include non-persistent memory, random access memory (RAM), and/or non-volatile memory in a computer readable medium, such as read only memory (ROM) or flash memory.
- RAM random access memory
- ROM read only memory
- Memory is an example of a computer readable medium.
- Computer readable media includes both permanent and non-persistent, removable and non-removable media.
- Information storage can be implemented by any method or technology.
- the information can be computer readable instructions, data structures, modules of programs, or other data.
- Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory. (ROM), electrically erasable programmable read only memory (EEPROM), flash memory or other memory technology, compact disk read only memory (CD-ROM), digital versatile disk (DVD) or other optical storage, Magnetic tape cartridges, magnetic tape storage or other magnetic storage devices or any other non-transportable media can be used to store information that can be accessed by a computing device.
- computer readable media does not include temporary storage of computer readable media, such as modulated data signals and carrier waves.
- embodiments of the present application can be provided as a method, system, or computer program product.
- the present application can take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment in combination of software and hardware.
- the application can take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) including computer usable program code.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Image Analysis (AREA)
- Geophysics And Detection Of Objects (AREA)
- Alarm Systems (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims (19)
- 一种入离场状态检测方法,由监听检测系统执行,所述方法包括:监听距离检测范围内的对象与目标对象之间的距离;当所述对象与目标对象之间的距离满足第一预设条件时,向与所述距离检测范围相对应的图像采集系统发送第一指令,以便激活所述图像采集系统获取其图像采集区域内的图像;根据对所述图像采集区域内的图像进行对象识别得到的识别结果,确定所述目标对象的状态,所述目标对象的状态包括入场状态和/或离场状态。
- 根据权利要求1所述方法,所述第一预设条件包括以下至少一项:所述对象与目标对象之间的距离小于第一预设阈值;所述对象与目标对象之间的距离在第一预设时间段内的统计值小于第二预设阈值;所述对象与目标对象之间的距离在第一预设时间间隔的差值大于第三预设阈值;所述对象与目标对象之间的距离大于第四预设阈值;所述对象与目标对象之间的距离在第二预设时间段内的统计值大于第五预设阈值。
- 根据权利要求1所述方法,在根据对所述图像采集区域内的图像进行对象识别得到的识别结果,确定所述目标对象的状态之前,所述方法还包括:将所述图像采集区域内的图像发送至对象识别系统,供所述对象识别系统对所述图像采集区域内的图像进行对象识别,得到所述识别结果;接收所述对象识别系统返回的所述识别结果。
- 根据权利要求3所述方法,所述对象识别系统为云端对象识别系统。
- 根据权利要求3所述方法,所述识别结果包括以下至少一项:所述图像采集区域内的图像中是否包含预设对象的判断结论;所述图像采集区域内的图像中包含的预设对象的数量。
- 根据权利要求5所述方法,根据对所述图像采集区域内的图像进行对象识别得到的识别结果,确定所述目标对象的状态,包括以下至少一项:当所述图像采集区域内的图像中包含所述预设对象时,确定所述目标对象的状态为入场状态;当所述图像采集区域内的图像中不包含所述预设对象时,确定所述目标对象的状态为离场状态;当所述图像采集区域内的图像中包含的所述预设对象的数量大于零时,确定所述目标对象的状态为入场状态;当所述图像采集区域内的图像中包含的所述预设对象的数量为零时,确定所述目标对象的状态为离场状态。
- 根据权利要求1~6之任一所述方法,在确定所述目标对象的状态之后,所述方法还包括:将所述目标对象的状态发送至业务系统,供所述业务系统确定与所述目标对象的状态相对应的业务流程。
- 根据权利要求1~6之任一所述方法,在确定所述目标对象的状态之后,所述方法还包括:向与所述距离检测范围相对应的图像采集系统发送第二指令,以便关闭所述图像采集系统或者将所述图像采集系统切换为待机模式。
- 根据权利要求8所述方法,向与所述距离检测范围相对应的图像采集系统发送第二指令,包括:当所述对象与目标对象之间的距离满足第二预设条件时,向与所述距离检测范围相对应的图像采集系统发送所述第二指令;其中,所述第二预设条件包括:所述对象与目标对象之间的距离在第二预设时间间隔的差值小于第六预设阈值。
- 根据权利要求1~6之任一所述方法,在确定所述目标对象的状态之后,所述方法还包括:记录所述目标对象的状态;根据所述目标对象的状态,确定所述第一预设条件。
- 一种入离场状态检测方法,由图像采集系统执行,所述方法包括:接收第一指令,所述第一指令由监听检测系统在对象与目标对象之间的距离满足第一预设条件时发送,所述对象位于所述监听检测系统的距离检测范围内,所述距离检测范围与所述图像采集系统相对应;获取所述图像采集系统的图像采集区域内的图像,以便所述监听检测系统根据对所述图像采集区域内的图像进行对象识别得到的识别结果,确定所述目标对象的状态,所述目标对象的状态包括入场状态和/或离场状态。
- 根据权利要求11所述方法,在获取所述图像采集系统的图像采集区域内的图像之后,所述方法还包括:接收第二指令,所述第二指令由所述监听检测系统在确定所述目标对象的状态之后发送;根据所述第二指令,所述图像采集系统关闭或者切换为待机模式。
- 一种入离场状态检测装置,应用于监听检测系统,所述装置包括:距离监听模块,监听距离检测范围内的对象与目标对象之间的距离;第一指令发送模块,当所述对象与目标对象之间的距离满足第一预设条件时,向与所述距离检测范围相对应的图像采集系统发送第一指令,以便激活所述图像采集系统获取其图像采集区域内的图像;状态确定模块,根据对所述图像采集区域内的图像进行对象识别得到的识别结果,确定所述目标对象的状态,所述目标对象的状态包括入场状态和/或离场状态。
- 一种入离场状态检测装置,应用于图像采集系统,所述装置包括:第一指令接收模块,接收监听检测系统在对象与目标对象之间的距离满足第一预设条件时发送的第一指令;所述对象位于所述监听检测系统的距离检测范围内,所述距离检测范围与所述图像采集系统相对应;图像获取模块,获取所述图像采集系统的图像采集区域内的图像,以便所述监听检测系统根据对所述图像采集区域内的图像进行对象识别得到的识别结果,确定所述目标对象的状态,所述目标对象的状态包括入场状态和/或离场状态。
- 一种电子设备,包括:处理器;以及被安排成存储计算机可执行指令的存储器,所述可执行指令在被执行时使所述处理器执行以下操作:监听距离检测范围内的对象与目标对象之间的距离;当所述对象与目标对象之间的距离满足第一预设条件时,向与所述距离检测范围相对应的图像采集系统发送第一指令,以便激活所述图像采集系统获取其图像采集区域内的图像;根据对所述图像采集区域内的图像进行对象识别得到的识别结果,确定所述目标对象的状态,所述目标对象的状态包括入场状态和/或离场状态。
- 一种计算机可读存储介质,所述计算机可读存储介质存储一个或多个程序,所述一个或多个程序当被包括多个应用程序的电子设备执行时,使得所述电子设备执行以下操作:监听距离检测范围内的对象与目标对象之间的距离;当所述对象与目标对象之间的距离满足第一预设条件时,向与所述距离检测范围相对应的图像采集系统发送第一指令,以便激活所述图像采集系统获取其图像采集区域内的图像;根据对所述图像采集区域内的图像进行对象识别得到的识别结果,确定所述目标对象的状态,所述目标对象的状态包括入场状态和/或离场状态。
- 一种电子设备,包括:处理器;以及被安排成存储计算机可执行指令的存储器,所述可执行指令在被执行时使所述处理器执行以下操作:接收第一指令,所述第一指令由监听检测系统在对象与目标对象之间的距离满足第一预设条件时发送,所述对象位于所述监听检测系统的距离检测范围内,所述距离检测范围与所述图像采集系统相对应;获取所述图像采集系统的图像采集区域内的图像,以便所述监听检测系统根据对所 述图像采集区域内的图像进行对象识别得到的识别结果,确定所述目标对象的状态,所述目标对象的状态包括入场状态和/或离场状态。
- 一种计算机可读存储介质,所述计算机可读存储介质存储一个或多个程序,所述一个或多个程序当被包括多个应用程序的电子设备执行时,使得所述电子设备执行以下操作:接收第一指令,所述第一指令由监听检测系统在对象与目标对象之间的距离满足第一预设条件时发送,所述对象位于所述监听检测系统的距离检测范围内,所述距离检测范围与所述图像采集系统相对应;获取所述图像采集系统的图像采集区域内的图像,以便所述监听检测系统根据对所述图像采集区域内的图像进行对象识别得到的识别结果,确定所述目标对象的状态,所述目标对象的状态包括入场状态和/或离场状态。
- 一种应用系统,包括监听检测系统、图像采集系统、对象识别系统、业务系统,其中:所述监听检测系统,监听距离检测范围内的对象与目标对象之间的距离;还当所述对象与目标对象之间的距离满足第一预设条件时,向与所述距离检测范围相对应的图像采集系统发送第一指令,以便激活所述图像采集系统获取其图像采集区域内的图像;还根据对所述图像采集区域内的图像进行对象识别得到的识别结果,确定所述目标对象的状态,所述目标对象的状态包括入场状态和/或离场状态;所述图像采集系统,接收第一指令,所述第一指令由监听检测系统在对象与目标对象之间的距离满足第一预设条件时发送,所述对象位于所述监听检测系统的距离检测范围内,所述距离检测范围与所述图像采集系统相对应;还获取所述图像采集系统的图像采集区域内的图像,以便所述监听检测系统根据对所述图像采集区域内的图像进行对象识别得到的识别结果,确定所述目标对象的状态,所述目标对象的状态包括入场状态和/或离场状态;所述对象识别系统,接收所述图像采集区域内的图像,并对所述图像采集区域内的图像进行对象识别,得到所述识别结果;还返回所述识别结果;所述业务系统,接收所述目标对象的状态,并确定与所述目标对象的状态相对应的业务流程。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SG11202005455YA SG11202005455YA (en) | 2018-02-08 | 2019-01-25 | Active/inactive state detection method and apparatus |
JP2020536646A JP6916394B2 (ja) | 2018-02-08 | 2019-01-25 | アクティブ/非アクティブ状態検出方法および装置 |
KR1020207018820A KR102366681B1 (ko) | 2018-02-08 | 2019-01-25 | 활성/비활성 상태 검출 방법 및 장치 |
EP19751705.5A EP3716142A4 (en) | 2018-02-08 | 2019-01-25 | INPUT / OUTPUT STATE DETECTION PROCESS AND DEVICE |
US16/889,622 US11102458B2 (en) | 2018-02-08 | 2020-06-01 | Active/inactive state detection method and apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810127142.2A CN108427914B (zh) | 2018-02-08 | 2018-02-08 | 入离场状态检测方法和装置 |
CN201810127142.2 | 2018-02-08 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/889,622 Continuation US11102458B2 (en) | 2018-02-08 | 2020-06-01 | Active/inactive state detection method and apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019154112A1 true WO2019154112A1 (zh) | 2019-08-15 |
Family
ID=63156823
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/073120 WO2019154112A1 (zh) | 2018-02-08 | 2019-01-25 | 入离场状态检测方法和装置 |
Country Status (8)
Country | Link |
---|---|
US (1) | US11102458B2 (zh) |
EP (1) | EP3716142A4 (zh) |
JP (1) | JP6916394B2 (zh) |
KR (1) | KR102366681B1 (zh) |
CN (2) | CN108427914B (zh) |
SG (1) | SG11202005455YA (zh) |
TW (1) | TWI692728B (zh) |
WO (1) | WO2019154112A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116893384A (zh) * | 2023-09-11 | 2023-10-17 | 南京中旭电子科技有限公司 | 数字霍尔传感器监测方法及平台 |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108427914B (zh) | 2018-02-08 | 2020-08-18 | 阿里巴巴集团控股有限公司 | 入离场状态检测方法和装置 |
CN113452926B (zh) * | 2018-10-26 | 2023-01-13 | 创新先进技术有限公司 | 图像采集设备、系统及方法 |
CN110018646A (zh) * | 2019-04-19 | 2019-07-16 | 北京潞电电气设备有限公司 | 一种电力设备操作规范监控系统 |
CN110084183A (zh) * | 2019-04-25 | 2019-08-02 | 杭州鸿雁电器有限公司 | 确定人员进出区域的方法和系统 |
CN112207812A (zh) * | 2019-07-12 | 2021-01-12 | 阿里巴巴集团控股有限公司 | 设备控制方法、设备、系统及存储介质 |
CN110427887B (zh) * | 2019-08-02 | 2023-03-10 | 腾讯科技(深圳)有限公司 | 一种基于智能的会员身份识别方法及装置 |
CN110661973B (zh) * | 2019-09-29 | 2022-04-22 | 联想(北京)有限公司 | 一种控制方法及电子设备 |
CN110826506A (zh) * | 2019-11-11 | 2020-02-21 | 上海秒针网络科技有限公司 | 目标行为的识别方法及装置 |
CN111507318A (zh) * | 2020-07-01 | 2020-08-07 | 口碑(上海)信息技术有限公司 | 基于图像识别的离店检测方法及装置 |
CN112906483B (zh) * | 2021-01-25 | 2024-01-23 | 中国银联股份有限公司 | 一种目标重识别方法、装置及计算机可读存储介质 |
CN113091730B (zh) * | 2021-03-25 | 2023-07-07 | 杭州海康威视系统技术有限公司 | 一种轨迹确定方法及装置 |
CN113610004B (zh) * | 2021-08-09 | 2024-04-05 | 上海擎朗智能科技有限公司 | 一种图像处理方法、机器人及介质 |
CN113701893B (zh) * | 2021-08-30 | 2023-05-02 | 杭州睿影科技有限公司 | 测温方法、装置、设备及存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105915784A (zh) * | 2016-04-01 | 2016-08-31 | 纳恩博(北京)科技有限公司 | 信息处理方法和装置 |
CN107278301A (zh) * | 2016-12-30 | 2017-10-20 | 深圳前海达闼云端智能科技有限公司 | 一种辅助用户寻物的方法及装置 |
CN107378949A (zh) * | 2017-07-22 | 2017-11-24 | 深圳市萨斯智能科技有限公司 | 一种机器人检测物体的方法和机器人 |
CN107589707A (zh) * | 2017-08-16 | 2018-01-16 | 深圳市启惠智能科技有限公司 | 一种监控处理方法、服务器及计算机存储介质 |
CN108427914A (zh) * | 2018-02-08 | 2018-08-21 | 阿里巴巴集团控股有限公司 | 入离场状态检测方法和装置 |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000099837A (ja) * | 1998-09-18 | 2000-04-07 | Toshiba Corp | 監視システム |
JP3992909B2 (ja) * | 2000-07-03 | 2007-10-17 | 富士フイルム株式会社 | 本人画像提供システム |
US6580360B1 (en) * | 2000-12-13 | 2003-06-17 | Digibot, Inc. | Smart table |
TWI244624B (en) * | 2004-06-04 | 2005-12-01 | Jin-Ding Lai | Device and method for defining an area whose image is monitored |
TWI357582B (en) * | 2008-04-18 | 2012-02-01 | Univ Nat Taiwan | Image tracking system and method thereof |
JP5347549B2 (ja) * | 2009-02-13 | 2013-11-20 | ソニー株式会社 | 情報処理装置および情報処理方法 |
CN103425443A (zh) * | 2012-05-22 | 2013-12-04 | 联想(北京)有限公司 | 一种控制方法、系统和电子设备 |
JP6090559B2 (ja) * | 2012-10-04 | 2017-03-08 | 三菱自動車工業株式会社 | 発進安全装置 |
JP6044472B2 (ja) * | 2013-06-28 | 2016-12-14 | 富士ゼロックス株式会社 | 情報処理装置及びプログラム |
JP5590193B1 (ja) * | 2013-06-28 | 2014-09-17 | 富士ゼロックス株式会社 | 情報処理装置及びプログラム |
CN103778577B (zh) * | 2013-08-30 | 2017-08-29 | 陈飞 | 一种依据餐具信息调控餐桌并记录用餐信息的方法及装置 |
US20160180712A1 (en) * | 2015-08-27 | 2016-06-23 | Sparkcity.Com Ltd. | Citywide parking reservation system and method |
CN105472231B (zh) * | 2014-09-03 | 2019-03-29 | 联想(北京)有限公司 | 控制方法、图像采集装置和电子设备 |
CN105100730A (zh) * | 2015-08-21 | 2015-11-25 | 联想(北京)有限公司 | 一种监控方法及摄像头装置 |
US10043374B2 (en) * | 2015-12-30 | 2018-08-07 | Lenovo (Beijing) Limited | Method, system, and electronic device for monitoring |
KR101815144B1 (ko) * | 2016-07-05 | 2018-01-05 | 이응수 | 얼굴인식 기반의 사진 공유 방법 및 이를 이용한 사진 공유 시스템 |
US11311210B2 (en) * | 2016-07-14 | 2022-04-26 | Brightday Technologies, Inc. | Posture analysis systems and methods |
CN107666589A (zh) * | 2016-07-29 | 2018-02-06 | 中兴通讯股份有限公司 | 一种远程监控方法及设备 |
CN106603969A (zh) * | 2016-11-04 | 2017-04-26 | 乐视控股(北京)有限公司 | 一种视频监控方法、装置和系统以及探测设备 |
-
2018
- 2018-02-08 CN CN201810127142.2A patent/CN108427914B/zh active Active
- 2018-02-08 CN CN202010733220.0A patent/CN111652197B/zh active Active
- 2018-12-20 TW TW107146089A patent/TWI692728B/zh active
-
2019
- 2019-01-25 EP EP19751705.5A patent/EP3716142A4/en not_active Ceased
- 2019-01-25 JP JP2020536646A patent/JP6916394B2/ja active Active
- 2019-01-25 SG SG11202005455YA patent/SG11202005455YA/en unknown
- 2019-01-25 KR KR1020207018820A patent/KR102366681B1/ko active IP Right Grant
- 2019-01-25 WO PCT/CN2019/073120 patent/WO2019154112A1/zh unknown
-
2020
- 2020-06-01 US US16/889,622 patent/US11102458B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105915784A (zh) * | 2016-04-01 | 2016-08-31 | 纳恩博(北京)科技有限公司 | 信息处理方法和装置 |
CN107278301A (zh) * | 2016-12-30 | 2017-10-20 | 深圳前海达闼云端智能科技有限公司 | 一种辅助用户寻物的方法及装置 |
CN107378949A (zh) * | 2017-07-22 | 2017-11-24 | 深圳市萨斯智能科技有限公司 | 一种机器人检测物体的方法和机器人 |
CN107589707A (zh) * | 2017-08-16 | 2018-01-16 | 深圳市启惠智能科技有限公司 | 一种监控处理方法、服务器及计算机存储介质 |
CN108427914A (zh) * | 2018-02-08 | 2018-08-21 | 阿里巴巴集团控股有限公司 | 入离场状态检测方法和装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3716142A4 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116893384A (zh) * | 2023-09-11 | 2023-10-17 | 南京中旭电子科技有限公司 | 数字霍尔传感器监测方法及平台 |
CN116893384B (zh) * | 2023-09-11 | 2023-12-01 | 南京中旭电子科技有限公司 | 数字霍尔传感器监测方法及平台 |
Also Published As
Publication number | Publication date |
---|---|
EP3716142A1 (en) | 2020-09-30 |
TW201935309A (zh) | 2019-09-01 |
TWI692728B (zh) | 2020-05-01 |
US11102458B2 (en) | 2021-08-24 |
CN108427914B (zh) | 2020-08-18 |
CN111652197A (zh) | 2020-09-11 |
JP6916394B2 (ja) | 2021-08-11 |
KR102366681B1 (ko) | 2022-03-21 |
EP3716142A4 (en) | 2021-01-20 |
CN111652197B (zh) | 2023-04-18 |
CN108427914A (zh) | 2018-08-21 |
KR20200093016A (ko) | 2020-08-04 |
US20200296335A1 (en) | 2020-09-17 |
JP2021513695A (ja) | 2021-05-27 |
SG11202005455YA (en) | 2020-07-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019154112A1 (zh) | 入离场状态检测方法和装置 | |
US11069217B2 (en) | Sensor configuration | |
US9946357B2 (en) | Control using movements | |
US9984590B2 (en) | Identifying a change in a home environment | |
CN111788821B (zh) | 一种用于检测电子设备附近的表面的方法和装置 | |
CN106603969A (zh) | 一种视频监控方法、装置和系统以及探测设备 | |
US20230000302A1 (en) | Cleaning area estimation device and method for estimating cleaning area | |
US10475310B1 (en) | Operation method for security monitoring system | |
US10540542B2 (en) | Monitoring | |
CN110602197A (zh) | 物联网控制装置和方法、电子设备 | |
CN113721232B (zh) | 目标对象检测方法、装置、电子设备及介质 | |
CN105807928B (zh) | 一种任意墙面互动系统及其扫描误差处理方法 | |
CN112731364B (zh) | 毫米波雷达智能厕位管理方法、系统、平台、介质及设备 | |
US20240135686A1 (en) | Method and electronic device for training neural network model by augmenting image representing object captured by multiple cameras | |
CN106597455A (zh) | 利用超音波测距避免碰撞发生的方法及其系统 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19751705 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20207018820 Country of ref document: KR Kind code of ref document: A Ref document number: 2020536646 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2019751705 Country of ref document: EP Effective date: 20200625 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |