CN117689688A - Mobile detection method, unit, intelligent monitoring equipment and storage medium - Google Patents

Mobile detection method, unit, intelligent monitoring equipment and storage medium Download PDF

Info

Publication number
CN117689688A
CN117689688A CN202311689535.XA CN202311689535A CN117689688A CN 117689688 A CN117689688 A CN 117689688A CN 202311689535 A CN202311689535 A CN 202311689535A CN 117689688 A CN117689688 A CN 117689688A
Authority
CN
China
Prior art keywords
image
frame image
current frame
value
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311689535.XA
Other languages
Chinese (zh)
Inventor
李艺峰
涂倩
乐超
何金
汪小勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SmartSens Technology Shanghai Co Ltd
Original Assignee
SmartSens Technology Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SmartSens Technology Shanghai Co Ltd filed Critical SmartSens Technology Shanghai Co Ltd
Priority to CN202311689535.XA priority Critical patent/CN117689688A/en
Publication of CN117689688A publication Critical patent/CN117689688A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Studio Devices (AREA)

Abstract

The application is applicable to the technical field of image processing and provides a mobile detection method, a mobile detection unit, intelligent monitoring equipment and a storage medium. The motion detection method is applied to the image sensor and comprises the following steps: responding to a received equipment starting instruction, and acquiring a cached frame image and a current frame image; performing differential processing on the cached frame image and the current frame image to determine a differential image; responding to the equipment sensitivity configuration of a user, and acquiring a noise threshold value, a preset neighborhood range and a neighborhood counting threshold value; and when the number of pixels larger than the noise threshold in the preset neighborhood range of the differential image reaches the neighborhood counting threshold, determining that the motion of the moving target exists in the preset neighborhood range. The scheme is applied to the image sensor, so that the mobile detection precision can be improved to reduce the power consumption of the equipment, corresponding mobile judgment is carried out according to different sensitivity configurations, and different application scenes can be adapted.

Description

Mobile detection method, unit, intelligent monitoring equipment and storage medium
Technical Field
The application belongs to the technical field of image processing, and particularly relates to a mobile detection method, a mobile detection unit, intelligent monitoring equipment and a storage medium.
Background
With the development of security and privacy awareness and the continuous enhancement of automation and intellectualization, people have higher requirements on the performance and power consumption of the mobile detection products. To meet these needs, existing motion detection techniques have been continuously improved and innovated, for example, by using high-resolution cameras and advanced image processing algorithms, so as to improve the accuracy and reliability of detection. Artificial intelligence techniques may also be employed to further enhance the detection capabilities to identify different types of moving objects and behaviors. However, in the prior art, when detection is performed, the detection is easily affected by a heat source, a light source, air flow and the like, and error touch is generated, namely the detection is sensitive to environmental change, and if error touch frequently occurs, the power consumption of equipment is high, and the endurance and the service life of the equipment are affected.
Disclosure of Invention
The embodiment of the application provides a mobile detection method, a mobile detection unit, intelligent monitoring equipment and a storage medium, which can improve the mobile detection precision and reduce the power consumption of the equipment.
In a first aspect, an embodiment of the present application provides a motion detection method, applied to an image sensor, where the image sensor is configured to carry the motion detection method to detect a moving target, the motion detection method includes:
Responding to a received equipment starting instruction, and acquiring a cached frame image and a current frame image;
performing differential processing on the cached frame image and the current frame image to determine a differential image;
responding to the equipment sensitivity configuration of a user, and acquiring a noise threshold value, a preset neighborhood range and a neighborhood counting threshold value;
when the number of pixel points larger than the noise threshold value in the preset neighborhood range of the differential image reaches the neighborhood counting threshold value, determining that the motion of the moving target exists in the preset neighborhood range.
Optionally, in response to the device sensitivity configuration of the user, acquiring the noise threshold, the preset neighborhood range, and the neighborhood count threshold, including:
determining a device sensitivity in response to a device sensitivity configuration of the user;
determining a preset neighborhood range, a neighborhood counting threshold value and an intensity coefficient corresponding to the preset neighborhood range according to the sensitivity of the equipment;
and determining the noise threshold value of each pixel point according to the intensity coefficient, the noise coefficient, the first candidate value of the noise threshold value and the pixel value of each pixel point, wherein the noise coefficient is obtained through the register gain of the image sensor.
Optionally, the preset neighborhood range is a rectangular area in the differential image, the rectangular area includes different numbers of pixels, corresponding neighborhood count thresholds are different, and intensity coefficients corresponding to the preset neighborhood range are different.
Optionally, determining the noise threshold of each pixel according to the intensity coefficient, the noise coefficient, the first candidate value of the noise threshold and the pixel value of each pixel includes:
determining a square root function value of the noise coefficient and the pixel value of the pixel point according to the noise coefficient and the pixel value of the pixel point;
determining a second candidate value of the noise threshold according to the intensity coefficient and the square root function value;
when the first candidate value is larger than the second candidate value, determining the first candidate value as the noise threshold value of the pixel point;
when the second candidate value is larger than the first candidate value, determining the second candidate value as a noise threshold value of the pixel point;
and determining a noise threshold value of each pixel point, wherein the first candidate value is the minimum value of the noise threshold value.
Optionally, in response to the received device start instruction, acquiring the cached frame image and the current frame image includes:
responding to a received equipment starting instruction, and collecting an image to be processed;
downsampling an image to be processed through a pixel merging algorithm to obtain a downsampled image;
reading the automatic exposure state of the image sensor, caching the current downsampled image when the automatic exposure state is stable, and determining the downsampled image as a cached frame image;
And acquiring a current frame to-be-processed image, and downsampling the current frame to-be-processed image through a pixel merging algorithm to obtain the current frame image.
Optionally, in response to the received device start instruction, acquiring the cached frame image and the current frame image, and further including:
under the condition of fluctuation of exposure values, re-acquiring an image to be processed, and downsampling the image to be processed through a pixel merging algorithm to obtain a new downsampled image;
reading an automatic exposure state of the image sensor;
until the automatic exposure state is stable, caching the current downsampled image, and determining the downsampled image as an updated cached frame image;
and acquiring a current frame to-be-processed image, and downsampling the current frame to-be-processed image through a pixel merging algorithm to obtain the current frame image.
Optionally, the motion detection method further includes:
when the movement of the moving target is determined, an early warning instruction is sent to the on-chip subsystem, and the on-chip subsystem is awakened and the image sensor is controlled to perform real-time imaging; or alternatively, the first and second heat exchangers may be,
when the movement of the moving object is determined to be the false touch, the on-chip subsystem is kept in a standby state.
In a second aspect, an embodiment of the present application provides a motion detection unit configured to be disposed on an image sensor, where the motion detection unit is configured to detect a moving target by using the motion detection method according to the first aspect, and the motion detection unit includes:
The image acquisition module is used for responding to the received equipment starting instruction and acquiring a cache frame image and a current frame image;
the difference determining module is used for carrying out difference processing on the cached frame image and the current frame image to determine a difference image;
the threshold value acquisition module is used for responding to the equipment sensitivity configuration of the user to acquire a noise threshold value, a preset neighborhood range and a neighborhood counting threshold value;
the detection module is used for determining that the motion of the moving target exists in the preset neighborhood range when the number of pixels larger than the noise threshold in the preset neighborhood range of the differential image reaches the neighborhood counting threshold.
In a third aspect, an embodiment of the present application provides an intelligent monitoring device, where the intelligent monitoring device includes an infrared sensing unit, an image sensor, and an on-chip subsystem;
the infrared sensing unit is used for sensing infrared change in the environment and sending a device starting instruction to the image sensor under the condition of sensing the infrared change;
the image sensor is used for responding to the received equipment starting instruction and acquiring a cache frame image and a current frame image; performing differential processing on the cached frame image and the current frame image to determine a differential image; responding to the equipment sensitivity configuration of a user, and acquiring a noise threshold value, a preset neighborhood range and a neighborhood counting threshold value; when the number of pixel points larger than a noise threshold value in a preset neighborhood range of the differential image reaches a neighborhood counting threshold value, determining that the motion of a moving target exists in the preset neighborhood range;
And the on-chip subsystem is used for receiving the early warning instruction of the image sensor and controlling the image sensor to perform real-time imaging.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium having a computer program stored thereon, the computer program implementing the motion detection method according to the first aspect when executed by a processor.
In a fifth aspect, embodiments of the present application provide a computer program product for causing an image sensor to perform the motion detection method of the first aspect described above when the computer program product is run on the image sensor.
Compared with the prior art, the embodiment of the application has the beneficial effects that:
the motion detection method is applied to an image sensor, after the image sensor receives a device starting instruction, a buffer frame image and a current frame image are obtained in response to the received device starting instruction, difference processing is carried out on the buffer frame image and the current frame image, and a difference image is determined, wherein the difference image is used for confirming the difference degree of the buffer frame image and the current frame image, the difference degree can be used for confirming whether motion of a moving target exists or not, but under the condition that device sensitivity configurations are different, the difference degree cannot directly confirm whether motion of the moving target exists or not, therefore, a noise threshold value, a preset neighborhood range and a neighborhood counting threshold value are required to be obtained in response to the device sensitivity configuration of a user, and when the number of pixels which are larger than the noise threshold value in the preset neighborhood range of the difference image reaches the neighborhood counting threshold value, the motion of the moving target in the preset neighborhood range is determined. The scheme can be applied to the image sensor to improve the movement detection precision so as to reduce the power consumption of the equipment, and can adapt to different application scenes by carrying out corresponding movement judgment according to different sensitivity configurations.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the following description will briefly introduce the drawings that are needed in the embodiments or the description of the prior art, it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of an intelligent monitoring device according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a prior art intelligent monitoring device;
FIG. 3 is a flow chart illustrating a mobile detection method according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of the acquisition of a buffered frame and a current frame;
fig. 5 is a flowchart of a mobile detection method according to another embodiment of the present application;
fig. 6 is a schematic structural diagram of a motion detection unit according to another embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an image sensor according to another embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system configurations, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
In addition, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
It should be understood that the sequence number of each step in this embodiment does not mean the sequence of execution, and the execution sequence of each process should be determined by its function and internal logic, and should not constitute any limitation on the implementation process of the embodiment of the present application.
With the popularization of intelligent furniture equipment, people have higher requirements on the performance and the power consumption of products. Such as smart doorbell, smart cat eye, etc., which typically rely on PIR sensors (Passive Infrared Sensor, passive infrared sensors) to detect time, by sensing changes in infrared radiation emitted by the human body, thereby achieving the effect of detecting moving targets and actively providing early warning. Once the device detects that the moving target appears, the SoC (System-on-a-Chip) can be started to carry out snapshot video recording so as to inform a user of the first time. However, PIR sensors are susceptible to false touches caused by heat sources, light sources, air flows and the like, and are sensitive to environmental changes like wind blowing. If false touch frequently occurs, a main system with higher power consumption is continuously awakened, and the endurance and the service life of equipment are affected.
In view of the above problems, the present application provides a motion detection method, which is applied to an image sensor, and the motion detection method is mounted on the image sensor, so that bandwidth can be saved, power consumption can be reduced, and response speed can be improved. In the motion detection method, after the image sensor receives the equipment starting instruction, the image sensor responds to the received equipment starting instruction to acquire a cache frame image and a current frame image, differential processing is carried out on the cache frame image and the current frame image, and a differential image is determined, wherein the differential image is used for confirming the differential degree of the cache frame image and the current frame image, the differential degree can be used for confirming whether motion of a moving target exists or not, but under the condition that equipment sensitivity configurations are different, the differential degree cannot directly confirm whether the motion of the moving target exists or not, therefore, a noise threshold value, a preset neighborhood range and a neighborhood counting threshold value are required to be acquired in response to the equipment sensitivity configurations of users, and when the number of pixels larger than the noise threshold value in the preset neighborhood range of the differential image reaches the neighborhood counting threshold value, the motion of the moving target in the preset neighborhood range is determined. Therefore, the method provided by the application can adapt to different application scenes based on the sensitivity configuration, and meanwhile, the mobile detection precision is improved, and the power consumption of the equipment is further reduced.
The motion detection method, unit, intelligent monitoring device, image sensor, storage medium and computer program provided in the present application are described in detail below with reference to the accompanying drawings.
Fig. 1 shows a schematic structural diagram of an intelligent detection device according to an embodiment of the present application. Wherein, intelligent monitoring equipment includes: infrared sensing units (PIRs), image sensors, and subsystem-on-chip SoC.
When the intelligent monitoring device detects movement of the external environment, the infrared sensing unit can be used for sensing infrared changes in the environment, and under the condition that the infrared changes are sensed, a device starting instruction is sent to the image sensor, so that the image sensor can respond to the received device starting instruction to acquire a cache frame image and a current frame image, and differential processing is carried out on the cache frame image and the current frame image to determine a differential image. Simultaneously, responding to the equipment sensitivity configuration of the user, and acquiring a noise threshold value, a preset neighborhood range and a neighborhood counting threshold value; when the number of pixel points larger than a noise threshold in a preset neighborhood range in the differential image reaches a neighborhood counting threshold, determining that the motion of a moving target exists in the preset neighborhood range. Once the movement of the moving target is determined, an early warning instruction can be sent to the on-chip subsystem, and the on-chip subsystem controls the image sensor to perform real-time imaging after receiving the early warning instruction.
For example, as shown in fig. 1, the intelligent monitoring device is configured to send a power-on instruction (i.e. a device start instruction) to the image sensor if the PIR is triggered by a moving object or other factors (such as a heat source, a light source, an air flow, etc.) during the movement detection, so that the image sensor starts to power up to acquire an image. It should be noted that, when the image sensor acquires an image, the image sensor may acquire the buffered frame image and the current frame image with low resolution, so as to save power consumption.
After the buffer frame image and the current frame image are acquired, carrying out differential processing on the buffer frame image and the current frame image to obtain a differential image, then obtaining a noise threshold value, a preset neighborhood range and a neighborhood counting threshold value which are matched with the currently set sensitivity according to the equipment sensitivity set by a user, further analyzing the differential image to judge whether the motion of a moving target exists, and if the motion of the moving target exists, sending an early warning instruction to the SoC to enable the SoC to be started, and controlling an image sensor to carry out real-time imaging by the SoC. When the SoC controls the image sensor to perform real-time imaging, it is necessary to switch the image to a high resolution mode to perform image acquisition so as to display a clearer image or video on the SoC.
In the embodiment of the application, the mobile detection method is carried in the image sensor, and the on-chip subsystem is started after the running of the moving target is determined, so that resources of the on-chip subsystem can be saved, and the problem that the on-chip subsystem is frequently started due to false touch, so that the processing power consumption is high can be avoided.
For example, fig. 2 is a schematic structural diagram of an intelligent monitoring device in the prior art, where the intelligent monitoring device in the prior art also includes a PIR, a SoC and an image sensor, but in the prior art, if the PIR is triggered by a moving object or other factors (such as a heat source, a light source, an air flow, etc.), the PIR directly calls the SoC, controls the image sensor to collect images in a high resolution mode, and transmits the images to the SoC for displaying images or videos.
In the embodiment of the application, the image sensor performs corresponding movement judgment according to different sensitivity configurations, can adapt to different application scenes, does not need to call the SoC when the situation of false touch is judged, prolongs the service life of the intelligent monitoring equipment, and when the PIR response is caused by a moving target or other factors, the image sensor works in a low-resolution quick starting mode and also can prolong the service life of the image sensor when the image sensor is powered on, so that the method can improve the movement detection precision and further reduce the equipment power consumption while adapting to different application scenes.
Fig. 3 is a flow chart illustrating a mobile detection method according to an embodiment of the present application.
Step 301, in response to a received device start instruction, acquiring a buffered frame image and a current frame image.
It should be noted that, the motion detection method according to the embodiment of the present application may be performed by the motion detection unit according to the embodiment of the present application, and the motion detection unit according to the embodiment of the present application may be configured in any image sensor to perform the motion detection method according to the embodiment of the present application. For example, the movement detection device in the embodiment of the present application may be configured in an image sensor in the smart cat eye to detect a moving target around the smart cat eye.
The image sensor is an apparatus for capturing an optical image and converting the optical image into an electronic signal, and in the embodiment of the present application, is used for carrying a motion detection method to detect a moving object.
The device start command may be a power-on command of the image sensor, and when the PIR senses an infrared change in the environment, the device start command is sent to the image sensor, so that the image sensor starts to work. The image sensor may obtain the buffered frame image and the current frame image in response to receiving the device-initiated instruction.
Wherein the buffered frame image may refer to a previously captured image frame stored inside the image sensor. The buffered frame image may be accessed and processed as needed.
The current frame image may refer to a frame image corresponding to the current time.
In this embodiment of the present application, after receiving an equipment start instruction, the image sensor starts to perform an acquisition operation of a buffered frame image and a current frame image, specifically, may first read an image acquired by the image sensor when an automatic exposure state is stable, as a buffered frame image, and then acquire, in real time, the image acquired by the image sensor corresponding to the current time, as a current frame image.
In one possible implementation, since the brightness difference between the collected adjacent frames is large before the automatic exposure state is stable, in order to ensure the detection accuracy, the image frame before the automatic exposure state is stable is not used as the processing object, so as to avoid false correspondence.
In one possible implementation manner, in order to reduce power consumption and noise of the device, the collected image may be downsampled to obtain a small-size image with high signal-to-noise ratio, and differential processing is performed according to the small-size image to detect the motion of the moving object. That is, the step 301 may specifically include the following steps:
responding to a received equipment starting instruction, and collecting an image to be processed;
downsampling an image to be processed through a pixel merging algorithm to obtain a downsampled image;
reading an automatic exposure state of the image sensor;
when the automatic exposure state reaches stability, caching the downsampled image at the time, and determining the downsampled image as a cached frame image;
and acquiring a current frame to-be-processed image, and downsampling the current frame to-be-processed image through a pixel merging algorithm to obtain the current frame image.
In this embodiment of the present application, after a received device start instruction is responded, an image to be processed is first acquired, where the image to be processed is an image directly output by an image sensor, a format of the image to be processed is a Raw image, and a resolution of the image is matched with an actual resolution of the image sensor, that is, if the image sensor uses a lower resolution to perform image acquisition, then the resolution of the image to be processed is also the lower resolution.
In order to realize the low-power-consumption mobile detection method, after the image to be processed is acquired, the image to be processed can be downsampled through a pixel merging algorithm to obtain a downsampled image. Where downsampling refers to the extraction of a small fraction of pixels from an image to be processed to reduce the resolution of the image, and pixel binning is a downsampling method that is typically used to bin the pixel values of each small block or region of the image into a single value to obtain a downsampled image.
Specifically, the multiple and direction of downsampling are first determined, for example, a downsampling multiple of 2 times in the horizontal and vertical directions is specified, i.e., a resolution that is reduced by half in each direction is indicated. And traversing the image to be processed, moving in a small block or window mode, and combining the pixel values according to the downsampling multiple in each small block to obtain a new pixel value, wherein the combination of the new pixel values can be performed in a mode of taking an average value, a maximum value or a minimum value and the like. Finally, a new image is created using the combined pixel values, i.e. a downsampled image, the size of which will be reduced according to the multiple of the downsampling.
It should be noted that the size of the downsampled image may be flexibly adapted and is not limited to a specific size.
Through the downsampling process, each image to be processed acquired by the image sensor can obtain a corresponding downsampled image, and for the accuracy of the motion detection, the automatic exposure state of the image sensor can be read, when the automatic exposure state is stable, the current downsampled image is cached and is determined to be a cached frame image, namely, when the automatic exposure state is not stable, the current downsampled image can be determined to be the cached frame image, and when the automatic exposure state is stable, the cached frame image can be used for comparing with all the current frame images.
Likewise, after the automatic exposure state is stable, the current frame to-be-processed image can be acquired, and the current frame to-be-processed image is also downsampled through a pixel merging algorithm to obtain the current frame image.
As shown in the schematic diagram of the acquisition of the buffered frame and the current frame in fig. 4, the average brightness is continuously changed before the stable moment, so that the automatic exposure state of the image sensor is unstable, and therefore, the image acquired in this period cannot be used, when the stable moment of the automatic exposure state arrives, the downsampled image of the moment can be buffered as the buffered frame image, and then the downsampled image of the image to be processed acquired corresponding to each moment can be used as the current frame image.
It will be appreciated that when the auto-exposure state is unstable, the exposure is continuously adjusted so that the average brightness of the image approaches the target value. When the brightness reaches the vicinity of the target value, an automatic exposure steady state is entered, and the image at this time is buffered and then is not changed. Each frame after the stabilization moment is taken as the current frame and is differentiated from the buffer frame.
It should also be appreciated that for a motion detection unit applied in the fast start mode of an image sensor, the overall automatic exposure state stabilization process time is very short, so that the environment is unlikely to be suddenly changed after stabilization, and the buffered frame does not need to be changed. If the image sensor is to be extended to other application situations, such as a normal open mode with low frame rate and low power consumption, when the environment changes, the automatic exposure state can restart the process from unsteady to steady, and the buffer frame can be updated at the steady time.
And 302, performing differential processing on the cached frame image and the current frame image to determine a differential image.
In this embodiment of the present application, since the number of current frame images may be plural, differential processing may be sequentially performed on the buffered frame images and the plural current frame images to obtain plural differential images.
The number of the current frame images is 3, namely a first current frame image, a second current frame image and a third current frame image, and the buffer frame images and the plurality of current frame images are subjected to differential processing in sequence, which means that the buffer frame images and the first current frame images are subjected to differential processing to obtain a first differential image; performing differential processing on the cached frame image and the second current frame image to obtain a second differential image; and carrying out differential processing on the cached frame image and the third current frame image to obtain a third differential image.
In this embodiment of the present application, performing differential processing on the buffered frame image and the current frame image may specifically refer to subtracting pixel values at corresponding positions in the buffered frame image and the current frame image to obtain a differential image, that is, a pixel value of each pixel point in the differential image represents an absolute value of a difference value between pixel values at corresponding positions in the buffered frame image and the current frame image.
In step 303, a noise threshold, a preset neighborhood range, and a neighborhood count threshold are obtained in response to the user's device sensitivity configuration.
In the embodiment of the application, by comparing the differences between the two images, the movement existing in the two images is detected, and a series of thresholds need to be determined to ensure that the detection method has better robustness in various scenes. The user can thus set different device sensitivities corresponding to different scenarios. For example, in the monitoring of security authorities, higher device sensitivity needs to be set to ensure security; in a home environment, lower device sensitivity may be provided to improve device lifetime. Therefore, for different device sensitivity configurations, the corresponding noise threshold, preset neighborhood range, and neighborhood count threshold may be obtained.
The sensitivity specifically refers to the furthest detection distance of the algorithm response, and the closer the distance is, the higher the requirement of judging movement is, and the lower the false touch rate is; the farther the distance is, the algorithm will sacrifice a certain accuracy, ensuring that the moving target at a distance is not missed. Based on the difference in sensitivity, the algorithm determines the corresponding threshold for the strategy of the detection phase.
In the embodiment of the application, the noise threshold, the preset neighborhood range and the neighborhood count threshold may be pre-stored in the image sensor and correspond to the device sensitivity, so that in response to the device sensitivity configuration of the user, the corresponding noise threshold, the preset neighborhood range and the neighborhood count threshold may be directly extracted from the memory of the image sensor.
It should be noted that, considering that human motion generally has temporal and spatial continuity, unlike noise that is always randomly generated at a single point, customizable neighborhood sizes, such as 3x3, 5x5, etc., can be set for multi-point comprehensive motion determination.
In one possible implementation manner, the step 303 may specifically include the following steps:
determining a device sensitivity in response to a device sensitivity configuration of the user;
Determining a preset neighborhood range, a neighborhood counting threshold value and an intensity coefficient corresponding to the preset neighborhood range according to the sensitivity of the equipment;
and determining the noise threshold value of each pixel point according to the intensity coefficient, the noise coefficient, the first candidate value of the noise threshold value and the pixel value of each pixel point.
The noise coefficient is obtained through the register gain of the image sensor of the image acquisition device, and it is needed to be explained that the noise coefficient is only related to the current gain, and the intensity coefficient is related to the sensitivity and the preset neighborhood range. Generally, the intensity coefficient should be set larger to avoid false recognition, and the intensity coefficient should be correspondingly reduced when the preset neighborhood range is larger, so as to prevent the number of pixels exceeding the noise threshold from failing to reach the domain count threshold.
It should be understood that, in the case where the preset neighborhood range is a rectangular region in the differential image, if the rectangular region includes different numbers of pixels, the corresponding neighborhood count threshold is different, and the intensity coefficient corresponding to the preset neighborhood range is also different.
In the embodiment of the application, since the image contrast needs to exclude the influence of noise, especially at the high frame rate of the image sensor in the fast start mode, the upper limit of the exposure time is very short, and the gain and noise tend to be very large. According to the noise model, shot noise follows poisson distribution, and noise variance is related to gain and signal strength of the image sensor.
For example, for each pixel in an image, a noise threshold may be defined as follows: noise_thr=max (k_2×sqrt (k_ 1*x), min_noise_thr), where x is the pixel value of a certain position, and noise_thr is the noise threshold of the pixel point of a certain position; k_1 is a noise coefficient related to the gain, k_2 is an intensity coefficient related to a preset neighborhood range, min_noise_thr is a minimum value of the noise threshold, that is, a first candidate value of the noise threshold, that is, the noise threshold of each pixel is determined according to the intensity coefficient, the noise coefficient, the first candidate value of the noise threshold, and the pixel value of each pixel, and may include the following steps:
determining a square root function value of the noise coefficient and the pixel value of the pixel point according to the noise coefficient and the pixel value of the pixel point;
determining a second candidate value of the noise threshold according to the intensity coefficient and the square root function value;
when the first candidate value is larger than the second candidate value, determining the first candidate value as the noise threshold value of the pixel point;
when the second candidate value is larger than the first candidate value, determining the second candidate value as a noise threshold value of the pixel point;
and determining a noise threshold value of each pixel point, wherein the first candidate value is the minimum value of the noise threshold value.
In the embodiment of the present application, sqrt () represents a square root function, k_2×sqrt (k_ 1*x) represents a second candidate value of the noise threshold, and max () represents taking the maximum value.
Step 304, when the number of pixels larger than the noise threshold in the preset neighborhood range of the differential image reaches the neighborhood counting threshold, determining that the motion of the moving target exists in the preset neighborhood range.
In the embodiment of the present application, whether there is a motion of a moving object in the differential image may be determined based on a pixel-by-pixel threshold method, that is, when only one pixel difference value in the differential image is greater than a noise threshold, the pixel area may be considered to have a motion of the moving object, and in order to ensure a recall rate (that is, a proportion of a successfully detected moving event to an actually occurring moving event), the noise threshold cannot be set too high, but in this way, there may always be a single high-intensity noise or an environmental disturbance exceeding the threshold, resulting in false touch.
Therefore, in order to reduce false touch, a multi-point judgment method can be adopted, and whether the motion of the moving target exists in the preset neighborhood range is determined through comparison between the number of pixels larger than the noise threshold in the preset neighborhood range and the neighborhood counting threshold.
For example, the preset neighborhood range is 5×5, and the corresponding preset neighborhood count threshold is 10, that is, if the number of pixels greater than the noise threshold in the preset neighborhood range reaches 10, it is determined that the motion of the moving object exists in the preset neighborhood range.
It should be noted that, the size of the preset neighborhood range may be determined according to the detection distance, and the closer the detection distance is, the more pixels change due to motion, so the larger the preset neighborhood range may be set. For example, a detection distance of 5x5 for 1m, a detection distance of 3x3 for 2m, and a detection distance of 1x1 for 3m may be used. Wherein, 3x3 presets the most balanced of neighborhood scope, is applicable to all detection distances. The closer the detection distance is, the larger the counting threshold should be, so as to improve the accuracy.
In one possible implementation, the method for motion detection may further include:
when the movement of the moving target is determined, an early warning instruction is sent to the on-chip subsystem, and the on-chip subsystem is awakened and the image sensor is controlled to perform real-time imaging; or (b)
When the movement of the moving object is determined to be the false touch, the on-chip subsystem is kept in a standby state.
In the embodiment of the application, when the movement of the moving target is determined, the on-chip subsystem can be awakened to control the image sensor to perform real-time imaging so as to be convenient for a user to check. Here, the real-time imaging refers to performing a series of algorithm processing on the acquired image and outputting a real-time image with high resolution for being presented to a user for viewing, which is different from the foregoing low-resolution buffered frame image and current frame image.
In the embodiment of the application, when the motion of the moving target is determined to be the false touch, the on-chip subsystem is kept in a standby state, and compared with the prior art that the on-chip subsystem is awakened once the environmental infrared change is detected, the operation power consumption of the on-chip subsystem can be reduced.
In the embodiment of the application, after receiving the device starting instruction, the image sensor acquires the cached frame image and the current frame image in response to the received device starting instruction, and performs differential processing on the cached frame image and the current frame image to determine a differential image, wherein the differential image is used for confirming the differential degree of the cached frame image and the current frame image, the differential degree can be used for determining whether the motion of a moving target exists or not, but under the condition that the device sensitivity configuration is different, the differential degree cannot directly confirm whether the motion of the moving target exists or not, so that the noise threshold, the preset neighborhood range and the neighborhood counting threshold need to be acquired in response to the device sensitivity configuration of a user, and when the number of pixels larger than the noise threshold in the preset neighborhood range of the differential image reaches the neighborhood counting threshold, the motion of the moving target in the preset neighborhood range is determined. The scheme can reduce the power consumption of the device when applied to the image sensor, and can adapt to different application scenes by carrying out corresponding movement judgment according to different sensitivity configurations.
Fig. 5 is a flowchart illustrating a mobile detection method according to another embodiment of the present application.
Step 501, under the condition of fluctuation of exposure value, re-collecting the image to be processed, and downsampling the image to be processed by a pixel merging algorithm to obtain a new downsampled image.
In this embodiment of the present application, if the environment changes, the exposure value will fluctuate, so when the fluctuation of the exposure value is detected, the image to be processed can be collected again, and a new downsampled image is obtained by processing.
Specifically, the specific process of downsampling the image to be processed by the pixel merging algorithm to obtain a new downsampled image is the same as the specific implementation process in step 301, which may be referred to herein, and the embodiments of the present application are not described herein again.
Step 502, the auto-exposure state of the image sensor is read.
In the embodiment of the present application, reading the auto-exposure state of the image sensor generally requires accessing a driver or an API interface of the image sensor, and specifically, the auto-exposure state may be obtained by reading exposure parameters (e.g., average brightness, exposure time, ISO value, aperture size, etc.).
When the exposure parameters are stable, the automatic exposure state can be determined to be stable.
Step 503, until the automatic exposure state is stable, the downsampled image at that time is buffered, and determined as the updated buffered frame image.
In the embodiment of the present application, the updated buffered frame image may not be determined until the automatic exposure state is stable, which may be understood as that the buffered frame image is updated once every time the automatic exposure state is changed from the unsteady state to the steady state.
Step 504, the image to be processed of the current frame is collected, and downsampling is carried out on the image to be processed of the current frame through a pixel merging algorithm, so that the image of the current frame is obtained.
In the embodiment of the present application, the specific implementation process and principle of step 504 may refer to the detailed description of step 301 in the foregoing embodiment, which is not repeated herein.
And 505, performing differential processing on the cached frame image and the current frame image to determine a differential image.
In step 506, in response to the user's device sensitivity configuration, a noise threshold, a preset neighborhood range, and a neighborhood count threshold are obtained.
Step 507, when the number of pixels larger than the noise threshold in the preset neighborhood range of the differential image reaches the neighborhood counting threshold, determining that the motion of the moving target exists in the preset neighborhood range.
The specific implementation process and principle of the steps 505 to 507 may refer to the detailed description of the embodiments, which is not repeated herein.
In this embodiment of the present application, if the exposure value of the image sensor fluctuates, in order to ensure accuracy of motion detection, the image to be processed may be collected again, and the updated buffered frame image and the current frame image after processing the image to be processed may be determined, so that accuracy of motion detection is still ensured in a state where the image sensor has a problem.
Fig. 6 is a schematic structural diagram of a motion detection unit according to another embodiment of the present application, corresponding to the motion detection method in the above embodiment. For convenience of explanation, only portions relevant to the embodiments of the present application are shown.
Referring to fig. 6, the motion detection unit 600 includes:
an image obtaining module 601, configured to obtain a buffered frame image and a current frame image in response to a received device start instruction;
the difference determining module 602 is configured to perform difference processing on the buffered frame image and the current frame image, and determine a difference image;
a threshold obtaining module 603, configured to obtain a noise threshold, a preset neighborhood range, and a neighborhood count threshold in response to a device sensitivity configuration of a user;
the detecting module 604 is configured to determine that there is a motion of the moving object in the preset neighborhood range when the number of pixels in the preset neighborhood range of the differential image that is greater than the noise threshold reaches the neighborhood count threshold.
In the embodiment of the present application, the threshold value obtaining module 603 may specifically include the following sub-modules:
a sensitivity determination submodule for determining a device sensitivity in response to a device sensitivity configuration of the user;
the coefficient determination submodule is used for determining a preset neighborhood range, a neighborhood counting threshold value and an intensity coefficient corresponding to the preset neighborhood range according to the equipment sensitivity;
the noise threshold determining submodule is used for determining the noise threshold of each pixel point according to the intensity coefficient, the noise coefficient, the first candidate value of the noise threshold and the pixel value of each pixel point, and the noise coefficient is obtained through the register gain of the image sensor.
In this embodiment of the present application, the preset neighborhood range is a rectangular area in the differential image, where the rectangular area includes different numbers of pixels, corresponding neighborhood count thresholds are different, and intensity coefficients corresponding to the preset neighborhood range are different.
In the embodiment of the present application, the noise threshold determining submodule may specifically be configured to:
determining a square root function value of the noise coefficient and the pixel value of the pixel point according to the noise coefficient and the pixel value of the pixel point;
determining a second candidate value of the noise threshold according to the intensity coefficient and the square root function value;
When the first candidate value is larger than the second candidate value, determining the first candidate value as the noise threshold value of the pixel point;
when the second candidate value is larger than the first candidate value, determining the second candidate value as a noise threshold value of the pixel point;
and determining a noise threshold value of each pixel point, wherein the first candidate value is the minimum value of the noise threshold value.
In the embodiment of the present application, the image acquisition module 601 may specifically include the following sub-modules:
the image acquisition sub-module is used for responding to the received equipment starting instruction and acquiring an image to be processed;
the first pixel merging sub-module is used for downsampling an image to be processed through a pixel merging algorithm to obtain a downsampled image;
a first state reading sub-module for reading an automatic exposure state of the image sensor; the first buffer sub-module is used for buffering the current downsampled image and determining the current downsampled image as a buffered frame image when the automatic exposure state is stable;
the first current frame acquisition sub-module is used for acquiring a current frame to-be-processed image, and downsampling the current frame to-be-processed image through a pixel merging algorithm to obtain the current frame image.
In the embodiment of the present application, the image acquisition module 601 may specifically further include the following sub-modules:
The second pixel merging sub-module is used for re-collecting the image to be processed under the condition of fluctuation of the exposure value, and downsampling the image to be processed through a pixel merging algorithm to obtain a new downsampled image;
a second state reading sub-module for reading the automatic exposure state of the image sensor;
the second buffer sub-module is used for buffering the current downsampled image until the automatic exposure state is stable and determining the current downsampled image as an updated buffered frame image;
and the second current frame acquisition sub-module is used for acquiring the current frame to-be-processed image, and downsampling the current frame to-be-processed image through a pixel merging algorithm to obtain the current frame image.
In this embodiment of the present application, the motion detection unit 600 may specifically further include the following modules:
the imaging control module is used for sending an early warning instruction to the on-chip subsystem when the movement of the moving target is determined, waking up the on-chip subsystem and controlling the image sensor to perform real-time imaging; or alternatively, the first and second heat exchangers may be,
and the standby control module is used for enabling the on-chip subsystem to maintain a standby state when the movement of the moving object is determined to be the false touch.
In practical use, the motion detection unit provided in the embodiments of the present application may be configured in any image sensor to perform the motion detection method described above.
It should be noted that, because the content of information interaction and execution process between the above devices/units is based on the same concept as the method embodiment of the present application, specific functions and technical effects thereof may be referred to in the method embodiment section, and will not be described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
Referring to fig. 7, a schematic structural diagram of an image sensor according to still another embodiment of the present application is shown, and as shown in fig. 7, an image sensor 700 of this embodiment includes: at least one processor 710 (only one is shown in fig. 7), a memory 720 and a computer program 721 stored in the memory 720 and executable on the at least one processor 710, which processor implements the steps of the embodiments of the motion detection method described above when executing the computer program 721.
The image sensor 700 may refer to any type of sensor. The image sensor may include, but is not limited to, a processor 710, a memory 720. It will be appreciated by those skilled in the art that fig. 7 is merely an example of an image sensor 700 and is not meant to be limiting of the image sensor 700, and may include more or fewer components than shown, or may combine certain components, or may include different components, such as input-output devices, network access devices, etc.
The processor 710 may be a central processing unit (Central Processing Unit, CPU), the processor 710 may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 720 may be an internal storage unit of the image sensor 700 in some embodiments, such as a hard disk or a memory of the image sensor 700. The memory 720 may also be an external storage device of the image sensor 700 in other embodiments, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card) or the like, which are provided on the image sensor 700. Further, the memory 720 may also include both an internal storage unit and an external storage device of the image sensor 700. The memory 720 is used to store an operating system, application programs, boot Loader (Boot Loader), data, other programs, etc., such as program codes of the computer program. The memory 720 may also be used to temporarily store data that has been output or is to be output.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided herein, it should be understood that the disclosed units/image sensors and methods may be implemented in other ways. For example, the above-described unit/image sensor embodiments are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be through some interface, unit or indirect coupling or communication connection of units, whether electrical, mechanical or otherwise.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each method embodiment described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or unit capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), an electrical carrier signal, a telecommunication signal, a software distribution medium, and so forth. It should be noted that the computer readable medium contains content that can be appropriately scaled according to the requirements of jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is subject to legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunication signals.
The present application implementing all or part of the flow of the method of the above embodiments may also be implemented by a computer program product, which when run on an image sensor, causes the image sensor to perform the steps of the method embodiments described above.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting. Although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (10)

1. A motion detection method, applied to an image sensor, where the image sensor is configured to carry the motion detection method to detect a moving target, the motion detection method includes:
responding to a received equipment starting instruction, and acquiring a cached frame image and a current frame image;
Performing differential processing on the cached frame image and the current frame image to determine a differential image;
responding to the equipment sensitivity configuration of a user, and acquiring a noise threshold value, a preset neighborhood range and a neighborhood counting threshold value;
and when the number of pixels larger than the noise threshold in the preset neighborhood range of the differential image reaches the neighborhood counting threshold, determining that the motion of the moving target exists in the preset neighborhood range.
2. The method of claim 1, wherein the obtaining the noise threshold, the preset neighborhood range, and the neighborhood count threshold in response to the device sensitivity configuration of the user comprises:
determining a device sensitivity of a user in response to a device sensitivity configuration of the user;
determining the preset neighborhood range, the neighborhood count threshold and an intensity coefficient corresponding to the preset neighborhood range according to the equipment sensitivity;
and determining the noise threshold value of each pixel point according to the intensity coefficient, the noise coefficient, the first candidate value of the noise threshold value and the pixel value of each pixel point, wherein the noise coefficient is obtained through the register gain of the image sensor.
3. The method of claim 2, wherein the predetermined neighborhood range is a rectangular area in the differential image, the rectangular area includes different numbers of pixels, the corresponding neighborhood count thresholds are different, and intensity coefficients corresponding to the predetermined neighborhood range are different.
4. The method of claim 2, wherein determining the noise threshold for each pixel based on the intensity coefficient, the noise coefficient, the first candidate value for the noise threshold, and the pixel value for each pixel comprises:
determining a square root function value of the noise coefficient and the pixel value of the pixel point according to the noise coefficient and the pixel value of the pixel point;
determining a second candidate value for the noise threshold based on the intensity coefficient and the square root function value;
when the first candidate value is larger than the second candidate value, determining the first candidate value as a noise threshold value of the pixel point;
when the second candidate value is larger than the first candidate value, determining the second candidate value as a noise threshold value of the pixel point;
and determining a noise threshold value of each pixel point, wherein the first candidate value is the minimum value of the noise threshold value.
5. The motion detection method as claimed in claim 1, wherein the acquiring the buffered frame image and the current frame image in response to the received device start-up instruction comprises:
responding to a received equipment starting instruction, and collecting an image to be processed;
downsampling the image to be processed through a pixel merging algorithm to obtain a downsampled image;
reading an automatic exposure state of the image sensor;
when the automatic exposure state reaches stability, caching the downsampled image at the moment, and determining the downsampled image as the cached frame image;
and acquiring a current frame to-be-processed image, and downsampling the current frame to-be-processed image through a pixel merging algorithm to obtain the current frame image.
6. The method of motion detection according to claim 5, wherein the obtaining the buffered frame image and the current frame image in response to the received device start-up instruction further comprises:
under the condition of fluctuation of exposure values, re-acquiring an image to be processed, and downsampling the image to be processed through a pixel merging algorithm to obtain a new downsampled image;
reading an automatic exposure state of the image sensor;
until the automatic exposure state is stable, caching the downsampled image at the moment, and determining the downsampled image as the updated cached frame image;
And acquiring a current frame to-be-processed image, and downsampling the current frame to-be-processed image through a pixel merging algorithm to obtain the current frame image.
7. The method of claim 1, further comprising:
when the movement of the moving target is determined to exist, an early warning instruction is sent to an on-chip subsystem, and the on-chip subsystem is awakened and controlled to perform real-time imaging; or (b)
And when the movement of the moving object is determined to be the false touch, the on-chip subsystem is kept in a standby state.
8. A motion detection unit configured to an image sensor, the motion detection unit being configured to carry out the motion detection method according to any one of claims 1 to 7 to detect a moving target, the motion detection unit comprising:
the image acquisition module is used for responding to the received equipment starting instruction and acquiring a cache frame image and a current frame image;
the difference determining module is used for carrying out difference processing on the cached frame image and the current frame image to determine a difference image;
the threshold value acquisition module is used for responding to the equipment sensitivity configuration of the user to acquire a noise threshold value, a preset neighborhood range and a neighborhood counting threshold value;
And the detection module is used for determining that the motion of the moving target exists in the preset neighborhood range when the number of the pixels which are larger than the noise threshold in the preset neighborhood range of the differential image reaches the neighborhood counting threshold.
9. The intelligent monitoring device is characterized by comprising an infrared sensing unit, an image sensor and an on-chip subsystem;
the infrared sensing unit is used for sensing infrared change in the environment, and sending a device starting instruction to the image sensor under the condition that the infrared change is sensed;
the image sensor is used for responding to the received equipment starting instruction and acquiring a cache frame image and a current frame image; performing differential processing on the cached frame image and the current frame image to determine a differential image; responding to the equipment sensitivity configuration of a user, and acquiring a noise threshold value, a preset neighborhood range and a neighborhood counting threshold value; when the number of pixels larger than the noise threshold in the preset neighborhood range of the differential image reaches the neighborhood counting threshold, determining that the motion of the moving target exists in the preset neighborhood range;
and the on-chip subsystem is used for receiving the early warning instruction of the image sensor and controlling the image sensor to perform real-time imaging.
10. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the method according to any one of claims 1 to 7.
CN202311689535.XA 2023-12-08 2023-12-08 Mobile detection method, unit, intelligent monitoring equipment and storage medium Pending CN117689688A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311689535.XA CN117689688A (en) 2023-12-08 2023-12-08 Mobile detection method, unit, intelligent monitoring equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311689535.XA CN117689688A (en) 2023-12-08 2023-12-08 Mobile detection method, unit, intelligent monitoring equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117689688A true CN117689688A (en) 2024-03-12

Family

ID=90133074

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311689535.XA Pending CN117689688A (en) 2023-12-08 2023-12-08 Mobile detection method, unit, intelligent monitoring equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117689688A (en)

Similar Documents

Publication Publication Date Title
US11431937B2 (en) Data rate control for event-based vision sensor
US10070053B2 (en) Method and camera for determining an image adjustment parameter
CN106878668B (en) Movement detection of an object
US8792681B2 (en) Imaging system and imaging method
WO2020248543A1 (en) Abnormal target detection method and device, and storage medium
CN108989638B (en) Imaging apparatus, control method thereof, electronic apparatus, and computer-readable storage medium
CN100375530C (en) Movement detecting method
CN112312076A (en) Intelligent mobile detection device
JP2010182021A (en) Image monitor system
KR100513739B1 (en) Motion detecting device using face characteristics and monitoring system adapting it
US9607210B2 (en) Video surveillance system and method for fraud detection
US10944941B2 (en) Smart motion detection device and related determining method
CN117689688A (en) Mobile detection method, unit, intelligent monitoring equipment and storage medium
US10984536B2 (en) Motion detection in digital images and a communication method of the results thereof
JP2022076837A (en) Information processing device, information processing method, and program
KR102275018B1 (en) Surveillance camera apparatus and surveillance system comprising the same
CN111225178A (en) Video monitoring method and system based on object detection
CN114827450B (en) Analog image sensor circuit, image sensor device and method
KR101533338B1 (en) Apparatus and method for detecting object
KR100749457B1 (en) Method and device for perceiving approach of object using camera
Ya-Jun Design of remote motion detection and alarm monitoring system based on the ARM
CN116634280A (en) Intelligent light supplementing method and device, face recognition system and access control system
CN116168415A (en) Animal monitoring and identifying method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination