CN113711272A - Method and system for non-spurious motion detection - Google Patents

Method and system for non-spurious motion detection Download PDF

Info

Publication number
CN113711272A
CN113711272A CN201980095490.9A CN201980095490A CN113711272A CN 113711272 A CN113711272 A CN 113711272A CN 201980095490 A CN201980095490 A CN 201980095490A CN 113711272 A CN113711272 A CN 113711272A
Authority
CN
China
Prior art keywords
motion
frames
scene change
bounding box
difference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980095490.9A
Other languages
Chinese (zh)
Inventor
张洪伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Publication of CN113711272A publication Critical patent/CN113711272A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/215Motion-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/254Analysis of motion involving subtraction of images

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)

Abstract

One method comprises the following steps: performing an inspection method a first plurality of times, wherein the inspection method comprises: obtaining a size difference between a first bounding box defining a first detected motion or scene change obtained using a first plurality of frames and a second bounding box defining a second detected motion or scene change obtained using a second plurality of frames; and updating the plurality of first frames and the plurality of second frames; wherein for each of a second plurality of times in the first plurality of times, the dimensional difference is greater than a predetermined threshold, and the second plurality of times is greater than or equal to a predetermined plurality of times; and determining that the motion or scene change in the plurality of first frames and the plurality of second frames for each of the first plurality of times belongs to continuous motion based on the size difference for each of the second plurality of times.

Description

Method and system for non-spurious motion detection
Technical Field
The present disclosure relates to the field of motion detection, and in particular, to a method and system for non-false motion detection.
Background
Motion detection is a technique for identifying a moving object in a series of frames. A first series of frames includes pixels collocated with each other and the changes are induced by a moving object. A second series of frames includes pixels collocated with each other and subject to variation due to camera motion. The first series of frames may be difficult to distinguish from the second series of frames, resulting in false motion detection.
Disclosure of Invention
It is an object of the present disclosure to propose a method and system for non-spurious motion detection.
In a first aspect of the disclosure, a computer-implemented method comprises: continuously performing an inspection method a first plurality of times, wherein the inspection method comprises: obtaining a plurality of first frames arranged in sequence; detecting motion or scene changes in the first plurality of frames to generate a first detected motion or scene change; obtaining a first bounding box bounding the first detected motion or scene change; obtaining a plurality of second frames arranged in sequence, wherein the plurality of second frames follow the plurality of first frames; detecting motion or scene changes in the second plurality of frames to generate a second detected motion or scene change; obtaining a second bounding box bounding the second detected motion or scene change; obtaining a first size difference between the first bounding box and the second bounding box; and updating the plurality of first frames to the plurality of second frames and the plurality of second frames to a plurality of third frames subsequent to the plurality of second frames; wherein for each of a second plurality of the first plurality, the first dimension difference is greater than a first predetermined threshold, and the second plurality is greater than or equal to a first predetermined plurality; and determining the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality of times to belong to continuous motion based on the first size difference for each of the second plurality of times.
In a second aspect of the disclosure, a computer-implemented method comprises: continuously performing an inspection method a first plurality of times, wherein the inspection method comprises: obtaining a plurality of first frames arranged in sequence; detecting motion or scene changes in the first plurality of frames to generate a first detected motion or scene change; obtaining a first feature value of all pixels affected by the first detected motion or scene change; obtaining a plurality of second frames arranged in sequence, wherein the plurality of second frames follow the plurality of first frames; detecting motion or scene changes in the second plurality of frames to generate a second detected motion or scene change; obtaining a second feature value of all pixels affected by the second detected motion or scene change; obtaining a first eigenvalue difference between the first eigenvalue and the second eigenvalue; wherein for each of a second plurality of the first plurality, the first characteristic difference is greater than a first predetermined threshold, and the second plurality is greater than or equal to a first predetermined plurality; and determining the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality of times to belong to continuous motion based on the first eigenvalue delta for each of the second plurality of times.
In a third aspect of the disclosure, a system includes at least one memory and at least one processor. The at least one memory is configured to store a plurality of program instructions. The at least one processor is configured to execute the plurality of program instructions, which cause the at least one processor to perform a plurality of steps comprising: continuously performing an inspection method a first plurality of times, wherein the inspection method comprises: obtaining a plurality of first frames arranged in sequence; detecting motion or scene changes in the first plurality of frames to generate a first detected motion or scene change; obtaining a first bounding box bounding the first detected motion or scene change; obtaining a plurality of second frames arranged in sequence, wherein the plurality of second frames follow the plurality of first frames; detecting motion or scene changes in the second plurality of frames to generate a second detected motion or scene change; obtaining a second bounding box bounding the second detected motion or scene change; obtaining a first size difference between the first bounding box and the second bounding box; and updating the plurality of first frames to the plurality of second frames and the plurality of second frames to a plurality of third frames subsequent to the plurality of second frames; wherein for each of a second plurality of the first plurality, the first dimension difference is greater than a first predetermined threshold, and the second plurality is greater than or equal to a first predetermined plurality; and determining the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality of times to belong to continuous motion based on the first size difference for each of the second plurality of times.
In a fourth aspect of the disclosure, a system includes at least one memory and at least one processor. The at least one memory is configured to store a plurality of program instructions. The at least one processor is configured to execute the plurality of program instructions, which cause the at least one processor to perform a plurality of steps comprising: continuously performing an inspection method a first plurality of times, wherein the inspection method comprises: obtaining a plurality of first frames arranged in sequence; detecting motion or scene changes in the first plurality of frames to generate a first detected motion or scene change; obtaining a first feature value of all pixels affected by the first detected motion or scene change; obtaining a plurality of second frames arranged in sequence, wherein the plurality of second frames follow the plurality of first frames; detecting motion or scene changes in the second plurality of frames to generate a second detected motion or scene change; obtaining a second feature value of all pixels affected by the second detected motion or scene change; obtaining a first eigenvalue difference between the first eigenvalue and the second eigenvalue; wherein for each of a second plurality of the first plurality, the first characteristic difference is greater than a first predetermined threshold, and the second plurality is greater than or equal to a first predetermined plurality; and determining the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality of times to belong to continuous motion based on the first eigenvalue delta for each of the second plurality of times.
Drawings
In order to more clearly describe the embodiments of the present disclosure or the related art, when the embodiments are briefly described, the following drawings will be described. It should be apparent that the drawings are merely some embodiments of the disclosure and that other drawings may be derived by one of ordinary skill in the art without the benefit of the foregoing description.
Fig. 1 is a block diagram illustrating input, processing and output hardware modules in a terminal according to an embodiment of the disclosure.
Fig. 2 is a flow chart illustrating a slow motion acquisition method according to an embodiment of the present disclosure.
Fig. 3 is a flow chart illustrating a step of performing non-false motion detection to automatically trigger slow motion capture in a slow motion capture method according to an embodiment of the disclosure.
Fig. 4 is a flow chart illustrating a step of detecting motion or a scene change in a current frame set according to an embodiment of the present disclosure.
Fig. 5 is a diagram illustrating images in the step of detecting motion or scene change in a current frame set, where the step is applied to a moving object example, according to an embodiment of the present disclosure.
Fig. 6 is a diagram of images in the step of detecting motion or scene change in a current frame set, applied to an example of a judder, in an embodiment of the disclosure.
FIG. 7 is a flow chart illustrating a step of performing an AND operation using a first pixel difference image AND a second pixel difference image according to an embodiment of the present disclosure.
FIG. 8 is a flow chart illustrating a step of performing an erosion operation performed after the step of performing the AND operation according to an embodiment of the present disclosure.
FIG. 9 is a diagram illustrating an image in the step of performing an erosion operation according to an embodiment of the present disclosure.
FIG. 10 is a flow diagram illustrating a step of continuous motion check using a first size difference between a current bounding box and a previous bounding box, wherein the current bounding box defines a currently detected motion or scene change, according to an embodiment of the present disclosure.
Fig. 11 is a diagram illustrating images in the step of continuous motion check using a first size difference between a current bounding box and a previous bounding box, where the motion or the scene change in the current frame set is determined to be of continuous motion, according to the embodiment of the disclosure.
Fig. 12-13 are diagrams illustrating images in the step of continuous motion check using a first size difference between a current bounding box and a previous bounding box, wherein the motion or the scene change in the current frame set is determined not to belong to continuous motion, according to the embodiment of the present disclosure.
Fig. 14 is a flow diagram illustrating a step of continuous motion check using a first feature value difference between a current feature value and a previous feature value, wherein the current feature value characterizes all pixels affected by a currently detected motion or scene change, according to another embodiment of the present disclosure.
Fig. 15 is a diagram illustrating images in the step of continuous motion check using a first eigenvalue difference between a current eigenvalue and a previous eigenvalue, where the motion or the scene in the current frame set is determined to belong to continuous motion, according to another embodiment of the present disclosure.
Fig. 16-17 are diagrams illustrating images in the step of continuous motion check using a first eigenvalue difference between a current eigenvalue and a previous eigenvalue, where the motion or the scene in the current frame set is determined not to belong to continuous motion, according to another embodiment of the present disclosure.
FIG. 18 is a flow chart illustrating a step of continuous motion check using a first feature difference between a current feature and a previous feature in accordance with yet another embodiment of the present disclosure.
FIG. 19 is a flow chart illustrating a step of continuous motion check using a first feature difference between a current feature and a previous feature in accordance with yet another embodiment of the present disclosure.
20-21 are flow diagrams illustrating a step of continuous motion check using a first size difference between a current bounding box and a previous bounding box and a first feature difference between a current feature and a previous feature according to yet another embodiment of the present disclosure.
FIG. 22 is a flow chart illustrating a step of further continuous motion check using a current feature value of all pixels affected by the currently detected motion or scene change according to yet another embodiment of the present disclosure.
Detailed Description
Technical matters, structural features, objects, and effects achieved by the embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings. In particular, the terms in the embodiments of the present disclosure are used for the purpose of illustrating a certain embodiment only, and are not used to limit the present disclosure.
The same reference numbers in different drawings identify substantially the same elements, and the description of one element applies to the other elements.
As used herein, the term "use or using" refers to a case where a step is directly performed using an object, or a case where the object is modified through at least one intermediate step and the modified object is directly used to perform the step.
Some implementations are described herein in connection with thresholds. The term "greater than (or similar term) is used herein to describe a relationship of a value to a threshold, and may be used interchangeably with the term" greater than or equal to (or similar term). Similarly, the term "less than (or the like)" as used herein to describe a relationship of a value to a threshold may be used interchangeably with the term "less than or equal to (or the like)". As used herein, "exceeding (exceeding)" a threshold (or similar terms) may be used interchangeably with "greater than a threshold (second threshold a threshold)", "greater than or equal to a threshold (second threshold or equal to an threshold)", "less than a threshold (second threshold a threshold)", "less than or equal to a threshold (second threshold or equal to an threshold) or other similar terms, depending on the context in which the threshold is used.
Fig. 1 is a block diagram illustrating input, processing, and output hardware modules in a terminal 100 according to an embodiment of the present disclosure. Referring to fig. 1, the terminal 100 includes a camera module 102, a processor module 104, a memory module 106, a display module 108, a storage module 110, a wired or wireless communication module 112, and a plurality of buses 114. In one embodiment, the terminal 100 may be a mobile phone, a smart phone, a tablet computer, a notebook computer, a desktop computer, or any electronic device with sufficient computing power for motion detection.
The camera module 102 is an input hardware module (an input hardware module) and is configured to serially capture a plurality of frames for transmission to the processor module 104 via the plurality of buses 114. The processor module 104 uses the frames to generate a preview stream (a preview stream) and a video stream (a video stream) as described with reference to fig. 2. In one embodiment, the camera module 102 performs capturing at a normal frame rate, such as 30fps, when the camera module 102 performs normal capturing, and the camera module 102 performs capturing at a higher frame rate, such as 120fps, 240fps or 480fps, when the camera module 102 performs slow motion capturing. In one embodiment, the camera module 102 includes an RGB camera, or a grayscale camera. In another embodiment, the plurality of frames may be obtained using another input hardware module, such as the storage module 110, or the wired or wireless communication module 112. The memory module 110 is for storing the plurality of frames transmitted to the processor module 104 via the plurality of buses 114. The wired or wireless communication module 112 is configured to receive the plurality of frames from a network via wired or wireless communication, wherein the plurality of frames are to be transmitted to the processor module 104 via the plurality of buses 114.
The memory module 106 stores a plurality of program instructions that are executed by the processor module 104 to cause the processor module 104 to perform non-spurious motion detection. In one embodiment, the processor module 104 performs a slow motion capture method (a slow motion capture method) that uses the non-ambiguous motion detection to automatically trigger slow motion capture. The slow motion acquisition method is described with reference to fig. 2. In one embodiment, the memory module 106 may be a transitory or non-transitory computer readable medium including at least one memory. In one embodiment, the processor module 104 includes at least one processor that sends signals to the camera module 102, the memory module 106, the display module 108, the storage module 110, and the wired or wireless communication module 112 directly or indirectly via a bus 114 and/or receives signals from the camera module 102, the memory module 106, the display module 108, the storage module 110, and the wired or wireless communication module 112 directly or indirectly via a bus 114. In one embodiment, the at least one processor may be central processing unit(s) (cpu (s)), graphics processing unit(s) (gpu (s)), and/or digital signal processor(s) (dsp (s)). The CPU(s) may send the frames, some program instructions, and other data or instructions to the GPU(s) and/or DSP(s) over the buses 114.
The display module 108 is an output hardware module for displaying the preview stream and/or the video stream received from the processor module 104 over the plurality of buses 114. In another embodiment, the preview stream and/or the video stream may be output using another output hardware module, such as the storage module 110, or the wired or wireless communication module 112. The storage module 110 is used to store the video streams received from the processor module 104 over the plurality of buses 114. The wired or wireless communication module 112 is used to transmit the preview stream and/or the video stream to the network through the wired or wireless communication, where the preview stream and/or the video stream is received from the processor module 104 through the plurality of buses 114.
The terminal 100 is a computing system with all its components integrated together via the plurality of buses 114. Other types of computing systems, such as a computing system having a remote camera module other than the camera module 102, are also within the intended scope of the present disclosure.
Fig. 2 is a flow chart illustrating a slow motion acquisition method 200 according to an embodiment of the present disclosure. Referring to fig. 1 and 2, the slow motion capture method 200 includes the following steps.
In a step 202, a plurality of samples are obtained from a gyroscope. The plurality of samples indicates whether rotational motion is present for the camera module 102. In a step 204, if there is rotational motion of the camera module 102, a step 206 is performed; otherwise, a step 208 is performed. In step 206, a user is guided to correct a gesture. Rotational motion of the camera module 102 may be caused by hand shake (hand shake) of the user, for example, when the user holds the terminal 100 for slow motion capture by the camera module 102. Because if there is rotational motion of the camera module 102 while performing slow motion acquisition, the frames acquired by the camera module 102 may contain blurred content. Thus, the user is guided to correct the gesture before triggering slow motion capture in the step 208.
In step 208, a preview stream is used to perform non-false action detection, which automatically triggers slow motion capture. The step 208 will be described with reference to fig. 3 to 22.
In step 210, the slow motion capture is performed and reflected in a video stream, and a detection result is indicated in the preview stream. After triggering the slow motion capture in the step 208, the processor module 104 causes the camera module 102 to perform the slow motion capture and reflect frames captured by the slow motion capture in the video stream. For example, a portion of the video stream reflecting the slow motion capture may be recorded in the storage module 110. During the slow motion capture, indicating the detection result as a moving object that caused the slow motion capture in the preview stream. A frame rate of the preview stream is maintained at a normal frame rate, such as 30 fps. The user may view the detection results in real-time by viewing the preview stream using the display module 108. In step 212, if the slow motion capture has continued for a predetermined period of time, proceed to step 214; otherwise, the method 200 loops back to the step 210. The predetermined period of time may be, for example, 10 seconds. In the step 214, the slow motion acquisition is caused to end.
Fig. 3 is a flowchart of the step 208 of performing non-ambiguous motion detection to automatically trigger the slow motion acquisition in the slow motion acquisition method 200 according to an embodiment of the disclosure. Referring to fig. 1 to 3, the step 208 includes the following steps.
In a step 302, a current frame set (a current frame set) is obtained. The current frame set includes a plurality of frames (adaptive of frames in series) arranged in order. The current frame set is obtained using the preview stream.
In step 304, motion or a scene change in the current frame set is detected to generate a current detected motion or scene change. The current frame set includes a plurality of first pixels collocated with each other and changing. The plurality of first pixels may include motion of the moving object in the current frame set or a scene change in the current frame set. The scene change may be caused by motion of the camera module 102. In one embodiment, the step 304 generates the currently detected motion or scene change that identifies the motion or scene change set in the current frame by a motion detection method, such as a pixel difference method. In another embodiment, the currently detected motion or scene change is generated by a motion vector method (a motion vector method). The step 304 does not distinguish the motion from the scene change. The step 304 will be further described in conjunction with fig. 4 through 9. In the embodiments described with reference to fig. 4 to 9, the current frame set has 3 frames. In another embodiment, the current frame set has 2 frames in a sequential order.
In a step 306, a continuous motion check (continuous motion checking) is performed using the currently detected motion or scene change. The step 306 distinguishes between motion and scene changes. The step 306 is described with reference to fig. 10 to 22. In a step 308, if continuous motion is detected, said step 210 is performed; otherwise, the step 208 loops back to the step 302 to obtain a current frame set for the next iteration. The step 308 determines whether the continuous motion is detected based on a slow motion capture start flag (slow motion capture start flag) in a step 1014 in fig. 10, a step 1414 in fig. 14, or a step 2116 in fig. 21.
Fig. 4 is a flow chart illustrating the step 304 of detecting motion or scene change in the current frame set according to an embodiment of the present disclosure. The step 304 is performed using the pixel delta method. The step 304 includes the following steps. In step 402, a first pixel difference image between an early frame (a pre-previous frame) and a current frame (a current frame) in the current frame set is obtained. In a step 404, a second pixel difference image between a previous frame and a current frame in the current frame set is obtained. In step 406, an AND operation is performed using the first delta image AND the second delta image to obtain a first image, wherein the first image comprises the currently detected motion or scene change.
Fig. 5 is a diagram illustrating a plurality of images 502-512 in the step 304 of detecting motion or scene change in a current frame set, wherein the step 304 is applied to a moving object example, according to an embodiment of the present disclosure. Referring to fig. 4 and 5, a current frame set includes an earlier frame (a previous frame)502, a previous frame (a previous frame)504, and a current frame (acquisition frame)506 arranged in sequence. The earlier frame 502 includes a plurality of pixels 514 of a moving object at a first location in motion. The previous frame 504 includes a plurality of pixels 516 of the moving object at a second location in motion. The current frame 506 includes a plurality of pixels 518 of the moving object at a third location in motion. The plurality of pixels 514 and the plurality of pixels collocated with the plurality of pixels 514 in the previous frame 504 and the current frame 506, the plurality of pixels 516 and the plurality of pixels collocated with the plurality of pixels 516 in the previous frame 502 and the current frame 506, and the plurality of pixels 518 and the plurality of pixels collocated with the plurality of pixels 518 in the previous frame 502 and the previous frame 504 are changing and comprise motion of the moving object in the current frame set.
In the step 402, a first pixel delta image 508 between the earlier frame 502 and the current frame 506 in the current frame set is obtained. The first delta image 508 includes a plurality of pixels, each having a pixel value that is equal to an absolute value of a delta between a pixel value of a corresponding pixel in the earlier frame 502 and a pixel value of a pixel in the current frame 506 collocated with the corresponding pixel. In an embodiment, all pixel values throughout the present disclosure are luminance values (e.g., luminance or brightness). In another embodiment, all pixel values throughout the present disclosure are color values. Thus, a plurality of pixels 520 in the first delta pixel image 508 display a plurality of delta pixels caused by the motion of the moving object in the earlier frame 502 and the current frame 506.
In the step 404, a second pixel delta image 510 between the previous frame 504 and the current frame 506 in the current frame set is obtained. The second delta image 510 includes a plurality of pixels, each having a pixel value that is equal to an absolute value of a delta between a pixel value of a corresponding pixel in the previous frame 504 and a pixel value of a pixel in the current frame 506 collocated with the corresponding pixel. Thus, a plurality of pixels 522 in the second pixel delta image 510 display a plurality of pixel deltas caused by the motion of the moving object in the previous frame 504 and the current frame 506.
In step 406, an AND operation is performed using the first delta pixel image 508 AND the second delta pixel image 510 to obtain a first image 512, wherein the first image 512 includes the currently detected motion or scene change. The AND operation finds the intersection 524 of the plurality of pixels 520 in the first delta image 508 AND the plurality of pixels 522 in the second delta image 510 that display delta pixels. The intersection 524 of the plurality of pixels 520 in the first delta pixel image 508 and the plurality of pixels 522 in the second delta pixel image 510 includes the currently detected motion or scene change 524.
Fig. 6 is a diagram illustrating images 602-612 in the step 304 of detecting motion or scene changes in a current frame set, where the step 304 is applied to a judder example, in an embodiment in accordance with the disclosure. Referring to fig. 4 and 6, a current frame set includes an earlier frame 602, a previous frame 604, and a current frame 606 in order. The earlier frame 602 includes a plurality of pixels of a static scene under an illumination at a first light intensity condition. The previous frame 604 includes a plurality of pixels of the static scene under the illumination at a second light intensity condition. The current frame 606 includes a plurality of pixels of the static scene under the illumination at the first light intensity condition. The first light intensity condition is different from the second light intensity condition due to a flicker.
In the step 402, a first pixel delta image 608 between the earlier frame 602 and a current frame 606 in the current frame set is obtained. The first delta image 608 includes a plurality of pixels, each having a pixel value that is equal to an absolute value of a difference between a pixel value of a corresponding pixel in the earlier frame 602 and a pixel value of a pixel in the current frame 606 collocated with the corresponding pixel. Since the multiple scenes in the previous frame 602 and the current frame 606 are still and under the same light intensity conditions, no pixel in the first delta pixel image 608 shows a delta pixel caused by motion or scene change in the previous frame 604 and the current frame 606.
In step 404, a second pixel delta image 610 between the previous frame 604 and the current frame 606 in the current frame set is obtained. The second delta image 610 includes a plurality of pixels, each having a pixel value equal to an absolute value of a delta between a pixel value of a corresponding pixel in the previous frame 604 and a pixel value of a pixel in the current frame 606 collocated with the corresponding pixel. Since the multiple scenes in the earlier frame 602 and the current frame 606 are static and under different light intensity conditions, all pixels in the second pixel delta image 610 show multiple pixel deltas caused by motion or scene changes in the previous frame 604 and the current frame 606.
In step 406, an AND (AND) operation is performed using the first delta image 608 AND the second delta image 610 to obtain a first image 612, the first image 612 including the currently detected motion or scene change. The AND operation finds the intersection (interaction) of the pixels in the first pixel difference image 608 AND the pixels in the second pixel difference image 610 displaying a plurality of pixel differences. Because no pixel in the first pixeldelta image 608 shows a pixeldelta, the intersection is empty. Therefore, the plurality of pixels of the plurality of pixel differences due to the dithering are shown in the second pixel difference image 610, and slow motion capture (slow motion capture) is not triggered by false (false).
FIG. 7 is a flow chart illustrating the step 406 of performing an AND operation using the first pixel delta image AND the second pixel delta image according to an embodiment of the present disclosure. Referring to fig. 7, in a step 702, the AND operation is performed on the first delta pixel image AND the second delta pixel image to obtain an AND image (an AND image). In step 704, the AND image is binarized using a contraction threshold to obtain a binary image, wherein the binary image is the first image. For each pixel in the AND image, if a pixel value of each pixel is greater than the shrink threshold, a corresponding pixel in the binary image has a pixel value of 255, AND if a pixel value of each pixel is less than or equal to the shrink threshold, the corresponding pixel in the binary image has a pixel value of 0. In another embodiment, the first AND second delta pixel images are first binarized to generate a plurality of binarized images, AND then the AND operation is performed on the plurality of binarized images.
FIG. 8 is a flow chart illustrating a step 802 of performing an erosion operation performed after the step 406 of performing the AND operation according to an embodiment of the present disclosure. Referring to fig. 8, the step 406 further includes the step 802. In step 802, an erosion operation is performed on the first image to obtain an erosion image, wherein the erosion image indicates the currently detected motion or scene change.
FIG. 9 is a diagram illustrating an image 902 in the step 802 of performing an erosion operation according to an embodiment of the present disclosure. Referring to fig. 5, 8 and 9, in the step 802, an erosion operation is performed on the first image 512 to obtain an eroded image 902, wherein the eroded image 902 indicates the currently detected motion or scene change 904. In one embodiment, the erosion operation is performed with a crossbar kernel 906. In one embodiment, the crossbar core 906 has a size of 3x 3. In another embodiment, the cross-structure kernel has a size of 5x5 and a center row has a plurality of pixel values of 1. In yet another embodiment, the cross-structure kernel has a size of 5x5, and the center 3 rows have a plurality of pixel values of 1. The cross structure kernel 906 is used by applying the erosion operation to a region of an image having the same size as the cross structure kernel 906, in which region a pixel corresponding to a pixel 908 of the cross structure kernel 906 is retained if all pixels corresponding to pixels of the cross structure kernel 906 having a pixel value of 1 have a plurality of pixel values of 255. In the region, if any of a plurality of pixels corresponding to a plurality of pixels of the cross structure kernel 906 having a pixel value of 1 has a pixel value less than 255, a pixel corresponding to the pixel 908 of the cross structure kernel 906 is not retained. A portion of the detected motion or scene change 904 is magnified in a view 910. In the view 910, each thin dashed box is a pixel. A thick dashed line is a portion of a boundary of the intersection 524 in the first image 512. A thick solid line is a portion of a boundary of the detected motion or scene change 902. A boundary 916 of a portion of the detected motion or scene change 902 is moved from the thick dashed line to the thick solid line using the cross structure kernel 906 by applying the erosion operation on each of a plurality of regions of the first image 512 having a corresponding center that is moved one by one from a pixel 912 to a pixel 914. The erosion operation further looks for the most significantly changed pixels in the current frame set.
FIG. 10 is a flow diagram illustrating the step 306 of continuous motion checking using a first size difference between a current bounding box and a previous bounding box, wherein the current bounding box defines the currently detected motion or scene change, according to an embodiment of the present disclosure. Referring to fig. 3 and 10, the step 306 includes the following steps. In a step 1002, a current bounding box (a current bounding box) is obtained, which bounds (bounding) the currently detected motion or scene change. In a step 1004, a first size difference between the current bounding box and a previous bounding box defining a previously detected motion or scene change is obtained. In a step 1006, the previous frame set (previous frame set) is updated to the current frame set (current frame set). In a step 1008, if the first size difference is greater than a first predetermined size difference threshold (a first predetermined size difference threshold), then go to a step 1010; otherwise, a step 1012 is performed. In step 1010, a variable of the satisfaction time (a satisfying time variable) is incremented. In step 1012, if the variable of the number of times of satisfaction is greater than or equal to a first predetermined number of times (a first predetermined complexity of times), proceed to step 1014; otherwise, the step 308 is performed. In the step 1014, a slow motion capture start flag is asserted (asserted). After the step 1014 is performed, the step 308 is performed.
Fig. 11 is a diagram illustrating a plurality of images 1102 and 1104 in step 306 of performing continuous motion check using a first size difference between a current bounding box 1108 and a previous bounding box 1106, wherein the motion or scene change in the current frame set is determined to belong to continuous motion, according to an embodiment of the present disclosure. Fig. 11 is a diagram illustrating an example of the moving object following the example of fig. 5. Referring to fig. 3, 5, 9, 10 and 11, in step 1002, a current bounding box 1108 is obtained, the current bounding box 1108 defining the currently detected motion or scene change 904. An upper boundary of the current bounding box 1108 is defined by the uppermost pixel of the currently detected motion or scene change 904, a right boundary of the current bounding box 1108 is defined by the rightmost pixel of the currently detected motion or scene change 904, a lower boundary of the current bounding box 1108 is defined by the bottommost pixel of the currently detected motion or scene change 904, and a left boundary of the current bounding box 1108 is defined by the leftmost pixel of the currently detected motion or scene change 904. In an embodiment, the current bounding box 1108 has a format of an upper left coordinate defined by the upper boundary and the left boundary and a lower right coordinate defined by the lower boundary and the right boundary.
In step 1004, a first size difference between the current bounding box 1108 and a previous bounding box 1106 is obtained, the previous bounding box 1106 defining a previously detected motion or scene change 1110. The previously detected motion or scene change 1110 is a currently detected motion or scene change in a previous iteration that was performed at the step 208 (in fig. 3). The current frame set in the current iteration is subsequent to the current frame set in the previous iteration. That is, if the plurality of frames in the current frame set in the current iteration are at times t-2, t-1, and t, then the plurality of frames in the current frame set in the previous iteration are at times t-5, t-4, and t-3. The previous bounding box 1106 is a current bounding box that bounds the currently detected motion or scene change in the previous iteration. In one embodiment, the step 1004 includes obtaining a first side length difference (a first side length difference) between a side 1112 of the previous bounding box 1106 and a side 1114 of the current bounding box 1108 that is parallel to the side 1112 of the previous bounding box 1106; and obtaining a second side length difference between a side 1116 of the previous bounding box 1106 and a side 1118 of the current bounding box 1108 that is parallel to the side 1116 of the previous bounding box 1106, wherein the side 1112 is perpendicular to the side 1116. The first dimension difference comprises the first side length difference and the second side length difference.
In step 1006, the previous frame set is updated to the current frame set. The step 1006 includes updating a previous bounding box to the current bounding box 1108. Thus, the step 1004 may be performed in the next iteration.
In step 1008, if the first size difference is greater than a first predetermined size difference threshold, then go to step 1010; otherwise, the step 1010 is performed. Otherwise, the process proceeds to step 1012. In one embodiment, the first predetermined size difference threshold comprises a first predetermined side length threshold and a second predetermined side length threshold. In one embodiment, the step 1008 includes determining whether the first side length difference is greater than a first predetermined side length threshold and whether the second side length difference is greater than a second predetermined side length threshold. If the first side length difference is greater than the first predetermined side length threshold and the second side length difference is greater than the second predetermined side length threshold, then go to step 1010; otherwise, the process proceeds to step 1012. In one embodiment, the first predetermined side length threshold is equal to one tenth of a width of the frame 502 in the current frame set. The second predetermined side length threshold is equal to one tenth of a height of the frame 502 in the current frame set.
In the above embodiments, the first side length difference and the second side length difference are absolute differences, and therefore the first predetermined side length threshold and the second predetermined side length threshold are real numbers. In another embodiment, the first side length difference and the second side length difference are relative differences in terms of ratios or percentages. In the above embodiment, there are two predetermined side length thresholds. In another embodiment, the first side length difference and the second side length difference have only one predetermined side length threshold. In the above embodiments, the first side length difference and the second side length difference are both required to be respectively greater than the first predetermined side length threshold and the second predetermined side length threshold, so that the first size difference is greater than the first predetermined size difference threshold. In another embodiment, at least one of the first side length difference and the second side length difference needs to be greater than the first predetermined side length threshold and the second predetermined side length threshold, respectively, so that the first size difference is greater than the first predetermined size difference threshold. In the above embodiments, the first dimension difference comprises the first side length difference and the second side length difference. In another embodiment, the first size difference amount comprises an area difference amount (an area difference) between the previous bounding box 1106 and the current bounding box 1108.
In step 1008, in one embodiment, the first dimension difference is an absolute value. In another embodiment, the first size delta is a non-absolute value obtained by subtracting a size of the previous bounding box 1106 from a size of the current bounding box 1108. In yet another embodiment, the first size difference amount is an absolute value, step 306 further comprises subtracting a size of the previous bounding box 1106 from a size of the current bounding box 1108 to obtain a second size difference amount, and if the second size difference amount is greater than zero, then step 1010 is performed. The second dimension difference is a non-absolute value.
In the example of fig. 11, the first size delta is greater than the first predetermined size delta threshold due to motion of the moving object in the previous frame set and the current frame set, where the first size delta is the absolute value or the non-absolute value. In embodiments that include the first dimension delta and the second dimension delta, the second dimension delta is greater than zero. Thus, the step 1010 is performed. In step 1010, a variable of the number of times of satisfaction is incremented. The meet times variable looks after each time at least one condition in the step 1008 is met.
In the above embodiment, the variable of the number of times of satisfaction is traced by incrementing. In another embodiment, the variable of degree of satisfaction is traced by decrementing, multiplying, or dividing.
In step 1012, if the variable of the number of times of satisfaction is greater than or equal to a first predetermined number of times, then go to step 1014; otherwise, the step 308 is performed. In the step 1014, the slow motion capture start flag is asserted. The reason for the number of the first predetermined plurality of times in the following embodiments will be explained in conjunction with fig. 13. In an embodiment, the step 1010 is only performed if the condition in the step 1008 is satisfied. Further, in the embodiment where the first size difference amount is the absolute value, the first predetermined plurality of times is at least three times. In a first example, due to motion of the moving object in a first frame set to a fourth frame set in a sequential order, at least one condition in the step 1008 is satisfied when the previous frame set and the current frame set are the first frame set and the second frame set, when the previous frame set and the current frame set are the second frame set and the third frame set, and when the previous frame set and the current frame set are the third frame set and the fourth frame set. The first example includes the example in fig. 11, and each of the other times when the at least one condition in step 1008 is satisfied is similar to the example in fig. 11. Thus, motion or scene changes in the previous frame set and motion or scene changes in the current frame set for each of a first plurality of times are determined to be of continuous motion. The slow motion capture start flag is asserted. The first plurality of times starts when the variable of the number of times of satisfaction equals zero and ends when the variable of the number of times of satisfaction equals the first predetermined plurality of times. In an example, the satisfaction time variable is equal to zero before the first set of frames and the second set of frames are processed. The first plurality is three times. In another example, occasionally, the motion of a moving object may, for example, decrease in speed, resulting in at least one condition not being satisfied in said step 1008. In this case, the satisfaction time variable remains the same, the first plurality is more than three times, and the motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality may still be decided to belong to continuous motion.
In the embodiment where the first dimension delta is the non-absolute value, the first predetermined plurality of times is at least two times. At least one condition in step 1008 is satisfied twice in a second example for a first and a second set of frames, a second and a third set of frames, compared to the at least one condition in step 1008 being satisfied three times in the first example for the first and second, second and third and fourth sets of frames. Thus, the motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of a first plurality of times are determined to belong to continuous motion. In an example, the satisfaction time variable is equal to zero before the first set of frames and the second set of frames are processed. The first plurality of times is two times. In another example, occasionally, the motion of a moving object may, for example, decrease in speed, resulting in at least one condition not being satisfied in said step 1008. In this case, the satisfaction time variable remains the same, the first plurality is more than two times, and the motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality may still be decided to belong to continuous motion.
In the embodiment that includes the first size difference amount as the absolute value and the second size difference amount as the non-absolute value, the first predetermined plurality of times is at least two times. At least one condition in step 1008 is satisfied twice in a third example for a first and a second set of frames, a second and a third set of frames, compared to the at least one condition in step 1008 being satisfied three times in the first example for the first and the second, the second and the third and the fourth set of frames. Thus, the motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of a first plurality of times are determined to belong to continuous motion. In an example, the satisfaction time variable is equal to zero before the first set of frames and the second set of frames are processed. The first plurality of times is two times. In another example, occasionally, the motion of a moving object may, for example, decrease in speed, resulting in at least one condition not being satisfied in said step 1008. In this case, the satisfaction time variable remains the same, the first plurality is more than two times, and the motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality may still be decided to belong to continuous motion.
Fig. 12-13 are diagrams illustrating images 1302, 1304, 1206, 1208, 1310 and 1312 in step 306 of continuous motion check using a first size difference between a current bounding box and a previous bounding box, wherein the motion or the scene change in the current frame set is determined not to belong to continuous motion, according to embodiments of the disclosure. Fig. 13 illustrates an example of the camera module 102 moving from stationary, the camera module 102 moving from moving to moving, and the camera module 102 moving from stationary. Fig. 12 illustrates the camera module 102 from motion to motion in fig. 13, comparing the example in fig. 12 with the example in fig. 11. Compared to the example in fig. 11, there are the following differences in the example of fig. 12. In the step 1002, a current bounding box 1212 is obtained, the current bounding box 1212 bounding a currently detected motion or scene change 1216. The currently detected motion or scene change 1216 occupies almost the entire image 1208, since camera motion is a global motion that causes a scene change. In the example of fig. 11, the motion of the moving object is a local motion. Because the current bounding box 1212 bounds the currently detected motion or scene change 1216, the current bounding box 1212 is near a boundary of the image 1208. In the step 1004, a first size difference between the current bounding box 1212 and a previous bounding box 1210 is obtained, the previous bounding box 1210 defining a previously detected motion or scene change 1214. The previously detected motion or scene change 1214 occupies almost the entire image 1206 due to camera motion. The previous bounding box 1210 is also close to a boundary of the image 1206. In the step 1008, the first size difference is not greater than the first predetermined size difference threshold because the current bounding box 1212 and the previous bounding box 1210 are both proximate to corresponding images 1208 and 1206. Thus, the motion in the previous frame set and the current frame set in the example of fig. 11 may be distinguished from scene changes in the previous frame set and the current frame set.
In a first example of fig. 13, the first size delta in step 1008 is the absolute value. The first example in fig. 13 has the following differences compared to the example in which the first example is included in fig. 11. As the camera module 102 is moving from stationary, scenes in a first frame set are stationary while scenes in a second frame set have a scene change. As the camera module 102 moves from motion to motion, scenes in the second set of frames have a scene change and scenes in a third set of frames have a scene change. Since the camera module 102 is moving to still, the scenes in the third frame set have a scene change, and the scenes in the fourth frame set have a scene change. The first set of frames through the fourth set of frames are arranged in order. When the previous frame set and the current frame set are the first frame set and the second frame set, a previous bounding box is empty and a current bounding box 1314 is near a boundary of the image 1304 such that a first size difference is greater than a first predetermined size difference threshold. When the previous and current frame sets are the second and third frame sets, the previous bounding box 1210 is near the boundary of the image 1206 and the current bounding box 1212 is near the boundary of the image 1208, resulting in a first size difference amount less than or equal to the first predetermined size difference threshold. When the previous and current frame sets are the third and fourth frame sets, a previous bounding box 1316 is near the boundary of the image 1310 and a current bounding box is empty, causing the first size difference to be greater than the first predetermined size difference threshold. Thus, at least one condition in said step 1008 is fulfilled twice. In order to decide that the motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of a first plurality of times do not belong to continuous motion, the first predetermined plurality of times in the step 1012 needs to be at least three times. Thus, the motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality of times are determined not to belong to continuous motion. The step 306 proceeds to the step 308 without declaring the slow motion capture start flag. In an example, the satisfaction time variable is equal to zero before the first set of frames and the second set of frames are processed. The first plurality is three times. In another example, the satisfaction time variable remains the same when the camera module 102 has more sets of frames from motion to motion. The first plurality is more than three times. The motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality of times may still be determined not to belong to continuous motion.
In a second example of fig. 13, the first size difference amount is the non-absolute value in step 1008. Compared to the first example in fig. 13, the second example in fig. 13 has the following differences. When the previous and current frame sets are the first and second frame sets, a previous bounding box is empty and the current bounding box 1314 is near the boundary of the image 1304 such that a first size difference is greater than a first predetermined size difference threshold. When the previous and current frame sets are the second and third frame sets, the previous bounding box 1210 is near the boundary of the image 1206 and the current bounding box 1212 is near the boundary of the image 1208, resulting in a first size difference amount less than or equal to the first predetermined size difference threshold. When the previous and current frame sets are the third and fourth frame sets, the previous bounding box 1316 is near the boundary of the image 1310 and the current bounding box is empty, causing a first size difference to be less than or equal to the first predetermined size difference threshold. Thus, at least one condition in said step 1008 is fulfilled once. In order to decide that the motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of a first predetermined number of times do not belong to a continuous motion, the first predetermined number of times needs to be at least twice in the step 1012. Thus, the motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality of times are determined not to belong to continuous motion. In an example, the satisfaction time variable is equal to zero before the first set of frames and the second set of frames are processed. The first plurality is three times. In another example, the satisfaction time variable remains the same when the camera module 102 has more sets of frames from motion to motion. The first plurality is more than three times. The motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality of times may still be determined not to belong to continuous motion.
In a third example of fig. 13, the first size delta and the second size delta in step 1008 are corresponding to the absolute value and the non-absolute value. Compared to the first example in fig. 13, the third example in fig. 13 has the following differences. When the previous and current frame sets are the first and second frame sets, the previous bounding box is empty and the current bounding box 1314 is near the boundary of the image 1304 such that a first size difference is greater than a first predetermined size difference threshold and a second size difference is greater than zero. When the previous and current frame sets are the second and third frame sets, the previous bounding box 1210 is near the boundary of the image 1206 and the current bounding box 1212 is near the boundary of the image 1208, such that the first size difference amount is less than or equal to the first predetermined size difference threshold. When the previous and current frame sets are the third and fourth frame sets, the previous bounding box 1316 is near the boundary of the image 1310 and the current bounding box is empty, causing the first size difference to be greater than the first predetermined size difference threshold and a second size difference to be less than or equal to zero. Thus, at least one condition in said step 1008 is fulfilled once. In order to decide that the motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of a first predetermined number of times do not belong to a continuous motion, the first predetermined number of times needs to be at least twice in the step 1012. Thus, the motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality of times are determined not to belong to continuous motion. In an example, the satisfaction time variable is equal to zero before the first set of frames and the second set of frames are processed. The first plurality is three times. In another example, the satisfaction time variable remains the same when the camera module 102 has more sets of frames from motion to motion. The first plurality is more than three times. The motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality of times may still be determined not to belong to continuous motion.
Fig. 14 is a flow chart illustrating step 306 of continuous motion check using a first feature value difference between a current feature value and a previous feature value, wherein the current feature value characterizes all pixels affected by the currently detected motion or scene change, according to an embodiment of the present disclosure. Referring to fig. 3 and 14, the step 306 includes the following steps. In step 1402, a current feature value of all pixels affected by the currently detected motion or scene change is obtained. In step 1404, a first eigenvalue difference between the current eigenvalue and a previous eigenvalue of all pixels affected by a previously detected motion or scene change is obtained. In a step 1406, the previous frame set is updated to the current frame set. In step 1408, if the first eigenvalue difference is greater than a first predetermined eigenvalue threshold, then proceed to step 1410; otherwise, a step 1412 is performed. In step 1410, a variable of the number of times of satisfaction is incremented. In step 1412, if the variable of the number of times of satisfaction is greater than or equal to a first predetermined number of times, proceed to step 1414; otherwise, the step 308 is performed. In step 1014, a slow motion capture start flag is asserted. After the step 1414 is performed, the step 308 is performed.
Fig. 15 is a diagram illustrating a plurality of images 1502 and 1504 in step 306 of continuous motion check using a first eigenvalue delta between a current eigenvalue and a previous eigenvalue, where the motion or the scene change in the current frame set is determined to be of continuous motion, according to an embodiment of the present disclosure. Fig. 15 is a diagram illustrating an example of the moving object following fig. 5. Referring to fig. 3, 5, 9, 14 and 15, in the step 1402, a current feature value of all pixels affected by the currently detected motion or scene change 904 is obtained. In one embodiment, the current feature value is a pixel number of all pixels affected by the currently detected motion or scene change. In one embodiment, the current feature value is the number of pixels of all pixels 1510 of the currently detected motion or scene change 904. The current feature value is a feature of the currently detected motion or scene change 904, illustrating an exact number of pixels of the currently detected motion or scene change 904, illustrated by the currently detected motion or scene change 904 being filled with a different pattern than that in FIG. 9. In another embodiment, the current feature value is a percentage of one pixel of a total number of pixels of the image 1504 for all pixels 1510 of the currently detected motion or scene change 904.
In step 1404, a first eigenvalue difference 1508 is obtained between the current eigenvalue and a previous eigenvalue of all pixels affected by a previously detected motion or scene change. The previously detected motion or scene change 1508 is a currently detected motion or scene change in a previous iteration of the step 208 (in fig. 3). The current frame set in the current iteration is subsequent to the current frame set in the previous iteration. That is, if the frames in the current frame set in the current iteration are at times t-2, t-1, and t, then the frames in the current frame set in the previous iteration are at times t-5, t-4, and t-3. The previous feature value is a current feature value of all pixels affected by the currently detected motion or scene change in the previous iteration.
In the step 1406, the previous frame set is updated to the current frame set. The step 1406 includes updating a previous feature value to the current feature value. Thus, the step 1404 may be performed in the next iteration.
In step 1408, if the first eigenvalue difference is greater than a first predetermined eigenvalue threshold, then go to step 1410; otherwise, the step 1412 is performed. In an embodiment, the first predetermined eigenvalue threshold is equal to a width of the frames 502 of the current frame set multiplied by one eighth of a height of the frames 502 of the current frame set.
In the above embodiment, the first eigenvalue delta is an absolute delta, and thus the first eigenvalue delta is for a real number. In another embodiment, the first variance is a relative variance in terms of a ratio or percentage.
In the step 1408, in one embodiment, the first characteristic value difference is an absolute value. In another embodiment, the first eigenvalue delta is a non-absolute value obtained by subtracting a previous eigenvalue from a current eigenvalue. In yet another embodiment, the first variance is an absolute value, and step 306 further comprises subtracting a previous variance from a current variance to obtain a second variance; if the second eigenvalue difference is greater than zero, then the step 1410 is performed. The second eigenvalue delta is a non-absolute value.
In the example of fig. 15, the first feature value difference is greater than the first predetermined feature value difference threshold due to motion of the moving object in the previous frame set and the current frame set, wherein the first feature value difference is the absolute value or the non-absolute value. In embodiments that include the first dispersion of feature values and the second dispersion of feature values, the second dispersion of feature values is greater than zero. Thus, the step 1410 is performed. In step 1410, a variable of the number of times of satisfaction is incremented. The number of times satisfied variable tracks each time at least one condition in the step 1408 is satisfied.
In the above embodiment, the variable of the number of times of satisfaction is traced by incrementing. In another embodiment, the variable of degree of satisfaction is traced by decrementing, multiplying, or dividing.
In step 1412, if the variable of the number of times of satisfaction is greater than or equal to a first predetermined number of times, proceed to step 1414; otherwise, the step 308 is performed. At the step 1414, the slow motion capture start flag is asserted. The reason for the number of the first predetermined plurality of times in the following embodiments will be explained in conjunction with fig. 17. In one embodiment, the step 1410 is performed only based on the condition in the step 1408 being satisfied. Further, in an embodiment where the first eigenvalue delta is the absolute value, the first predetermined number of times is at least three times. In a first example, due to motion of the moving object in a first frame set to a fourth frame set arranged in sequence, at least one condition in the step 1408 is satisfied when the previous frame set and the current frame set are the first frame set and the second frame set, when the previous frame set and the current frame set are the second frame set and the third frame set, and when the previous frame set and the current frame set are the third frame set and the fourth frame set. The first example includes the example in fig. 15, and each of the other times when the at least one condition in the step 1408 is satisfied is similar to the example in fig. 15. Thus, the motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of a first plurality of times are determined to belong to continuous motion. The slow motion capture start flag is asserted. The first plurality of times starts when the variable of the number of times of satisfaction equals zero and ends when the variable of the number of times of satisfaction equals the first predetermined plurality of times. In an example, the satisfaction time variable is equal to zero before the first set of frames and the second set of frames are processed. The first plurality is three times. In another example, occasionally, the motion of a moving object may, for example, decrease in speed, causing at least one condition in the step 1408 to be unsatisfied. In this case, the satisfaction time variable remains the same, the first plurality is more than three times, and the motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality may still be decided to belong to continuous motion.
In an embodiment where the first eigenvalue delta is the non-absolute value, the first predetermined number of times is at least two times. At least one condition in step 1408 is satisfied twice in a second example for a first and second set of frames, a second and third set of frames, and a fourth set of frames, as compared to the at least one condition in step 1408 being satisfied three times in the first example for the first and second set of frames, the second and third set of frames, and the fourth set of frames. Thus, the motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of a first plurality of times are determined to belong to continuous motion. In an example, the satisfaction time variable is equal to zero before the first set of frames and the second set of frames are processed. The first plurality of times is two times. In another example, occasionally, the motion of a moving object may, for example, decrease in speed, causing at least one condition in the step 1408 to be unsatisfied. In this case, the satisfaction time variable remains the same, the first plurality is more than two times, and the motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality may still be decided to belong to continuous motion.
In an embodiment that includes the first variance of eigenvalues as the absolute value and the second variance of eigenvalues as the non-absolute value, the first predetermined plurality of times is at least two times. At least one condition in step 1408 is satisfied twice in a third example for a first and second set of frames, a second set of frames, and a third set of frames, as compared to the at least one condition in step 1408 being satisfied three times in the first example for the first and second set of frames, the second and third set of frames, and the third and fourth set of frames. Thus, the motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of a first plurality of times are determined to belong to continuous motion. In an example, the satisfaction time variable is equal to zero before the first set of frames and the second set of frames are processed. The first plurality of times is two times. In another example, occasionally, the motion of a moving object may, for example, decrease in speed, causing at least one condition in the step 1408 to be unsatisfied. In this case, the satisfaction time variable remains the same, the first plurality is greater than two times, and the motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality may still be decided to belong to continuous motion.
Fig. 16-17 are graphs illustrating images 1702, 1704, 1606, 1608, 1710 and 1712 in step 306 of continuous motion check using a first eigenvalue difference between the current eigenvalue and the previous eigenvalue, wherein the motion or the scene change in the current frame set is determined not to belong to continuous motion, according to embodiments of the present disclosure. Fig. 17 illustrates an example where the camera module 102 is moving from stationary, the camera module 102 is moving from moving to moving, and the camera module 102 is moving from stationary. Fig. 16 illustrates the camera module 102 from motion to motion in fig. 17, comparing the example in fig. 16 with the example in fig. 15. Compared to the example in fig. 15, the example in fig. 16 has the following differences. In the step 1402, a current feature value is obtained for all pixels affected by a currently detected motion or scene change 1616. The currently detected motion or scene change 1616 occupies almost the entire image 1608 because camera motion is a global motion that causes a scene change. The motion of the moving object in the example in fig. 15 is a local motion. Because the current feature value is a pixel count of all pixels 1612 affected by the currently detected motion or scene change 1616, the current feature value is close to a total number of pixels of the image 1608. In step 1404, a first variance of eigenvalues between the current and previous eigenvalues of all pixels 1610 affected by a previously detected motion or scene change 1614 is obtained. The previously detected motion or scene change 1614 occupies almost the entire image 1606 due to camera motion. The previous feature value is also close to a total number of pixels of the image 1606. In the step 1408, the first feature value difference is not greater than the first predetermined feature value difference threshold because the current feature value and the previous feature value are both close to the total number of pixels of the corresponding images 1608 and 1606. Thus, the motion in the previous frame set and the current frame set in the example in fig. 15 can be distinguished from scene changes in the previous frame set and the current frame set.
In a first example in fig. 17, the first eigenvalue delta in step 1408 is the absolute value. The first example in fig. 17 has the following differences compared to the example in which the first example is included in fig. 15. As the camera module 102 transitions from stationary to moving, scenes in a first frame set are stationary and scenes in the second frame set have a scene change. As the camera module 102 moves from motion to motion, scenes in the second set of frames have a scene change and scenes in the third set of frames have a scene change. Since the camera module 102 is moving to still, the scenes in the third frame set have a scene change, and the scenes in the fourth frame set have a scene change. The first set of frames through the fourth set of frames are ordered in sequence. When the previous and current frame sets are the first and second frame sets, a previous feature value is zero and the current feature value is close to the total number of pixels of the image 1704 such that a first feature value difference is greater than a first predetermined feature value difference threshold. When the previous and current frame sets are the second and third frame sets, the previous feature value approaches the total number of pixels of the image 1606 and the current feature value approaches the total number of pixels of the image 1608, resulting in a first feature value difference amount being less than or equal to the first predetermined feature value difference amount threshold. When the previous and current frame sets are the third and fourth frame sets, a previous feature value is close to the total number of pixels of the image 1710 and a current feature value is zero, such that a first feature value difference is greater than the first predetermined feature value difference threshold. Thus, at least one condition in the step 1408 is satisfied twice. In order to decide that the motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of a first plurality of times do not belong to continuous motion, the first predetermined plurality of times needs to be at least three times in the step 1412. Thus, the motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of a first plurality of times are determined not to belong to continuous motion. Without declaring the slow motion capture start flag, the step 306 proceeds to the step 308. In an example, the satisfaction time variable is equal to zero before the first set of frames and the second set of frames are processed. The first plurality is three times. In another example, the satisfaction time variable remains the same when the camera module 102 has more sets of frames from motion to motion. The first plurality is more than three times. The motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality of times may still be determined not to belong to continuous motion.
In a second example in fig. 17, the first variance in the step 1408 is the non-absolute value. Compared to the first example in fig. 17, the second example in fig. 17 has the following differences. When the previous and current frame sets are the first and second frame sets, a previous feature value is zero and the current feature value is close to the total number of pixels of the image 1704 such that a first feature value difference is greater than a first predetermined feature value difference threshold. When the previous and current frame sets are the second and third frame sets, the previous feature value approaches the total number of pixels of the image 1606 and the current feature value approaches the total number of pixels of the image 1608, resulting in a first feature value difference amount being less than or equal to the first predetermined feature value difference amount threshold. When the previous and current frame sets are the third and fourth frame sets, the previous feature value is close to the total number of pixels of the image 1710 and the current feature value is zero, such that a first feature value difference is less than or equal to the first predetermined feature value difference threshold. Thus, at least one condition in the step 1408 is satisfied once. In order to decide that the motion or the scene change in the previous frame set and the motion or the scene change in the current frame set do not belong to continuous motion for each of a first plurality of times, the first predetermined plurality of times needs to be at least two times in the step 1412. Thus, the motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality of times are determined not to belong to continuous motion. In an example, the satisfaction time variable is equal to zero before the first set of frames and the second set of frames are processed. The first plurality is three times. In another example, the satisfaction time variable remains the same when the camera module 102 has more sets of frames from motion to motion. The first plurality is more than three times. The motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality of times may still be determined not to belong to continuous motion.
In a third example of fig. 17, the first variance and the second variance in step 1408 correspond to the absolute value and the non-absolute value. The third example in fig. 17 has the following differences compared to the first example in fig. 17. When the previous and current frame sets are the first and second frame sets, a previous feature value is zero and the current feature value is close to the total number of pixels of the image 1704 such that a first feature value difference is greater than a first predetermined feature value difference threshold and a second feature value difference is greater than zero. When the previous and current frame sets are the second and third frame sets, the previous feature value approaches the total number of pixels of the image 1606 and the current feature value approaches the total number of pixels of the image 1608, resulting in a first feature value difference amount being less than or equal to the first predetermined feature value difference amount threshold. When the previous and current frame sets are the third and fourth frame sets, the previous feature value is close to the total number of pixels of the image 1710 and the current feature value is zero, such that a first feature value difference is greater than the first predetermined feature value difference threshold and a second feature value difference is less than or equal to zero. Thus, at least one condition in the step 1408 is satisfied once. In order to decide that the motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of a first plurality of times do not belong to continuous motion, the first predetermined plurality of times needs to be at least two times in the step 1412. Thus, the motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of a first plurality of times are determined not to belong to continuous motion. In an example, the satisfaction time variable is equal to zero before the first set of frames and the second set of frames are processed. The first plurality is three times. In another example, the satisfaction time variable remains the same when the camera module 102 has more sets of frames from motion to motion. The first plurality is more than three times. The motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality of times may still be determined not to belong to continuous motion.
FIG. 18 is a flow chart illustrating step 306 of continuous motion check using a first feature difference between a current feature and a previous feature in accordance with yet another embodiment of the present disclosure. Compared to the embodiment in fig. 14, the embodiment in fig. 18 has several replaced steps 1408, 1410, 1802, and 1804 between the step 1406 and the step 1412. In step 1408, if the first eigenvalue difference is greater than a first predetermined eigenvalue threshold, then go to step 1410; otherwise, the step 1802 is performed. In the step 1802, if the first feature value dispersion is less than or equal to zero, then go to a step 1804; otherwise, the step 1412 is performed. In the step 1804, if the satisfied variable is greater than zero, the satisfied time variable is decremented.
In the embodiment of fig. 18, the first eigenvalue delta is a non-absolute value. In one embodiment, if at least a second number of times in a first plurality meets one of the conditions in step 1408, and if at least a third number of times in the first plurality meets one of the conditions in step 1802, then the at least a second number is greater than the at least a third number by at least a first predetermined plurality of times. The first predetermined plurality of times is two times. In contrast to the second example described with reference to FIG. 15, where the satisfaction time variable remains the same when a condition in the step 1408 is not satisfied, in an example of FIG. 18, the satisfaction time variable may be decremented or remain the same based on whether a condition in the step 1802 is satisfied. Thus, if the at least one third number is one due to a decrease in velocity of the motion of the moving object, the first number is four times and the at least one second number is three times. The motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality of times may still be determined to belong to continuous motion. In contrast to the second example described with reference to FIG. 17, where the satisfaction time variable remains the same when the condition in the step 1408 is not satisfied, in an example of FIG. 18, the satisfaction time variable may be decremented or remain the same based on whether a condition in the step 1802 is satisfied. In this way, if the at least one third time is two times due to the camera module 102 changing from moving to stationary and the camera module 102 changing from moving to stationary, the first plurality of times is three times and the at least one second time is one time. The motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality of times are determined not to belong to continuous motion. If the at least one third time is three times when the number of frames from motion to motion of the camera module 102 is large, the first times is four times, and the at least one second time is one time. The motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality of times may still be determined not to belong to continuous motion.
FIG. 19 is a flow chart illustrating step 306 of continuous motion check using a first feature difference between a current feature and a previous feature in accordance with yet another embodiment of the present disclosure. Referring to the embodiments in fig. 5 and 19, compared to the embodiment in fig. 18, the embodiment in fig. 19 has a step 1902 instead of the step 1802. In step 1902, if the first variance measure is less than or equal to zero and a second variance measure is greater than or equal to a second predetermined variance measure threshold, then go to step 1804; otherwise, the step 1412 is performed. In an embodiment, the second predetermined eigenvalue threshold is equal to the width of the frames 502 of the current frame set multiplied by one thousand six hundredths of the height of the frames 502 of the current frame set.
In the embodiment of fig. 19, the first variance is a non-absolute value, and the second variance is an absolute value of the first variance. In one embodiment, if a condition in step 1408 is satisfied for at least a second number of the first plurality of times, and if a condition in step 1902 is satisfied for at least a third number of the first plurality of times, then the at least a second number is greater than the at least a third number by at least the first predetermined plurality of times. The first predetermined plurality of times is two times. Regardless of a decrement amount in the example of fig. 18, the satisfaction time variable is decremented only when the decrease in speed exceeds a certain limit in the example of fig. 19, as compared to the satisfaction time variable that is decremented due to, for example, a decrease in speed of the moving object. In contrast to the satisfaction time variable being decremented because the camera module 102 is moving from motion to motion in the example in fig. 18, in the example of fig. 19, the satisfaction time variable remains the same if the camera module 102 is moving from motion to motion. In this way, if the camera module 102 is moving to rest, the at least one third time is one time, the first times is three times, and the at least one second time is one time. The motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality of times are determined not to belong to continuous motion. If the camera module 102 has more frames to be converted from motion to stationary, the at least one third time is still one, the first plurality of times is four times, and the at least one second time is one, because the camera module 102 is converted from motion to stationary. The motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality of times may still be determined not to belong to continuous motion.
In contrast to the second example described with reference to fig. 15, where the satisfaction time variable remains the same when a condition in the step 1408 is not satisfied, in fig. 19, the satisfaction time variable may be decremented or remain the same based on whether a condition in the step 1902 is satisfied. In this way, if the at least one third number is one due to a decrease in speed of the motion of the moving object, the first number is four times, and the at least one second number is three times. The motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality of times may still be determined to belong to continuous motion. In contrast to the second example described with reference to fig. 17, where the satisfaction time variable remains the same when the condition in the step 1408 is not satisfied, in fig. 18, the satisfaction time variable may be decremented or remain the same based on whether a condition in the step 1802 is satisfied. In this way, if the camera module 102 changes from moving to moving and the camera module 102 changes from moving to stationary, the at least one third time is two times, the first number is three times, and the at least one second time is one time. The motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality of times are determined not to belong to continuous motion. If the at least one third time is three times when the camera module 102 has more frames from motion to motion, the first plurality of times is four times, and the at least one second time is two times. The motion or the scene change in the previous frame set and the motion or the scene change in the current frame set for each of the first plurality of times may still be determined not to belong to continuous motion.
20-21 illustrate a flow chart of step 306 of continuous motion check using a first size difference between a current bounding box and a previous bounding box and a first feature difference between a current feature and a previous feature, respectively, according to yet another embodiment of the present disclosure. In one embodiment, the multiple embodiments in fig. 10 and 14, 18, or 19 are combined. By combining the embodiments of fig. 10 and 14, 18 or 19, the continuous motion check further ensures non-spurious motion detection, such as distinguishing motion of a small moving object from noise. After combination, the first predetermined number of times is at least three times when all conditions in the described embodiment relate to absolute values only, and the first predetermined number of times is at least two times when at least one condition in the described embodiment relates to a non-absolute value. In contrast to the embodiments that are combined in fig. 10 and 19, the embodiments in fig. 20-21 have the following differences in one branch of the step 1008 in fig. 10 being (yes) by directly inserting a step between step 1406 and step 1412 in fig. 19. If, in said step 1008, said first size difference is greater than a first predetermined size difference threshold, then said step 1408 is performed; otherwise, instead of performing said step 1012 in fig. 10, a step 2102 is performed in fig. 21. In the step 2102, if the first variance is greater than a second predetermined variance threshold, the step 1010 is performed; otherwise, a step 2104 is performed. Step 2002 in FIG. 20 is a combination of step 1006 in FIG. 10 and step 1406 in FIG. 14. Said step 2104 is similar to said step 1902 of fig. 19, wherein said third predetermined eigenvalue threshold of fig. 21 is said second predetermined eigenvalue threshold of fig. 19. In an embodiment, said second predetermined eigenvalue threshold in said step 2102 is equal to said width of said frames 502 of said current frame set multiplied by one fourth of said height of said frames 502 of said current frame set. The blending in of step 2102 allows motion of a moving object at the same location to be detected and trigger the slow motion capture. FIG. 22 is a flow chart illustrating step 306 of further continuous motion check using a current feature value of all pixels affected by the currently detected motion or scene change according to yet another embodiment of the present disclosure. Compared with step 306 in fig. 20 and 21, step 306 in fig. 22 further includes the following steps between step 2002 and step 1008. In step 2202, if the current feature value is greater than a predetermined feature value threshold, then go to step 1008; otherwise, a step 2204 is performed. In the step 2204, the slow motion capture start flag is deasserted and the number of times of satisfaction variable is reset. In an embodiment, the predetermined eigenvalue threshold is equal to a minimum of one tenth of the width of the frames 502 of the current frame set and one tenth of the height of the frames 502 of the current frame set.
Some embodiments have one or a combination of the following features and/or advantages. In a related technique, whether a change in a frame set belongs to motion is determined based on whether a bounding box bounding the changes has a perimeter greater than a bounding box threshold. In this way, scene changes from motion to motion due to a camera module cannot be distinguished from motion of a moving object. In contrast to the related art, some embodiments of the present disclosure perform continuous motion inspection. Thus, it is possible to distinguish between motion of a moving object and scene changes due to the camera module going from stationary to motion, the camera module going from motion to motion, and the camera module going from motion to stationary. In addition, some embodiments of the present disclosure perform an AND operation (AND) on a plurality of pixel delta images to generate a detected motion or scene change. Thus, the detected motion or scene change is not caused by tremor. One of ordinary skill in the art will appreciate that each of the various elements, modules, layers, blocks, algorithms, and steps of the system or computer-implemented method described and disclosed in the embodiments of the present disclosure may be implemented using hardware, firmware, software, or combinations thereof. Whether the functionality operates as hardware, firmware, or software depends upon the application and design requirements of a particular implementation. Those of ordinary skill in the art may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure. It should be understood that the system and computer-implemented method disclosed in embodiments of the present disclosure may be implemented in other ways. The embodiments described above are merely exemplary. The division of the modules is based solely on logical functions, while other divisions exist in implementation. The plurality of modules may or may not be a plurality of physical modules. It is possible that a plurality of modules are combined or integrated into one physical module. It is also possible to divide any module into a plurality of physical modules. It is also possible to omit or skip certain features. In another aspect, the shown or discussed mutual coupling, direct coupling or communicative coupling operate indirectly or communicatively through some port, device or module, electrically, mechanically or otherwise.
The modules, which are illustratively separate components, may or may not be physically separate. The plurality of modules are located at one place or distributed over a plurality of network modules. Some or all of the modules are used for purposes of embodiments. The software functional module, if implemented as a product, used and sold, may be stored in a computer-readable storage medium. Based on this understanding, the technical solutions proposed by the present disclosure can be implemented substantially or partially in the form of software products. Alternatively, a part of the technical solution advantageous to the prior art may be implemented in the form of a software product. The software product is stored in a computer readable storage medium and includes instructions for a system having at least one processor to execute all or a portion of the steps disclosed in the various embodiments of the disclosure. The storage medium includes a USB disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a floppy disk or other medium capable of storing program instructions. While the present disclosure has been described in connection with what is presently considered to be the most practical and preferred embodiment, it is to be understood that the disclosure is not to be limited to the disclosed embodiment, but is intended to cover various arrangements made without departing from the broadest interpretation of the appended claims.

Claims (156)

1. A computer-implemented method, characterized by: the method comprises the following steps:
continuously performing an inspection method a first plurality of times, wherein the inspection method comprises:
obtaining a plurality of first frames arranged in sequence;
detecting motion or scene changes in the first plurality of frames to generate a first detected motion or scene change;
obtaining a first bounding box bounding the first detected motion or scene change;
obtaining a plurality of second frames arranged in sequence, wherein the plurality of second frames follow the plurality of first frames;
detecting motion or scene changes in the second plurality of frames to generate a second detected motion or scene change;
obtaining a second bounding box bounding the second detected motion or scene change;
obtaining a first size difference between the first bounding box and the second bounding box; and
updating the plurality of first frames to the plurality of second frames and the plurality of second frames to a plurality of third frames subsequent to the plurality of second frames; wherein for each of a second plurality of the first plurality, the first dimension difference is greater than a first predetermined threshold, and the second plurality is greater than or equal to a first predetermined plurality; and
determining the motion or scene change in the plurality of first frames and the motion or scene change in the plurality of second frames for each of the first plurality of times to be continuous motion based on the first size difference for each of the second plurality of times.
2. The computer-implemented method of claim 1, wherein: the step of obtaining the first size delta between the first bounding box and the second bounding box comprises:
obtaining a first side length difference between a first side of the first bounding box and a side of the second bounding box parallel to the first side of the first bounding box; and
obtaining a second side length difference between a second side of the first bounding box and a side of the second bounding box parallel to the second side of the first bounding box, wherein the second side is perpendicular to the first side; wherein the first dimension difference comprises the first side length difference and the second side length difference.
3. The computer-implemented method of claim 2, wherein: the first predetermined threshold includes a first predetermined side length threshold and a second predetermined side length threshold.
4. The computer-implemented method of claim 3, wherein: the first side length difference is greater than the first predetermined side length threshold, and the second side length difference is greater than the second predetermined side length threshold.
5. The computer-implemented method of claim 3, wherein: the first predetermined side length threshold is equal to one tenth of a width of one of the plurality of first frames and the second predetermined side length threshold is equal to one tenth of a height of the one of the plurality of first frames.
6. The computer-implemented method of claim 1, wherein: the first dimension difference is an absolute value.
7. The computer-implemented method of claim 6, wherein: the first predetermined plurality of times is at least three times.
8. The computer-implemented method of claim 1, wherein: obtaining the first size delta by subtracting a size of the first bounding box from a size of the second bounding box.
9. The computer-implemented method of claim 8, wherein: the first predetermined plurality of times is at least two times.
10. The computer-implemented method of claim 1, wherein:
the first dimension difference is an absolute value;
the inspection method further includes:
subtracting a size of the first bounding box from a size of the second bounding box to obtain a second size difference; wherein for each of the second plurality of times, the second size difference amount is greater than zero; and
the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality are determined to belong to the continuous motion also based on the second size difference amount for each of the second plurality.
11. The computer-implemented method of claim 10, wherein: the first predetermined plurality of times is at least two times.
12. The computer-implemented method of claim 1, wherein: the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality are determined to belong to the continuous motion based only on the first size difference amount for each of the second plurality.
13. The computer-implemented method of claim 1, wherein:
the inspection method further includes:
obtaining a first feature value of all pixels affected by the first detected motion or scene change;
obtaining a second feature value of all pixels affected by the second detected motion or scene change; and
obtaining a first eigenvalue differential between the first eigenvalue and the second eigenvalue; wherein for each of the second plurality of times, the first variance of characteristic values is greater than a second predetermined threshold; and
determining the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality of times to belong to the continuous motion based also on the first eigenvalue delta for each of the second plurality of times.
14. The computer-implemented method of claim 13, wherein:
the first feature value is a pixel number of all pixels affected by the first detected motion or scene change; and
the second feature value is a pixel number of all pixels affected by the second detected motion or scene change.
15. The computer-implemented method of claim 14, wherein: the second predetermined threshold is equal to a width of one of the plurality of first frames multiplied by one eighth of a height of the one of the plurality of first frames.
16. The computer-implemented method of claim 13, wherein: the first eigenvalue delta is an absolute value.
17. The computer-implemented method of claim 16, wherein: the first predetermined plurality of times is at least three times.
18. The computer-implemented method of claim 13, wherein: obtaining the first eigenvalue delta by subtracting the first eigenvalue from the second eigenvalue.
19. The computer-implemented method of claim 18, wherein: the first predetermined plurality of times is at least two times.
20. The computer-implemented method of claim 13, wherein:
the first characteristic value difference is an absolute value;
the inspection method further includes:
subtracting the first feature value from the second feature value to obtain a second feature value difference; wherein for each of the second plurality of times, the second eigenvalue delta is greater than zero; and
the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality are determined to belong to the continuous motion also based on the second eigenvalue delta for each of the second plurality.
21. The computer-implemented method of claim 20, wherein: the first predetermined plurality of times is at least two times.
22. The computer-implemented method of claim 1, wherein:
the inspection method further includes:
obtaining a first feature value for all pixels affected by the second detected motion or scene change, wherein the first feature value is greater than the second predetermined threshold for each of the second plurality of times; and
determining that the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality of times belongs to the continuous motion further based on the first feature value for each of the second plurality of times.
23. The computer-implemented method of claim 22, wherein: the first feature value is a pixel number of all pixels affected by the second detected motion or scene change.
24. The computer-implemented method of claim 23, wherein: the second predetermined threshold is equal to a minimum of a tenth of a width of one of the plurality of first frames and a tenth of a height of one of the plurality of first frames.
25. The computer-implemented method of claim 1, wherein: the plurality of first frames include a third frame, a fourth frame and a fifth frame arranged in sequence.
26. The computer-implemented method of claim 25, wherein: the step of detecting the motion or scene change in the plurality of first frames to generate the first detected motion or scene change comprises:
obtaining a first pixel difference image between the third frame and the fifth frame;
obtaining a second pixel difference image between the fourth frame and the fifth frame; and
performing an AND operation using the first and second delta images to obtain a first image including the first detected motion or scene change.
27. The computer-implemented method of claim 26, wherein: the step of performing the AND operation using the first delta pixel image and the second delta pixel image to obtain the first image comprising the first detected motion or scene change comprises:
performing the and operation on the first pixel difference image and the second pixel difference image to obtain a second image;
binarizing the second image to obtain the first image.
28. The computer-implemented method of claim 26, wherein: the step of detecting the motion or scene change in the plurality of first frames to generate the first detected motion or scene change further comprises:
performing an erosion operation on the first image to obtain a second image, wherein the first detected motion or scene change is indicated in the second image.
29. The computer-implemented method of claim 28, wherein: performing the erosion operation with a cross-structure kernel.
30. The computer-implemented method of claim 1, wherein: the method further comprises the following steps:
obtaining a plurality of samples from a gyroscope, wherein the plurality of samples indicate that a camera module has no rotational motion;
wherein
Performing the inspection method based on the plurality of samples;
obtaining the plurality of first frames using the camera module; and
obtaining the plurality of second frames using the camera module.
31. The computer-implemented method of claim 1, wherein: the step of determining, based on the first size difference amount for each of the second plurality of times, that the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality of times belong to continuous motion comprises:
triggering a slow motion capture of a camera module based on the first dimension difference for each of the second plurality of times;
wherein
Obtaining the plurality of first frames using the camera module; and
obtaining the plurality of second frames using the camera module.
32. A computer-implemented method, characterized by: the method comprises the following steps:
continuously performing an inspection method a first plurality of times, wherein the inspection method comprises:
obtaining a plurality of first frames arranged in sequence;
detecting motion or scene changes in the first plurality of frames to generate a first detected motion or scene change;
obtaining a first feature value of all pixels affected by the first detected motion or scene change;
obtaining a plurality of second frames arranged in sequence, wherein the plurality of second frames follow the plurality of first frames;
detecting motion or scene changes in the second plurality of frames to generate a second detected motion or scene change;
obtaining a second feature value of all pixels affected by the second detected motion or scene change; and
obtaining a first eigenvalue differential between the first eigenvalue and the second eigenvalue; wherein for each of a second plurality of the first plurality, the first characteristic difference is greater than a first predetermined threshold, and the second plurality is greater than or equal to a first predetermined plurality; and
determining that the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames of each of the first plurality belong to continuous motion based on the first eigenvalue delta for each of the second plurality.
33. The computer-implemented method of claim 32, wherein:
the first feature value is a pixel number of all pixels affected by the first detected motion or scene change; and
the second feature value is a pixel number of all pixels affected by the second detected motion or scene change.
34. The computer-implemented method of claim 33, wherein: the second predetermined threshold is equal to a width of one of the plurality of first frames multiplied by one eighth of a height of the one of the first frames.
35. The computer-implemented method of claim 32, wherein: the first eigenvalue delta is an absolute value.
36. The computer-implemented method of claim 35, wherein: the first predetermined plurality of times is at least three times.
37. The computer-implemented method of claim 32, wherein: obtaining the first eigenvalue delta by subtracting the first eigenvalue from the second eigenvalue.
38. The computer-implemented method of claim 37, wherein: the first predetermined plurality of times is at least two times.
39. The computer-implemented method of claim 32, wherein:
for each of at least one of the first plurality of times, the first feature value difference is less than or equal to zero;
said second plurality being at least said first predetermined plurality of times greater than said at least one time;
the first predetermined plurality of times is two times.
40. The computer-implemented method of claim 32, wherein:
the first characteristic value difference is an absolute value;
the inspection method further includes:
subtracting the first feature value from the second feature value to obtain a second feature value difference; wherein for each of the second plurality of times, the second eigenvalue delta is greater than zero; and
the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality are determined to belong to the continuous motion also based on the second eigenvalue delta for each of the second plurality.
41. The computer-implemented method of claim 39, wherein: the first predetermined plurality of times is at least two times.
42. The computer-implemented method of claim 32, wherein:
the first characteristic value difference is an absolute value;
the inspection method further includes:
subtracting the first feature value from the second feature value to obtain a second feature value difference; wherein for each of at least one of the first plurality of times, the first variance measure is greater than or equal to a second predetermined threshold and the second variance measure is less than zero;
said second plurality being at least said first predetermined plurality of times greater than said at least one time; and
the first predetermined plurality of times is two times.
43. The computer-implemented method of claim 42, wherein: for each of a third plurality of the first plurality of times other than the at least one time and the second plurality of times, the first eigenvalue delta is less than the second predetermined threshold and the second eigenvalue delta is less than zero.
44. The computer-implemented method of claim 32, wherein: the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames of each of the first plurality are determined to belong to the continuous motion based only on the first eigenvalue delta for each of the second plurality.
45. The computer-implemented method of claim 32, wherein:
the inspection method further includes:
obtaining a first bounding box bounding the first detected motion or scene change;
obtaining a second bounding box bounding the first detected motion or scene change;
obtaining a first size difference between the first bounding box and the second bounding box; wherein for each of the second plurality of times, the first size differential is greater than a second predetermined threshold; and
determining the motion or scene change in the plurality of first frames and the motion or scene change in the plurality of second frames for each of the first plurality to belong to the continuous motion also based on the first size difference amount for each of the second plurality.
46. The computer-implemented method of claim 45, wherein: the step of obtaining the first size delta between the first bounding box and the second bounding box comprises:
obtaining a first side length difference between a first side of the first bounding box and a side of the second bounding box parallel to the first side of the first bounding box; and
obtaining a second side length difference between a second side of the first bounding box and a side of the second bounding box parallel to the second side of the first bounding box, wherein the second side is perpendicular to the first side; wherein the first dimension difference comprises the first side length difference and the second side length difference.
47. The computer-implemented method of claim 46, wherein: the second predetermined threshold includes a first predetermined side length threshold and a second predetermined side length threshold.
48. The computer-implemented method of claim 47, wherein: the first side length difference is greater than the first predetermined side length threshold, and the second side length difference is greater than the second predetermined side length threshold.
49. The computer-implemented method of claim 47, wherein: the first predetermined side length threshold is equal to one tenth of a width of one of the plurality of first frames and the second predetermined side length threshold is equal to one tenth of a height of the one of the plurality of first frames.
50. The computer-implemented method of claim 45, wherein: the first dimension difference is an absolute value.
51. The computer-implemented method of claim 50, wherein: the first predetermined plurality of times is at least three times.
52. The computer-implemented method of claim 45, wherein: obtaining the first size delta by subtracting a size of the first bounding box from a size of the second bounding box.
53. The computer-implemented method of claim 52, wherein: the first predetermined plurality of times is at least two times.
54. The computer-implemented method of claim 45, wherein:
the first dimension difference is an absolute value;
the inspection method further includes:
subtracting a size of the first bounding box from a size of the second bounding box to obtain a second size difference; wherein for each of the second plurality of times, the second size difference amount is greater than zero; and
the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality are determined to belong to the continuous motion also based on the second size difference amount for each of the second plurality.
55. The computer-implemented method of claim 54, wherein: the first predetermined plurality of times is at least two times.
56. The computer-implemented method of claim 32, wherein:
the inspection method further includes:
obtaining a first bounding box bounding the first detected motion or scene change;
obtaining a second bounding box bounding the first detected motion or scene change;
obtaining a first size difference between the first bounding box and the second bounding box; wherein for at least one of the second plurality of times, the first size differential amount is less than or equal to a second predetermined threshold; and
determining the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality to belong to the continuous motion also based on the first size difference amount for the at least one of the first plurality.
57. The computer-implemented method of claim 56, wherein:
the first feature value is a pixel number of all pixels affected by the first detected motion or scene change; and
the second feature value is a pixel number of all pixels affected by the second detected motion or scene change.
58. The computer-implemented method of claim 57, wherein: the first predetermined threshold is equal to a width of one of the plurality of first frames multiplied by one fourth percent of a height of the one of the plurality of first frames.
59. The computer-implemented method of claim 56, wherein: the step of obtaining the first size delta between the first bounding box and the second bounding box comprises:
obtaining a first side length difference between a first side of the first bounding box and a side of the second bounding box parallel to the first side of the first bounding box; and
obtaining a second side length difference between a second side of the first bounding box and a side of the second bounding box parallel to the second side of the first bounding box, wherein the second side is perpendicular to the first side; wherein the first dimension difference comprises the first side length difference and the second side length difference.
60. The computer-implemented method of claim 59, wherein: the second predetermined threshold includes a first predetermined side length threshold and a second predetermined side length threshold.
61. The computer-implemented method of claim 60, wherein: the first side length difference is less than or equal to the first predetermined side length threshold, and the second side length difference is less than or equal to the second predetermined side length threshold.
62. The computer-implemented method of claim 60, wherein: the first predetermined side length threshold is equal to one tenth of a width of one of the plurality of first frames and the second predetermined side length threshold is equal to one tenth of a height of the one of the plurality of first frames.
63. The computer-implemented method of claim 56, wherein: the first dimension difference is an absolute value.
64. The computer-implemented method of claim 63, wherein: the first predetermined plurality of times is at least three times.
65. The computer-implemented method of claim 66, wherein: obtaining the first size delta by subtracting a size of the first bounding box from a size of the second bounding box.
66. The computer-implemented method of claim 65, wherein: the first predetermined plurality of times is at least two times.
67. The computer-implemented method of claim 56, wherein:
the first dimension difference is an absolute value;
the inspection method further includes:
subtracting a size of the first bounding box from a size of the second bounding box to obtain a second size difference; wherein for each of the second plurality of times, the second size difference amount is greater than zero; and
the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality are determined to belong to the continuous motion also based on the second size difference amount for each of the second plurality.
68. The computer-implemented method of claim 67, wherein: the first predetermined plurality of times is at least two times.
69. The computer-implemented method of claim 32, wherein:
the inspection method further includes:
obtaining a first feature value for all pixels affected by the second detected motion or scene change, wherein the first feature value is greater than a second predetermined threshold for each of the second plurality of times; and
determining that the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality of times belongs to the continuous motion further based on the first feature value for each of the second plurality of times.
70. The computer-implemented method of claim 69, wherein: the first feature value is a pixel number of all pixels affected by the second detected motion or scene change.
71. The computer-implemented method of claim 70, wherein: the second predetermined threshold is equal to a minimum of a tenth of a width of one of the plurality of first frames and a tenth of a height of one of the plurality of first frames.
72. The computer-implemented method of claim 32, wherein: the plurality of first frames include a third frame, a fourth frame and a fifth frame arranged in sequence.
73. The computer-implemented method of claim 72, wherein: the step of detecting the motion or scene change in the plurality of first frames to generate the first detected motion or scene change comprises:
obtaining a first pixel difference image between the third frame and the fifth frame;
obtaining a second pixel difference image between the fourth frame and the fifth frame; and
performing an AND operation using the first and second delta images to obtain a first image including the first detected motion or scene change.
74. The computer-implemented method of claim 73, wherein: the step of performing the AND operation using the first delta pixel image and the second delta pixel image to obtain the first image comprising the first detected motion or scene change comprises:
performing the and operation on the first pixel difference image and the second pixel difference image to obtain a second image;
binarizing the second image to obtain the first image.
75. The computer-implemented method of claim 73, wherein: the step of detecting the motion or scene change in the plurality of first frames to generate the first detected motion or scene change further comprises:
performing an erosion operation on the first image to obtain a second image, wherein the first detected motion or scene change is indicated in the second image.
76. The computer-implemented method of claim 75, wherein: performing the erosion operation with a cross-structure kernel.
77. The computer-implemented method of claim 32, wherein: the method further comprises the following steps:
obtaining a plurality of samples from a gyroscope, wherein the plurality of samples indicate that a camera module has no rotational motion;
wherein
Performing the inspection method based on the plurality of samples;
obtaining the plurality of first frames using the camera module; and
obtaining the plurality of second frames using the camera module.
78. The computer-implemented method of claim 32, wherein: the step of determining, based on the first feature value difference amount for each of the second plurality of times, that the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality of times belong to continuous motion comprises:
triggering slow motion capture of a camera module based on the first eigenvalue difference for each of the second plurality of times;
wherein
Obtaining the plurality of first frames using the camera module; and
obtaining the plurality of second frames using the camera module.
79. A system, characterized by: the system comprises:
at least one memory configured to store a plurality of program instructions;
at least one processor configured to execute the plurality of program instructions, the plurality of program instructions causing the at least one processor to perform a plurality of steps comprising:
continuously performing an inspection method a first plurality of times, wherein the inspection method comprises:
obtaining a plurality of first frames arranged in sequence;
detecting motion or scene changes in the first plurality of frames to generate a first detected motion or scene change;
obtaining a first bounding box bounding the first detected motion or scene change;
obtaining a plurality of second frames arranged in sequence, wherein the plurality of second frames follow the plurality of first frames;
detecting motion or scene changes in the second plurality of frames to generate a second detected motion or scene change;
obtaining a second bounding box bounding the second detected motion or scene change;
obtaining a first size difference between the first bounding box and the second bounding box; and
updating the plurality of first frames to the plurality of second frames and the plurality of second frames to a plurality of third frames subsequent to the plurality of second frames; wherein for each of a second plurality of the first plurality, the first dimension difference is greater than a first predetermined threshold, and the second plurality is greater than or equal to a first predetermined plurality; and
determining the motion or scene change in the plurality of first frames and the motion or scene change in the plurality of second frames for each of the first plurality of times to be continuous motion based on the first size difference for each of the second plurality of times.
80. The system of claim 79, wherein: the step of obtaining the first size delta between the first bounding box and the second bounding box comprises:
obtaining a first side length difference between a first side of the first bounding box and a side of the second bounding box parallel to the first side of the first bounding box; and
obtaining a second side length difference between a second side of the first bounding box and a side of the second bounding box parallel to the second side of the first bounding box, wherein the second side is perpendicular to the first side; wherein the first dimension difference comprises the first side length difference and the second side length difference.
81. The system of claim 80, wherein: the first predetermined threshold includes a first predetermined side length threshold and a second predetermined side length threshold.
82. The system of claim 81, wherein: the first side length difference is greater than the first predetermined side length threshold, and the second side length difference is greater than the second predetermined side length threshold.
83. The system of claim 81, wherein: the first predetermined side length threshold is equal to one tenth of a width of one of the plurality of first frames and the second predetermined side length threshold is equal to one tenth of a height of the one of the plurality of first frames.
84. The system of claim 79, wherein: the first dimension difference is an absolute value.
85. The system of claim 84, wherein: the first predetermined plurality of times is at least three times.
86. The system of claim 79, wherein: obtaining the first size delta by subtracting a size of the first bounding box from a size of the second bounding box.
87. The system of claim 86, wherein: the first predetermined plurality of times is at least two times.
88. The system of claim 79, wherein:
the first dimension difference is an absolute value;
the inspection method further includes:
subtracting a size of the first bounding box from a size of the second bounding box to obtain a second size difference; wherein for each of the second plurality of times, the second size difference amount is greater than zero; and
the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality are determined to belong to the continuous motion also based on the second size difference amount for each of the second plurality.
89. The system of claim 88, wherein: the first predetermined plurality of times is at least two times.
90. The system of claim 79, wherein: the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality are determined to belong to the continuous motion based only on the first size difference amount for each of the second plurality.
91. The system of claim 79, wherein:
the inspection method further includes:
obtaining a first feature value of all pixels affected by the first detected motion or scene change;
obtaining a second feature value of all pixels affected by the second detected motion or scene change; and
obtaining a first eigenvalue differential between the first eigenvalue and the second eigenvalue; wherein for each of the second plurality of times, the first variance of characteristic values is greater than a second predetermined threshold; and
determining the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality of times to belong to the continuous motion based also on the first eigenvalue delta for each of the second plurality of times.
92. The system of claim 91, wherein:
the first feature value is a pixel number of all pixels affected by the first detected motion or scene change; and
the second feature value is a pixel number of all pixels affected by the second detected motion or scene change.
93. The system of claim 92, wherein: the second predetermined threshold is equal to a width of one of the plurality of first frames multiplied by one eighth of a height of the one of the plurality of first frames.
94. The system of claim 91, wherein: the first eigenvalue delta is an absolute value.
95. The system of claim 94, wherein: the first predetermined plurality of times is at least three times.
96. The system of claim 91, wherein: obtaining the first eigenvalue delta by subtracting the first eigenvalue from the second eigenvalue.
97. The system of claim 96, wherein: the first predetermined plurality of times is at least two times.
98. The system of claim 91, wherein:
the first characteristic value difference is an absolute value;
the inspection method further includes:
subtracting the first feature value from the second feature value to obtain a second feature value difference; wherein for each of the second plurality of times, the second eigenvalue delta is greater than zero; and
the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality are determined to belong to the continuous motion also based on the second eigenvalue delta for each of the second plurality.
99. The system of claim 98, wherein: the first predetermined plurality of times is at least two times.
100. The system of claim 79, wherein:
the inspection method further includes:
obtaining a first feature value for all pixels affected by the second detected motion or scene change, wherein the first feature value is greater than the second predetermined threshold for each of the second plurality of times; and
determining that the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality of times belongs to the continuous motion further based on the first feature value for each of the second plurality of times.
101. The system of claim 100, wherein: the first feature value is a pixel number of all pixels affected by the second detected motion or scene change.
102. The system of claim 101, wherein: the second predetermined threshold is equal to a minimum of a tenth of a width of one of the plurality of first frames and a tenth of a height of one of the plurality of first frames.
103. The system of claim 79, wherein: the plurality of first frames include a third frame, a fourth frame and a fifth frame arranged in sequence.
104. The system of claim 103, wherein: the step of detecting the motion or scene change in the plurality of first frames to generate the first detected motion or scene change comprises:
obtaining a first pixel difference image between the third frame and the fifth frame;
obtaining a second pixel difference image between the fourth frame and the fifth frame; and
performing an AND operation using the first and second delta images to obtain a first image including the first detected motion or scene change.
105. The system of claim 104, wherein: the step of performing the AND operation using the first delta pixel image and the second delta pixel image to obtain the first image comprising the first detected motion or scene change comprises:
performing the and operation on the first pixel difference image and the second pixel difference image to obtain a second image;
binarizing the second image to obtain the first image.
106. The system of claim 104, wherein: the step of detecting the motion or scene change in the plurality of first frames to generate the first detected motion or scene change further comprises:
performing an erosion operation on the first image to obtain a second image, wherein the first detected motion or scene change is indicated in the second image.
107. The system of claim 106, wherein: performing the erosion operation with a cross-structure kernel.
108. The system of claim 79, wherein: the system further comprises:
a gyroscope; and
a camera module configured to obtain the first frames and the second frames;
wherein the steps performed by the at least one processor further comprise:
obtaining a plurality of samples from the gyroscope, wherein the plurality of samples indicate that the camera module has no rotational motion; wherein the examination method is performed based on the plurality of samples.
109. The system of claim 79, wherein: the system further comprises:
a camera module configured to obtain the first frames and the second frames;
wherein the step of determining the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality of times to be continuous motion based on the first size difference amount for each of the second plurality of times comprises:
triggering a slow motion capture of the camera module based on the first dimension difference for each of the second plurality of times.
110. A system, characterized by: the system comprises:
at least one memory configured to store a plurality of program instructions;
at least one processor configured to execute the plurality of program instructions, the plurality of program instructions causing the at least one processor to perform a plurality of steps comprising:
continuously performing an inspection method a first plurality of times, wherein the inspection method comprises:
obtaining a plurality of first frames arranged in sequence;
detecting motion or scene changes in the first plurality of frames to generate a first detected motion or scene change;
obtaining a first feature value of all pixels affected by the first detected motion or scene change;
obtaining a plurality of second frames arranged in sequence, wherein the plurality of second frames follow the plurality of first frames;
detecting motion or scene changes in the second plurality of frames to generate a second detected motion or scene change;
obtaining a second feature value of all pixels affected by the second detected motion or scene change; and
obtaining a first eigenvalue differential between the first eigenvalue and the second eigenvalue; wherein for each of a second plurality of the first plurality, the first characteristic difference is greater than a first predetermined threshold, and the second plurality is greater than or equal to a first predetermined plurality; and
determining that the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames of each of the first plurality belong to continuous motion based on the first eigenvalue delta for each of the second plurality.
111. The system of claim 110, wherein:
the first feature value is a pixel number of all pixels affected by the first detected motion or scene change; and
the second feature value is a pixel number of all pixels affected by the second detected motion or scene change.
112. The system of claim 111, wherein: the second predetermined threshold is equal to a width of one of the plurality of first frames multiplied by one eighth of a height of the one of the first frames.
113. The system of claim 110, wherein: the first eigenvalue delta is an absolute value.
114. The system of claim 113, wherein: the first predetermined plurality of times is at least three times.
115. The system of claim 110, wherein: obtaining the first eigenvalue delta by subtracting the first eigenvalue from the second eigenvalue.
116. The system of claim 115, wherein: the first predetermined plurality of times is at least two times.
117. The system of claim 110, wherein:
for each of at least one of the first plurality of times, the first feature value difference is less than or equal to zero;
said second plurality being at least said first predetermined plurality of times greater than said at least one time;
the first predetermined plurality of times is two times.
118. The system of claim 110, wherein:
the first characteristic value difference is an absolute value;
the inspection method further includes:
subtracting the first feature value from the second feature value to obtain a second feature value difference; wherein for each of the second plurality of times, the second eigenvalue delta is greater than zero; and
the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality are determined to belong to the continuous motion also based on the second eigenvalue delta for each of the second plurality.
119. The system of claim 117, wherein: the first predetermined plurality of times is at least two times.
120. The system of claim 110, wherein:
the first characteristic value difference is an absolute value;
the inspection method further includes:
subtracting the first feature value from the second feature value to obtain a second feature value difference; wherein for each of at least one of the first plurality of times, the first variance measure is greater than or equal to a second predetermined threshold and the second variance measure is less than zero; said second plurality being at least said first predetermined plurality of times greater than said at least one time; and
the first predetermined plurality of times is two times.
121. The system of claim 120, wherein: for each of a third plurality of the first plurality of times other than the at least one time and the second plurality of times, the first eigenvalue delta is less than the second predetermined threshold and the second eigenvalue delta is less than zero.
122. The system of claim 110, wherein: the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames of each of the first plurality are determined to belong to the continuous motion based only on the first eigenvalue delta for each of the second plurality.
123. The system of claim 110, wherein:
the inspection method further includes:
obtaining a first bounding box bounding the first detected motion or scene change;
obtaining a second bounding box bounding the first detected motion or scene change;
obtaining a first size difference between the first bounding box and the second bounding box; wherein for each of the second plurality of times, the first size differential is greater than a second predetermined threshold; and
determining the motion or scene change in the plurality of first frames and the motion or scene change in the plurality of second frames for each of the first plurality to belong to the continuous motion also based on the first size difference amount for each of the second plurality.
124. The system of claim 123, wherein: the step of obtaining the first size delta between the first bounding box and the second bounding box comprises:
obtaining a first side length difference between a first side of the first bounding box and a side of the second bounding box parallel to the first side of the first bounding box; and
obtaining a second side length difference between a second side of the first bounding box and a side of the second bounding box parallel to the second side of the first bounding box, wherein the second side is perpendicular to the first side; wherein the first dimension difference comprises the first side length difference and the second side length difference.
125. The system of claim 124, wherein: the second predetermined threshold includes a first predetermined side length threshold and a second predetermined side length threshold.
126. The system of claim 125, wherein: the first side length difference is greater than the first predetermined side length threshold, and the second side length difference is greater than the second predetermined side length threshold.
127. The system of claim 125, wherein: the first predetermined side length threshold is equal to one tenth of a width of one of the plurality of first frames and the second predetermined side length threshold is equal to one tenth of a height of the one of the plurality of first frames.
128. The system of claim 125, wherein: the first dimension difference is an absolute value.
129. The system of claim 128, wherein: the first predetermined plurality of times is at least three times.
130. The system of claim 125, wherein: obtaining the first size delta by subtracting a size of the first bounding box from a size of the second bounding box.
131. The system of claim 130, wherein: the first predetermined plurality of times is at least two times.
132. The system of claim 125, wherein:
the first dimension difference is an absolute value;
the inspection method further includes:
subtracting a size of the first bounding box from a size of the second bounding box to obtain a second size difference; wherein for each of the second plurality of times, the second size difference amount is greater than zero; and
the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality are determined to belong to the continuous motion also based on the second size difference amount for each of the second plurality.
133. The system of claim 132, wherein: the first predetermined plurality of times is at least two times.
134. The system of claim 110, wherein:
the inspection method further includes:
obtaining a first bounding box bounding the first detected motion or scene change;
obtaining a second bounding box bounding the first detected motion or scene change;
obtaining a first size difference between the first bounding box and the second bounding box; wherein for at least one of the second plurality of times, the first size differential amount is less than or equal to a second predetermined threshold; and
determining the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality to belong to the continuous motion also based on the first size difference amount for the at least one of the first plurality.
135. The system of claim 134, wherein:
the first feature value is a pixel number of all pixels affected by the first detected motion or scene change; and
the second feature value is a pixel number of all pixels affected by the second detected motion or scene change.
136. The system of claim 135, wherein: the first predetermined threshold is equal to a width of one of the plurality of first frames multiplied by one fourth percent of a height of the one of the plurality of first frames.
137. The system of claim 134, wherein: the step of obtaining the first size delta between the first bounding box and the second bounding box comprises:
obtaining a first side length difference between a first side of the first bounding box and a side of the second bounding box parallel to the first side of the first bounding box; and
obtaining a second side length difference between a second side of the first bounding box and a side of the second bounding box parallel to the second side of the first bounding box, wherein the second side is perpendicular to the first side; wherein the first dimension difference comprises the first side length difference and the second side length difference.
138. The system of claim 137, wherein: the second predetermined threshold includes a first predetermined side length threshold and a second predetermined side length threshold.
139. The system of claim 138, wherein: the first side length difference is less than or equal to the first predetermined side length threshold, and the second side length difference is less than or equal to the second predetermined side length threshold.
140. The system of claim 138, wherein: the first predetermined side length threshold is equal to one tenth of a width of one of the plurality of first frames and the second predetermined side length threshold is equal to one tenth of a height of the one of the plurality of first frames.
141. The system of claim 134, wherein: the first dimension difference is an absolute value.
142. The system of claim 141, wherein: the first predetermined plurality of times is at least three times.
143. The system of claim 144, wherein: obtaining the first size delta by subtracting a size of the first bounding box from a size of the second bounding box.
144. The system of claim 143, wherein: the first predetermined plurality of times is at least two times.
145. The system of claim 134, wherein:
the first dimension difference is an absolute value;
the inspection method further includes:
subtracting a size of the first bounding box from a size of the second bounding box to obtain a second size difference; wherein for each of the second plurality of times, the second size difference amount is greater than zero; and
the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality are determined to belong to the continuous motion also based on the second size difference amount for each of the second plurality.
146. The system of claim 145, wherein: the first predetermined plurality of times is at least two times.
147. The system of claim 110, wherein:
the inspection method further includes:
obtaining a first feature value for all pixels affected by the second detected motion or scene change, wherein the first feature value is greater than a second predetermined threshold for each of the second plurality of times; and
determining that the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality of times belongs to the continuous motion further based on the first feature value for each of the second plurality of times.
148. The system of claim 147, wherein: the first feature value is a pixel number of all pixels affected by the second detected motion or scene change.
149. The system of claim 148, wherein: the second predetermined threshold is equal to a minimum of a tenth of a width of one of the plurality of first frames and a tenth of a height of one of the plurality of first frames.
150. The system of claim 110, wherein: the plurality of first frames include a third frame, a fourth frame and a fifth frame arranged in sequence.
151. The system of claim 150, wherein: the step of detecting the motion or scene change in the plurality of first frames to generate the first detected motion or scene change comprises:
obtaining a first pixel difference image between the third frame and the fifth frame;
obtaining a second pixel difference image between the fourth frame and the fifth frame; and
performing an AND operation using the first and second delta images to obtain a first image including the first detected motion or scene change.
152. The system of claim 151, wherein: the step of performing the AND operation using the first delta pixel image and the second delta pixel image to obtain the first image comprising the first detected motion or scene change comprises:
performing the and operation on the first pixel difference image and the second pixel difference image to obtain a second image;
binarizing the second image to obtain the first image.
153. The system of claim 151, wherein: the step of detecting the motion or scene change in the plurality of first frames to generate the first detected motion or scene change further comprises:
performing an erosion operation on the first image to obtain a second image, wherein the first detected motion or scene change is indicated in the second image.
154. The system of claim 153, wherein: performing the erosion operation with a cross-structure kernel.
155. The system of claim 110, wherein: the system further comprises:
a gyroscope; and
a camera module configured to obtain the first frames and the second frames;
wherein the steps performed by the at least one processor further comprise:
obtaining a plurality of samples from the gyroscope, wherein the plurality of samples indicate that the camera module has no rotational motion; wherein the examination method is performed based on the plurality of samples.
156. The system of claim 110, wherein: the system further comprises:
a camera module configured to obtain the first frames and the second frames;
wherein the step of determining the motion or the scene change in the plurality of first frames and the motion or the scene change in the plurality of second frames for each of the first plurality of times to be continuous motion based on the first size difference amount for each of the second plurality of times comprises:
triggering a slow motion capture of the camera module based on the first dimension difference for each of the second plurality of times.
CN201980095490.9A 2019-04-23 2019-04-23 Method and system for non-spurious motion detection Pending CN113711272A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/083967 WO2020215227A1 (en) 2019-04-23 2019-04-23 Method and system for non-false motion detection

Publications (1)

Publication Number Publication Date
CN113711272A true CN113711272A (en) 2021-11-26

Family

ID=72940850

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980095490.9A Pending CN113711272A (en) 2019-04-23 2019-04-23 Method and system for non-spurious motion detection

Country Status (3)

Country Link
JP (1) JP2022529414A (en)
CN (1) CN113711272A (en)
WO (1) WO2020215227A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116711296A (en) * 2021-02-25 2023-09-05 Oppo广东移动通信有限公司 Electronic device, method of controlling electronic device, and computer-readable storage medium
KR102617846B1 (en) * 2022-12-22 2023-12-27 주식회사 핀텔 Method and System for Detecting Moving Object in Video

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060285724A1 (en) * 2005-06-20 2006-12-21 Ying-Li Tian Salient motion detection system, method and program product therefor
CN1885953A (en) * 2005-06-21 2006-12-27 三星电子株式会社 Intermediate vector interpolation method and three-dimensional (3d) display apparatus performing the method
CN101352029A (en) * 2005-12-15 2009-01-21 模拟装置公司 Randomly sub-sampled partition voting(RSVP) algorithm for scene change detection
CN106878668A (en) * 2015-12-10 2017-06-20 微软技术许可有限责任公司 Mobile detection to object
CN107071440A (en) * 2016-01-29 2017-08-18 谷歌公司 Use the motion-vector prediction of previous frame residual error
WO2018201444A1 (en) * 2017-05-05 2018-11-08 Boe Technology Group Co., Ltd. Method for detecting and tracking target object, target object tracking apparatus, and computer-program product
CN109361923A (en) * 2018-12-04 2019-02-19 深圳市梦网百科信息技术有限公司 A kind of time slip-window scene change detection method and system based on motion analysis

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2442512A (en) * 2006-09-09 2008-04-09 David Hostettler Wain Motion detector using video frame differences with noise filtering and edge change accentuation
US8233094B2 (en) * 2007-05-24 2012-07-31 Aptina Imaging Corporation Methods, systems and apparatuses for motion detection using auto-focus statistics
JP5656964B2 (en) * 2012-11-29 2015-01-21 Eizo株式会社 Scene change determination apparatus or method
CN103886617A (en) * 2014-03-07 2014-06-25 华为技术有限公司 Method and device for detecting moving object
CN105303581B (en) * 2014-06-12 2018-12-14 南京理工大学 A kind of moving target detecting method of auto-adaptive parameter
US10686969B2 (en) * 2016-07-08 2020-06-16 NETFLIX Inc. Detecting shot changes in a video
JP2018010165A (en) * 2016-07-13 2018-01-18 キヤノン株式会社 Image blur correction device, method for controlling the same, and imaging apparatus
CN107273815A (en) * 2017-05-24 2017-10-20 中国农业大学 A kind of individual behavior recognition methods and system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060285724A1 (en) * 2005-06-20 2006-12-21 Ying-Li Tian Salient motion detection system, method and program product therefor
CN1885953A (en) * 2005-06-21 2006-12-27 三星电子株式会社 Intermediate vector interpolation method and three-dimensional (3d) display apparatus performing the method
CN101352029A (en) * 2005-12-15 2009-01-21 模拟装置公司 Randomly sub-sampled partition voting(RSVP) algorithm for scene change detection
CN106878668A (en) * 2015-12-10 2017-06-20 微软技术许可有限责任公司 Mobile detection to object
CN107071440A (en) * 2016-01-29 2017-08-18 谷歌公司 Use the motion-vector prediction of previous frame residual error
WO2018201444A1 (en) * 2017-05-05 2018-11-08 Boe Technology Group Co., Ltd. Method for detecting and tracking target object, target object tracking apparatus, and computer-program product
CN109361923A (en) * 2018-12-04 2019-02-19 深圳市梦网百科信息技术有限公司 A kind of time slip-window scene change detection method and system based on motion analysis

Also Published As

Publication number Publication date
WO2020215227A1 (en) 2020-10-29
JP2022529414A (en) 2022-06-22

Similar Documents

Publication Publication Date Title
US20170161905A1 (en) System and method for background and foreground segmentation
EP2709039A1 (en) Device and method for detecting the presence of a logo in a picture
US10079974B2 (en) Image processing apparatus, method, and medium for extracting feature amount of image
CN110796600B (en) Image super-resolution reconstruction method, image super-resolution reconstruction device and electronic equipment
KR20150114437A (en) Image processing apparatus and image processing method
JP5360052B2 (en) Object detection device
CN109005367B (en) High dynamic range image generation method, mobile terminal and storage medium
CN110083740B (en) Video fingerprint extraction and video retrieval method, device, terminal and storage medium
JP2016085487A (en) Information processing device, information processing method and computer program
CN109286758B (en) High dynamic range image generation method, mobile terminal and storage medium
CN110572636B (en) Camera contamination detection method and device, storage medium and electronic equipment
CN111028276A (en) Image alignment method and device, storage medium and electronic equipment
CN113711272A (en) Method and system for non-spurious motion detection
CN100375530C (en) Movement detecting method
JPWO2007129591A1 (en) Shielding object image identification apparatus and method
CN111986229A (en) Video target detection method, device and computer system
JP7197000B2 (en) Information processing device, information processing method and information processing program
TW201913352A (en) Method and electronic apparatus for wave detection
CN110290310B (en) Image processing apparatus for reducing step artifacts from image signals
US10916016B2 (en) Image processing apparatus and method and monitoring system
WO2023160061A1 (en) Method and apparatus for determining moving object in image, electronic device, and storage medium
JP6399122B2 (en) Face detection apparatus and control method thereof
JP6591349B2 (en) Motion detection system and motion detection method
CN111970451B (en) Image processing method, image processing device and terminal equipment
CN110782425A (en) Image processing method, image processing device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination