CN108629808B - Image processing system and marker - Google Patents

Image processing system and marker Download PDF

Info

Publication number
CN108629808B
CN108629808B CN201810071830.1A CN201810071830A CN108629808B CN 108629808 B CN108629808 B CN 108629808B CN 201810071830 A CN201810071830 A CN 201810071830A CN 108629808 B CN108629808 B CN 108629808B
Authority
CN
China
Prior art keywords
marker
detected
plate members
detection
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810071830.1A
Other languages
Chinese (zh)
Other versions
CN108629808A (en
Inventor
横井谦太朗
野田周平
田村聪
木村纱由美
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Elevator and Building Systems Corp
Original Assignee
Toshiba Elevator Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Elevator Co Ltd filed Critical Toshiba Elevator Co Ltd
Publication of CN108629808A publication Critical patent/CN108629808A/en
Application granted granted Critical
Publication of CN108629808B publication Critical patent/CN108629808B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

The invention provides an image processing system and a marker, which can set a detection region with a complex shape with high precision. An image processing system according to an embodiment includes: an imaging unit that images a subject including 1 or more markers for setting a detection region in which a motion of a person or an object can be detected; a detection unit that detects 1 or more markers that can be rotated in the left-right direction with reference to a rotation fulcrum and that have 2 sides extending from the rotation fulcrum, from a captured image captured by the imaging unit; a region setting unit that sets a region including 2 sides each constituting 1 or more markers detected by the detection unit as a detection region; and a detection unit that detects a motion of the person or the object within the detection area set by the area setting unit.

Description

Image processing system and marker
The present application is based on Japanese patent application 2017 and 058801 (application date: 3/24/2017) on which priority is enjoyed. This application is incorporated by reference in its entirety.
Technical Field
Embodiments of the present invention relate to an image processing system and a marker.
Background
In general, an image sensor for detecting the movement of a person or an object from an image captured by a camera needs to set a detection area capable of detecting the movement of the person or the object when the image sensor is mounted.
As a method of setting the detection area, for example, there is a method of causing a display device to display a captured image at the time of installation and manually designating (setting) the detection area on the display device. However, in the case of this method, a display device for displaying the captured image at the time of installation is required, and the detection area is specified manually, so there is an inconvenience that it takes time and labor.
Therefore, a method of setting a detection region by disposing a marker at a site that is a characteristic point of the detection region has been studied.
Disclosure of Invention
However, in the case of the above method, there is an inconvenience that a detection region having a complicated shape in which a large number of feature points exist cannot be set with high accuracy.
An object of the present invention is to provide an image processing system and a marker that can accurately set a detection region having a complicated shape.
An embodiment relates to an image processing system including: an imaging unit that images a subject including 1 or more markers for setting a detection region capable of detecting a motion of a person or an object; a detection unit that detects the 1 or more markers that can be rotated in the left-right direction with a rotation fulcrum as a reference and that have 2 sides extending from the rotation fulcrum, from the captured image captured by the imaging unit; an area setting unit that sets an area including each of the 2 sides as the detection area, each of the 2 sides constituting 1 or more markers detected by the detection unit; and a detection unit that detects a motion of a person or an object in the detection area set by the area setting unit.
According to the image processing system configured as described above, even a detection region having a complicated shape can be set with high accuracy.
Drawings
Fig. 1 is a diagram showing a schematic configuration example of an image processing system according to embodiment 1.
Fig. 2 is a view showing the appearance of the marker of this embodiment.
Fig. 3 is a diagram showing a case where the marker of this embodiment is provided at a corner of a wall surface.
Fig. 4 is another diagram showing the appearance of the marker of this embodiment.
Fig. 5 is another view showing the appearance of the marker of this embodiment.
Fig. 6 is still another view showing the appearance of the marker of this embodiment.
Fig. 7 is still another diagram showing the appearance of the marker of this embodiment.
FIG. 8 is a diagram for explaining the difference between the labeling substance shown in FIG. 2 and the labeling substances shown in FIGS. 4 to 7.
Fig. 9 is a flowchart showing an example of the procedure of the detection region setting process performed by the image processing apparatus of the embodiment.
Fig. 10 is a diagram for supplementary explanation of fig. 9.
Fig. 11 is another diagram for supplementary explanation of fig. 9.
Fig. 12 is still another diagram for supplementary explanation of fig. 9.
Fig. 13 is still another diagram for supplementary explanation of fig. 9.
Fig. 14 is a diagram showing the detection result of the contour edge component in the case where the marker of this embodiment is painted with a single color.
Fig. 15 is another diagram showing the detection result of the contour edge component in the case where the marker of the embodiment is painted with a single color.
Fig. 16 is a diagram showing the detection result of the contour edge component in the case where the marker of this embodiment is painted with a plurality of colors.
Fig. 17 is another diagram showing the detection result of the contour edge component in the case where the marker of the present embodiment is painted with a plurality of colors.
Fig. 18 is a diagram showing a schematic configuration example of an elevator system to which various functions realized by the image processing system of the embodiment are applied.
Fig. 19 is a diagram showing a schematic configuration example of the image processing system according to embodiment 2.
Fig. 20 is a diagram for explaining the function of the marker specifying direction determining unit according to this embodiment.
Fig. 21 is a flowchart showing an example of the procedure of the detection region setting process performed by the image processing apparatus of the present embodiment.
Detailed Description
The following describes embodiments with reference to the drawings.
< embodiment 1 >
Fig. 1 is a diagram showing a schematic configuration example of an image processing system according to embodiment 1.
As shown in fig. 1, the image processing system includes a camera 11 (may also be referred to as an "imaging unit"), an image processing device 12, a marker 13, and the like, and detects a person, an object, and the like existing in a specific area from an image (captured image) captured by the camera 11.
The camera 11 is a small-sized monitoring camera such as a vehicle-mounted camera, has a wide-angle lens, and can continuously capture images of several frames (for example, 30 frames/second) within 1 second.
Each image (video) continuously captured by the camera 11 is analyzed in real time by the image processing device 12. Specifically, the image processing apparatus 12 detects (motion of) a person, an object, or the like from a change in luminance value of an image in a specific area.
The marker 13 defines a detection area by setting a detection area in which the image processing apparatus 12 can detect a person, an object, or the like.
Here, referring to fig. 2, the marker 13 is explained in detail.
Fig. 2 is a diagram for explaining the configuration of the marker 13 for setting the detection region, fig. 2 (a) is a perspective view showing the appearance of the marker 13, fig. 2 (b) is a side view showing the appearance of the marker 13 corresponding to fig. 2 (a), fig. 2 (c) is another perspective view showing the appearance of the marker 13, and fig. 2 (d) is another side view showing the appearance of the marker 13 corresponding to fig. 2 (c).
As shown in fig. 2 (a) and (b), the marker 13 is configured by overlapping a plurality of fan-shaped plate members (thick paper) 131a to 131c having both side portions r1 and r2 (may also be referred to as "radii"), and the plate members 131a to 131c are connected to each other so as to be rotatable in the left-right direction by a shaft member 132 provided near the center point of the fan shape. That is, the shape of the marker 13 is also a fan shape (or a circle), and as shown in fig. 2 (c) and (d), the center angle θ of the fan-shaped marker 13 can be adjusted to an arbitrary angle by rotating a part or all of the plurality of fan-shaped plate members 131a to 131c in the left-right direction.
For example, as shown in fig. 3 (a), in the case where the marker 13 is provided (placed) at a place where the angle of the wall surface is acute (for example, X degrees, where X is 0 ≦ X < 90), the marker 13 is deformed so that the central angle θ becomes (360-X) degrees by rotating the plurality of fan-shaped plate members 131 in the left-right direction.
In addition, as shown in fig. 3 (b), when the marker 13 is provided at a position where the angle of the wall surface is a right angle, the marker 13 is deformed so that the center angle θ becomes 270 degrees by rotating the plurality of fan-shaped plate members 131 in the left-right direction.
Further, as shown in fig. 3c, in the case where the marker 13 is provided at a place where the angle of the wall surface is an obtuse angle (for example, Y degrees, where Y is 90 < Y < 180), the marker 13 is deformed so that the central angle θ becomes (360-Y) degrees by rotating the plurality of fan-shaped plate members 131 in the left-right direction.
The plurality of plate members 131a to 131c constituting the marker 13 are painted with a specific color. The specific color is, for example, a fluorescent color, and is preferably a color having a low frequency used as a color of a floor surface or a wall surface.
In fig. 2, the plate members 131a to 131c are connected to each other so as to be rotatable in the left-right direction (integrated) by a shaft member 132 provided near the center point of the fan shape. However, the plate members 131a to 131c may be connected to each other so as to be rotatable in the left-right direction by providing a concave depression near the center point of the upper surface of each of the plate members 131a to 131c and a convex projection near the center point of the lower surface (at a position facing the concave depression). Alternatively, the plate members 131a to 131c may be connected to each other so as to be rotatable in the left-right direction by providing a convex protrusion near the center point of the upper surface of each of the plate members 131a to 131c and providing a concave depression near the center point of the lower surface (at a position facing the convex protrusion).
The shape of the marker 13 is not limited to the shape shown in fig. 2, and may be, for example, the following shape.
As shown in fig. 4 (a) and (b), the marker 13 is configured by overlapping a plurality of fan-shaped plate members 131 so as to be rotatable in the left-right direction, and each plate member 131 may have a stopper member 133 protruding in the upper direction or the lower direction from both side portions r1 and r2 of the fan shape, unlike the shape shown in fig. 2. As shown in fig. 4 (c), when the plate member 131 is rotated in the left-right direction, the stopper member 133 provided on the predetermined plate member 131 abuts against the stopper member 133 provided on the opposite plate member 131.
The marker 13 is configured such that a plurality of fan-shaped plate members 131 are stacked rotatably in the left-right direction as shown in fig. 5 (a) and (b), and each plate member 131 may have a convex protrusion 134 on the upper surface or the lower surface of the fan shape, which is different from the shape shown in fig. 2 and 4. As shown in fig. 5 (c), when the plate member 131 is rotated in the left-right direction, the projection 134 provided on the predetermined plate member 131 abuts against the projection 134 provided on the opposite plate member 131.
The marker 13 is configured such that a plurality of fan-shaped plate members 131 are stacked rotatably in the left-right direction as shown in fig. 6 (a) and (b), and each plate member 131 may have a convex protrusion 135 provided on the upper surface or the lower surface of the fan shape and a slide groove 136 provided along the arc of the fan shape and into which the protrusion 135 engages, unlike the shapes of fig. 2, 4, and 5. As shown in fig. 6 (c), when the plate member 131 is rotated in the left-right direction, the protrusion 135 provided on the predetermined plate member 131 moves in the slide groove 136 provided on the opposing plate member 131, and abuts against the opposing plate member 131 at the end of the slide groove 136.
As shown in fig. 7, the marker 13 may be formed integrally such that a plurality of fan-shaped plate members 131 are formed in a corrugated shape.
If the marker 13 is formed in the shape shown in fig. 4 to 7, the following advantages can be obtained. Next, the above-described advantages will be described by comparing the shape shown in fig. 2 with the shape shown in fig. 4 with reference to fig. 8.
In the case of the marker 13 having the shape shown in fig. 2, when the plate members 131 are rotated in the left-right direction, a gap (clearance) shown in fig. 8 a may be generated. Accordingly, the accuracy of the processing for detecting the edge of the marker 13, which will be described later, may be reduced. On the other hand, in the case of the marker 13 having the shape shown in fig. 4, although it is inferior in compactness (portability) compared to the shape shown in fig. 2, when each plate member 131 is rotated in the left-right direction, as shown in fig. 8 (b), the stopper member 133 abuts against the corresponding stopper member 133, so that the possibility of the occurrence of the above-described gap can be eliminated. Accordingly, the processing for detecting the edge of the marker 13, which will be described later, can be performed with higher accuracy than in the case where the gap is generated.
Note that, although the marker 13 having the shape shown in fig. 4 is described as a representative example, similar advantages can be obtained with respect to the marker 13 having the shape shown in fig. 5 to 7.
In the present embodiment, the marker 13 is configured by the plurality of fan-shaped plate members 131, but the present invention is not limited thereto, and the marker 13 may be configured by a plurality of triangular plate members.
The description returns to fig. 1 again. The image processing apparatus 12 includes a storage unit 121, a detection unit 122, a marker detection unit 123, a marker edge detection unit 124, an area setting unit 125, an area setting information storage unit 126, and the like.
The storage unit 121 sequentially stores images captured by the camera 11, and has a buffer for temporarily holding data necessary for processing by the detection unit 122. The detection unit 122 detects a person, an object, or the like in the detection area based on a change in luminance value in the detection area in the captured image.
The marker detecting unit 123 detects (extracts) the marker 13 provided for setting (calibrating) the detection region on site from the image as an initial setting. In the present embodiment, it is assumed that the marker 13 is coated with a specific color such as a fluorescent color, and the marker detecting unit 123 detects the marker 13 by extracting an area having a hue/chroma corresponding to the specific color from the image.
In the present embodiment, since the shape of the marker 13 is assumed to be a fan shape, the marker detecting unit 123 may detect the marker 13 by extracting a shape close to a circle from the image using a known separation filter or the like.
Further, when the marker 13 is coated to have a predetermined pattern (pattern), the marker detecting unit 123 may detect the marker 13 by extracting the predetermined pattern by using a known pattern matching method or the like.
The marker edge detection unit 124 detects edges corresponding to the both side portions r1, r2 of the marker 13 detected by the marker detection unit 123. More specifically, first, the marker edge detection unit 124 extracts the contour edge component of the marker 13 detected by the marker detection unit 123 by Sobel edge detection or the like. Thereafter, the marker edge detection unit 124 applies hough transform, for example, to the extracted contour edge components, extracts only straight line components from the extracted contour edge components, and detects edges corresponding to both side portions r1 and r2 of the marker 13.
When 2 edges corresponding to both side portions r1 and r2 of the marker 13 are detected, the marker edge detection unit 124 detects the intersection of the detected 2 edges as the corner of the wall surface on which the marker 13 is provided (in other words, the center point of the fan-shaped marker 13).
The region setting unit 125 sets a region including the edges corresponding to the both side portions r1, r2 of the marker 13 detected by the marker edge detection unit 124 as a detection region. More specifically, the region setting unit 125 sets, as the detection region, a region defined by an extension line obtained by extending the edges corresponding to the both side portions r1 and r2 of the marker 13 detected by the marker edge detection unit 124.
The region setting information storage unit 126 stores region setting information indicating the detection region set by the region setting unit 125.
Next, an example of the procedure of the detection region setting process executed by the image processing apparatus 12 configured as described above will be described with reference to the flowchart of fig. 9.
First, the image processing apparatus 12 acquires an image (hereinafter, referred to as an "initial setting image") captured in a state where the marker 13 is set from the camera 11 (step S1). Here, a case is assumed where the initial setting image shown in fig. 10 is acquired from the camera 11.
Next, the marker detecting unit 123 in the image processing apparatus 12 detects the marker 13 in the initial setting image (step S2). In this case, 3 markers 13a to 13c are detected from the initial setting image shown in fig. 10.
Further, as described above, the marker 13 may be detected by extracting a region having a hue/chroma corresponding to the color of the marker 13 from the initial setting image, or may be detected by extracting a shape close to a circle from the initial setting image.
Next, the marker edge detection unit 124 in the image processing apparatus 12 detects the contour edge components of the markers 13a to 13c detected by the marker detection unit 123 (step S3). In this case, as shown in fig. 11, the edge ea is detected from the marker 13a1~ea3. Detection of edge eb from the marker 13b1~eb3. Edge ec is detected from the marker 13c1~ec3
Thereafter, the marker edge detection unit 124 extracts only the straight line component of the detected contour edge components, and detects edges corresponding to the both side portions r1 and r2 of the marker 13 (step S4). In this case, as shown in fig. 12, the edge ea is drawn from the marker 13a1、ea2Edges corresponding to both side portions are detected. From the tag 13b the edge eb1、eb2Edges corresponding to both side portions are detected. Edge ec is taken from marker 13c1、ec2Edges corresponding to both side portions are detected.
Next, the marker edge detection unit 124 detects the intersection of the extracted edges corresponding to the both side portions r1, r2 as the corner of the wall surface (step S5). In this case, as shown in fig. 12, the edge ea corresponding to the marker 13a will be1、ea2The intersection of (a) is detected as the wall angle C1. Will correspond to the edge eb of the marker 13b1、eb2The intersection of (a) is detected as the corner C2 of the other wall. Will correspond to the edge of the marker 13cEdge ec1、ec2Is detected as the angle C3 of the other wall surface.
Next, the region setting unit 125 sets a detection region based on the edges corresponding to the both side portions r1, r2 of the marker 13 detected by the marker edge detection unit 124 and the angle of the wall surface detected by the marker edge detection unit 124 (step S6).
In this case, as shown in fig. 13, the edges ea corresponding to both side portions of the marker 13a are passed1、ea2Extension lines ea extending in the direction opposite to the wall surface corner C11’、ea2’And edges eb corresponding to both side portions of the marker 13b1、eb2Extended lines eb extended in the direction opposite to the wall surface corner C21’、eb2' and edges ec corresponding to both side portions of the marker 13c1、ec2Extended lines ec extended in the direction opposite to the wall surface angle C31’、ec2’The predetermined area is set as a detection area E1.
Thereafter, the area setting unit 125 stores the area setting information indicating the set detection area in the area setting information storage unit 126 (step S7), and ends the process here.
According to embodiment 1 described above, since the marker detection unit 123, the marker edge detection unit 124, and the region setting unit 125 are provided, the detection region setting process using the marker 13 can be realized. Since the marker 13 has a structure capable of adjusting the center angle θ to an arbitrary angle, it is possible to specify in which direction the detection region is set. That is, the number of markers to be installed can be reduced compared to a general method of identifying 1 marker and setting a detection region by connecting the identified points.
In the present embodiment, the explanation was made assuming that the marker 13 is coated with 1 color (single color) such as a fluorescent color, for example, but in this case, the following inconvenience may occur.
Fig. 14 and 15 are diagrams showing the detection results of the contour edge components when the left marker 13a painted with the 1 st color and the right marker 13b painted with the 2 nd color different from the 1 st color are provided on the floor surfaces F1 and F2 of the plurality of colors along the wall surface W of the plurality of colors.
For convenience of explanation, all the drawings shown in fig. 14 and 15 will not be described below, and only some of the drawings will be described as representative examples.
As shown in the left drawing of fig. 14 (a), when the wall surface W and the floor surfaces F1 and F2 are all different colors from the 1 st and 2 nd colors, the marker edge detection unit 124 can detect the contour edge components of the markers 13a and 13b coated with the 1 st and 2 nd colors, respectively, with high accuracy (high intensity).
On the other hand, as shown in the center of fig. 14 (a), when the wall surface W is the 1 st color and the floor surfaces F1 and F2 are both different colors from the 1 st and 2 nd colors, the marker edge detection unit 124 can accurately detect the contour edge component of the marker 13b coated with the 2 nd color, but since the wall surface W is the same color as the marker 13a, it is not possible to accurately detect the contour edge component of the marker 13a coated with the 1 st color. More specifically, there may be inconvenience that edges corresponding to both side portions r1 and r2 of the marker 13a cannot be detected with high accuracy.
However, in the case of the central view in fig. 14 (a), since the marker edge detection unit 124 can detect the edge corresponding to the arc of the marker 13a with high accuracy, the edges corresponding to the both side portions r1 and r2 can be predicted and detected to some extent from the edge corresponding to the arc.
However, as shown in the center view of fig. 15 (a), when all of the wall surface W and the floor surfaces F1 and F2 are the 1 st color, the marker edge detection unit 124 cannot detect (or cannot detect with high accuracy) the contour edge component of the marker 13a coated with the 1 st color at all, and cannot predict and detect an edge as in the case of the center view of fig. 14 (a).
Accordingly, the area setting unit 125 has inconvenience that the detection area cannot be set or the detection area is set to an incorrect place.
In order to solve such inconvenience, it is preferable to coat the marker 13 with a plurality of colors instead of 1 color.
Fig. 16 and 17 are diagrams showing the detection results of the contour edge components when the left marker 13a in the drawing in which the inner circle is coated with the 1 st color and the outer circle is coated with the 2 nd color and the right marker 13b in the drawing in which the inner circle is coated with the 2 nd color and the outer circle is coated with the 1 st color are provided on the floor surfaces H1 and H2 of the various colors along the wall surface W of the various colors.
As shown in each of fig. 16 and 17, when the markers 13a and 13b are painted with a plurality of colors, the contour edge components of the markers 13a and 13b are detected with high accuracy in both cases.
For example, even in the center view of fig. 17 (a) in which the colors of the wall surface W and the floor surfaces F1, F2 are the same conditions as those in the center view of fig. 15 (a), the marker edge detection unit 124 can detect the contour edge component of the marker 13a with high accuracy by using the inner portion coated with the 2 nd color.
By applying the markers 13 in a plurality of colors in this manner, the detection of the contour edge component by the marker edge detection unit 124 can be realized with higher accuracy, and the inconvenience described above can be eliminated.
Note that, although the case where the markers 13 are applied in a plurality of colors is described here, the markers 13 may be applied in the same color but in different brightness values (brightnesses).
Further, various functions realized by the image processing system of the present embodiment can be applied to, for example, an elevator system shown in fig. 18. Next, a case where various functions realized by the image processing system are applied to the elevator system will be described.
Fig. 18 shows an elevator (car) 20, a car door 21, a hall door 22, and an elevator control device 23 for controlling opening and closing of the doors of the car door 21.
In recent years, various techniques have been studied to prevent people and objects from being caught by the car door 21 of the elevator 20, and among them, there is an elevator system shown in fig. 18. This elevator system uses the camera 11 to photograph the vicinity of the car door 21, and controls the opening and closing of the car door 21 in accordance with the presence or absence of (movement of) a user in a detection area set in the vicinity of the car door 21. For example, when a user is present in the detection area, the door closing operation of the car doors 21 is prohibited and the door opened state is maintained.
In such an elevator system, it is necessary to set a probe area for each floor where the car 20 is stopped, but when the shape of the door pocket disposed around the hall door 22 is a complicated shape, the serviceman displays an initial setting image on a terminal such as a flat panel, manually specifies an area to be a probe area, and sets a probe area. This requires a terminal such as a tablet to be carried around, and also requires a detection area to be manually specified, which is inconvenient in that it takes time and labor.
However, by utilizing the functions realized by the image processing system, particularly, by providing the marker detection unit 123, the marker edge detection unit 124, and the region setting unit 125 in the image processing apparatus 12, the detection region can be set by the marker 13, and the inconvenience can be resolved.
In addition, the image processing device 12 shown in fig. 18 may be equipped with the following functions as functions unique to the elevator system: the user nearest to the car door 21 is detected based on the change in the brightness value of the image, and whether or not the detected user has an intention to get in the car 20 is determined based on whether or not the detected user comes close to the car door 21. Further, the following functions may be further incorporated: when it is determined that there is an intention to get in the car 20, the door closing operation of the car doors 21 is prohibited and the door opened state is maintained.
< embodiment 2 >
Next, embodiment 2 will be explained. Embodiment 2 differs from embodiment 1 in that the image processing apparatus 12 further includes a marker specifying direction determining unit 127, as shown in fig. 19. Note that portions having the same functions as those of embodiment 1 are denoted by the same reference numerals, and detailed description thereof is omitted.
In the present embodiment, it is assumed that the markers 13 shown in fig. 14 and 15, which are painted with a single color, are not used, and the markers 13 shown in fig. 16 and 17, which are painted with the 1 st and 2 nd colors, respectively, are used.
The marker specifying direction determining section 127 specifies the position where the detection region is set based on the color of the inner circle (or the color of the outer circle) of the marker 13. More specifically, the marker specifying direction determining unit 127 determines whether or not the position where the detection region is set is the side where the edge corresponding to the arc of the marker 13 is detected with the edges corresponding to the both side portions r1 and r2 of the marker 13 as the boundary, based on whether or not the color of the inner circle of the marker 13 is the 1 st color.
In the present embodiment, when the color of the inner circle of the marker 13 is the 1 st color, the marker specifying direction determining unit 127 sets the detection region to the side where the edge corresponding to the arc of the marker 13 is detected as shown in fig. 20 (a), and when the color of the inner circle of the marker 13 is not the 1 st color (is the 2 nd color), sets the detection region to the side where the edge corresponding to the arc of the marker 13 is not detected as shown in fig. 20 (b). However, this is merely an example, and when the color of the inner circle of the marker 13 is the 1 st color, the position where the detection region is set may be the side where the edge corresponding to the arc of the marker 13 is not detected, and when the color of the inner circle of the marker 13 is not the 1 st color, the position where the detection region is set may be the side where the edge corresponding to the arc of the marker 13 is detected.
Here, an example of the procedure of the detection region setting process executed by the image processing apparatus 12 according to embodiment 2 will be described with reference to the flowchart of fig. 21. Note that the same processing as in the flowchart shown in fig. 9 is denoted by the same reference numerals, and detailed description thereof is omitted.
After the above-described processing of steps S1 to S5 is performed, the marker specifying direction determining section 127 determines whether the color of the inner circle of the marker 13 is the 1 st color (step S11).
As a result of the processing at step S11, when the inner circle of the marker 13 is the 1 st color (yes at step S11), the marker specifying direction determining unit 127 determines that the position where the detection region is set is the side where the edge corresponding to the arc of the marker 13 is detected, notifies the region setting unit 125 of the determination result (step S12), and proceeds to the processing at step S14, which will be described later.
On the other hand, as a result of the processing in step S11, when the inner circle of the marker 13 is not the 1 st color (no in step S11), the marker specifying direction determining unit 127 determines that the position where the detection region is set is on the side where the edge corresponding to the arc of the marker 13 is not detected, and notifies the region setting unit 125 of the determination result (step S13).
Next, the region setting unit 125 sets a detection region based on the edges corresponding to the both side portions r1, r2 of the marker 13 detected by the marker edge detection unit 124, the angle of the wall surface detected by the marker edge detection unit 124, and the place (side) of the set detection region determined by the marker specifying direction determination unit 127 (step S14).
Thereafter, the process of step S7 is executed, and the process ends here.
In the present embodiment, the marker specifying direction determining unit 127 defines the position where the detection region is set, but conversely, may define the position where the detection region is not set, that is, the position where the detection region is excluded.
According to embodiment 2 described above, by further providing the marker specifying direction determining unit 127, the hollow region can be set as the detection region as shown in fig. 20 (b). In addition, by using the marker 13 in which the inner circle is painted with the 1 st color and the marker 13 in which the inner circle is painted with the 2 nd color in combination, the detection region can be set in the floor surface and the hollow region at one time. That is, the detection regions can be set at a variety of places at a time.
According to at least 1 embodiment described above, it is possible to provide an image processing system and a marker that can be set with high accuracy even in a detection region having a complicated shape.
Although the embodiments of the present invention have been described, these embodiments are provided as examples and are not intended to limit the scope of the invention. These novel embodiments can be implemented in other various forms, and various omissions, substitutions, and changes can be made without departing from the spirit of the invention. These embodiments and modifications thereof are included in the scope and gist of the invention, and are included in the invention described in the claims and the equivalent scope thereof.

Claims (12)

1. An image processing system is characterized by comprising:
an imaging unit that images a subject including 1 or more markers for setting a detection region in which a motion of a person or an object can be detected;
a detection unit that detects the 1 or more markers that can be rotated in the left-right direction with a rotation fulcrum as a reference and that have 2 sides extending from the rotation fulcrum, from the captured image captured by the imaging unit;
an area setting unit that sets an area including each of the 2 sides as the detection area, each of the 2 sides constituting 1 or more markers detected by the detection unit; and
and a detection unit that detects a motion of the person or the object in the detection area set by the area setting unit.
2. The image processing system according to claim 1,
the marker has a fan shape having the 2 sides as both sides and the pivot point as a center point.
3. The image processing system according to claim 2,
the detection unit detects a contour edge component of the detected marker, detects 2 straight edge components corresponding to the two side portions from the detected contour edge component, and detects an intersection of the detected 2 straight edge components as the center point.
4. The image processing system according to claim 3,
the region setting unit extends the detected 2 linear edge components in a direction opposite to a position where the center point is detected, and sets a region defined by an extension line obtained by the extension as the detection region.
5. The image processing system according to claim 3,
regarding the marker, an inner circle portion having the center point as a center is painted with a 1 st color, and an outer circle portion is painted with a 2 nd color different from the 1 st color.
6. The image processing system according to claim 5,
the detection device further includes a marker specifying direction determining unit that specifies a region in which the detection region is set, based on a color of an inner circle portion of the marker.
7. The image processing system according to claim 6,
in a case where the color of the inner circle portion of the marker is the 1 st color, the marker specifying direction judging section judges that there is a region in which the detection region is set on a side where there is an edge component corresponding to an arc among the detected contour edge components on the basis of the detected 2 straight line edge components as the boundary line,
in a case where the color of the inner circle portion of the marker is the 2 nd color, the marker specifying direction determining section determines that there is a region in which the detection region is set on a side where there is no edge component corresponding to an arc among the detected contour edge components on the basis of the detected 2 straight line edge components as the boundary line.
8. A marker used when setting a detection region for detecting a motion of a person or an object, the marker comprising:
a plurality of fan-shaped plate members; and
a shaft member provided in the vicinity of a center point of each of the plate members,
the plate members are rotatably connected to each other in the left-right direction by the shaft member,
by rotating a part or all of the plate members in the left-right direction, the central angle of the fan shape formed by the plate members can be adjusted to an arbitrary angle.
9. The marker according to claim 8,
and stopper members projecting upward or downward from both side portions of each of the plate members,
each of the stopper members abuts against a stopper member provided to the opposing plate member when each of the plate members rotates in the left-right direction.
10. The marker according to claim 8,
each of the plate members further includes a convex protrusion on an upper surface or a lower surface thereof,
each of the projections abuts against a projection provided on an opposing plate member when each of the plate members is rotated in the left-right direction.
11. The marker according to claim 8, further comprising:
a protrusion provided on an upper surface or a lower surface of each of the plate members; and
a slide groove provided along an arc of a sector of each of the plate members and engaged with the projection,
when the plate members are rotated in the left-right direction, the protrusions move in the slide grooves provided in the opposing plate members, and abut the opposing plate members at the ends of the slide grooves.
12. The marker according to claim 8,
each of the plate members is integrally formed in a corrugated shape.
CN201810071830.1A 2017-03-24 2018-01-25 Image processing system and marker Active CN108629808B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017058801A JP6339259B1 (en) 2017-03-24 2017-03-24 Image processing system and marker
JP2017-058801 2017-03-24

Publications (2)

Publication Number Publication Date
CN108629808A CN108629808A (en) 2018-10-09
CN108629808B true CN108629808B (en) 2021-08-31

Family

ID=62487506

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810071830.1A Active CN108629808B (en) 2017-03-24 2018-01-25 Image processing system and marker

Country Status (3)

Country Link
JP (1) JP6339259B1 (en)
CN (1) CN108629808B (en)
SG (1) SG10201800803XA (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104680482A (en) * 2015-03-09 2015-06-03 华为技术有限公司 Method and device for image processing
CN204379311U (en) * 2014-10-21 2015-06-10 无锡海斯凯尔医学技术有限公司 A kind of device and elastomeric check system selecting surveyed area
CN105825498A (en) * 2015-01-27 2016-08-03 株式会社拓普康 Survey data processing device, survey data processing method, and program therefor

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5119533A (en) * 1974-08-08 1976-02-16 Chinon Ind Inc Shinekameraniokeru shatsutakaikakudochosetsusochi
JP3587775B2 (en) * 1999-09-09 2004-11-10 松下電器産業株式会社 Display data analyzer and recording medium
JP4140159B2 (en) * 2000-01-19 2008-08-27 株式会社明電舎 Surveillance camera monitoring area setting apparatus and method
JP2004220510A (en) * 2003-01-17 2004-08-05 Minolta Co Ltd Three-dimensional shape measuring device, three-dimensional shape measuring method and target mark
JP2007033163A (en) * 2005-07-25 2007-02-08 Tokyo Electric Power Co Inc:The Scale
JP2007301330A (en) * 2006-05-11 2007-11-22 Hokkaido Jiki Insatsu Kk Manufacturing method of folding fan
BRPI0920015A2 (en) * 2008-10-28 2015-12-15 Bae Systems Plc methods for selecting a homography model from point matches of an associated image pair, calculating a homography transformation, for differentiating image, and for selecting a homography model, and storage medium that stores processor-implementable instructions
JP5573618B2 (en) * 2010-11-12 2014-08-20 富士通株式会社 Image processing program and image processing apparatus
JP6466297B2 (en) * 2015-09-14 2019-02-06 株式会社東芝 Object detection apparatus, method, depalletizing automation apparatus, and packing box

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN204379311U (en) * 2014-10-21 2015-06-10 无锡海斯凯尔医学技术有限公司 A kind of device and elastomeric check system selecting surveyed area
CN105825498A (en) * 2015-01-27 2016-08-03 株式会社拓普康 Survey data processing device, survey data processing method, and program therefor
CN104680482A (en) * 2015-03-09 2015-06-03 华为技术有限公司 Method and device for image processing

Also Published As

Publication number Publication date
SG10201800803XA (en) 2018-10-30
JP6339259B1 (en) 2018-06-06
JP2018163402A (en) 2018-10-18
CN108629808A (en) 2018-10-09

Similar Documents

Publication Publication Date Title
US20020168091A1 (en) Motion detection via image alignment
Porikli et al. Shadow flow: A recursive method to learn moving cast shadows
CN105701756B (en) Image processing apparatus and image processing method
JP3880759B2 (en) Moving object detection method
JP5369175B2 (en) Elevator door detection apparatus and detection method using video
US8228382B2 (en) System and method for counting people
US20020167537A1 (en) Motion-based tracking with pan-tilt-zoom camera
US20060290780A1 (en) Method for modeling cast shadows in videos
US20180206658A1 (en) Mirror display apparatus and the operation method thereof
JPWO2007138858A1 (en) Special effect detection device for video, special effect detection method, special effect detection program, and video reproduction device
US20020176001A1 (en) Object tracking based on color distribution
US10706553B2 (en) Image detection device for detecting shadow or light based on detected edges
US7460705B2 (en) Head-top detecting method, head-top detecting system and a head-top detecting program for a human face
US10762372B2 (en) Image processing apparatus and control method therefor
JP2009048347A (en) Image processing apparatus, method and program
CN108629808B (en) Image processing system and marker
WO2007060987A1 (en) Object monitoring method, object monitoring device, and object monitoring program
JP2009032116A (en) Face authentication apparatus, face authentication method, and access management apparatus
CN111717768A (en) Image processing apparatus
KR20040094984A (en) Context sensitive camera and control system for image recognition
US20220415055A1 (en) Image identification method and image surveillance apparatus
JP2024088185A (en) Elevator System
Sangi et al. Global motion estimation using block matching with uncertainty analysis
JP2000113169A (en) Device and method for detecting vehicle
JP2019102105A (en) Image detection system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 1259398

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant