WO2004109228A1 - 三次元形状測定装置 - Google Patents
三次元形状測定装置 Download PDFInfo
- Publication number
- WO2004109228A1 WO2004109228A1 PCT/JP2004/007717 JP2004007717W WO2004109228A1 WO 2004109228 A1 WO2004109228 A1 WO 2004109228A1 JP 2004007717 W JP2004007717 W JP 2004007717W WO 2004109228 A1 WO2004109228 A1 WO 2004109228A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- movement
- pattern
- image
- dimensional
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/113—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb occurring during breathing
- A61B5/1135—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb occurring during breathing by monitoring thoracic expansion
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0062—Arrangements for scanning
- A61B5/0064—Body surface scanning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/107—Measuring physical dimensions, e.g. size of the entire body or parts thereof
- A61B5/1075—Measuring physical dimensions, e.g. size of the entire body or parts thereof for measuring dimensions by non-invasive methods, e.g. for determining thickness of tissue layer
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
- A61B5/1127—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using markers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
- A61B5/1128—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0233—Special features of optical sensors or probes classified in A61B5/00
Definitions
- the present invention relates to a three-dimensional shape measuring device, and more particularly to a three-dimensional shape measuring device capable of easily and accurately grasping a state of an object.
- a motion detection sensor has been conventionally proposed as a motion detection device for detecting a motion of an object, for example, a person in a space, for example, a bathroom or a toilet.
- a typical example is a monitoring device that monitors a sleeping person's breathing by projecting a pattern on a sleeping person on a bed and calculating a moving amount of the pattern from an image obtained by continuously capturing the projected pattern. (For example, see Patent Document 1).
- Patent Document 1 Japanese Patent Application Laid-Open No. 2002-175582 (Pages 5-9, Figure 113)
- the state of each part of the object for example, the shape of the object and its movement (including a small movement such as breathing) can be simultaneously displayed. It was difficult to grasp. Also, depending on the part of the object (for example, when the object is a person, the chest and abdomen, etc.), a slight error may occur in the movement measurement result.
- an object of the present invention is to provide a three-dimensional shape measuring device capable of easily and accurately grasping the state of an object.
- a three-dimensional shape measuring apparatus 1 includes, as shown in FIGS. 1 and 3, for example, a projecting apparatus 11 that projects pattern light on a target area, A first three-dimensional sensor 10a having an imaging device 12a arranged at a first interval dl from the projection device 11 and imaging the target region onto which the pattern light is projected; and a projection device for projecting the pattern light onto the target region.
- a second three-dimensional sensor 10b having an imaging device 12b arranged at an interval d2 of 2 and capturing an image of the target area onto which the pattern light is projected; and a pattern of the pattern on the image obtained by the first three-dimensional sensor 10a.
- Three-dimensional information calculating means 22 for obtaining outer shape information of the object 2 existing in the target area based on the movement; and an object based on the movement of the pattern on the image obtained by the second three-dimensional sensor 10b.
- a fluctuation information calculating means for obtaining fluctuation information of the object; and an information synthesizing means for synthesizing the outer shape information and the fluctuation information.
- the projection device 11 that projects the pattern light onto the target region
- the imaging device 12a that is arranged at a first interval dl from the projection device 11 and captures an image of the target region on which the pattern light is projected
- a first three-dimensional sensor 10a having a pattern light
- a projection device 11 for projecting pattern light onto the target area and a second distance d2 longer than a first distance dl from the projection device 11 and the pattern light
- a second three-dimensional sensor 10b having an image pickup device 12b for picking up an image of the target area on which the image has been projected.
- each of the three-dimensional sensors can obtain the movement of the pattern on the image.
- the three-dimensional information calculating means 22 can obtain the outer shape information of the object 2, and the fluctuation information calculating means 23 Based on the movement of the pattern on the image obtained by the second three-dimensional sensor 10b, it is possible to obtain the fluctuation information of the object 2. Furthermore, by synthesizing the outer shape information and the fluctuation information by the information synthesizing means 24, it is possible to provide a three-dimensional shape measuring apparatus capable of easily and accurately grasping the state of the object.
- the first three-dimensional sensor 10a and the second three-dimensional sensor 10b share the projection device 11, and the first imaging device 12a and a different one from the first imaging device.
- the second imaging device 12b is provided, the first imaging device 12a and the second imaging device 12b are common, and the projection device 11 or the first projection device and the first projection device There may be another second projection device.
- the information synthesizing unit 24 may be configured to correct the variation information based on the external shape information. With this configuration, more accurate fluctuation information can be obtained by the correction.
- the information synthesizing means 24 may be characterized in that the synthesizing is performed so that the movement of each part of the object 2 can be understood.
- a three-dimensional shape measuring apparatus includes an information output unit 40 that displays a synthesis result of the information synthesis unit 24.
- the information output unit 40 displays the synthesis result of the information synthesis unit 24, so that, for example, the movement of each part on the object 2 can be easily grasped by display.
- the pattern light projected by the projection apparatus 11 may be one in which bright spots are arranged.
- the three-dimensional information calculation means 22 may be characterized in that the three-dimensional information calculation means 22 performs interpolation on a part where the outer shape information is insufficient.
- a projection device that projects pattern light onto a target region, and an image of a target region that is arranged at a first interval from the projection device and onto which the pattern light is projected
- a first three-dimensional sensor having an imaging device, a projection device for projecting pattern light on the target area, and the pattern light arranged from the projection device at a second interval longer than the first interval.
- a second three-dimensional sensor having an imaging device that captures an image of the target area on which the image is projected, and a target existing in the target area based on a movement of a pattern on an image obtained by the first three-dimensional sensor.
- FIG. 1 is a schematic external view of a monitoring device 1 as a three-dimensional shape measuring device according to a first embodiment of the present invention.
- the monitoring device 1 is configured to monitor a target area.
- the monitoring device 1 includes a projection device 11 that projects pattern light on the target region, a first imaging device 12a that is arranged at a first interval from the projection device 11, and captures an image of the target region on which the pattern light is projected. It has a first FG sensor 10a as a first three-dimensional sensor having the following. Further, the monitoring device 1 is arranged with a projection device 11 for projecting the pattern light onto the target area and a second interval longer than the first interval from the projection device 11, and the target on which the pattern light is projected.
- a second FG sensor 10b as a second three-dimensional sensor having a second imaging device 12b for imaging an area is provided.
- the monitoring device 1 includes an arithmetic device 20 that controls the first FG sensor 10a and the second FG sensor 10b. That is, the monitoring device 1 is configured to include the first FG sensor 10a, the second FG sensor 10b, and the arithmetic device 20.
- the first FG sensor 10a and the second FG sensor 10b, and the first imaging device 12a and the second imaging device 12b are simply referred to as the FG sensor 10 and the imaging device 12, respectively, unless particularly distinguished.
- the first FG sensor 10a and the second FG sensor 10b share the projection device 11.
- the FG sensor 10 has a measuring device 14 for measuring the movement of the pattern on the image captured by the imaging device 12.
- the first FG sensor 10a and the second FG sensor 10b share the measurement device 14. That is, the measurement device 14 measures the movement of the pattern on the image respectively captured by the first imaging device 12a and the second imaging device 12b.
- the projection device 11 and the imaging device 12 are electrically connected to the measurement device 14 and controlled by the measurement device 14.
- the measuring device 14 is And is configured integrally.
- the target area is a bed
- the target object typically performs a respiratory movement. That is, the object is a person or an animal. Further, in the present embodiment, the target object is person 2.
- the person 2 is lying down. That is, the pattern light is projected on the person 2.
- the pattern light is projected onto the bed 3 as it is.
- bedding may be placed on the person 2. In this case, the pattern light is projected onto the bedding.
- the pattern light projected by the projection device 11 is typically one in which bright spots are arranged.
- the projected pattern light is a plurality of bright spots.
- the projected pattern light is a pattern 11a formed by a plurality of bright spots 1 lb arranged in a substantially square lattice shape as described later with reference to FIG.
- a projection device 11 projects a pattern 11a on a bed 3.
- the plurality of bright spots projected on the bed 3 correspond to the plurality of measurement points on the bed 3, respectively. That is, the position of each bright spot is the position of the measurement point.
- the measurement point is a point at which the movement and height of the person 2 in the height direction can be measured as described later.
- the height is the height of the bed 3 on the bed.
- the projection device 11 and the imaging device 12 are arranged above the bed 3.
- the projection device 11 and the first imaging device 12a are arranged approximately above the center of the bed 3, and the second imaging device 12b is arranged approximately above the head of the person 2.
- the first imaging device 12a is arranged at a first interval dl from the projection device 11, and the second imaging device 12 is arranged at a second interval d2.
- the projection device 11, the first imaging device 12a, and the second imaging device 12b are arranged on the same straight line. That is, here, the base line direction of the first FG sensor 10a and the base line direction of the second FG sensor 10b are parallel to each other, and are further on the same straight line.
- the second interval d2 is, for example, about 220 times, preferably 5 to 15 times the first interval dl. In the present embodiment, it is 10 times. For example, if the first distance dl is 60 mm, the second distance d2 is 600 mm.
- the angle of view of each imaging device 12 is set so that an image of the central portion of the bed 3 can be captured. Note that the distance between the projection device 11 and the imaging device 12 is referred to as a base line length. Baseline length is the base line of the triangulation method This is the distance between the projection device 11 and the imaging device 12 in different directions.
- the projection device 11, the first imaging device 12a, and the second imaging device 12b are arranged on the same straight line, but is not limited thereto. If they are not on the same straight line, they can be dealt with, for example, by correcting the appearance in the later-described synthesis.
- the FG sensor 10 measures the movement of a bright spot forming a pattern.
- the movement amount of the luminescent spot increases.
- the movement amount of the bright spot is large, a phenomenon that the bright spot next to the bright spot to be compared may occur may occur. In this case, it is determined that the light source has moved from the next bright spot, and the measured moving distance of the bright spot may be small. That is, the amount of movement of the bright spot cannot be measured accurately.
- the base line length is short (first interval dl) as in the first FG sensor 10a
- the movement amount of the luminescent spot is small, and the above jump is unlikely to occur. Is difficult to distinguish from noise.
- the base line length is long (second interval d2), as in the case of the second FG sensor 10b, for example, even if the object is slightly moved, the amount of movement of the bright spot is large. Because it is reflected, a force that can measure a minute height or movement in the height direction, for example, jumping over when there is a large movement is a force s .
- the shape of the person 2 is measured based on the movement of the pattern obtained by the first FG sensor 10a, and the movement of the pattern obtained by the second FG sensor 10b is performed. It is desirable to measure the movement of the person 2 based on the movement.
- the projection device 11 and the second imaging device 12b are installed at a certain distance from each other. By doing so, the base line length becomes longer, so that the change can be detected sensitively.
- the projection device 11 has its optical axis (projection direction of the laser beam L1) set in a direction substantially parallel to the vertical direction of the upper surface of the bed 3 as shown in the figure.
- the projection device 11 has its optical axis approximately perpendicular to the vertical direction of the upper surface of the bed 3. Although installed in a parallel direction, it may be installed at an angle to the vertical direction.
- the first imaging device 12a is installed so that its optical axis is approximately parallel to the vertical direction of the upper surface of the bed 3. That is, the optical axis of the first imaging device 12a is set in a direction parallel to the optical axis of the projection device 11.
- the second imaging device 12b is installed with its optical axis inclined with respect to the vertical direction of the upper surface of the bed 3. By doing so, for example, it is possible to easily install the second imaging device 12b and the projection device 11 apart from each other. That is, the second interval d2 can be easily made longer. In other words, it is easy to increase the base line length of the triangulation method.
- the optical axes of the projection device 11, the first imaging device 12a, and the second imaging device 12b may be installed so as to be parallel to each other.
- the FG sensor 10 and the arithmetic unit 20 may be configured as a single united force. By doing so, the size of the monitoring device 1 can be reduced.
- the projection device 11 includes a light beam generation unit 105 as a light beam generation unit that generates a coherent light beam, and a fiber grating 120 (hereinafter simply referred to as a grating 120).
- the coherent light beam projected by the light beam generation unit 105 is typically an infrared laser.
- the light beam generator 105 is configured to generate a parallel light beam.
- the light beam generating unit 105 is a semiconductor laser device typically including a collimator lens (not shown), and the generated parallel light beam is a laser light beam L1.
- the laser beam L1 is a beam having a substantially circular cross section.
- the parallel light flux includes a nearly parallel light flux as long as it is substantially parallel.
- grating 120 is arranged parallel to plane 102 (perpendicular to Z axis).
- the laser beam L1 is incident on the grating 120 in the Z-axis direction.
- the laser light L1 is condensed by the individual optical fibers 121 in a plane having the lens effect, then spreads as a divergent wave, interferes, and a plurality of laser lights L1 are projected on the plane 102 as the projection plane.
- the pattern 11a which is a bright spot array, is projected.
- the arrangement of the grating 120 in parallel with the plane 102 means that, for example, the plane including the axis of each optical fiber 121 of the FG element 122 constituting the grating 120 and the plane 102 are parallel to each other. is there.
- the grating 120 includes two FG elements 122.
- the planes of the respective FG elements 122 are parallel to each other.
- the plane of each FG element 122 is referred to as an element plane.
- the axes of the optical fibers 121 of the two FG elements 122 are substantially orthogonal to each other.
- the FG element 122 is formed by, for example, arranging several tens to several hundreds of optical fibers 121 having a diameter of several tens of microns and a length of about 10 mm in a sheet shape in parallel.
- the two FG elements 122 may be arranged in contact with each other, or may be arranged at a distance from each other in the normal direction of the element plane. In this case, the distance between the two FG elements 122 is set so as not to interfere with the projection of the pattern 11a.
- the laser beam L1 is typically incident perpendicularly to the element plane of the grating 122.
- the projection device 11 since the grating 120 configured to include the two FG elements 122 is an optical system, an optical housing that does not require a complicated optical system can be downsized. . Further, by using the grating 120, the projection device 11 can project a plurality of bright points lib as a pattern 11a onto the target area with a simple configuration.
- the pattern 11a is typically a plurality of bright spots ib arranged in a square lattice.
- the shape of the luminescent spot is a substantially circular shape including an elliptical shape.
- the imaging device 12 is typically a CCD camera.
- the imaging device 12 includes an imaging optical system 13a (see FIG. 4) and an imaging device 15 (see FIG. 4).
- the imaging device 15 is typically a CCD imaging device.
- CMOS-structured devices other than CCDs have recently been actively reported as the imaging device 15, and they can be used as a matter of course. In particular, among these, some of the elements themselves have a function of binarizing between frames, and the use of these elements is preferable.
- the imaging device 12 includes a filter 13b (see Fig. 4) that attenuates light having a wavelength other than the peripheral portion of the wavelength of the laser light beam L1 generated by the light beam generation unit 105 (see Fig. 2). Good.
- the finoletor 13b is typically an optical filter such as an interference filter, and is preferably arranged on the optical axis of the imaging optical system 13a. In this way, the imaging device 12 can reduce the influence of disturbance light because the intensity of the light of the pattern 11a projected from the projection device 11 out of the light received by the imaging device 15 increases relatively.
- the laser beam generated by the light beam The light beam LI is typically a light beam of an infrared laser. Further, the laser beam L1 may be applied continuously or intermittently. In the case of intermittent irradiation, the imaging by the imaging device 12 is performed in synchronization with the irradiation timing.
- the arithmetic device 20 is configured integrally with the measuring device 14. Further, here, the measuring device 14 is configured integrally with a control unit 21 described later.
- the projection device 11 and the two imaging devices 12 are electrically connected to the measurement device 14 and controlled.
- the computing device 20 is remotely located with respect to the projection device 11 and the two imaging devices 12. Specifically, for example, beside the bed 3 or a room different from the room where the bed 3 is installed
- the arithmetic unit 20 is typically a computer such as a personal computer.
- the measuring device 14 measures the movement of the pattern on the image captured by the imaging device 12.
- the measurement device 14 is configured to be able to acquire an image captured by the imaging device 12. Further, the measurement device 14 is configured to measure the movement of each bright spot on the image captured by the imaging device 12.
- the projected luminescent spot and the image of the luminescent spot on the captured image are simply referred to as luminescent spots for convenience.
- measuring the movement of the luminescent spot means measuring the amount of movement of the luminescent spot (hereinafter referred to as the movement amount).
- the measured moving amount of the bright point is a concept including the moving direction of the bright point. In other words, the measured movement amount of the bright spot includes the information of the moving direction.
- the measuring device 14 is configured to measure the movement of the bright spot based on the images at two different points in time acquired from the two imaging devices 12, respectively. Further, in the present embodiment, the images at the two different time points are configured to measure the movement of the bright spot based on each of the image at the first different two time points and the image at the second different two time points. You. It is to be noted that the first two different time images are obtained from the first imaging device 12a, and the second two different time images are obtained from the second imaging device 12b.
- the first two different time points are the arbitrary time point (current) and the time point when the person 2 is not on the bed 3 .
- an image acquired at an arbitrary time (current) will be described as an acquired image
- an image acquired at a time when the person 2 does not exist on the bed 3 will be described as a reference image. Note that the reference image is stored in the storage unit 31.
- the acquired image and the reference image are, for example, images captured by the imaging device 12 (here, the first imaging device 12a), and the position information of the bright spot on each image It is a concept that also includes That is, the acquired image and the reference image are images of the pattern 11a formed by the projection of the projection device 11 at each time.
- the reference image is stored in the storage unit 31, for example, in the form of positional information such as coordinates regarding the position of each bright spot that is not a so-called image.
- the position of the bright point is the position of the center of gravity of the bright point. In this way, even a slight shift of the luminescent spot can be measured.
- the moving amount of the bright point can be calculated by comparing the position information of each bright point on the reference image stored in the storage unit 31 with the position information of each bright point on the acquired image. The amount of movement is measured. In addition, each movement amount is obtained, for example, by counting the number of pixels to which the position of the bright spot has moved (how many pixels have moved). By doing so, as described later, it is not necessary to generate a difference image, so that the processing can be simplified.
- the case where the position information of the luminescent spot is compared has been described, but a difference image between the reference image and the acquired image may be created.
- the moving amount of the bright point is measured based on the position of the corresponding bright point from the difference image. In this way, only the moved luminescent spot remains on the difference image, so that the processing amount can be reduced.
- the moving amount of the bright point measured by the measuring device 14 is a moving average value or a period average value of the moving amount of the bright point measured a certain number of times in the past or measured within a certain period in the past. I'm sorry. By doing so, random noise or sudden noise due to the flickering of sunlight coming in through a window can be reduced, and the reliability of the measured movement amount of the bright spot is improved.
- the measuring device 14 is configured to perform the above-described measurement of the movement of the bright spot for each bright spot forming the pattern 11a. That is, the positions of the plurality of bright points become the plurality of measurement points. Measurement equipment The unit 14 outputs the movement of the bright spot measured for each bright spot forming the pattern 11a, that is, the measured movement amount of the bright spot to the control unit 21 as a measurement result. That is, the measurement result is the moving amount of the bright spot measured based on the first two images at different time points. Furthermore, this measurement result corresponds to the height of person 2. Hereinafter, this measurement result is referred to as height information. The measurement device 14 outputs the measurement result at each measurement point as height information.
- the second different two time points are an arbitrary time point and a slightly earlier time point. "Slightly before” means that it is only before the time interval sufficient to detect the movement of person 2. In this case, when it is desired to detect even a slight movement of the person 2, it is sufficient to set the time to be short, for example, the movement of the person 2 cannot be too large, and to be regarded as substantially no movement, for example, about 0.1 second. Les ,.
- the images at the second different two time points are the acquired image and the reference image described above.
- the reference image is also an image captured by the imaging device 12 (here, the second imaging device 12b), like the reference image, but has a concept that also includes the position information of the bright spot on each image.
- the information is stored in the storage unit 31 in the form of position information such as coordinates relating to the position of each bright spot.
- the position of the bright point is the position of the center of gravity of the bright point.
- the images at the second different two points in time are an acquired image (N frame) and an image acquired immediately before the acquired image (N-1 frame). That is, the reference image is an image acquired immediately before the acquired image.
- the image acquisition interval may be appropriately determined depending on, for example, the processing speed of the apparatus and the content of the motion to be detected as described above. For example, about 0.1 to 3 seconds, preferably about 0.1 to 0.5 seconds It is good.
- acquiring images at shorter time intervals and performing averaging or filtering processing is effective because, for example, the effects of random noise can be reduced.
- the waveform obtained by measuring the movement of the luminescent spot based on the images at two different points in time that is, the arbitrary point and the point slightly before the point in time (for example, the sum of the movement amounts of the luminescent points, etc.) It becomes a differential waveform, that is, a waveform representing a change in speed.
- a waveform of a distance that is, a waveform indicating a change in height is obtained by integrating the waveform.
- measuring device 14 obtains the position information of each bright spot on the reference image stored in storage unit 31 and the position information of each bright spot on the acquired image. By comparing with the position information, the moving amount of the bright spot including the moving direction of the bright spot is measured. Similarly, the measuring device 14 outputs the movement amount of the bright spot measured for each bright spot to the control unit 21 as a measurement result. That is, the measurement result is the movement amount of the bright spot measured based on the image at the second different time point. Further, as will be described later with reference to FIG. 4, the measurement result corresponds to the movement of the object, that is, the person 2 in the height direction at each bright spot (measurement point). Hereinafter, this measurement result is referred to as motion information. The measurement device 14 outputs the measurement result at each measurement point as motion information.
- the movement in the height direction of the person 2 is, for example, a movement accompanying the breathing of the person 2.
- the concept of the movement of the luminescent spot will be described with reference to the conceptual perspective view of FIG.
- the explanation will be made on the assumption that the target area is the plane 102 and the target is the object 103 immediately.
- a case will be described in which the above-mentioned first two images at two different times, that is, a reference image and an acquired image are used.
- the reference image is an image of the pattern 11a when the object 103 does not exist on the plane 102
- the acquired image will be described as a pattern 11a when the object 103 exists on the plane 102.
- the case where there is one imaging device 12 will be described.
- an object 103 is placed on a plane 102.
- an orthogonal coordinate system XYZ is set so that the XY axis is located within the plane 102, and the object 103 is located in the first quadrant of the XY coordinate system.
- a projection device 11 and an imaging device 12 are arranged above the plane 102 on the Z-axis in the figure.
- the imaging device 12 captures an image of the plane 102 on which the pattern 11a is projected by the projection device 11. That is, the object 103 placed on the plane 102 is imaged.
- the imaging lens 13a as the imaging optical system of the imaging device 12 is arranged so that its optical axis coincides with the axis. Then, the imaging lens 13a forms an image of the pattern 11a on the plane 102 or the object 103 with an imaging surface 15 '(image Image).
- the imaging plane 15 ' is typically a plane orthogonal to the Z axis. Further, an xy orthogonal coordinate system is set in the imaging plane 15 ', and the Z axis passes through the origin of the xy coordinate system.
- the projection apparatus 11 is arranged at a distance d (base length d) in the negative direction of the Y axis from the imaging lens 13a at the same distance from the plane 102 and the imaging lens 13a. On the object 103 and the plane 102, a pattern 11a formed by a plurality of bright points l ib is projected by the projection device 11. Note that the y-axis direction is also the baseline direction in triangulation.
- the pattern 11a projected on the plane 102 by the projection device 11 is not blocked by the object 103 at the portion where the object 103 exists, and does not reach the plane 102.
- the bright point lib to be projected on the point 102a on the plane 102 is projected on the point 103a on the object 103.
- the imaging lens 13a and the projection device 11 are separated by a distance d (base line length d), on the imaging surface 15 ',
- What should be imaged at point 102a '(x, y) is imaged at point 103a' (x, y + ⁇ ). That is, the point in time when the object 103 does not exist and the point in time when the object 103 exists move by the distance ⁇ in the axial direction of the image power of the bright spot l ib.
- the bright spot formed on the image plane 15 ′ of the image sensor 15 moves in the y-axis direction by ⁇ due to the object 103 having a height. .
- the position of the point 103a on the object 103 can be specified three-dimensionally by measuring the movement amount ⁇ of the bright spot. That is, for example, the height of the point 103a is known. In this way, by measuring the difference between a point at which an image should be formed on the image plane 15 ′ when the object 103 does not exist and the actual image position on the image plane 15 ′, The distribution of the height of the object 103, that is, the three-dimensional shape can be measured. Alternatively, the three-dimensional coordinates of the object 103 can be measured.
- the pitch of the pattern 11a that is, the pitch of the bright spot l ib is finely adjusted so that the correspondence between the bright spots l ib does not become unclear
- the height distribution of the object 103 can be measured more precisely.
- the measuring device 14 can measure the height of the target object by measuring the movement amount of the bright spot.
- the measuring device 14 when measuring the movement of the luminescent spot based on the image at the second different time point, that is, when measuring the movement of the luminescent spot based on the acquired image and the reference image, Since you will see the amount of change in the movement of the bright spot, for example, The height of the object cannot be measured, but the change in the height of the object can be measured.
- the measurement device 14 associates the pattern 11a on the image captured by the first imaging device 12a with the pattern 11a on the image captured by the second imaging device 12b. It is configured.
- the respective bright points l ib forming the pattern 11a are associated with each other. In this way, it is possible to correspond to the position of each bright spot in the height information and the movement information, in other words, the movement amount of the bright spot at the position of each measurement point. For example, if the target area of the two imaging devices 12, that is, the appearance on the bed 3 is checked in advance, it is possible to take measures.
- the optical axes of the first imaging device 12a and the second imaging device 12b are aligned. More specifically, for example, the optical axis is adjusted so that the imaging ranges of the imaging devices 12 overlap as much as possible.
- the association is performed by the following method.
- an image of a pattern is acquired by the first imaging device 12a, and a three-dimensional coordinate of each bright spot of the pattern is calculated. Then, based on the three-dimensional coordinates and the arrangement of each imaging device 12, the coordinate system shown in FIG. 6 is transformed.
- the three-dimensional coordinates of each bright spot are represented by a coordinate system (X, Y
- the coordinate system (X, ⁇ , Z) is set as the three-dimensional coordinates of the first imaging device 12a,
- the image lens 13a is set as the origin of the coordinate system (X, ⁇ , Z), and the first imaging device 12a and the second imaging device
- dx be the distance of device 12b.
- the following equation is used to convert the coordinate system.
- an image of the pattern is acquired by the second imaging device 12b.
- the calculated position is compared with the pattern image acquired by the second imaging device 12b, and the closest bright spot is regarded as the same bright spot, and is associated with each other.
- the above association process is performed for all luminescent points on the image. However, due to the difference in the imaging range between the first imaging device 12a and the second imaging device 12b, the illuminating cannot be performed. A dot may appear. Such a bright spot is treated as a vanishing spot and is not used for measurement.
- the arithmetic device 20 has a control unit 21 that controls the monitoring device 1. Further, a storage unit 31 is connected to the control unit 21. The storage unit 31 may store images acquired from the imaging device 12 in chronological order. The storage unit 31 can store data such as calculated information.
- the control unit 21 is connected to a display 40 as information output means for displaying a synthesis result of the output information generation unit 24 as information synthesis means described later.
- Display 40 is typically an LCD.
- the display 40 inputs and displays, for example, analysis information output from an output information generation unit 24 described later.
- it is not particularly necessary to provide the display 40.
- the control unit 21 is connected to an input device 35 for inputting information for operating the monitoring device 1.
- the input device 35 is, for example, a touch panel, a keyboard, or a mouse. Although the input device 35 is shown as being externally attached to the arithmetic device 20 in this figure, it may be built in. Further, in the present embodiment, the case where the input device 35 is provided will be described, but there is no problem if the input device 35 is not provided.
- control unit 21 three-dimensional information calculation means for obtaining the outer shape information of the person 2 present on the bed 3 based on the movement of the pattern on the image obtained by the first FG sensor 10a And a fluctuation information calculating unit as fluctuation information calculating means for obtaining fluctuation information of the person 2 based on the movement of the pattern on the image obtained by the second FG sensor 10b. 23, and an output information generating unit 24 as information synthesizing means for synthesizing the outline information and the variation information.
- the outer shape information and the variation information will be described below. Hereinafter, each of the above configurations will be described in detail.
- the three-dimensional shape generation unit 22 obtains the outline information of the person 2 existing on the bed 3 as described above.
- the outer diameter information is an image indicating a three-dimensional shape (hereinafter, simply referred to as a three-dimensional shape).
- the three-dimensional shape generation unit 22 generates a three-dimensional shape as outer shape information based on a measurement result of the measurement device 14 of the first FG sensor 10a, that is, height information.
- the three-dimensional shape generation unit 22 is configured to generate a three-dimensional shape based on height information that is a measurement result of the measurement device 14.
- the height information which is the measurement result of the measurement device 14, corresponds to the height of the person 2 at a plurality of measurement points. Calculate the height. In this case, the height of the person 2 at each measurement point is calculated by triangulation based on the movement amount of the bright spot at each measurement point in the height information. More specifically, the height from above the bed 3 is calculated.
- the calculation of the height of the person 2 will be described with reference to FIG.
- the explanation will be made assuming that the target area is the plane 102 and the target is the object 103 immediately.
- FIG. 8 is a diagram showing the relationship between the projection device 11, the imaging device 12, the object 103, and the plane 102 as viewed in the X-axis direction (see FIG. 4).
- the case where the height of the object 103 is Z1 will be described.
- the center of the projection device 11 (the center of the pattern light source) and the center of the imaging lens 13a are disposed at a distance d in parallel with the plane 102, and are separated from the imaging lens 13a by the imaging surface 15 '(imaging
- the distance to the element 15) is 1 (L) (substantially equal to the focal point of the imaging lens 13a), the distance from the imaging lens 13a to the plane 102 is h, and the point 103a of the object 103 from the plane 102 is The height is Z1.
- the point 102a 'on the image plane 15' is a point 10 away by ⁇ . Suppose you have moved to 3a '.
- the height of the object 103 can be calculated.
- the three-dimensional shape generation unit 22 is configured to interpolate the external shape information, that is, the insufficient part of the three-dimensional shape. Note that when the outer shape information can be acquired as necessary and sufficient, there is no need to perform interpolation.
- the three-dimensional shape generation unit 22 calculates the height at each measurement point from the height information that is the measurement result of the measurement device 14, and generates the three-dimensional shape based on the calculated height. Generate. Note that the height of the person 2 is not known because the measurement points (bright spots) are arranged at intervals, so the height between the measurement points is not known. Therefore, if the three-dimensional shape is generated as it is from the calculated height of the person 2 at each measurement point, the outer shape of the person 2 is difficult to understand. In order to compensate for this, the three-dimensional shape generation unit 22 performs interpolation on the part of the person 2 where the height is insufficient.
- the four nearby measurement points are searched from the three-dimensional coordinates ( ⁇ , ⁇ , ⁇ ) to be interpolated.
- the four measurement points belong to the four quadrants based on ( ⁇ , ⁇ ), respectively.
- the three-dimensional shape generation unit 22 By performing the above calculation for the coordinates to be interpolated, the height of the person 2 at the coordinates between the measurement points can be interpolated.
- the three-dimensional shape generation unit 22 generates a three-dimensional shape by performing the above interpolation.
- FIG. 9 shows an example of the three-dimensional shape generated in this manner.
- the three-dimensional shape shown is an image of an image displayed on the display 40.
- interpolation is not limited to the examples described above, but may be any of various interpolation methods or the crit method (eg, Triangulation, Radial Basis Function complement f3 ⁇ 4, Polynomia 1 Regression ⁇ Nearest Neighbor grid method, Natural Neighbor grid Method, Modified Shepard's Method ⁇ Minimum Curvature, Inverse Distance to a Power grid method, Kriging, etc.) can be applied.
- crit method eg, Triangulation, Radial Basis Function complement f3 ⁇ 4, Polynomia 1 Regression ⁇ Nearest Neighbor grid method, Natural Neighbor grid Method, Modified Shepard's Method ⁇ Minimum Curvature, Inverse Distance to a Power grid method, Kriging, etc.
- the fluctuation information calculation unit 23 obtains fluctuation information of the person 2 as described above.
- the variation information is information on the movement of the person 2 in the height direction including the phase of the movement of the person 2 in the height direction at each measurement point. Further, here, it is assumed that the fluctuation information also includes a height change amount described later.
- the fluctuation information calculation unit 23 is configured to identify the phase of the motion at each measurement point from the motion information that is the measurement result of the measurement device 14.
- the variation information calculation unit 23 obtains the phase of the movement identified at each measurement point as variation information.
- the phase is a concept including the direction of movement.
- the discrimination of the phase of the motion by the fluctuation information calculation unit 23 is performed by discriminating whether the motion measured at each measurement point by the measuring device 14 is a force that is an upward motion or a downward motion. It is to be. By doing so, for example, it is possible to know which part on the body of the person 2 is moving upward or downward.
- the fluctuation information calculation unit 23 is configured to calculate the amount of change in height of the person 2 at each measurement point (hereinafter referred to as the height change amount) as fluctuation information based on the motion information. .
- the fluctuation information calculation unit 23 calculates the measurement result of the measuring device 14 of the second FG sensor 10b, that is, the motion information. The amount of height change is calculated based on.
- the fluctuation information calculation unit 23 is configured to calculate a height change amount based on the motion information.
- the movement information corresponds to the movement of the person 2 in the height direction at each measurement point.
- the actual height change amount is calculated from the movement information.
- the height change of the person 2 at each measurement point is determined by triangulation based on the movement amount of the bright spot at each measurement point in the motion information. Calculate the amount.
- the height change amount may be subjected to interpolation similar to the three-dimensional shape.
- the output information generation unit 24 synthesizes the three-dimensional shape and the variation information.
- the output information generation unit 24 is configured to generate analysis information for combining and displaying the three-dimensional shape obtained by the three-dimensional shape generation unit 22 and the fluctuation information obtained by the fluctuation information calculation unit 23. Have been.
- the generated analysis information is output to the display 40 and displayed on the display 40.
- the term “combining” means, for example, superimposing variation information on a three-dimensional shape.
- the displayed synthesis result also includes, for example, information such as a volume change of the person 2 and a waveform thereof, which will be described later.
- the output information generation unit 24 writes the variation information in a three-dimensional shape at each measurement point.
- An image synthesized so as to correspond to (bright point) is generated as analysis information.
- the determination result by the abnormality determination unit 26 described later is also included in the generated analysis information.
- the output information generation unit 24 is configured to perform the synthesis so that the movement of each part of the person 2 can be understood. Specifically, the phase of the identified movement at each measurement point, which is fluctuation information, is superimposed on the three-dimensional shape so that the coordinates correspond to each other. By doing so, for example, it is easy to see which part of the body of the person 2 is moving upward or downward.
- an example of synthesizing the three-dimensional shape and the variation information that is, an example of the generated analysis information will be described with reference to the schematic diagram of FIG.
- an example of the generated analysis information is shown by an image displayed on the display 40.
- the position information corresponds to the three-dimensional shape described in FIG. To be synthesized.
- the phase of the movement at each measurement point can be identified.
- FIG. 10 (a) shows a case where the abdomen of the person 2 is moving upward, and more specifically, a case of inhalation of abdominal breathing.
- FIG. 10 (b) shows a case where the chest of the person 2 is moving downward, more specifically, a case of exhalation of chest respiration.
- the measurement points of each phase are shown by changing the pattern (in the figure, the upward direction is outlined, and the downward direction is solid). ), And the colors may be changed respectively (for example, blue upwards and red downwards). Also, for example, the phase of the movement may be indicated by an arrow (indicated by a broken line at some measurement points in the figure). By doing so, it is possible to easily recognize which part of the body of the person 2 is moving upward or downward.
- the monitoring device 1 displays the analysis information thus generated on the display 40.
- the display is performed. Further, the display becomes easy to understand the change in the movement.
- the color is brightened by reflecting the amount, and the length of the arrow is increased. The display is easy to understand the change in height.
- the output information generation unit 24 is configured to calculate the volume fluctuation amount of the person 2.
- the volume fluctuation amount can be calculated as a height change amount force which is fluctuation information.
- the sum of the height fluctuation amounts may be used as the volume fluctuation amount.
- the calculated volume fluctuation amount is displayed on the display 40 by being included in the analysis information. If the information is not to be displayed on the display 40, the data may be stored in an electronic medium or the like (for example, the storage unit 31 in this case).
- the volume fluctuation is integrated by an absolute value over one period (data acquisition is performed at regular time intervals, so actually, By adding the data, the total amount of each motion can be obtained. In the case of respiratory detection, half of this corresponds to tidal volume.
- one cycle or half cycle When data is added over multiple periods, when determining the start point and end point of the cycle, take the moving average of the volume change amount obtained several times and determine the value from negative to positive or from positive to negative.
- the start point or the end point may be set with the timing of the transition. As a result, it is possible to prevent pseudo-positive and negative transitions from occurring due to noise included in the value of the volume fluctuation amount, thereby causing an error in the timing of the start point and the end point.
- the output information generation unit 24 is configured to correct the fluctuation information based on the outer shape information, here, the three-dimensional shape. What is corrected here is the height change amount that is the fluctuation information.
- the output information generation unit 24 corrects the height change amount using the three-dimensional shape obtained by the three-dimensional shape generation unit 22. Specifically, the height change amount corresponding to each point is corrected, with the height of the person 2 at each point forming the three-dimensional shape as the distance.
- the base line length of the second FG sensor 10b is 600 mm
- the focal length 1 of the imaging lens 13a of the second imaging device 12b is 12 mm. If the distance h from the imaging lens 13a to the plane 102 is 2.3m, and if the bright spot moves 5 / im on the imaging surface 15 ', the height change is calculated to be 3.67mm. You. Also, assuming that the distance from the imaging lens 13a to the object 103 is 2.0 m (the height of the object 103 is 0.3 m), and calculating this as h, the height change is 2.77 mm. Is calculated. The difference in height change of 1.1 mm is the measurement error.
- the output information generation unit 24 can calculate the height change amount using the accurate distance h by the correction, and thus can know the height change amount more accurately. Furthermore, by calculating the volume fluctuation amount based on the height change amount thus corrected, the person 2 can be more accurately calculated. Volume fluctuations can be measured. This is very useful for measuring the amount of fine movement such as breathing.
- control unit 21 includes a motion determining unit 25 that determines the type of the motion of the person 2 based on the motion information measured by the measuring device 14 of the second FG sensor 10b. That is, the motion determining unit 25 determines the type of the motion of the person 2 based on the motion information measured at the plurality of measurement points by the measuring device 14, that is, the motion of the person 2 in the height direction.
- the types of movement of the person 2 determined by the movement determining unit 25 are typically breathing, body movement, and no movement (immobile). Further, the motion determining unit 25 is configured to detect the breathing of the person 2 based on the motion information.
- the body movement is a movement of the body of the person 2, and is a concept including, for example, a movement such as standing or sitting, as well as a movement of a limb.
- the motion determining unit 25 may be configured to detect the breathing.
- the detection of respiration by the motion discriminating unit 25 is performed by setting predetermined upper and lower thresholds for both or one of the amplitude and the period (frequency) of the periodic change of the time change of the average value, and comparing with the threshold.
- the breathing force may be determined to detect the breathing.
- the upper and lower thresholds of the cycle may be set, for example, in a range including the respiratory cycle of the person, for example, the lower limit may be set to 5 cycles per minute and the upper limit may be set to 60 cycles per minute.
- the respiratory rate of an adult is in the range of about 5 to 30 times per minute, but the respiratory rate of an infant tends to be higher.
- FIG. 11 is a diagram showing an example of a breathing waveform pattern.
- the motion determining unit 25 detects the respiratory rate.
- the respiration rate can be detected, for example, by performing a data processing such as a Fourier transform on a time change of a total sum of movement amounts of the bright spots in the area where the motion is determined to be the respiration.
- the control unit 21 further includes an abnormality determination unit 26 that determines an abnormality of the person 2 based on the movement of the person 2 in the height direction measured by the measurement device 14 of the second FG sensor 10b. ing. More specifically, the abnormality determination unit 26 determines the abnormality of the person 2 based on the detection result of the respiration of the person 2 by the motion determination unit 25. Further, the abnormality determination unit 26 is also an abnormality determination unit that determines an abnormality of the person 2 based on the fluctuation information obtained by the fluctuation information calculation unit 23. In addition, the judgment of the abnormality of the person 2 means that the person 2 is in a dangerous state here. It is to determine whether or not.
- the criteria for determining the dangerous state of person 2 by abnormality determination unit 26 may be set in consideration of the following. For example, when breathing is detected by the motion determining unit 25, if the cycle of the breathing pattern is disturbed in a short time, or if the cycle of the breathing pattern changes rapidly, for example, natural pneumothorax, Since it can be inferred that the disease is a lung disease such as bronchial asthma, a heart disease such as depressive heart failure, or a cerebrovascular disease such as cerebral hemorrhage, it is set to be judged as dangerous. In addition, if the disappearance of the breathing pattern continues, it can be estimated that the breathing of the person 2 has stopped, so that it is set to be determined to be dangerous. Then, if the body motion of the person 2 occurs frequently instead of the breathing pattern in a short time, it can be inferred that the person 2 is suffering and rampaging for some reason, so it is determined that the state is dangerous. Set as follows.
- the determination result by the abnormality determination unit 26 as described above is configured to be displayed on the display 40.
- the abnormality determination unit 26 outputs the determination result to the output information generation unit 24.
- the output information generation unit 24 generates analysis information including the determination result and outputs the analysis information to the display 40.
- the result of the determination by the abnormality determination unit 26 is displayed on the display 40, so that, for example, the measurer can easily recognize the abnormality of the person 2.
- the pattern projected on the bed 3 is a plurality of bright points.
- the movement of the person 2 in the height direction may be measured by using the light section method.
- a projection device 111 configured to project a bright line as pattern light onto the bed 3 is used as the projection means.
- the number of emission lines to be projected may be one, typically a plurality of forces S. Further, in the case of one line, for example, a method of running one bright line may be used.
- a plurality of bright lines 11 lb are projected at equal intervals.
- the multiple emission lines 11 lb form a pattern 111a '.
- the direction of the bright line 111b is substantially perpendicular to the base line direction of the trigonometry.
- the image of the bright line formed on the imaging surface 15 ′ of the image sensor 15 is A tall object will move in the y-axis direction by ⁇ . Similarly, by measuring this ⁇ , the object The position of the upper point can be specified three-dimensionally. The measurement of ⁇ is performed at the position of the center line of the bright line image. In the case of a bright line, the measurement point corresponds to one pixel of the image sensor 15 at the position of the bright line image.
- the movement of an arbitrary point on the bright line can be compared with the case where the pattern light is a plurality of bright points. Can be measured, and the continuous shape in the emission line direction can be recognized. In other words, it is possible to improve the resolution of the measurement in the bright line direction.
- the first three-dimensional sensor and the second three-dimensional sensor share the same projection device, but are not limited to this, and the first imaging device and the second imaging device may be different.
- the device 12b and the common imaging device 12 may be used.
- the first FG sensor 10a and the second FG sensor 10b are installed by placing the first projection device 11-1 and the imaging device 12 approximately above the center of the bed 3 and the person 2
- the second projection device 112 is arranged approximately above the head.
- the first projection device 11 1 is arranged at a first interval dl from the imaging device 12, and the second projection device 11-2 is arranged at a second interval d 2.
- the patterns projected are not common, that is, the pattern 11a projected from the first projection device 11-1 and the pattern projected from the second projection device 11-12.
- 11a ′ the synthesis of the three-dimensional shape and the variation information by the output information generation unit 24 is slightly different. Specifically, the synthesis by the output information generation unit 24 is performed in such a manner that the coordinates on the bed 3 correspond to the three-dimensional shape and the variation information, instead of being synthesized so that each measurement point of each FG sensor 10 corresponds. Combine. By doing so, the projection of the pattern of each FG sensor 10 Even if the positions and the pitches of the luminescent spots are different, the three-dimensional shape and the variation information can be accurately synthesized.
- the processing may be performed while distinguishing the image captured when projecting the pattern 11a '.
- the movement of the pattern 11a and the movement of the pattern 11a ' can be measured, respectively, so that three-dimensional shape and variation information can be obtained.
- the amount of image processing can be reduced.
- the monitoring device 1 includes the first FG sensor 10a, the second FG sensor 10b, the three-dimensional shape generation unit 22, and the fluctuation information calculation unit 23
- the three-dimensional shape of the person 2 can be obtained by the original shape generation unit 22 based on the height information obtained by the first FG sensor 10a, and the variation information calculation unit 23 can obtain the three-dimensional shape of the person 2 by the second FG sensor 10b. Based on the obtained motion information, variation information of the person 2 can be obtained. Further, by providing the output information generation unit 24 and synthesizing the three-dimensional shape and the variation information, it is possible to easily recognize which part of the body of the person 2 is moving upward or downward, for example. Since an image can be generated, the state of the person 2, particularly the state of breathing, can be easily and accurately grasped.
- the first FG sensor 10a has a relatively short distance between the projection device and the imaging device (base line length). It is suitable.
- the second FG sensor 10b has a longer arrangement interval (base line length) between the projection device and the imaging device than the first FG sensor 10a, so that more detailed movement, such as breathing of the person 2, is performed. Even small movements can be measured accurately.
- the three-dimensional shape generation unit 22 generates a three-dimensional shape that can recognize the body shape of the person 2, so that the breathing state of the person 2 can be clearly understood.
- the FG sensor 10 as a three-dimensional sensor, the movement of the person 2 in the height direction can be measured simply but accurately. Further, since the FG sensor 10 can perform measurement without contact, the burden on the person to be measured is small.
- the output information generation unit 24 corrects the fluctuation information based on the three-dimensional shape, so that the height change amount of the person 2 can be calculated more accurately. Further, since the volume fluctuation amount is calculated based on the height change amount, a more accurate volume fluctuation amount can be measured.
- the monitoring device 1 also includes a display 40 for displaying the result of the synthesis of the output information generation unit 24. Accordingly, the monitoring apparatus 1 analyzes the synthesis result obtained by the output information generation unit 24, that is, the analysis information in which the fluctuation information indicating the movement of the person 2 on the body is superimposed on the three-dimensional shape indicating the outer shape of the body of the person 2. Since the information can be displayed on the display 40, the movement (particularly the movement due to breathing) of each body part can be easily recognized. This can be helpful, for example, for a doctor's diagnosis.
- the three-dimensional shape generation unit 22 performs interpolation on the insufficient part of the three-dimensional shape, it is possible to obtain a continuous outline of the person 2 even when the measurement points are arranged at intervals. .
- FIG. 1 is a schematic external view schematically showing a monitoring device according to an embodiment of the present invention.
- FIG. 2 is a schematic perspective view illustrating a projection device according to an embodiment of the present invention.
- FIG. 3 is a block diagram showing a configuration example of a monitoring device according to an embodiment of the present invention.
- FIG. 4 is a conceptual perspective view for explaining the concept of movement of a bright spot in the embodiment of the present invention.
- FIG. 5 is a schematic diagram illustrating a bright point formed on an image forming surface in the case of FIG. 4.
- FIG. 6 is a diagram for explaining coordinate conversion at the time of associating a luminescent spot between a first imaging device and a second imaging device according to an embodiment of the present invention.
- FIG. 7 is a diagram illustrating similar use in the case of FIG. 6;
- FIG. 8 is a diagram illustrating calculation of a height of an object according to the embodiment of the present invention.
- FIG. 9 is a schematic diagram illustrating a three-dimensional shape generated by a three-dimensional shape generation unit according to an embodiment of the present invention.
- FIG. 10 shows an example of a result obtained by synthesizing fluctuation information with the three-dimensional shape in the case of FIG. 9, (a) is a schematic diagram showing an upward movement of the abdomen, and (b) is a schematic diagram of the chest.
- FIG. 9 is a schematic diagram showing a case where there is a downward movement.
- FIG. 11 is a diagram showing a waveform pattern of respiration used in the embodiment of the present invention.
- FIG. 12 is a schematic external view schematically showing a monitoring device when a plurality of bright lines are used for pattern light projected by the projection device according to the embodiment of the present invention.
- FIG. 13 is a schematic diagram illustrating bright lines imaged on an image plane in the case of FIG. 12.
- FIG. 14 is a schematic external view schematically showing a monitoring device in a case where a first imaging device and a second imaging device according to an embodiment of the present invention are shared and two projection devices are provided.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- Public Health (AREA)
- Pathology (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Physiology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/560,048 US7630537B2 (en) | 2003-06-09 | 2004-06-03 | Three-dimensional shape-measuring device |
AU2004245815A AU2004245815B2 (en) | 2003-06-09 | 2004-06-03 | Three-dimensional shape-measuring device |
CA002528824A CA2528824A1 (en) | 2003-06-09 | 2004-06-03 | Three-dimensional shape-measuring device |
DE602004009077T DE602004009077T2 (de) | 2003-06-09 | 2004-06-03 | Einrichtung zur messung dreidimensionaler formen |
EP04745573A EP1645841B8 (en) | 2003-06-09 | 2004-06-03 | Three-dimensional shape-measuring device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003163503A JP3738291B2 (ja) | 2003-06-09 | 2003-06-09 | 三次元形状測定装置 |
JP2003-163503 | 2003-06-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2004109228A1 true WO2004109228A1 (ja) | 2004-12-16 |
Family
ID=33508757
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/007717 WO2004109228A1 (ja) | 2003-06-09 | 2004-06-03 | 三次元形状測定装置 |
Country Status (7)
Country | Link |
---|---|
US (1) | US7630537B2 (ja) |
EP (1) | EP1645841B8 (ja) |
JP (1) | JP3738291B2 (ja) |
AU (1) | AU2004245815B2 (ja) |
CA (1) | CA2528824A1 (ja) |
DE (1) | DE602004009077T2 (ja) |
WO (1) | WO2004109228A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070076090A1 (en) * | 2005-10-04 | 2007-04-05 | Alexander Eugene J | Device for generating three dimensional surface models of moving objects |
WO2015104763A1 (ja) * | 2014-01-07 | 2015-07-16 | ソニー株式会社 | 分析システム、分析プログラム及び分析方法 |
Families Citing this family (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3782815B2 (ja) * | 2004-02-04 | 2006-06-07 | 住友大阪セメント株式会社 | 呼吸解析装置 |
DE102005013042A1 (de) | 2005-03-18 | 2006-09-28 | Siemens Ag | Einrichtung zur Erzeugung von 3D-Fluoreszenz-oder Lumineszenz-Scans |
JP2008154655A (ja) * | 2006-12-21 | 2008-07-10 | Keio Gijuku | 呼吸機能測定装置及びプログラム |
DE102007006566A1 (de) * | 2007-02-09 | 2008-08-14 | Siemens Ag | Verfahren zur zentralen Überwachung sowie Anordnung zur Aufnahme, Auswertung und selektiven Anzeige von Bildern ruhender Personen |
JP5538667B2 (ja) * | 2007-04-26 | 2014-07-02 | キヤノン株式会社 | 位置姿勢計測装置及びその制御方法 |
GB0822605D0 (en) * | 2008-12-11 | 2009-01-21 | Pneumacare Ltd | Method and apparatus for monitoring an object |
CN102341085B (zh) | 2009-03-06 | 2014-02-12 | 阿特雷奥医疗公司 | 对在表面上的cpr的按压参数的测量 |
WO2011011633A2 (en) * | 2009-07-22 | 2011-01-27 | Atreo Medical, Inc. | Optical techniques for the measurement of chest compression depth and other parameters during cpr |
JP2011033507A (ja) * | 2009-08-03 | 2011-02-17 | Renesas Electronics Corp | 3次元計測装置 |
US11022433B2 (en) | 2010-02-12 | 2021-06-01 | Koninklijke Philips N.V. | Laser enhanced reconstruction of 3D surface |
EP2380493A1 (en) * | 2010-04-21 | 2011-10-26 | Koninklijke Philips Electronics N.V. | Respiratory motion detection apparatus |
US8670029B2 (en) * | 2010-06-16 | 2014-03-11 | Microsoft Corporation | Depth camera illuminator with superluminescent light-emitting diode |
JP5163713B2 (ja) * | 2010-08-24 | 2013-03-13 | カシオ計算機株式会社 | 距離画像センサ及び距離画像生成装置並びに距離画像データ取得方法及び距離画像生成方法 |
JP5187364B2 (ja) * | 2010-08-24 | 2013-04-24 | カシオ計算機株式会社 | 回折光学素子、並びに測距装置及び測距方法 |
EP2619724A2 (en) | 2010-09-23 | 2013-07-31 | Stryker Corporation | Video monitoring system |
JP5655021B2 (ja) | 2011-03-29 | 2015-01-14 | 富士フイルム株式会社 | 光音響画像化方法および装置 |
CN102223553B (zh) * | 2011-05-27 | 2013-03-20 | 山东大学 | 一种二维视频到三维视频的自动转换方法 |
EP2715695B1 (en) * | 2011-05-30 | 2016-03-16 | Koninklijke Philips N.V. | Apparatus and method for the detection of the body position while sleeping |
JP5984409B2 (ja) * | 2012-02-03 | 2016-09-06 | キヤノン株式会社 | 三次元計測システム及び方法 |
US9301710B2 (en) * | 2012-06-01 | 2016-04-05 | Xerox Corporation | Processing a video for respiration rate estimation |
JP6150231B2 (ja) * | 2012-07-24 | 2017-06-28 | 公立大学法人広島市立大学 | 心拍計測方法および装置 |
WO2014059681A1 (zh) * | 2012-10-20 | 2014-04-24 | 因美吉智能科技(济南)有限公司 | 非接触式儿科测量方法和测量设备 |
US10648789B2 (en) * | 2012-11-07 | 2020-05-12 | ARTEC EUROPE S.á r.l. | Method for monitoring linear dimensions of three-dimensional objects |
US9291877B2 (en) | 2012-11-15 | 2016-03-22 | Og Technologies, Inc. | Method and apparatus for uniformly focused ring light |
JP2014119427A (ja) * | 2012-12-19 | 2014-06-30 | Fujitsu Semiconductor Ltd | スポット探索装置、スポット探索方法 |
GB2522452B (en) * | 2014-01-24 | 2017-07-19 | Chapelglade Ltd | System for measuring body characteristics relevant for mattress selection |
US9589359B2 (en) * | 2014-04-24 | 2017-03-07 | Intel Corporation | Structured stereo |
US9814410B2 (en) * | 2014-05-06 | 2017-11-14 | Stryker Corporation | Person support apparatus with position monitoring |
DE102014218140B3 (de) * | 2014-09-10 | 2016-03-10 | Ait Austrian Institute Of Technology Gmbh | Verfahren und Vorrichtung zur Bestimmung des zeitlichen Verlaufs der Atemtiefe einer Person |
FR3026933A1 (fr) * | 2014-10-09 | 2016-04-15 | Inst Nat De La Sante Et De La Rech Medicale (Inserm) | Dispositif et procede de caracterisation de l'activite respiratoire d'un mammifere |
JP6248917B2 (ja) * | 2014-12-11 | 2017-12-20 | カシオ計算機株式会社 | 立体形成方法、立体形成装置、及び、プログラム |
CN104596439A (zh) * | 2015-01-07 | 2015-05-06 | 东南大学 | 一种基于相位信息辅助的散斑匹配三维测量方法 |
CN107708493B (zh) * | 2016-01-13 | 2021-01-08 | 株式会社爱维福 | 长丝三维结合体制造装置以及长丝三维结合体的制造方法和床垫用芯材 |
JP6839799B2 (ja) * | 2016-03-03 | 2021-03-10 | パナソニックIpマネジメント株式会社 | 情報端末装置の制御方法、体動測定装置、及び、プログラム |
FI127962B (en) | 2017-04-10 | 2019-06-14 | Sorvi Consulting Oy | DEVICE, METHOD AND COMPUTER SOFTWARE FOR MANAGING PHYSICAL TRAINING |
WO2020110164A1 (ja) * | 2018-11-26 | 2020-06-04 | 三菱電機株式会社 | 表示データ生成装置、表示データ生成方法、および表示データ生成プログラム |
JP2020094976A (ja) * | 2018-12-14 | 2020-06-18 | 本田技研工業株式会社 | 凹凸検査方法及び凹凸検査装置 |
AT522841A1 (de) * | 2019-07-17 | 2021-02-15 | Ait Austrian Inst Tech Gmbh | Verfahren zur Detektion der Intensität und/oder des Ausmaßes der Körperbewegungen einer Person |
GB201914842D0 (en) * | 2019-10-14 | 2019-11-27 | Binatone Electronics Int Ltd | Breathing detection apparatus and methods for detecting breathing |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002122416A (ja) * | 2000-10-16 | 2002-04-26 | Sumitomo Osaka Cement Co Ltd | 三次元形状測定装置 |
JP2002131017A (ja) * | 2000-10-27 | 2002-05-09 | Honda Motor Co Ltd | 距離測定装置、及び距離測定方法 |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4947152A (en) | 1986-02-10 | 1990-08-07 | Mesa Vista Hospital | Patient monitoring system |
US4882566A (en) | 1988-08-03 | 1989-11-21 | Hill-Rom Company, Inc. | Safety control system for a hospital bed |
CA2056370C (en) | 1990-03-09 | 1998-08-11 | Hiroyuki Ogino | Sleep detecting apparatus |
JP3051279B2 (ja) * | 1993-05-13 | 2000-06-12 | シャープ株式会社 | バンプ外観検査方法およびバンプ外観検査装置 |
US5528339A (en) * | 1994-08-26 | 1996-06-18 | Eastman Kodak Company | Color image reproduction of scenes with color enhancement and preferential tone mapping |
US5471198A (en) | 1994-11-22 | 1995-11-28 | Newham; Paul | Device for monitoring the presence of a person using a reflective energy beam |
DE19638727A1 (de) * | 1996-09-12 | 1998-03-19 | Ruedger Dipl Ing Rubbert | Verfahren zur Erhöhung der Signifikanz der dreidimensionalen Vermessung von Objekten |
US6075883A (en) * | 1996-11-12 | 2000-06-13 | Robotic Vision Systems, Inc. | Method and system for imaging an object or pattern |
US6011477A (en) | 1997-07-23 | 2000-01-04 | Sensitive Technologies, Llc | Respiration and movement monitoring system |
US6011595A (en) * | 1997-09-19 | 2000-01-04 | Eastman Kodak Company | Method for segmenting a digital image into a foreground region and a key color region |
JP3263035B2 (ja) | 1997-11-21 | 2002-03-04 | 東芝エンジニアリング株式会社 | 呼吸モニタリングの関心領域設定装置および呼吸モニタリングシステム |
US5914660A (en) | 1998-03-26 | 1999-06-22 | Waterview Llc | Position monitor and alarm apparatus for reducing the possibility of sudden infant death syndrome (SIDS) |
US6049281A (en) | 1998-09-29 | 2000-04-11 | Osterweil; Josef | Method and apparatus for monitoring movements of an individual |
JP3820811B2 (ja) * | 1999-08-02 | 2006-09-13 | 株式会社デンソー | 呼吸器系疾患のモニタ装置 |
US7167575B1 (en) | 2000-04-29 | 2007-01-23 | Cognex Corporation | Video safety detector with projected pattern |
US7106885B2 (en) | 2000-09-08 | 2006-09-12 | Carecord Technologies, Inc. | Method and apparatus for subject physical position and security determination |
JP3689720B2 (ja) | 2000-10-16 | 2005-08-31 | 住友大阪セメント株式会社 | 三次元形状測定装置 |
JP2002164066A (ja) * | 2000-11-22 | 2002-06-07 | Mitsubishi Heavy Ind Ltd | 積層型熱交換器 |
JP3477166B2 (ja) | 2000-12-07 | 2003-12-10 | 学校法人慶應義塾 | 監視装置 |
JP3922694B2 (ja) | 2001-06-15 | 2007-05-30 | 住友大阪セメント株式会社 | 監視装置 |
EP1410755A4 (en) | 2001-06-15 | 2009-01-28 | Sumitomo Osaka Cement Co Ltd | CONTROL DEVICE |
JP4703049B2 (ja) * | 2001-07-17 | 2011-06-15 | 住友大阪セメント株式会社 | 監視装置 |
US7110596B2 (en) | 2002-04-25 | 2006-09-19 | Microsoft Corporation | System and method facilitating document image compression utilizing a mask |
JP3979238B2 (ja) | 2002-09-05 | 2007-09-19 | 住友大阪セメント株式会社 | 空間内監視装置 |
JP3764949B2 (ja) | 2003-06-09 | 2006-04-12 | 住友大阪セメント株式会社 | 状態解析装置 |
-
2003
- 2003-06-09 JP JP2003163503A patent/JP3738291B2/ja not_active Expired - Fee Related
-
2004
- 2004-06-03 CA CA002528824A patent/CA2528824A1/en not_active Abandoned
- 2004-06-03 US US10/560,048 patent/US7630537B2/en active Active
- 2004-06-03 WO PCT/JP2004/007717 patent/WO2004109228A1/ja active IP Right Grant
- 2004-06-03 EP EP04745573A patent/EP1645841B8/en not_active Expired - Lifetime
- 2004-06-03 AU AU2004245815A patent/AU2004245815B2/en not_active Ceased
- 2004-06-03 DE DE602004009077T patent/DE602004009077T2/de not_active Expired - Lifetime
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002122416A (ja) * | 2000-10-16 | 2002-04-26 | Sumitomo Osaka Cement Co Ltd | 三次元形状測定装置 |
JP2002131017A (ja) * | 2000-10-27 | 2002-05-09 | Honda Motor Co Ltd | 距離測定装置、及び距離測定方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1645841A4 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070076090A1 (en) * | 2005-10-04 | 2007-04-05 | Alexander Eugene J | Device for generating three dimensional surface models of moving objects |
US8848035B2 (en) * | 2005-10-04 | 2014-09-30 | Motion Analysis Corporation | Device for generating three dimensional surface models of moving objects |
WO2015104763A1 (ja) * | 2014-01-07 | 2015-07-16 | ソニー株式会社 | 分析システム、分析プログラム及び分析方法 |
JPWO2015104763A1 (ja) * | 2014-01-07 | 2017-03-23 | ソニー株式会社 | 分析システム、分析プログラム及び分析方法 |
US10535137B2 (en) | 2014-01-07 | 2020-01-14 | Sony Corporation | Analysis system and analysis method |
Also Published As
Publication number | Publication date |
---|---|
AU2004245815A1 (en) | 2004-12-16 |
JP2005003367A (ja) | 2005-01-06 |
EP1645841B1 (en) | 2007-09-19 |
JP3738291B2 (ja) | 2006-01-25 |
EP1645841A4 (en) | 2006-09-06 |
US20060239538A1 (en) | 2006-10-26 |
CA2528824A1 (en) | 2004-12-16 |
EP1645841A1 (en) | 2006-04-12 |
DE602004009077D1 (de) | 2007-10-31 |
US7630537B2 (en) | 2009-12-08 |
DE602004009077T2 (de) | 2008-06-19 |
EP1645841B8 (en) | 2007-11-07 |
AU2004245815B2 (en) | 2010-08-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2004109228A1 (ja) | 三次元形状測定装置 | |
JP3764949B2 (ja) | 状態解析装置 | |
JP3782815B2 (ja) | 呼吸解析装置 | |
US7123758B2 (en) | Method and system for monitoring breathing activity of a subject | |
JP4703049B2 (ja) | 監視装置 | |
JP3922694B2 (ja) | 監視装置 | |
JP4422580B2 (ja) | 動き検出装置 | |
JP3939224B2 (ja) | 領域監視装置 | |
JP3979238B2 (ja) | 空間内監視装置 | |
JP2005253608A (ja) | 状態解析装置 | |
JP4372643B2 (ja) | 動き検出装置 | |
JP2004037274A (ja) | 高さ計測装置及び監視装置 | |
JP2004093376A (ja) | 高さ計測装置及び監視装置 | |
JP4365699B2 (ja) | 状態解析装置 | |
JP2005005912A (ja) | 監視装置 | |
JP4230287B2 (ja) | 動き検出装置 | |
JP2004037871A (ja) | パターン光投影装置及び測定装置 | |
JP2009078179A (ja) | 状態推移モニタ装置及び状態推移表示方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2528824 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2004245815 Country of ref document: AU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2004745573 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2004245815 Country of ref document: AU Date of ref document: 20040603 Kind code of ref document: A |
|
WWP | Wipo information: published in national office |
Ref document number: 2004245815 Country of ref document: AU |
|
WWP | Wipo information: published in national office |
Ref document number: 2004745573 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006239538 Country of ref document: US Ref document number: 10560048 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 10560048 Country of ref document: US |
|
WWG | Wipo information: grant in national office |
Ref document number: 2004745573 Country of ref document: EP |