CN110898404A - Deep vision anti-collision swimming goggles - Google Patents

Deep vision anti-collision swimming goggles Download PDF

Info

Publication number
CN110898404A
CN110898404A CN201911314669.7A CN201911314669A CN110898404A CN 110898404 A CN110898404 A CN 110898404A CN 201911314669 A CN201911314669 A CN 201911314669A CN 110898404 A CN110898404 A CN 110898404A
Authority
CN
China
Prior art keywords
depth
image
collision
water surface
boundary curve
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911314669.7A
Other languages
Chinese (zh)
Other versions
CN110898404B (en
Inventor
孙秀芳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Union University
Original Assignee
Beijing Union University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Union University filed Critical Beijing Union University
Priority to CN201911314669.7A priority Critical patent/CN110898404B/en
Publication of CN110898404A publication Critical patent/CN110898404A/en
Application granted granted Critical
Publication of CN110898404B publication Critical patent/CN110898404B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B33/00Swimming equipment attachable to the head, e.g. swim caps or goggles
    • A63B33/002Swimming goggles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B2071/0694Visual indication, e.g. Indicia
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/05Image processing for measuring physical parameters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/20Distances or displacements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/807Photo cameras

Landscapes

  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Pulmonology (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a pair of depth vision anti-collision swimming goggles, which comprise a tilt angle sensor, a hollow pile head, a lithium battery arranged in the pile head, two hollow spectacle frames, an anti-collision early warning lamp arranged at the upper edge of the spectacle frames and lenses arranged in the spectacle frames, wherein two visual angle adjusting brackets are symmetrically arranged at the position of the bridge of the nose between the two spectacle frames, a depth camera and a depth image processing unit which are integrated in a module are arranged between the visual angle adjusting brackets, the module is connected with the visual angle adjusting brackets through a rotating shaft, the rotating shaft is a hollow shaft, and the module is connected with the lithium battery and the anti-collision early warning lamp through wires; the depth camera is configured to collect a depth image of a front object, the tilt sensor is configured to collect an angle difference between the glasses and the gravity direction, and the depth image processing unit is configured to acquire distance information between the glasses and the front object according to the depth image and the angle difference and send an alarm through the anti-collision early warning lamp. Therefore, the swimmer can avoid front collision, and the swimming device is particularly suitable for backstroke and free swimming.

Description

Deep vision anti-collision swimming goggles
Technical Field
The invention relates to the technical field of swimming goggles, in particular to deep vision anti-collision swimming goggles.
Background
Among various swimming postures, the backstroke, which looks at the sky or the ceiling of a swimming pool, and the freestyle swimmer, which looks at the side, are inconvenient to observe the front, and easily collide with obstacles such as a swimmer or a wall in front because the swimmer cannot observe the front during swimming exercise.
Therefore, there is a need in the art for new swimming goggles that address the above-mentioned problems.
Disclosure of Invention
The problem that how to detect the distance between a swimmer and surrounding obstacles and prompt the swimmer to avoid collision is solved. The invention provides a pair of deep vision anti-collision swimming glasses, which comprises a tilt angle sensor, a hollow pile head, a lithium battery arranged in the pile head, two hollow picture frames, an anti-collision early warning lamp arranged at the upper edge of the picture frames and lenses arranged in the picture frames, wherein two visual angle adjusting supports are symmetrically arranged at the bridge of the nose between the two picture frames, a depth camera and a depth image processing unit which are integrated in a module are arranged between the visual angle adjusting supports, the module is connected with the visual angle adjusting supports through a rotating shaft, the rotating shaft is a hollow shaft, and the module is connected with the lithium battery and the anti-collision early warning lamp through a lead;
the depth camera is configured to collect a depth image of a front object, the tilt sensor is configured to collect an angle difference between the glasses and the gravity direction, and the depth image processing unit is configured to obtain distance information between the glasses and the front object according to the depth image and the angle difference and send an alarm through the anti-collision early warning lamp.
Preferably, the depth camera includes an infrared light emitter, a CMOS sensor and a processor,
the infrared light emitter is configured to emit an infrared light pulse with a set wavelength in the forward direction;
the CMOS sensor is configured to receive infrared light pulses reflected by a front object;
the processor is configured to acquire a depth image of a front object based on a time difference between the emitted infrared light pulse and the infrared light pulse received by the CMOS sensor at each pixel point.
Preferably, the processor includes a logical OR operation unit, a distance calculation unit, and a depth image acquisition unit,
the logic or operation unit is configured to output a frequency signal after performing logic or operation on the emitted infrared light pulse and the infrared light pulse received by the CMOS sensor on each pixel point;
the distance calculation unit is configured to obtain a distance value between a corresponding measured point and a pixel point of the CMOS sensor based on the frequency signal and the pulse width of the emitted infrared light pulse;
the depth image acquisition unit is configured to acquire a depth image of a front object based on the distance value.
Preferably, the distance calculation unit is further configured to obtain a distance value between the measured point and a pixel point of the CMOS sensor according to the following method:
Figure BDA0002325522520000021
wherein L represents the distance value between the measured point and the pixel point of the CMOS sensor, C represents the light speed, f represents the frequency value of the frequency signal, t1Indicating the pulse width of the emitted infrared light pulse.
Preferably, the depth image processing unit is further configured to perform the following operations:
constructing a curved surface image of a front object under a spherical coordinate system based on the depth image;
acquiring a water surface boundary curve based on the curved surface image;
acquiring the up-down orientation of the glasses based on the angle difference between the glasses and the gravity direction;
and when the glasses face upwards, acquiring the curved surface image above the water surface boundary curve, performing distance difference comparison on the curved surface image above the water surface boundary curve and a preset three-dimensional spherical safety boundary, and sending an alarm through the anti-collision early warning lamp based on a comparison result.
And when the glasses face downwards, acquiring the curved surface image below the water surface boundary curve, performing difference comparison on the distance between the curved surface image below the water surface boundary curve and the three-dimensional spherical safety boundary, and sending an alarm through the anti-collision early warning lamp based on a comparison result.
Preferably, the "constructing a curved surface image of the front object in the spherical coordinate system based on the depth image" specifically includes:
extracting pixel depth information of the depth image, and constructing a curved surface image of a front object under a spherical coordinate system, wherein the origin of the curved surface image is the central point of the depth camera optical lens, and the curved surface image is represented by the following formula:
Figure BDA0002325522520000022
wherein A represents a curved image of a front object,
Figure BDA0002325522520000031
represents the distance r of a point on the curved surface image from the origin along with theta,
Figure BDA0002325522520000032
A function of the variation, said theta representing the vertical opening angle, said
Figure BDA0002325522520000033
Indicating the horizontal opening angle.
Preferably, the "acquiring a water surface boundary curve based on the curved surface image" specifically includes:
acquiring the water surface boundary curve according to a method shown as the following formula:
Figure BDA0002325522520000034
wherein, W1Representing the water surface boundary curve in question,
Figure BDA0002325522520000035
it is shown that when a is theta,
Figure BDA0002325522520000036
calculating partial derivatives while varying
Figure BDA0002325522520000037
And a curve function formed by the generated singular point set.
Preferably, "acquiring the curved surface image above the water surface boundary curve" specifically includes:
acquiring the curved surface image above the water surface boundary curve by the method shown in the following formula
Figure BDA0002325522520000038
Wherein, W2A curved image representing the object in front above the water surface boundary curve,
Figure BDA0002325522520000039
the expression is given by the expression of theta,
Figure BDA00023255225200000310
a surface function composed of depth points when the part above the water surface boundary curve changes and satisfies theta > thetasurface|θ∈W1Theta ofsurfaceAnd representing the corresponding vertical opening angle of the water surface boundary curve.
Preferably, "acquiring the curved surface image below the water surface boundary curve" specifically includes:
acquiring the curved surface image below the water surface boundary curve according to the method shown in the following formula
Figure BDA00023255225200000313
Wherein, W3Representing a curved image of the object in front below the water surface boundary curve,
Figure BDA00023255225200000311
the expression is given by the expression of theta,
Figure BDA00023255225200000312
a surface function composed of depth points when the part below the water surface boundary curve changes and satisfying theta < thetasurface|θ∈W1Theta ofsurfaceAnd representing the corresponding vertical opening angle of the water surface boundary curve.
Preferably, the tilt sensor is mounted in the frame or in the stub.
The invention has the advantages that:
the depth vision anti-collision swimming goggles provided by the invention can accurately detect the distance between a swimmer and a front obstacle and prompt the swimmer so as to avoid collision.
Furthermore, compared with microwave distance measurement and ultrasonic distance measurement, the depth camera is adopted, so that the device is stronger in adaptability, smaller in size, lower in cost and more suitable for swimming goggles.
Drawings
Fig. 1 is a main structural schematic diagram of deep vision crashworthy swimming goggles in an embodiment of the invention.
Fig. 2 is a schematic diagram of a circuit function module in the deep vision anti-collision swimming goggles according to an embodiment of the invention.
Fig. 3 is a schematic main structural diagram of a depth camera in an embodiment of the present invention.
Fig. 4 is a schematic diagram of a main structure of a processor in the embodiment of the present invention.
FIG. 5 is a schematic coordinate diagram of a three-dimensional spherical safety boundary and an object surface contour according to an embodiment of the present invention.
Detailed Description
Referring to fig. 1 and 2, fig. 1 and 2 exemplarily show the main structure of deep vision anti-collision swimming goggles, as shown in fig. 1, the goggles comprise a tilt angle sensor (not shown), a hollow pile head 1, a lithium battery 2 installed in the pile head 1, two hollow glasses frames 3, an anti-collision early warning lamp 4 installed at the upper edge of the glasses frames 3, and a lens 8 installed in the glasses frames 3, two visual angle adjusting brackets 5 are symmetrically arranged at the bridge of the nose between the two glasses frames 3, a depth camera 6 and a depth image processing unit 7 integrated in a module are arranged between the visual angle adjusting brackets 5, the module is connected with the visual angle adjusting brackets 5 through a rotating shaft, the rotating shaft is a hollow shaft and is used for arranging a lead, and the module is connected with the lead between the lithium battery and the anti-collision early warning lamp 4; the depth camera 6 is configured to collect a depth image of a front object, the tilt sensor is configured to collect an angle difference between the glasses and the gravity direction, and the depth image processing unit 7 is configured to obtain distance information between the glasses and the front object according to the depth image and the angle difference and send an alarm through the anti-collision early warning lamp 4.
Specifically, lithium cell 2 can be two, installs respectively in two pile heads 1, so, can increase battery capacity, also can be that the holistic weight distribution of glasses is more reasonable, does not influence the travelling comfort that glasses were worn. The two glasses frames 3 are hollow, and wires can be arranged in the glasses frames, so that the wires are not exposed in water, and the service life of the glasses is prolonged. The lithium battery 2 is connected with the depth camera 6, the depth image processing unit 7 and the two anti-collision early warning lamps 4 through wires, the depth camera 6 is connected with the depth image processing unit 7 through signals, the inclination angle sensor is connected with the depth image processing unit 7 through signals, and the depth image processing unit 7 is connected with the anti-collision early warning lamps 4 through signals. The number of the anti-collision early warning lamps 4 can be two, and the anti-collision early warning lamps are respectively and symmetrically arranged on the upper edges of the two mirror frames 3. The anti-collision early warning lamp 4 can be low-brightness LED particles, and is not dazzling while warning is achieved.
The acquisition of the depth image is effected by the depth camera 6. The depth camera 6 emits infrared light pulses with specific wavelengths, irradiates a front object and reflects the infrared light pulses back, and the time difference of the infrared light pulses at each pixel position is measured by the CMOS sensor, so that a depth image is obtained. Specifically, the depth camera 6 uses an infrared optical time of flight measurement tof (time of flight) principle, infrared light with a specific wavelength (preferably 850nm) is transmitted by the infrared light transmitter 61 after being pulse-modulated, the transmitted infrared light pulse is reflected after encountering a front obstacle, the infrared light pulse reflected on each pixel point is captured by the CMOS sensor 62, the transmitted infrared light pulse and the received infrared light pulse are subjected to logical or operation by the processor 63, then the frequency of an output frequency signal after the logical or operation is measured, and then the distance from the measured point to the depth camera CMOS pixel point is calculated.
Referring to fig. 3, fig. 3 exemplarily shows a main structure of the depth camera, as shown in fig. 3, the depth camera 6 includes an infrared light emitter 61, a CMOS sensor 62, and a processor 63, the infrared light emitter 61 is configured to emit an infrared light pulse of a set wavelength in the forward direction; the CMOS sensor 62 is configured to receive infrared light pulses reflected back by objects in front; the processor 63 is configured to acquire a depth image of the forward object based on the time difference between the emitted infrared light pulses and the infrared light pulses received by the CMOS sensor at the respective pixel points.
Referring to fig. 4, fig. 4 exemplarily shows a main structure of the processor, and as shown in fig. 4, the processor 63 includes a logical or operation unit 631, a distance calculation unit 632, and a depth image acquisition unit 633. The logical or operation unit 631 is configured to output a frequency signal after performing logical or operation on the emitted infrared light pulse and the infrared light pulse received by the CMOS sensor at each pixel point; the distance calculation unit 632 is configured to obtain a distance value between a corresponding measured point and a pixel point of the CMOS sensor based on the frequency signal and the pulse width of the emitted infrared light pulse; the depth image acquisition unit 633 is configured to acquire a depth image of a front object based on the distance value. Specifically, a depth image of a front object is acquired based on a set of distance values between all measured points in front and pixels of the CMOS sensor.
Further, the distance calculating unit 632 may be further configured to obtain a distance value between the measured point and a pixel point of the CMOS sensor according to the following method:
Figure BDA0002325522520000051
wherein L represents the distance value between the measured point and the pixel point of the CMOS sensor, C represents the light speed, f represents the frequency value of the frequency signal, t1Indicating the pulse width of the emitted infrared light pulse. In this example, C is in m/s and L is in m.
The depth image processing unit 7 receives the depth image of the depth camera 6 through a digital communication interface thereof, extracts pixel depth information thereof, and constructs a three-dimensional depth curved image, thereby recognizing the shape and distance of a front object through a three-dimensional scene reconstruction algorithm. The up-down orientation of the glasses is obtained through the tilt angle sensor, so that the swimming posture of the swimmer is judged, whether the curved surface image above the water surface is used for being compared with a preset three-dimensional spherical surface safety boundary or the curved surface image below the water surface is used for being compared with the three-dimensional spherical surface safety boundary is determined, early warning information is generated based on the comparison result, and the anti-collision early warning lamp 4 warns the swimmer.
Preferably, the depth image processing unit 7 is further configured to perform the following operations:
step S1: constructing a curved surface image of a front object under a spherical coordinate system based on the depth image; specifically, pixel depth information of the depth image is extracted, a curved surface image of a front object under a spherical coordinate system is constructed, an origin of the curved surface image is a central point of an optical lens of the depth camera, and the curved surface image is represented by formula (2):
Figure BDA0002325522520000061
wherein A represents a curved image of a front object,
Figure BDA0002325522520000062
the distance r from the origin point to a point on the curved surface image is represented by,
Figure BDA0002325522520000063
A function of the variation, theta denotes the vertical opening angle,
Figure BDA0002325522520000064
indicating the horizontal opening angle. Theta, theta,
Figure BDA0002325522520000065
May be less than or equal to 80 degrees, preferably both opening angles are 80 degrees, such that the front object surface, which is 80 degrees vertical and horizontal in front of the swimmer, expresses r.
Step S2: acquiring a water surface boundary curve based on the curved surface image; specifically, a water surface boundary curve is obtained according to a method shown in formula (3):
Figure BDA0002325522520000066
wherein, W1A water surface boundary curve is shown,
Figure BDA0002325522520000067
it is shown that when a is theta,
Figure BDA0002325522520000068
calculating partial derivatives while varying
Figure BDA0002325522520000069
When the curve is calculated by a curve function formed by the generated singular point set, the depth image processing unit calculates the partial derivative in the theta direction of the curved surface image A, records each singular point, and then forms the curve. For example the depth image is at theta and,
Figure BDA00023255225200000610
the directions are respectively M and N graduations, namely M multiplied by N depth points are obtained in the depth image in total, and the water surface boundary curve W is calculated1When it is along
Figure BDA00023255225200000611
Scanning in the direction, wherein the number of scanning steps is recorded as N, N is from 1 to N, for each determined N, scanning in the theta direction is performed, the number of scanning steps is recorded as M, M is from 1 to M, theta partial derivative of depth r when M changes is calculated, standard first-order difference operation can be used for calculating partial derivative, the value of point M corresponding to the singular point is found out, and the value of point M is recorded as MqnThus, m values corresponding to the N singular value points can be found out: m isq1…mqn…mqNThese points form a water surface boundary curve W1
Step S3: acquiring the up-down orientation of the glasses based on the angle difference between the glasses and the gravity direction; in particular, the tilt sensor may be mounted in the frame or in the stub, the tilt sensor being configured to capture the angular difference between the glasses and the direction of gravity. The depth image processing unit 7 reads out an angle value of the tilt angle, that is, an angle difference between the glasses and the gravity direction from the tilt sensor, and determines the vertical direction of the glasses from the angle difference. The glasses are facing upwards when the swimmer is backstroke, at an angle of about 180 degrees, for example greater than 160 degrees, less than 200 degrees. In this case, it can be understood that the swimmer needs to determine the distance information of the object ahead of the water surface for backstroke. The difference in angle is around 0 degrees, e.g. greater than-60 degrees, less than 60 degrees, with the glasses facing upwards when the swimmer is not backstroke. In this case, it can be understood that the swimmer is not backstroke and needs to determine the distance information of the front object below the water surface. The tilt sensor may employ a gyroscope and accelerometer combined sensor chip MPU 6050.
Step S4: when the glasses face upwards, the curved surface image above the water surface boundary curve is obtained, the distance between the curved surface image above the water surface boundary curve and a preset three-dimensional spherical surface safety boundary is subjected to difference comparison, and an alarm is given out through an anti-collision early warning lamp based on a comparison result. Specifically, a curved surface image above a water surface boundary curve is obtained according to the method shown in formula (4)
Figure BDA0002325522520000071
Wherein, W2A curved image representing the object in front above the water surface boundary curve,
Figure BDA0002325522520000072
the expression is given by the expression of theta,
Figure BDA0002325522520000073
a surface function composed of depth points when the part above the water surface boundary curve changes and satisfies theta > thetasurface|θ∈W1,θsurfaceAnd representing the corresponding vertical opening angle of the water surface boundary curve. In calculating the curved surface image W2For the water surface boundary curve W of M N points1The number of scanning steps is N, from 1 to N, and for each N, the value of m is compared with the curved surface image W2M is corresponding to the upper singular pointqnAll are greater than mqnAll belong to the curved surface image W2Otherwise, the water surface boundary curve W1The points of the following parts are obtained, so that a lattice curved surface W on the water surface can be obtained2The total number of points on the curved surface is
Figure BDA0002325522520000077
Step S5: and when the glasses face downwards, acquiring a curved surface image below the water surface boundary curve, performing distance difference comparison on the curved surface image below the water surface boundary curve and the three-dimensional spherical safety boundary, and sending an alarm through an anti-collision early warning lamp based on a comparison result. Specifically, the curved surface image below the water surface boundary curve is obtained according to the method shown in formula (5)
Figure BDA0002325522520000074
Wherein, W3Representing a curved image of the object in front below the water surface boundary curve,
Figure BDA0002325522520000075
the expression is given by the expression of theta,
Figure BDA0002325522520000076
a surface function composed of depth points when the part below the water surface boundary curve changes and satisfying theta < thetasurface|θ∈W1,θsurfaceThe vertical opening angle corresponding to the water surface boundary curve is shown.
In steps S4 and S5, after the curved surface image below or below the water surface boundary curve is acquired, the difference between the acquired curved surface image and the three-dimensional spherical safety boundary is compared to determine whether the swimmer is within the safety distance from the front object, and if the swimmer is not within the safety distance range, an alarm is given by the anti-collision warning lamp to remind the swimmer.
The following description will be made in detail by taking non-backstroke as an example.
Referring to FIG. 5, FIG. 5 illustrates a three-dimensional spherical safety boundary and coordinate representation of the surface profile of an object, as shown in FIG. 5, with the glasses at a depth h below the surface of the swimming pool, the CMOS plane of the CMOS sensor 62 behind the optical lens of the depth camera 6, and the infrared light emitter 61 of the depth camera 6 continuously emitting infrared light pulses in the forward direction, which are due to the division of the measurement surface by the surface of the swimming poolThe pixel depth information influencing the depth image by refraction on the water surface is considered, the scene of swimming of people is considered, the evaluation of the measuring curved surface is limited below the water surface, namely, the analyzed front object profile only considers the part below the water surface, so that the curve of the intersection of the water surface and the front object profile curved surface needs to be calculated, as shown in figure 5, the vertical opening angle of a point on the curve is β, the calculation of the opening angle can be obtained by a singular point of the object profile curved surface, the singular point is a point of the profile curved surface which is not smooth in the vertical direction, the unsmooth point is a point of the profile curved surface which can lead the continuity of the measuring curved surface to change due to the refraction of infrared light pulses passing through the water surface, so that a first-order partial derivative in the theta1. Further, the profile W of the front object below the water surface is obtained3. The generation of the collision alarm signal of the swimmer is realized by calculating the underwater curved surface image W of the object3The safety range of the distance r of each point is obtained, and the distance r of each point is compared with the set three-dimensional spherical safety boundary. The alarm level is set according to the situation that the curved surface of the object to be measured invades the safety sphere. Preferably, the radius of the safety sphere can be set to three levels, wherein R yellow, R orange and R red correspond to 1 meter, 5 meters and 10 meters respectively, the anti-collision early warning lamp 4 comprises a red light LED, an orange light LED and a yellow light LED, and the red light LED flickers when the distance between the front object and the swimmer is less than or equal to 1 meter. When the distance between the front object and the swimmer is less than or equal to 5 meters and more than 1 meter, the orange LED flickers; when the distance between the front object and the swimmer is less than or equal to 10 meters and more than 5 meters, the yellow LED flickers.
Based on the depth vision anti-collision swimming goggles provided by the invention, in the using process, a swimmer adjusts the depth camera 6 to a proper angle towards the direction of the top of the head, starts swimming after the power is switched on, the depth image containing distance information and collected by the depth camera 6 is sent to the depth image processing unit 7 for calculation, and when the collision danger in the front is detected, the anti-collision early warning lamp 4 is controlled to flash and can give out warning according to the distance condition in a grading way. The invention can accurately measure the distance between the front object and the swimmer so as to avoid collision, and is particularly suitable for backstroke and free swimming.
The invention adopts the depth camera 6 for distance measurement of the swimming goggles, and compared with a distance measurement method realized by a microwave technology and an ultrasonic technology, the depth camera 6 also has the advantages of strong environmental adaptability, small volume and low cost.
The processor 63 in the invention carries out logical OR operation on the emitted infrared light pulse and the infrared light pulse received by the CMOS sensor 62 on each pixel point, so that the measurement of the flight time is skillfully converted into the frequency measurement of a logical signal, and the conversion enables the subsequent signal processing to be changed into one path from two paths of signals, thereby improving the measurement efficiency, and improving the efficiency obviously for the measurement of the multipoint depth information.
The above description is of the preferred embodiment of the present invention and the technical principles applied thereto, and it will be apparent to those skilled in the art that any changes and modifications based on the equivalent changes and simple substitutions of the technical solution of the present invention are within the protection scope of the present invention without departing from the spirit and scope of the present invention.

Claims (10)

1. The deep vision anti-collision swimming glasses are characterized by comprising a tilt angle sensor, a hollow pile head, a lithium battery arranged in the pile head, two hollow mirror frames, an anti-collision early warning lamp arranged at the upper edge of the mirror frames and lenses arranged in the mirror frames, wherein two visual angle adjusting supports are symmetrically arranged at the bridge of the nose between the two mirror frames, a depth camera and a depth image processing unit which are integrated in one module are arranged between the visual angle adjusting supports, the module is connected with the visual angle adjusting supports through a rotating shaft, the rotating shaft is a hollow shaft, and the module is connected with the lithium battery and the anti-collision early warning lamp through a lead;
the depth camera is configured to collect a depth image of a front object, the tilt sensor is configured to collect an angle difference between the glasses and the gravity direction, and the depth image processing unit is configured to obtain distance information between the glasses and the front object according to the depth image and the angle difference and send an alarm through the anti-collision early warning lamp.
2. The depth vision anti-collision swimming goggles of claim 1, wherein the depth camera includes an infrared light emitter, a CMOS sensor, and a processor,
the infrared light emitter is configured to emit an infrared light pulse with a set wavelength in the forward direction;
the CMOS sensor is configured to receive infrared light pulses reflected by a front object;
the processor is configured to acquire a depth image of a front object based on a time difference between the emitted infrared light pulse and the infrared light pulse received by the CMOS sensor at each pixel point.
3. The depth vision anti-collision swimming goggles of claim 2, wherein the processor includes a logical OR unit, a distance calculation unit, and a depth image acquisition unit,
the logic or operation unit is configured to output a frequency signal after performing logic or operation on the emitted infrared light pulse and the infrared light pulse received by the CMOS sensor on each pixel point;
the distance calculation unit is configured to obtain a distance value between a corresponding measured point and a pixel point of the CMOS sensor based on the frequency signal and the pulse width of the emitted infrared light pulse;
the depth image acquisition unit is configured to acquire a depth image of a front object based on the distance value.
4. The deep vision anti-collision swimming goggles of claim 3, wherein the distance calculating unit is further configured to obtain the distance value between the measured point and the pixel point of the CMOS sensor according to the following method:
Figure FDA0002325522510000021
wherein L represents the distance value between the measured point and the pixel point of the CMOS sensor, C represents the light speed, and f represents the frequency signalFrequency value of t1Indicating the pulse width of the emitted infrared light pulse.
5. The depth vision anti-collision swimming goggles of claim 1, wherein the depth image processing unit is further configured to:
constructing a curved surface image of a front object under a spherical coordinate system based on the depth image;
acquiring a water surface boundary curve based on the curved surface image;
acquiring the up-down orientation of the glasses based on the angle difference between the glasses and the gravity direction;
and when the glasses face upwards, acquiring the curved surface image above the water surface boundary curve, performing distance difference comparison on the curved surface image above the water surface boundary curve and a preset three-dimensional spherical safety boundary, and sending an alarm through the anti-collision early warning lamp based on a comparison result.
And when the glasses face downwards, acquiring the curved surface image below the water surface boundary curve, performing difference comparison on the distance between the curved surface image below the water surface boundary curve and the three-dimensional spherical safety boundary, and sending an alarm through the anti-collision early warning lamp based on a comparison result.
6. The depth vision anti-collision swimming goggles as claimed in claim 5, wherein "constructing a curved surface image of the front object in a spherical coordinate system based on the depth image" specifically comprises:
extracting pixel depth information of the depth image, and constructing a curved surface image of a front object under a spherical coordinate system, wherein the origin of the curved surface image is the central point of the depth camera optical lens, and the curved surface image is represented by the following formula:
Figure FDA0002325522510000022
wherein A represents a curved image of a front object,
Figure FDA0002325522510000023
represents the distance r of a point on the curved surface image from the origin along with theta,
Figure FDA0002325522510000024
A function of the variation, said theta representing the vertical opening angle, said
Figure FDA0002325522510000025
Indicating the horizontal opening angle.
7. The deep vision anti-collision swimming goggles according to claim 6, wherein the "acquiring a water surface boundary curve based on the curved surface image" specifically includes:
acquiring the water surface boundary curve according to a method shown as the following formula:
Figure FDA0002325522510000026
wherein, W1Representing the water surface boundary curve in question,
Figure FDA0002325522510000027
it is shown that when a is theta,
Figure FDA0002325522510000028
calculating partial derivatives while varying
Figure FDA0002325522510000031
And a curve function formed by the generated singular point set.
8. The deep vision anti-collision swimming goggles according to claim 7, wherein the step of obtaining the curved surface image above the water surface boundary curve specifically comprises:
acquiring the curved surface image above the water surface boundary curve by the method shown in the following formula
Figure FDA0002325522510000032
Wherein, W2A curved image representing the object in front above the water surface boundary curve,
Figure FDA0002325522510000033
the expression is given by the expression of theta,
Figure FDA0002325522510000034
a surface function composed of depth points when the part above the water surface boundary curve changes and satisfies theta > thetasurface|θ∈W1Theta ofsurfaceAnd representing the corresponding vertical opening angle of the water surface boundary curve.
9. The deep vision anti-collision swimming goggles according to claim 7, wherein the step of obtaining the curved surface image below the water surface boundary curve specifically comprises:
acquiring the curved surface image below the water surface boundary curve according to the method shown in the following formula
Figure FDA0002325522510000035
Wherein, W3Representing a curved image of the object in front below the water surface boundary curve,
Figure FDA0002325522510000036
the expression is given by the expression of theta,
Figure FDA0002325522510000037
a surface function composed of depth points when the part below the water surface boundary curve changes and satisfying theta < thetasurface|θ∈W1Theta ofsurfaceAnd representing the corresponding vertical opening angle of the water surface boundary curve.
10. The deep vision anti-collision swimming goggles of claim 1, wherein the tilt sensor is mounted in the goggle frame or in the stub.
CN201911314669.7A 2019-12-19 2019-12-19 Deep vision anti-collision swimming goggles Active CN110898404B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911314669.7A CN110898404B (en) 2019-12-19 2019-12-19 Deep vision anti-collision swimming goggles

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911314669.7A CN110898404B (en) 2019-12-19 2019-12-19 Deep vision anti-collision swimming goggles

Publications (2)

Publication Number Publication Date
CN110898404A true CN110898404A (en) 2020-03-24
CN110898404B CN110898404B (en) 2021-02-26

Family

ID=69826732

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911314669.7A Active CN110898404B (en) 2019-12-19 2019-12-19 Deep vision anti-collision swimming goggles

Country Status (1)

Country Link
CN (1) CN110898404B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114783147A (en) * 2022-04-19 2022-07-22 珠海市杰理科技股份有限公司 Intelligent monitoring method and device, wearable device and readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN85108521A (en) * 1985-11-18 1986-07-02 徐瑞良 Method of measuring area with knot-point-pattern
CN204203569U (en) * 2014-08-01 2015-03-11 王傲立 The smart mobile phone of stereo display
CN105561543A (en) * 2016-02-04 2016-05-11 京东方科技集团股份有限公司 Underwater glasses and control method thereof
US20170123065A1 (en) * 2014-11-14 2017-05-04 Microsoft Technology Licensing, Llc Eyewear-mountable eye tracking device
CN107817614A (en) * 2017-08-31 2018-03-20 杭州视氪科技有限公司 A kind of blind person's auxiliary eyeglasses for being used to hide the water surface and barrier

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN85108521A (en) * 1985-11-18 1986-07-02 徐瑞良 Method of measuring area with knot-point-pattern
CN204203569U (en) * 2014-08-01 2015-03-11 王傲立 The smart mobile phone of stereo display
US20170123065A1 (en) * 2014-11-14 2017-05-04 Microsoft Technology Licensing, Llc Eyewear-mountable eye tracking device
CN105561543A (en) * 2016-02-04 2016-05-11 京东方科技集团股份有限公司 Underwater glasses and control method thereof
CN107817614A (en) * 2017-08-31 2018-03-20 杭州视氪科技有限公司 A kind of blind person's auxiliary eyeglasses for being used to hide the water surface and barrier

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
郭宁博、陈向宁、薛俊诗: "基于飞行时间法的红外相机研究综述", 《兵器装备工程学报》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114783147A (en) * 2022-04-19 2022-07-22 珠海市杰理科技股份有限公司 Intelligent monitoring method and device, wearable device and readable storage medium
CN114783147B (en) * 2022-04-19 2023-10-27 珠海市杰理科技股份有限公司 Intelligent monitoring method and device, wearable device and readable storage medium

Also Published As

Publication number Publication date
CN110898404B (en) 2021-02-26

Similar Documents

Publication Publication Date Title
CN104964672B (en) A kind of long-distance barrier detecting sensor based on line-structured light
JP7120002B2 (en) flight device
US9370459B2 (en) System and method for alerting visually impaired users of nearby objects
CN106859929B (en) A kind of Multifunctional blind person guiding instrument based on binocular vision
US9801778B2 (en) System and method for alerting visually impaired users of nearby objects
CN106384382A (en) Three-dimensional reconstruction system and method based on binocular stereoscopic vision
BR102017006726A2 (en) AIRCRAFT COLLISION ALERT SYSTEM AND METHOD FOR GENERATING AN ALERT SIGNAL OF A POTENTIAL AIRCRAFT COLLISION
CN109143247B (en) Three-eye underwater detection method for acousto-optic imaging
CN111685633A (en) Tumble detection method
CN106915303B (en) Automobile A-column blind area perspective method based on depth data and fish eye images
ES2955667T3 (en) Pool cleaning robot and a method to obtain images of a pool
JP7064163B2 (en) 3D information acquisition system
KR102265980B1 (en) Device and method for monitoring ship and port
CN108140066A (en) Drawing producing device and drawing production method
JP2011145924A (en) Moving device and method
CN109106563A (en) A kind of automation blind-guide device based on deep learning algorithm
US11493932B2 (en) Pool cleaning robot and a method for imaging a pool
CN106597690A (en) Visually impaired people passage prediction glasses based on RGB-D camera and stereophonic sound
CN110898404B (en) Deep vision anti-collision swimming goggles
US10402996B2 (en) Distance measuring device for human body features and method thereof
WO2022179207A1 (en) Window occlusion detection method and apparatus
CN106817577A (en) One kind is based on RGB D cameras and stereosonic visually impaired people&#39;s barrier early warning glasses
CN116255908A (en) Underwater robot-oriented marine organism positioning measurement device and method
CN112188059B (en) Wearable device, intelligent guiding method and device and guiding system
CN108917701A (en) A kind of barrier based on smart phone detects automatically and method for early warning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant