WO2024099933A1 - System for protecting an operator of a power tool having a camera - Google Patents

System for protecting an operator of a power tool having a camera Download PDF

Info

Publication number
WO2024099933A1
WO2024099933A1 PCT/EP2023/080773 EP2023080773W WO2024099933A1 WO 2024099933 A1 WO2024099933 A1 WO 2024099933A1 EP 2023080773 W EP2023080773 W EP 2023080773W WO 2024099933 A1 WO2024099933 A1 WO 2024099933A1
Authority
WO
WIPO (PCT)
Prior art keywords
power tool
controller
protection assembly
operator
assembly
Prior art date
Application number
PCT/EP2023/080773
Other languages
French (fr)
Inventor
Guoliang Wang
Niklas SARIUS
Sören KAHL
Carles CARDENAL
Oscar ÅGREN
Neziha AKALIN
Original Assignee
Husqvarna Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Husqvarna Ab filed Critical Husqvarna Ab
Publication of WO2024099933A1 publication Critical patent/WO2024099933A1/en

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16PSAFETY DEVICES IN GENERAL; SAFETY DEVICES FOR PRESSES
    • F16P3/00Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body
    • F16P3/12Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine
    • F16P3/14Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact
    • F16P3/142Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact using image capturing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23DPLANING; SLOTTING; SHEARING; BROACHING; SAWING; FILING; SCRAPING; LIKE OPERATIONS FOR WORKING METAL BY REMOVING MATERIAL, NOT OTHERWISE PROVIDED FOR
    • B23D59/00Accessories specially designed for sawing machines or sawing devices
    • B23D59/001Measuring or control devices, e.g. for automatic control of work feed pressure on band saw blade
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B27WORKING OR PRESERVING WOOD OR SIMILAR MATERIAL; NAILING OR STAPLING MACHINES IN GENERAL
    • B27BSAWS FOR WOOD OR SIMILAR MATERIAL; COMPONENTS OR ACCESSORIES THEREFOR
    • B27B17/00Chain saws; Equipment therefor
    • B27B17/08Drives or gearings; Devices for swivelling or tilting the chain saw
    • B27B17/083Devices for arresting movement of the saw chain
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B27WORKING OR PRESERVING WOOD OR SIMILAR MATERIAL; NAILING OR STAPLING MACHINES IN GENERAL
    • B27GACCESSORY MACHINES OR APPARATUS FOR WORKING WOOD OR SIMILAR MATERIALS; TOOLS FOR WORKING WOOD OR SIMILAR MATERIALS; SAFETY DEVICES FOR WOOD WORKING MACHINES OR TOOLS
    • B27G19/00Safety guards or devices specially adapted for wood saws; Auxiliary devices facilitating proper operation of wood saws
    • B27G19/003Safety guards or devices specially adapted for wood saws; Auxiliary devices facilitating proper operation of wood saws for chain saws

Definitions

  • Example embodiments generally relate to power equipment and, more particularly, relate to a system configured to protect the user of a chainsaw or other power equipment such as power cutters with blade or chain.
  • Property maintenance tasks are commonly performed using various tools and/or machines that are configured for the performance of corresponding specific tasks. Some of those tools, like chainsaws, are designed to be effective at cutting trees in situations that could be relatively brief, or could take a long time including, in some cases, a full day of work. When operating a chainsaw for a long period of time, fatigue can play a role in safe operation of the device. However, regardless of how long the operator uses the device, it is important that the operator remain vigilant to implementing safe operating procedures in order to avoid injury to himself/herself and to others.
  • PPE personal protective equipment
  • the protection assembly may include a camera which may be disposed at an operator of the power tool that may record image data, and a controller which may be disposed at the power tool, the controller may include processing circuitry which may be configured to initiate a protective action with respect to a working assembly of the power tool responsive to a trigger event occurring, the trigger event may include the operator intruding on a protective zone defined by a predetermined distance threshold value that may extend from all sides of the working assembly.
  • the processing circuitry may include a machine learning model that may recognize the operator and the power tool in the image data.
  • the power tool may include a working assembly that may perform a cutting operation, and a protection assembly that may protect an operator of the power tool during the cutting operation.
  • the protection assembly may include a controller which may include processing circuitry which may be configured to initiate a protective action with respect to a working assembly of the power tool responsive to a trigger event occurring, the trigger event may include the operator intruding on a protective zone defined by a predetermined distance threshold value that may extend from all sides of the working assembly.
  • the processing circuitry may include a machine learning model that may recognize the operator and the power tool in image data recorded by a camera disposed at the operator.
  • FIG. 1 illustrates a concept diagram of a system in which wearable sensors may operate in accordance with an example embodiment
  • FIG. 2 illustrates a power tool with a protection assembly in accordance with an example embodiment
  • FIG. 3 illustrates the power tool with the protection assembly in accordance with an example embodiment
  • FIG. 4 illustrates the power tool with the protection assembly in accordance with an example embodiment
  • FIG. 5 illustrates the power tool with the protection assembly in accordance with an example embodiment
  • FIG. 6 illustrates the power tool with the protection assembly in accordance with an example embodiment
  • FIG. 7 illustrates a block diagram of the protection assembly in accordance with an example embodiment.
  • FIG. 1 illustrates a protection assembly of an example embodiment where the power tool is a chainsaw 100 having a working assembly that may include an endless chain 102 that rotates about a guide bar to perform cutting operations.
  • an operator 110 may wear wearable sensors.
  • the operator 110 may wear a helmet 112, gloves 114, and boots 116 as examples of PPE.
  • the sensors may be integrated into the PPE, or may be attached thereto.
  • the sensors could alternatively be integrated into or attached to other clothing or gear, and at other locations as well.
  • the specific examples of the protection assembly shown in FIG. 1 should be appreciated as being non-limiting in relation to the numbers of sensors, locations of the sensors, and methods of attaching the sensors to the operator 110 and/or the gear of the operator 110.
  • the wearable sensors of the protection assembly may include IMU- based sensors 120.
  • the IMU-based sensors 120 of FIG. 1 may be disposed on the helmet 112, gloves 114 and boots 116 that the operator 110 may be wearing, but could be at other locations as well, as noted above.
  • additional IMU-based sensors 120 could be provided at the knees, elbows, chest or other desirable locations on the operator 110.
  • the IMU- based sensors 120 may operate in cooperation with a tool position sensor 122, which may be disposed at the working assembly of the power tool (e.g., chainsaw 100).
  • the tool position sensor 122 may itself be an IMU-based sensor and/or may include a set of such sensors.
  • the IMU-based sensors 120 and the tool position sensor 122 may each be configured to perform motion tracking in three dimensions in order to enable relative positions between body parts at which the IMU-based sensors 120 are located and the tool to be tracked.
  • the motion tracking may be performed in connection with the application of motion tracking algorithms on linear acceleration and angular velocity data in three dimensions.
  • the wearable sensors of the protection assembly may include one or more object recognition sensors 125.
  • the object recognition sensor 125 may be disposed at the helmet 112, gloves 114 and/or boots 116 that the operator 110 may be wearing, but could be at other locations as well.
  • the object recognition sensor 125 may be disposed at the helmet 112 of the operator 110 and may thus define a field of view 127 similar to the field of view of the operator 110 themselves.
  • the object recognition sensor 125 may be configured to identify a variety of protected objects in the field of view 127, which may include the chainsaw 100, the working assembly of the chainsaw 100, and/or a body part 115 of the operator 110.
  • the object recognition sensor 125 may be a camera configured to record image data of the entire field of view 127.
  • the image data may include photos, videos, and/or any other type of image file format that the camera may record
  • the object recognition sensor 125 may be an infrared temperature sensor configured to detect infrared thermal radiation in the field of view 127.
  • the protection assembly may include both the camera and the infrared temperature sensor to reduce the potential for error in identifying objects from either of the object recognition sensors 125 on their own.
  • the IMU-based sensors 120 may track movement in three dimensions while the object recognition sensor 125 may identify the protected objects within the field of view 127. In either case, distances or proximity measurements and object identification may be performed so that the chainsaw 100 (or at least the cutting action thereof) may be disabled based on distance or proximity thresholds that can be defined (e.g., for short distances), or based on combinations of relative motion of body parts 115 and the tool at angular velocities or linear velocities above certain thresholds (e.g., stop delay based distances for larger distances).
  • distance or proximity thresholds can be defined (e.g., for short distances), or based on combinations of relative motion of body parts 115 and the tool at angular velocities or linear velocities above certain thresholds (e.g., stop delay based distances for larger distances).
  • a controller 140 may be disposed at the power tool (e.g., chainsaw 100) and, in this case, may be provided within a housing 150 of the chainsaw 100.
  • the controller 140 may be configured to communicate with the tool position sensor 122 and/or the IMU-based sensors 120 to perform motion tracking, and with the object recognition sensor 125 to perform object identification, as described herein.
  • the controller 140 and tool position sensor 122 are shown to be collocated. However, such collocation is not necessary.
  • the tool position sensor 122 could be located at any desirable location on the chainsaw 100.
  • the controller 140 may have a wired or wireless connection to the tool position sensor 122.
  • wireless communication e.g., short range wireless communication techniques including Bluetooth, WiFi, Zigbee, and/or the like.
  • FIG. 1 illustrates a specific view of a protection assembly for the chainsaw 100 according to an example embodiment
  • FIG. 2 illustrates a general view of an example embodiment of a power tool 200 including a protection assembly 210.
  • FIG. 2 illustrates a power tool 200 with a working assembly 220 which may perform a cutting operation, a housing 230 which may be operably coupled to the working assembly 220, and the protection assembly 210.
  • the housing 230 may include a powerhead disposed therein which may power the working assembly 220.
  • the power tool 200 may be a chainsaw 100.
  • the working assembly 220 may include a cutting chain, and a guide bar about which the chain rotates.
  • the power tool 200 may be power cutters.
  • the working assembly 220 may include a cutting blade.
  • the protection assembly 210 may include the object recognition sensor 125 which may be disposed at the operator 110 of the power tool 200 as shown in FIG. 2, and the controller 140 which may be disposed at the housing 230 of the power tool 200.
  • the object recognition sensor 125 may be a camera 240, while in other cases, the object recognition sensor 125 may be an infrared temperature sensor 250.
  • more than one object recognition sensor 125 may be disposed at the operator 110, and in such cases, multiple cameras 240, multiple infrared temperature sensors 250, or a combination of cameras 240 and infrared temperature sensors 250 may be disposed at the operator 110.
  • the camera 240 disposed at the operator 110 may be a standard optical camera 240.
  • the camera 240 may be configured to record image data that may show the entire field of view 127 and therefore any objects disposed within the field of view 127.
  • the image data recorded by the camera 240 may be sent to the controller 140 to be analyzed either wirelessly or by wire.
  • the field of view 127 may further include a protective zone 260.
  • the protective zone 260 may be a boundary that extends a predetermined distance threshold value (DI) around all sides of the working assembly 220.
  • DI distance threshold value
  • the protective zone 260 may define a minimum distance from the working assembly 220 that the body part 115 may approach, occupy or exit, to define a trigger event.
  • the trigger event may be related to the body part 115 contacting the protective zone 260.
  • the controller 140 may save and process the image data from the camera 240 to identify protected objects within the field of view 127, as shown by the dashed boxes in FIG. 3.
  • the controller 140 may be able to determine if the body part 115 may be entering or approaching, may be disposed within, or may be exiting the protective zone 260. Thus, responsive to determining that the body part 115 may be entering/approaching, may be disposed within, or may be exiting the protective zone 260, the controller 140 may initiate a protective action with respect to the power tool 200.
  • the operator 110 may wear a marker 270 disposed at each body part 115.
  • the marker 270 at each body part 115 may assist the controller 140 in identifying the body part 115 within the field of view 127 of the camera 240 and also in distinguishing which body part 115 out of the plurality of body parts 115 is within the field of view 127.
  • each marker 270 may be unique to the body part 115 at which it is disposed.
  • the controller 140 may identify the marker 270 not only to detect the body part 115, but also to distinguish between different body parts 115.
  • the markers 270 may be disposed at the PPE which may be worn by the operator 110. As such, the markers 270 may be disposed at the gloves 114 and boots 116 that the operator 110 may be wearing, but could be at other locations as well. For example, additional markers 270 could be provided at the knees, elbows, chest or other desirable locations on the operator 110 to enhance the ability of the camera 240 to visually capture, and the controller 140 to identify, the body part 115.
  • the working assembly 220 of the power tool 200 may also include a marker 270 disposed thereon to reduce the possibility of the controller 140 failing to identify the working assembly 220.
  • the work environment in which the power tool 200 is operated may be a low-light setting.
  • the camera 240 may be equipped with a low-light setting (i.e. night mode) and the markers 270 may be reflectors. In this regard, the camera 240 may continue to visually capture objects with markers 270 within the field of view 127.
  • the controller 140 may thus be able to identify the protected objects in the image data supplied by the camera 240 and may be able to track their positions accordingly, as shown in FIG. 3.
  • the camera 240 may communicate with the controller 140 to provide the image data either on a continuous, periodic or event-driven basis.
  • continuous image data may be provided to, and evaluated by, the controller 140 at routine and frequent intervals.
  • the image data may only be provided when the body part 115 may be identified or may interact with the protective zone 260.
  • the controller 140 may be configured to analyze the image data relative to initiation of warnings or other protective actions that the controller 140 may control.
  • the controller 140 may include processing circuitry 280.
  • the processing circuitry 280 within the controller 140 may be configured to receive and analyze the image data generated by the camera 240 to identify protected objects within the field of view 127. Accordingly, the processing circuitry 280 may identify the protected objects, such as the chainsaw 100 and/or the body part 115 of the operator 110, from the photo or video input data via a machine learning model.
  • a machine learning model that is already pre-trained in one task may be obtained from any number of online sources including various research institutions.
  • a pre-trained machine learning model for image classification may be retrained, through a form of transfer learning, to identify the protected objects such as the chainsaw 100 and/or the body part 115 and markers 270.
  • Transfer learning may save time and resources when training the machine learning model to identify protected objects by starting off with the pre-trained model for image classification.
  • a pre-trained model may be re-trained by supplying a much smaller dataset of image data of the protected objects in various settings, backgrounds, orientations and lighting conditions.
  • the model may be trained using a dataset including images of the chainsaw taken from different angles that may include different backgrounds and lighting conditions and other non -protected objects in order to thoroughly train the machine learning model to identify the protected objects in a plurality of environments and with a plurality of non-protected objects also populating the field of view 127.
  • the machine learning model may not necessarily be retrained by supplying a dataset of image data of the protected objects to the model prior to being implemented at the controller 140.
  • the machine learning model may be re-trained at the power tool 200 simply from operating the power tool 200 with the protection assembly 210 in various different environments, lighting conditions, orientations, and backgrounds.
  • the machine learning model may improve with time and exposure to more diverse image data from the camera 240.
  • the model may still be an image classification model that is configured to classify image data based on objects visible in the image data.
  • Having the machine learning model be trained via operation of the power tool 200 with the protection assembly 210 may further save time and resources by not requiring the model to be trained with supplied image data via another computer in a lab setting prior to being implemented at the power tool 200.
  • the machine learning model may thus learn to identify protected objects in the image data from the camera 240 over time with more time spent operating the power tool 200, thereby improving the accuracy, consistency and precision of the protection assembly 210 as a whole.
  • the controller 140 may be configured to initiate a protective action with respect to a working assembly 220 of the power tool 200 responsive to a trigger event occurring.
  • the trigger event may include the operator 110, or a body part 115 of the operator 110, intruding on the protective zone 260 defined by the predetermined distance threshold value (DI) that may extend from all sides of the working assembly 220.
  • the protective action may include stopping any cutting operation (e.g., via activating a chain brake 170) of the chainsaw 100 in FIG. 1 responsive to the controller 140 determining that the trigger event has occurred.
  • the protective action may include providing a warning (e.g., audibly, visually, or via haptic feedback).
  • FIG. 4 illustrates the protection assembly 210 according to an example embodiment wherein the protection assembly 210 further comprises a radar 290 disposed at the power tool 200.
  • the radar 290 may measure a distance to each protected object (e.g. body part 115) identified by the camera 240 from the power tool 200 and may communicate that distance to the controller 140.
  • the image data recorded by the camera 240 may be two dimensional.
  • the controller 140 may thus be configured to fuse data from the camera 240 and the radar 290 to augment the machine learning model with a three dimensional sense of depth and distance of the objects the machine learning model may identify.
  • the radar 290 on its own may not be able to distinguish between a tree branch and the body part 115. The radar 290 would simply record the distance to the unknown object proximate to the power tool 200.
  • the controller 140 may then initiate the protective action responsive to identifying the body part 115 and determining that the distance to the body part 115 from the power tool 200 is less than or equal to the predetermined distance threshold value (DI).
  • DI distance threshold value
  • FIG. 5 illustrates the protection assembly 210 according to an example embodiment where the protection assembly 210 may include further cameras 240 disposed at both the power tool 200 and the operator 110.
  • multiple cameras 240 may be disposed at different locations.
  • the cameras 240 may have significant overlap in their respective field of views 127, so that the image data may essentially contain perspectives of substantially the same field of view 127, but from different angles.
  • a stereoscopic effect may be achieved by fusing image data from each of the cameras 240 at the controller 140.
  • This stereoscopic effect may provide the controller 140 with more thorough image data, enabling the controller 140 to make better determinations of the spatial positioning of the protected objects relative to the working assembly 220 and the protective zone 260.
  • a plurality of cameras 240 may be used to achieve this effect.
  • the plurality of cameras 240 may be disposed at various locations at the operator 110.
  • the plurality of cameras 240 may be disposed at various locations at the power tool 200.
  • the plurality of cameras 240 may be disposed both at the operator 110 and at the power tool 200.
  • the plurality of cameras 240 may be directed towards substantially the same field of view 127 so as to convey a stereoscopic perspective of objects within the field of view 127 in the image data to the controller 140.
  • the controller 140 may be configured to generate a 3D model of the field of view 127.
  • the controller 140 may utilize the stereoscopic effect of the image data to generate a 3D model of the field of view 127 to more accurately perceive and track distances and positions of the protected objects.
  • the controller 140 may also augment the 3D models by generating non-visible portions of protected objects. For example, if in the image data a portion of a body part 115 may be visible but another portion of the same body part 115 may be blocked from view by another object, the controller 140 may recognize the body part 115 via the machine learning model and generate the rest of the body part 115 that is blocked in the 3D model.
  • the protected object may be partially or entirely blocked from view of either camera 240 or from both cameras 240.
  • the controller 140 may be configured to generate the invisible protected object as a result of the determination of the location of other protected objects and knowing where another protected object should be in relation to the others.
  • the protection assembly 210 may still be able to initiate the protective action responsive to a protected object intruding on the protective zone 260 despite being blocked from view of the one or more cameras 240.
  • FIG. 6 illustrates the protection assembly 210 according to an example embodiment where the protection assembly 210 may include the inertial measurement unit (IMU)-based sensors 120 and the tool position sensor 122 disposed at the power tool 200 and at each body part 115 of the plurality of body parts 115, similar to FIG. 1.
  • IMU inertial measurement unit
  • the IMU-based sensors 120 may measure the movement of, and the relative positions between, each IMU-based sensor 120 and the tool position sensor 122.
  • the IMU- based sensors 120 may be calibrated based on predefined poses of the operator 110 and corresponding positions of the power tool 200. In this regard, for example, certain positions may have known sensor data associated therewith. Accordingly, the chainsaw 100 may be detected as being held in one or more of such positions during a calibration procedure in order to reset to a known state of parts of the sensor data. Given that there may be multiple positions, various different parts of the sensor data may be reset until a full reset is achieved by going through a full sequence of calibration positions. Accordingly, the user manual or a maintenance manual for the chainsaw 100 may list the calibration positions.
  • a calibration mode may be entered, and the corresponding positions may be sequentially cycled through.
  • the calibrated positions may relate to both the chainsaw 100 and the operator 110 in some cases.
  • the operator 110 who may be a maintenance technician, or the owner in various cases
  • the controller 140 may store baseline data corresponding to known distances from each of the IMU- based sensors 120 to the tool position sensor 122 from each of the predefined poses.
  • the controller 140 may be configured to perform a comparison of the baseline data gathered in the predefined poses to current data gathered in real time. Responsive to determining the distance from any of the IMU-based sensors 120 to the tool position sensor 122 is less than or equal to the predetermined distance threshold value, the controller 140 may initiate the protective action.
  • FIG. 6 may also illustrate the protection assembly 210 of some embodiments where the protection assembly may also include infrared temperature sensors 250.
  • the infrared temperature sensors 250 may be implemented in addition to the camera 240 and the IMU-based sensors 120 to provide an additional form of object identification.
  • the infrared temperature sensors 250 may be configured to detect infrared thermal radiation from the body part 115 within the field of view.
  • objects at temperatures above zero degrees kelvin e.g., absolute zero
  • relatively low temperatures e.g., room temperature
  • most objects emit thermal radiation in the infrared spectrum, such that the radiation is not visible to the human eye.
  • An object may continue to emit thermal radiation in the infrared spectrum as it heats up until eventually, when the object reaches a temperature of approximately 773 degrees Kelvin, the thermal radiation enters the range of visible light. As such, the human body is not hot enough to emit thermal radiation in the visible light spectrum. Thus, humans and animals fall within a temperature range that corresponds to thermal radiation in the infrared spectrum.
  • the infrared temperature sensors 250 may continuously scan the field of view 127 and, in some cases, generate maps or other data regarding objects in the field of view 127 based on the results of the scanning.
  • the infrared temperature sensors 250 may communicate the maps or other data to the controller 140 that may use the maps or other data, along with the image data from the camera 240 and the distance data from the IMU-based sensors 120, to determine when the trigger event occurs.
  • the infrared temperature sensors 250 may therefore be able to detect the presence of the body part 115 (or other object emitting thermal radiation in the range of emissions that are of interest) within the field of view 127.
  • the protection assembly 210 may distinguish between the body part 115 in the field of view 127, and other objects within the field of view 127.
  • the controller 140 may be configured to respond differently to detecting the body part 115 in the field of view 127 as compared to detecting a different object in the field of view 127.
  • the body part 115 is just one example of a detectable object entering the field of view 127.
  • Other objects detectable by the infrared temperature sensors 250 may include animals or parts of animals emitting infrared thermal radiation.
  • the field of view 127 may be much larger than the protective zone 260.
  • the protection assembly 210 may identify and track protected objects within the field of view 127, even if the protected objects may not be located proximate to the protective zone 260.
  • the controller 140 may dynamically alter the size of the protective zone 260 based on whether or not the protected objects in the field of view 127 have a velocity and an acceleration relative to the working assembly 220.
  • the maps, photos, videos, distance data or other data generated by the object recognition sensors 125 e.g.
  • the controller 140 may use the controller 140 to determine if the body part 115 is approaching the working assembly 220, and, if so, its velocity and acceleration. If the controller 140 determines that the body part 115 may be approaching the working assembly 220 at a velocity that exceeds a predetermined threshold velocity, then the controller 140 may enlarge the protective zone 260 to create a larger buffer between the body part 115 and the working assembly 220 to allow more time for the protective action to take place.
  • the controller 140 may reduce the size of the protective zone 260 and create a smaller buffer between the body part 115 and the working assembly 220 to allow for more precise and controlled operation of the power tool 200 in certain settings.
  • the controller 140 may have similar responses for detecting the body part 115 accelerating towards the protective zone 260, and decelerating as it approaches the protective zone 260, respectively.
  • the predetermined threshold velocity may be a function of the current size of the protective zone 260, the distance of the body part 115 to the protective zone 260 and the velocity and acceleration of the body part 115.
  • the predetermined threshold velocity may be a function that may reflect the variety of conditions that must be met rather than a static numerical value.
  • FIG. 7 shows a block diagram of the controller 140 in accordance with an example embodiment.
  • the controller 140 may include the processing circuitry 280 of an example embodiment as described herein.
  • the processing circuitry 280 may be configured to provide electronic control inputs to one or more functional units of the power tool 200 (e.g., the chain brake 170) or the system (e.g., issuing a warning to the hearing protection 180) and to process data received at or generated by the one or more of the IMU -based sensors 120 or the object recognition sensors 125 regarding various indications of movement or distance between the power tool 200 and the operator 110 or the body part 115.
  • the processing circuitry 280 may be configured to perform data processing, control function execution and/or other processing and management services according to an example embodiment.
  • the processing circuitry 280 may be embodied as a chip or chip set.
  • the processing circuitry 280 may comprise one or more physical packages (e.g., chips) including materials, components and/or wires on a structural assembly (e.g., a baseboard).
  • the structural assembly may provide physical strength, conservation of size, and/or limitation of electrical interaction for component circuitry included thereon.
  • the processing circuitry 280 may therefore, in some cases, be configured to implement an embodiment of the present invention on a single chip or as a single “system on a chip.”
  • a chip or chipset may constitute means for performing one or more operations for providing the functionalities described herein.
  • the processing circuitry 280 may include one or more instances of a processor 282 and memory 284 that may be in communication with or otherwise control other components or modules that interface with the processing circuitry 280.
  • the processing circuitry 280 may be embodied as a circuit chip (e.g., an integrated circuit chip) configured (e.g., with hardware, software or a combination of hardware and software) to perform operations described herein.
  • the processing circuitry 280 may be embodied as a portion of an onboard computer housed in the housing 230 of the power tool 200 to control operation of the system relative to interaction with other motion tracking and/or distance measurement devices.
  • the controller 140 may employ or be in communication with a user interface 300.
  • the user interface 300 may be in communication with the processing circuitry 280 to receive an indication of a user input at the user interface 300 and/or to provide an audible, visual, tactile or other output to the operator 110.
  • the user interface 300 may include, for example, a display, one or more switches, lights, buttons or keys, speaker, and/or other input/output mechanisms.
  • the user interface 300 may include the hearing protection 180 of FIG. 1 , or one or a plurality of colored lights to indicate status or other relatively basic information. However, more complex interface mechanisms could be provided in some cases.
  • the controller 140 may employ or utilize components or circuitry that acts as a device interface 310.
  • the device interface 310 may include one or more interface mechanisms for enabling communication with other devices (e.g., the object recognition sensors 125, the chain brake 170, the hearing protection 180, the IMU-based sensors 120, etc).
  • the device interface 310 may be any means such as a device or circuitry embodied in either hardware, or a combination of hardware and software that is configured to receive and/or transmit data from/to components in communication with the processing circuitry 280 via internal communication systems of the power tool 200 and/or via wireless communication equipment (e.g., a one way or two way radio).
  • the device interface 310 may include an antenna and radio equipment for conducting Bluetooth, WiFi, or other short range communication, or include wired communication links for employing the communications necessary to support the functions described herein.
  • the processor 282 may be embodied in a number of different ways.
  • the processor 282 may be embodied as various processing means such as one or more of a microprocessor or other processing element, a coprocessor, a controller or various other computing or processing devices including integrated circuits such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), or the like.
  • the processor 282 may be configured to execute instructions stored in the memory 284 or otherwise accessible to the processor 282.
  • the processor 282 may represent an entity (e.g., physically embodied in circuitry - in the form of processing circuitry 280) capable of performing operations according to embodiments of the present invention while configured accordingly.
  • the processor 282 when the processor 282 is embodied as an ASIC, FPGA or the like, the processor 282 may be specifically configured hardware for conducting the operations described herein.
  • the processor 282 when the processor 282 is embodied as an executor of software instructions, the instructions may specifically configure the processor 282 to perform the operations described herein.
  • the memory 284 may include one or more non- transitory memory devices such as, for example, volatile and/or non-volatile memory that may be either fixed or re-movable.
  • the memory 284 may be configured to store information, data, applications, instructions, the machine learning model or the like for enabling the processing circuitry 280 to carry out various functions in accordance with exemplary embodiments of the present invention.
  • the memory 284 could be configured to buffer input data for processing by the processor 282.
  • the memory 284 could be configured to store instructions for execution by the processor 282.
  • the memory 284 may include one or more databases that may store a variety of data sets.
  • applications may be stored for execution by the processor 282 in order to carry out the functionality associated with each respective application.
  • the applications may include instructions for motion tracking and/or object recognition and distance measuring and distance tracking as described herein.
  • the memory 284 may save data in a circular buffer. In other words, a set amount of storage space may be available in the memory 284 and in order to save new data from the object recognition sensors 125 or the IMU-based sensors 120, the memory 284 may overwrite the oldest saved data with the newest saved data.
  • the processor 282 may then access the memory 284 under the direction of an application from the memory 284 to compare sets of saved data from the plurality of sensors.
  • the protection assembly may include a camera which may be disposed at an operator of the power tool that may record image data, and a controller which may be disposed at the power tool, the controller may include processing circuitry which may be configured to initiate a protective action with respect to a working assembly of the power tool responsive to a trigger event occurring, the trigger event may include the operator intruding on a protective zone defined by a predetermined distance threshold value that may extend from all sides of the working assembly.
  • the processing circuitry may include a machine learning model that may recognize the operator and the power tool in the image data.
  • the protection assembly of some embodiments may include additional, optional features, and/or the features described above may be modified or augmented. Some examples of modifications, optional features and augmentations are described below. It should be appreciated that the modifications, optional features and augmentations listed below may each be added alone, or they may be added cumulatively in any desirable combination.
  • the protection assembly may identify and track a plurality of body parts of the operator.
  • a unique marker may be disposed at each body part of the plurality of body parts of the operator.
  • the machine learning model may be trained to identify and track each of the unique markers.
  • the machine learning model may be a pre-trained image classification model that may be re-trained through a transfer learning process.
  • the machine learning model may be trained on a dataset that may include images of the unique markers and images of the power tool in a variety of settings and orientations.
  • the protection assembly may further include a radar disposed at the power tool.
  • the radar may measure a distance to each body part identified by the camera from the power tool and may communicate the distance to the controller.
  • the controller may initiate the protective action responsive to determining the distance to any body part from the power tool may be less than or equal to the predetermined distance threshold value.
  • the protection assembly may further include inertial measurement unit (IMU)- based sensors that may be disposed at the power tool and at each body part of the plurality of body parts, and a tool position sensor that may be disposed at the working assembly.
  • IMU inertial measurement unit
  • the IMU-based sensors may measure movement and relative positions between each IMU- based sensor and the tool position sensor.
  • the IMU-based sensors may be calibrated based on predefined poses of the operator and corresponding positions of the power tool.
  • the controller may store baseline data corresponding to known distances from each of the IMU-based sensors to the tool position sensor in each of the predefined poses.
  • the controller may be configured to perform a comparison of the baseline data gathered in the predefined poses to current data.
  • the controller may initiate the protective action responsive to determining the distance from any of the IMU-based sensors to the tool position sensor may be less than or equal to the predetermined distance threshold value.
  • the protection assembly may further include infrared temperature sensors that may be disposed at the operator that may assist the camera in identifying the body parts by detecting infrared thermal radiation from the body parts.
  • the power tool may be a chainsaw.
  • the working assembly may include a chain and a guide bar.
  • the protective action may include activating a chain brake of the chainsaw responsive to the controller determining that the trigger event may have occurred.
  • the power tool may be a chainsaw or power cutters.
  • the working assembly may include a chain and guide bar or a cutting blade.
  • the protective action may include providing an audible or visual warning to the operator responsive to the controller determining that the trigger event may have occurred.
  • the power tool may include a working assembly that may perform a cutting operation, and a protection assembly that may protect an operator of the power tool during the cutting operation.
  • the protection assembly may include a controller which may include processing circuitry which may be configured to initiate a protective action with respect to a working assembly of the power tool responsive to a trigger event occurring, the trigger event may include the operator intruding on a protective zone defined by a predetermined distance threshold value that may extend from all sides of the working assembly.
  • the processing circuitry may include a machine learning model that may recognize the operator and the power tool in image data recorded by a camera disposed at the operator.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Wood Science & Technology (AREA)
  • Forests & Forestry (AREA)
  • Numerical Control (AREA)

Abstract

A power tool (200) may include a working assembly (220) that may perform a cutting operation, and a protection assembly (210) that may protect an operator (110) of the power tool (200) during the cutting operation. The protection assembly (210) may include a controller (140) which may include processing circuitry (280) which may be configured to initiate a protective action with respect to a working assembly (220) of the power tool (200) responsive to a trigger event occurring, the trigger event may include the operator (110) intruding on a protective zone (260) defined by a predetermined distance threshold value (D1) that may extend from all sides of the working assembly (220). The processing circuitry (280) may include a machine learning model that recognize the operator (110) and the power tool (200) in image data recorded by a camera (240) disposed at the operator (110).

Description

SYSTEM FOR PROTECTING AN OPERATOR OF A POWER TOOL HAVING A CAMERA
TECHNICAL FIELD
[0001] Example embodiments generally relate to power equipment and, more particularly, relate to a system configured to protect the user of a chainsaw or other power equipment such as power cutters with blade or chain.
BACKGROUND
[0002] Property maintenance tasks are commonly performed using various tools and/or machines that are configured for the performance of corresponding specific tasks. Some of those tools, like chainsaws, are designed to be effective at cutting trees in situations that could be relatively brief, or could take a long time including, in some cases, a full day of work. When operating a chainsaw for a long period of time, fatigue can play a role in safe operation of the device. However, regardless of how long the operator uses the device, it is important that the operator remain vigilant to implementing safe operating procedures in order to avoid injury to himself/herself and to others.
[0003] To help improve safety, operators are encouraged to wear protective clothing and other personal protective equipment (PPE). Although wearing of PPE is always recommended while operating power equipment, some operators may nevertheless not do so. Accordingly, it may be desirable to define additional “intelligent” protection solutions that do not rely on the present type of PPE in order to protect users of chainsaws and other outdoor power equipment.
BRIEF SUMMARY OF SOME EXAMPLES
[0004] Some example embodiments may provide for a protection assembly for a power tool. The protection assembly may include a camera which may be disposed at an operator of the power tool that may record image data, and a controller which may be disposed at the power tool, the controller may include processing circuitry which may be configured to initiate a protective action with respect to a working assembly of the power tool responsive to a trigger event occurring, the trigger event may include the operator intruding on a protective zone defined by a predetermined distance threshold value that may extend from all sides of the working assembly. The processing circuitry may include a machine learning model that may recognize the operator and the power tool in the image data.
[0005] Some example embodiments may provide for a power tool. The power tool may include a working assembly that may perform a cutting operation, and a protection assembly that may protect an operator of the power tool during the cutting operation. The protection assembly may include a controller which may include processing circuitry which may be configured to initiate a protective action with respect to a working assembly of the power tool responsive to a trigger event occurring, the trigger event may include the operator intruding on a protective zone defined by a predetermined distance threshold value that may extend from all sides of the working assembly. The processing circuitry may include a machine learning model that may recognize the operator and the power tool in image data recorded by a camera disposed at the operator.
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING(S) [0006] Having thus described some example embodiments in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
[0007] FIG. 1 illustrates a concept diagram of a system in which wearable sensors may operate in accordance with an example embodiment;
[0008] FIG. 2 illustrates a power tool with a protection assembly in accordance with an example embodiment;
[0009] FIG. 3 illustrates the power tool with the protection assembly in accordance with an example embodiment;
[0010] FIG. 4 illustrates the power tool with the protection assembly in accordance with an example embodiment;
[0011] FIG. 5 illustrates the power tool with the protection assembly in accordance with an example embodiment;
[0012] FIG. 6 illustrates the power tool with the protection assembly in accordance with an example embodiment; and
[0013] FIG. 7 illustrates a block diagram of the protection assembly in accordance with an example embodiment. DETAILED DESCRIPTION
[0014] Some example embodiments now will be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all example embodiments are shown. Indeed, the examples described and pictured herein should not be construed as being limiting as to the scope, applicability or configuration of the present disclosure. Rather, these example embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like reference numerals refer to like elements throughout. Furthermore, as used herein, the term “or” is to be interpreted as a logical operator that results in true whenever one or more of its operands are true. As used herein, operable coupling should be understood to relate to direct or indirect connection that, in either case, enables functional interconnection or interaction of components that are operably coupled to each other.
[0015] FIG. 1 illustrates a protection assembly of an example embodiment where the power tool is a chainsaw 100 having a working assembly that may include an endless chain 102 that rotates about a guide bar to perform cutting operations. As shown in FIG. 1, an operator 110 may wear wearable sensors. In this regard, the operator 110 may wear a helmet 112, gloves 114, and boots 116 as examples of PPE. The sensors may be integrated into the PPE, or may be attached thereto. Of course, the sensors could alternatively be integrated into or attached to other clothing or gear, and at other locations as well. Thus, the specific examples of the protection assembly shown in FIG. 1 should be appreciated as being non-limiting in relation to the numbers of sensors, locations of the sensors, and methods of attaching the sensors to the operator 110 and/or the gear of the operator 110.
[0016] In this example, the wearable sensors of the protection assembly may include IMU- based sensors 120. The IMU-based sensors 120 of FIG. 1 may be disposed on the helmet 112, gloves 114 and boots 116 that the operator 110 may be wearing, but could be at other locations as well, as noted above. Thus, for example, additional IMU-based sensors 120 could be provided at the knees, elbows, chest or other desirable locations on the operator 110. The IMU- based sensors 120 may operate in cooperation with a tool position sensor 122, which may be disposed at the working assembly of the power tool (e.g., chainsaw 100). Of note, the tool position sensor 122 may itself be an IMU-based sensor and/or may include a set of such sensors. The IMU-based sensors 120 and the tool position sensor 122 may each be configured to perform motion tracking in three dimensions in order to enable relative positions between body parts at which the IMU-based sensors 120 are located and the tool to be tracked. The motion tracking may be performed in connection with the application of motion tracking algorithms on linear acceleration and angular velocity data in three dimensions.
[0017] In some cases, the wearable sensors of the protection assembly may include one or more object recognition sensors 125. The object recognition sensor 125 may be disposed at the helmet 112, gloves 114 and/or boots 116 that the operator 110 may be wearing, but could be at other locations as well. In a preferred embodiment, the object recognition sensor 125 may be disposed at the helmet 112 of the operator 110 and may thus define a field of view 127 similar to the field of view of the operator 110 themselves. The object recognition sensor 125 may be configured to identify a variety of protected objects in the field of view 127, which may include the chainsaw 100, the working assembly of the chainsaw 100, and/or a body part 115 of the operator 110. In some cases, the object recognition sensor 125 may be a camera configured to record image data of the entire field of view 127. In an example embodiment, the image data may include photos, videos, and/or any other type of image file format that the camera may record, the object recognition sensor 125 may be an infrared temperature sensor configured to detect infrared thermal radiation in the field of view 127. In some cases, the protection assembly may include both the camera and the infrared temperature sensor to reduce the potential for error in identifying objects from either of the object recognition sensors 125 on their own.
[0018] As can be appreciated from the descriptions above, the IMU-based sensors 120 may track movement in three dimensions while the object recognition sensor 125 may identify the protected objects within the field of view 127. In either case, distances or proximity measurements and object identification may be performed so that the chainsaw 100 (or at least the cutting action thereof) may be disabled based on distance or proximity thresholds that can be defined (e.g., for short distances), or based on combinations of relative motion of body parts 115 and the tool at angular velocities or linear velocities above certain thresholds (e.g., stop delay based distances for larger distances).
[0019] In an example embodiment, a controller 140 may be disposed at the power tool (e.g., chainsaw 100) and, in this case, may be provided within a housing 150 of the chainsaw 100. The controller 140 may be configured to communicate with the tool position sensor 122 and/or the IMU-based sensors 120 to perform motion tracking, and with the object recognition sensor 125 to perform object identification, as described herein. In FIG. 1, the controller 140 and tool position sensor 122 are shown to be collocated. However, such collocation is not necessary. Moreover, the tool position sensor 122 could be located at any desirable location on the chainsaw 100. Thus, for example, the controller 140 may have a wired or wireless connection to the tool position sensor 122. If communications between the IMU-based sensors 120 and the controller 140, or between the object recognition sensor 125 and the controller 140, occur, such communication may be accomplished via wireless communication (e.g., short range wireless communication techniques including Bluetooth, WiFi, Zigbee, and/or the like).
[0020] While FIG. 1 illustrates a specific view of a protection assembly for the chainsaw 100 according to an example embodiment, FIG. 2 illustrates a general view of an example embodiment of a power tool 200 including a protection assembly 210. Accordingly, FIG. 2 illustrates a power tool 200 with a working assembly 220 which may perform a cutting operation, a housing 230 which may be operably coupled to the working assembly 220, and the protection assembly 210. In an example embodiment, the housing 230 may include a powerhead disposed therein which may power the working assembly 220. In some cases, the power tool 200 may be a chainsaw 100. In this regard, the working assembly 220 may include a cutting chain, and a guide bar about which the chain rotates. In some other embodiments, the power tool 200 may be power cutters. In this regard, the working assembly 220 may include a cutting blade. The protection assembly 210 may include the object recognition sensor 125 which may be disposed at the operator 110 of the power tool 200 as shown in FIG. 2, and the controller 140 which may be disposed at the housing 230 of the power tool 200. As described above, in some embodiments the object recognition sensor 125 may be a camera 240, while in other cases, the object recognition sensor 125 may be an infrared temperature sensor 250. In an example embodiment, more than one object recognition sensor 125 may be disposed at the operator 110, and in such cases, multiple cameras 240, multiple infrared temperature sensors 250, or a combination of cameras 240 and infrared temperature sensors 250 may be disposed at the operator 110.
[0021] According to an example embodiment, the camera 240 disposed at the operator 110 may be a standard optical camera 240. The camera 240 may be configured to record image data that may show the entire field of view 127 and therefore any objects disposed within the field of view 127. Among the possibilities of objects disposed within the field of view 127, there may be some objects that are more desirable to identify and track the location of than others. For example, it may be desirable to identify the body part 115 of the operator 110, and the working assembly 220 of the power tool 200, and track their respective locations accordingly within the field of view 127. In this regard, the image data recorded by the camera 240 may be sent to the controller 140 to be analyzed either wirelessly or by wire.
[0022] As depicted in FIG. 2, in some embodiments, the field of view 127 may further include a protective zone 260. The protective zone 260 may be a boundary that extends a predetermined distance threshold value (DI) around all sides of the working assembly 220. In other words, the protective zone 260 may define a minimum distance from the working assembly 220 that the body part 115 may approach, occupy or exit, to define a trigger event. Accordingly, the trigger event may be related to the body part 115 contacting the protective zone 260. In this regard, the controller 140 may save and process the image data from the camera 240 to identify protected objects within the field of view 127, as shown by the dashed boxes in FIG. 3. The controller 140 may be able to determine if the body part 115 may be entering or approaching, may be disposed within, or may be exiting the protective zone 260. Thus, responsive to determining that the body part 115 may be entering/approaching, may be disposed within, or may be exiting the protective zone 260, the controller 140 may initiate a protective action with respect to the power tool 200.
[0023] Also shown in FIG. 3, in an example embodiment, the operator 110 may wear a marker 270 disposed at each body part 115. The marker 270 at each body part 115 may assist the controller 140 in identifying the body part 115 within the field of view 127 of the camera 240 and also in distinguishing which body part 115 out of the plurality of body parts 115 is within the field of view 127. In other words, each marker 270 may be unique to the body part 115 at which it is disposed. In this regard, the controller 140 may identify the marker 270 not only to detect the body part 115, but also to distinguish between different body parts 115.
[0024] In some cases, the markers 270 may be disposed at the PPE which may be worn by the operator 110. As such, the markers 270 may be disposed at the gloves 114 and boots 116 that the operator 110 may be wearing, but could be at other locations as well. For example, additional markers 270 could be provided at the knees, elbows, chest or other desirable locations on the operator 110 to enhance the ability of the camera 240 to visually capture, and the controller 140 to identify, the body part 115. In an example embodiment, the working assembly 220 of the power tool 200 may also include a marker 270 disposed thereon to reduce the possibility of the controller 140 failing to identify the working assembly 220. In some cases, the work environment in which the power tool 200 is operated may be a low-light setting. In such cases, the camera 240 may be equipped with a low-light setting (i.e. night mode) and the markers 270 may be reflectors. In this regard, the camera 240 may continue to visually capture objects with markers 270 within the field of view 127. In any case, the controller 140 may thus be able to identify the protected objects in the image data supplied by the camera 240 and may be able to track their positions accordingly, as shown in FIG. 3.
[0025] The camera 240 may communicate with the controller 140 to provide the image data either on a continuous, periodic or event-driven basis. At one end of the spectrum, continuous image data may be provided to, and evaluated by, the controller 140 at routine and frequent intervals. At the other end of the spectrum, the image data may only be provided when the body part 115 may be identified or may interact with the protective zone 260. In any case, the controller 140 may be configured to analyze the image data relative to initiation of warnings or other protective actions that the controller 140 may control.
[0026] In order for the controller 140 to be able to identify protected objects within the field of view 127, the controller 140 may include processing circuitry 280. The processing circuitry 280 within the controller 140 may be configured to receive and analyze the image data generated by the camera 240 to identify protected objects within the field of view 127. Accordingly, the processing circuitry 280 may identify the protected objects, such as the chainsaw 100 and/or the body part 115 of the operator 110, from the photo or video input data via a machine learning model. Generally speaking, a machine learning model that is already pre-trained in one task may be obtained from any number of online sources including various research institutions. In one example embodiment, a pre-trained machine learning model for image classification may be retrained, through a form of transfer learning, to identify the protected objects such as the chainsaw 100 and/or the body part 115 and markers 270.
[0027] Transfer learning may save time and resources when training the machine learning model to identify protected objects by starting off with the pre-trained model for image classification. In this regard, rather than supplying large amounts of image data to a new machine learning model to train it to classify images, a pre-trained model may be re-trained by supplying a much smaller dataset of image data of the protected objects in various settings, backgrounds, orientations and lighting conditions. In this regard, the model may be trained using a dataset including images of the chainsaw taken from different angles that may include different backgrounds and lighting conditions and other non -protected objects in order to thoroughly train the machine learning model to identify the protected objects in a plurality of environments and with a plurality of non-protected objects also populating the field of view 127.
[0028] In some other embodiments, the machine learning model may not necessarily be retrained by supplying a dataset of image data of the protected objects to the model prior to being implemented at the controller 140. In this regard, the machine learning model may be re-trained at the power tool 200 simply from operating the power tool 200 with the protection assembly 210 in various different environments, lighting conditions, orientations, and backgrounds. As such, the machine learning model may improve with time and exposure to more diverse image data from the camera 240. The model, however, may still be an image classification model that is configured to classify image data based on objects visible in the image data. Having the machine learning model be trained via operation of the power tool 200 with the protection assembly 210 may further save time and resources by not requiring the model to be trained with supplied image data via another computer in a lab setting prior to being implemented at the power tool 200. The machine learning model may thus learn to identify protected objects in the image data from the camera 240 over time with more time spent operating the power tool 200, thereby improving the accuracy, consistency and precision of the protection assembly 210 as a whole.
[0029] Accordingly, as mentioned above, the controller 140 may be configured to initiate a protective action with respect to a working assembly 220 of the power tool 200 responsive to a trigger event occurring. In this regard, the trigger event may include the operator 110, or a body part 115 of the operator 110, intruding on the protective zone 260 defined by the predetermined distance threshold value (DI) that may extend from all sides of the working assembly 220. As an example, the protective action may include stopping any cutting operation (e.g., via activating a chain brake 170) of the chainsaw 100 in FIG. 1 responsive to the controller 140 determining that the trigger event has occurred. Alternatively or additionally, the protective action may include providing a warning (e.g., audibly, visually, or via haptic feedback). For example, if hearing protection 180 is worn by the operator 110 as shown in FIG. 1, an audible warning could be provided via the hearing protection 180. In some cases, the protective action may include both providing the warning and stopping any cutting operations (e.g., activating the chain brake 170). [0030] FIG. 4 illustrates the protection assembly 210 according to an example embodiment wherein the protection assembly 210 further comprises a radar 290 disposed at the power tool 200. The radar 290 may measure a distance to each protected object (e.g. body part 115) identified by the camera 240 from the power tool 200 and may communicate that distance to the controller 140. In this regard, the image data recorded by the camera 240 may be two dimensional. Thus, depending on the orientation of the objects within the field of view 127, the depth and distance may be difficult to determine from the perspective of the camera 240 alone. The controller 140 may thus be configured to fuse data from the camera 240 and the radar 290 to augment the machine learning model with a three dimensional sense of depth and distance of the objects the machine learning model may identify. For example, the radar 290 on its own may not be able to distinguish between a tree branch and the body part 115. The radar 290 would simply record the distance to the unknown object proximate to the power tool 200. When the input from the radar 290 is fused with the image data from the camera 240, the controller 140 may then initiate the protective action responsive to identifying the body part 115 and determining that the distance to the body part 115 from the power tool 200 is less than or equal to the predetermined distance threshold value (DI).
[0031] FIG. 5 illustrates the protection assembly 210 according to an example embodiment where the protection assembly 210 may include further cameras 240 disposed at both the power tool 200 and the operator 110. In some cases, in order to provide a better sense of depth and three dimensional perception, multiple cameras 240 may be disposed at different locations. In an example embodiment, the cameras 240 may have significant overlap in their respective field of views 127, so that the image data may essentially contain perspectives of substantially the same field of view 127, but from different angles. In this regard, a stereoscopic effect may be achieved by fusing image data from each of the cameras 240 at the controller 140. This stereoscopic effect may provide the controller 140 with more thorough image data, enabling the controller 140 to make better determinations of the spatial positioning of the protected objects relative to the working assembly 220 and the protective zone 260. In such cases, a plurality of cameras 240 may be used to achieve this effect. In an example embodiment, the plurality of cameras 240 may be disposed at various locations at the operator 110. In some cases, the plurality of cameras 240 may be disposed at various locations at the power tool 200. In an example embodiment, the plurality of cameras 240 may be disposed both at the operator 110 and at the power tool 200. In any case, the plurality of cameras 240 may be directed towards substantially the same field of view 127 so as to convey a stereoscopic perspective of objects within the field of view 127 in the image data to the controller 140.
[0032] In some cases, the controller 140 may be configured to generate a 3D model of the field of view 127. In this regard, the controller 140 may utilize the stereoscopic effect of the image data to generate a 3D model of the field of view 127 to more accurately perceive and track distances and positions of the protected objects. In some cases, the controller 140 may also augment the 3D models by generating non-visible portions of protected objects. For example, if in the image data a portion of a body part 115 may be visible but another portion of the same body part 115 may be blocked from view by another object, the controller 140 may recognize the body part 115 via the machine learning model and generate the rest of the body part 115 that is blocked in the 3D model. In an example embodiment, the protected object may be partially or entirely blocked from view of either camera 240 or from both cameras 240. In any case, the controller 140 may be configured to generate the invisible protected object as a result of the determination of the location of other protected objects and knowing where another protected object should be in relation to the others. As such, the protection assembly 210 may still be able to initiate the protective action responsive to a protected object intruding on the protective zone 260 despite being blocked from view of the one or more cameras 240.
[0033] FIG. 6 illustrates the protection assembly 210 according to an example embodiment where the protection assembly 210 may include the inertial measurement unit (IMU)-based sensors 120 and the tool position sensor 122 disposed at the power tool 200 and at each body part 115 of the plurality of body parts 115, similar to FIG. 1. As described above, in relation to FIG.
1, the IMU-based sensors 120 may measure the movement of, and the relative positions between, each IMU-based sensor 120 and the tool position sensor 122. In some embodiments, the IMU- based sensors 120 may be calibrated based on predefined poses of the operator 110 and corresponding positions of the power tool 200. In this regard, for example, certain positions may have known sensor data associated therewith. Accordingly, the chainsaw 100 may be detected as being held in one or more of such positions during a calibration procedure in order to reset to a known state of parts of the sensor data. Given that there may be multiple positions, various different parts of the sensor data may be reset until a full reset is achieved by going through a full sequence of calibration positions. Accordingly, the user manual or a maintenance manual for the chainsaw 100 may list the calibration positions. A calibration mode may be entered, and the corresponding positions may be sequentially cycled through. The calibrated positions may relate to both the chainsaw 100 and the operator 110 in some cases. Thus, for example, the operator 110 (who may be a maintenance technician, or the owner in various cases) may be guided as to the poses to assume with the chainsaw 100 while wearing the IMU-based sensors 120. Thus, the controller 140 may store baseline data corresponding to known distances from each of the IMU- based sensors 120 to the tool position sensor 122 from each of the predefined poses. During the operation of the power tool 200, the controller 140 may be configured to perform a comparison of the baseline data gathered in the predefined poses to current data gathered in real time. Responsive to determining the distance from any of the IMU-based sensors 120 to the tool position sensor 122 is less than or equal to the predetermined distance threshold value, the controller 140 may initiate the protective action.
[0034] FIG. 6 may also illustrate the protection assembly 210 of some embodiments where the protection assembly may also include infrared temperature sensors 250. The infrared temperature sensors 250 may be implemented in addition to the camera 240 and the IMU-based sensors 120 to provide an additional form of object identification. The infrared temperature sensors 250 may be configured to detect infrared thermal radiation from the body part 115 within the field of view. In this regard, according to the known principles of black-body radiation, objects at temperatures above zero degrees kelvin (e.g., absolute zero) emit thermal electromagnetic radiation at varying wavelengths as a function of the object’s temperature. At relatively low temperatures (e.g., room temperature) most objects emit thermal radiation in the infrared spectrum, such that the radiation is not visible to the human eye. The hotter an object gets, the shorter the wavelength of the thermal radiation gets. An object may continue to emit thermal radiation in the infrared spectrum as it heats up until eventually, when the object reaches a temperature of approximately 773 degrees Kelvin, the thermal radiation enters the range of visible light. As such, the human body is not hot enough to emit thermal radiation in the visible light spectrum. Thus, humans and animals fall within a temperature range that corresponds to thermal radiation in the infrared spectrum.
[0035] The infrared temperature sensors 250 may continuously scan the field of view 127 and, in some cases, generate maps or other data regarding objects in the field of view 127 based on the results of the scanning. The infrared temperature sensors 250 may communicate the maps or other data to the controller 140 that may use the maps or other data, along with the image data from the camera 240 and the distance data from the IMU-based sensors 120, to determine when the trigger event occurs. The infrared temperature sensors 250 may therefore be able to detect the presence of the body part 115 (or other object emitting thermal radiation in the range of emissions that are of interest) within the field of view 127. In this regard, the protection assembly 210 may distinguish between the body part 115 in the field of view 127, and other objects within the field of view 127. The controller 140 may be configured to respond differently to detecting the body part 115 in the field of view 127 as compared to detecting a different object in the field of view 127. As noted above, the body part 115 is just one example of a detectable object entering the field of view 127. Other objects detectable by the infrared temperature sensors 250 may include animals or parts of animals emitting infrared thermal radiation.
[0036] In some embodiments, the field of view 127 may be much larger than the protective zone 260. In this regard, the protection assembly 210 may identify and track protected objects within the field of view 127, even if the protected objects may not be located proximate to the protective zone 260. When the field of view 127 is much larger than the protective zone 260, the controller 140 may dynamically alter the size of the protective zone 260 based on whether or not the protected objects in the field of view 127 have a velocity and an acceleration relative to the working assembly 220. As such, the maps, photos, videos, distance data or other data generated by the object recognition sensors 125 (e.g. camera 240 and infrared temperature sensors 250) and the IMU-based sensors 120 may be used by the controller 140 to determine if the body part 115 is approaching the working assembly 220, and, if so, its velocity and acceleration. If the controller 140 determines that the body part 115 may be approaching the working assembly 220 at a velocity that exceeds a predetermined threshold velocity, then the controller 140 may enlarge the protective zone 260 to create a larger buffer between the body part 115 and the working assembly 220 to allow more time for the protective action to take place. In contrast, if the controller 140 determines that the body part 115 may be approaching the working assembly 220 at a velocity that is less than a predetermined threshold velocity, then the controller 140 may reduce the size of the protective zone 260 and create a smaller buffer between the body part 115 and the working assembly 220 to allow for more precise and controlled operation of the power tool 200 in certain settings. In an example embodiment, the controller 140 may have similar responses for detecting the body part 115 accelerating towards the protective zone 260, and decelerating as it approaches the protective zone 260, respectively. In some cases, the predetermined threshold velocity may be a function of the current size of the protective zone 260, the distance of the body part 115 to the protective zone 260 and the velocity and acceleration of the body part 115. In this regard, the predetermined threshold velocity may be a function that may reflect the variety of conditions that must be met rather than a static numerical value.
[0037] The configuration of the controller 140 in accordance with an example embodiment will now be described in reference to FIG. 7. In this regard, FIG. 7 shows a block diagram of the controller 140 in accordance with an example embodiment. As shown in FIG. 7, the controller 140 may include the processing circuitry 280 of an example embodiment as described herein. The processing circuitry 280 may be configured to provide electronic control inputs to one or more functional units of the power tool 200 (e.g., the chain brake 170) or the system (e.g., issuing a warning to the hearing protection 180) and to process data received at or generated by the one or more of the IMU -based sensors 120 or the object recognition sensors 125 regarding various indications of movement or distance between the power tool 200 and the operator 110 or the body part 115. Thus, the processing circuitry 280 may be configured to perform data processing, control function execution and/or other processing and management services according to an example embodiment.
[0038] In some embodiments, the processing circuitry 280 may be embodied as a chip or chip set. In other words, the processing circuitry 280 may comprise one or more physical packages (e.g., chips) including materials, components and/or wires on a structural assembly (e.g., a baseboard). The structural assembly may provide physical strength, conservation of size, and/or limitation of electrical interaction for component circuitry included thereon. The processing circuitry 280 may therefore, in some cases, be configured to implement an embodiment of the present invention on a single chip or as a single “system on a chip.” As such, in some cases, a chip or chipset may constitute means for performing one or more operations for providing the functionalities described herein.
[0039] In an example embodiment, the processing circuitry 280 may include one or more instances of a processor 282 and memory 284 that may be in communication with or otherwise control other components or modules that interface with the processing circuitry 280. As such, the processing circuitry 280 may be embodied as a circuit chip (e.g., an integrated circuit chip) configured (e.g., with hardware, software or a combination of hardware and software) to perform operations described herein. In some embodiments, the processing circuitry 280 may be embodied as a portion of an onboard computer housed in the housing 230 of the power tool 200 to control operation of the system relative to interaction with other motion tracking and/or distance measurement devices.
[0040] Although not required, some embodiments of the controller 140 may employ or be in communication with a user interface 300. The user interface 300 may be in communication with the processing circuitry 280 to receive an indication of a user input at the user interface 300 and/or to provide an audible, visual, tactile or other output to the operator 110. As such, the user interface 300 may include, for example, a display, one or more switches, lights, buttons or keys, speaker, and/or other input/output mechanisms. In an example embodiment, the user interface 300 may include the hearing protection 180 of FIG. 1 , or one or a plurality of colored lights to indicate status or other relatively basic information. However, more complex interface mechanisms could be provided in some cases.
[0041] The controller 140 may employ or utilize components or circuitry that acts as a device interface 310. The device interface 310 may include one or more interface mechanisms for enabling communication with other devices (e.g., the object recognition sensors 125, the chain brake 170, the hearing protection 180, the IMU-based sensors 120, etc). In some cases, the device interface 310 may be any means such as a device or circuitry embodied in either hardware, or a combination of hardware and software that is configured to receive and/or transmit data from/to components in communication with the processing circuitry 280 via internal communication systems of the power tool 200 and/or via wireless communication equipment (e.g., a one way or two way radio). As such, the device interface 310 may include an antenna and radio equipment for conducting Bluetooth, WiFi, or other short range communication, or include wired communication links for employing the communications necessary to support the functions described herein.
[0042] The processor 282 may be embodied in a number of different ways. For example, the processor 282 may be embodied as various processing means such as one or more of a microprocessor or other processing element, a coprocessor, a controller or various other computing or processing devices including integrated circuits such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), or the like. In an example embodiment, the processor 282 may be configured to execute instructions stored in the memory 284 or otherwise accessible to the processor 282. As such, whether configured by hardware or by a combination of hardware and software, the processor 282 may represent an entity (e.g., physically embodied in circuitry - in the form of processing circuitry 280) capable of performing operations according to embodiments of the present invention while configured accordingly. Thus, for example, when the processor 282 is embodied as an ASIC, FPGA or the like, the processor 282 may be specifically configured hardware for conducting the operations described herein. Alternatively, as another example, when the processor 282 is embodied as an executor of software instructions, the instructions may specifically configure the processor 282 to perform the operations described herein.
[0043] In an exemplary embodiment, the memory 284 may include one or more non- transitory memory devices such as, for example, volatile and/or non-volatile memory that may be either fixed or re-movable. The memory 284 may be configured to store information, data, applications, instructions, the machine learning model or the like for enabling the processing circuitry 280 to carry out various functions in accordance with exemplary embodiments of the present invention. For example, the memory 284 could be configured to buffer input data for processing by the processor 282. Additionally or alternatively, the memory 284 could be configured to store instructions for execution by the processor 282. As yet another alternative or additional capability, the memory 284 may include one or more databases that may store a variety of data sets. Among the contents of the memory 284, applications may be stored for execution by the processor 282 in order to carry out the functionality associated with each respective application. In some cases, the applications may include instructions for motion tracking and/or object recognition and distance measuring and distance tracking as described herein. In an example embodiment, the memory 284 may save data in a circular buffer. In other words, a set amount of storage space may be available in the memory 284 and in order to save new data from the object recognition sensors 125 or the IMU-based sensors 120, the memory 284 may overwrite the oldest saved data with the newest saved data. The processor 282 may then access the memory 284 under the direction of an application from the memory 284 to compare sets of saved data from the plurality of sensors.
[0044] Some example embodiments may provide for a protection assembly for a power tool. The protection assembly may include a camera which may be disposed at an operator of the power tool that may record image data, and a controller which may be disposed at the power tool, the controller may include processing circuitry which may be configured to initiate a protective action with respect to a working assembly of the power tool responsive to a trigger event occurring, the trigger event may include the operator intruding on a protective zone defined by a predetermined distance threshold value that may extend from all sides of the working assembly. The processing circuitry may include a machine learning model that may recognize the operator and the power tool in the image data.
[0045] The protection assembly of some embodiments may include additional, optional features, and/or the features described above may be modified or augmented. Some examples of modifications, optional features and augmentations are described below. It should be appreciated that the modifications, optional features and augmentations listed below may each be added alone, or they may be added cumulatively in any desirable combination. For example, in some embodiments, the protection assembly may identify and track a plurality of body parts of the operator. In some cases, a unique marker may be disposed at each body part of the plurality of body parts of the operator. In an example embodiment, the machine learning model may be trained to identify and track each of the unique markers. In some cases, the machine learning model may be a pre-trained image classification model that may be re-trained through a transfer learning process. In an example embodiment, in the transfer learning process, the machine learning model may be trained on a dataset that may include images of the unique markers and images of the power tool in a variety of settings and orientations. In some cases, the protection assembly may further include a radar disposed at the power tool. In an example embodiment, the radar may measure a distance to each body part identified by the camera from the power tool and may communicate the distance to the controller. In some cases, the controller may initiate the protective action responsive to determining the distance to any body part from the power tool may be less than or equal to the predetermined distance threshold value. In an example embodiment, the protection assembly may further include inertial measurement unit (IMU)- based sensors that may be disposed at the power tool and at each body part of the plurality of body parts, and a tool position sensor that may be disposed at the working assembly. In some cases, the IMU-based sensors may measure movement and relative positions between each IMU- based sensor and the tool position sensor. In an example embodiment, the IMU-based sensors may be calibrated based on predefined poses of the operator and corresponding positions of the power tool. In some cases, the controller may store baseline data corresponding to known distances from each of the IMU-based sensors to the tool position sensor in each of the predefined poses. In an example embodiment, the controller may be configured to perform a comparison of the baseline data gathered in the predefined poses to current data. In some cases, the controller may initiate the protective action responsive to determining the distance from any of the IMU-based sensors to the tool position sensor may be less than or equal to the predetermined distance threshold value. In an example embodiment, the protection assembly may further include infrared temperature sensors that may be disposed at the operator that may assist the camera in identifying the body parts by detecting infrared thermal radiation from the body parts. In some cases, the power tool may be a chainsaw. In an example embodiment, the working assembly may include a chain and a guide bar. In some cases, the protective action may include activating a chain brake of the chainsaw responsive to the controller determining that the trigger event may have occurred. In an example embodiment, the power tool may be a chainsaw or power cutters. In some cases, the working assembly may include a chain and guide bar or a cutting blade. In an example embodiment, the protective action may include providing an audible or visual warning to the operator responsive to the controller determining that the trigger event may have occurred.
[0046] Some example embodiments may provide for a power tool. The power tool may include a working assembly that may perform a cutting operation, and a protection assembly that may protect an operator of the power tool during the cutting operation. The protection assembly may include a controller which may include processing circuitry which may be configured to initiate a protective action with respect to a working assembly of the power tool responsive to a trigger event occurring, the trigger event may include the operator intruding on a protective zone defined by a predetermined distance threshold value that may extend from all sides of the working assembly. The processing circuitry may include a machine learning model that may recognize the operator and the power tool in image data recorded by a camera disposed at the operator.
[0047] Many modifications and other embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these inventions pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the inventions are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Moreover, although the foregoing descriptions and the associated drawings describe exemplary embodiments in the context of certain exemplary combinations of elements and/or functions, it should be appreciated that different combinations of elements and/or functions may be provided by alternative embodiments without departing from the scope of the appended claims. In this regard, for example, different combinations of elements and/or functions than those explicitly described above are also contemplated as may be set forth in some of the appended claims. In cases where advantages, benefits or solutions to problems are described herein, it should be appreciated that such advantages, benefits and/or solutions may be applicable to some example embodiments, but not necessarily all example embodiments. Thus, any advantages, benefits or solutions described herein should not be thought of as being critical, required or essential to all embodiments or to that which is claimed herein. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.

Claims

THAT WHICH IS CLAIMED:
1. A protection assembly (210) for a power tool (200), the protection assembly (210) comprising: a camera (240) disposed at an operator (110) of the power tool (200) that records image data; and a controller (140) disposed at the power tool (200), the controller comprising processing circuitry (280) configured to initiate a protective action with respect to a working assembly (220) of the power tool (200) responsive to a trigger event occurring, the trigger event comprising the operator (110) intruding on a protective zone (260) defined by a predetermined distance threshold value (DI) that extends from all sides of the working assembly (220), wherein the processing circuitry (280) comprises a machine learning model that recognizes protected objects in the image data.
2. The protection assembly (210) of claim 1, wherein the protection assembly (210) recognizes a plurality of body parts (115) of the operator (110), wherein a unique marker (270) is disposed at each body part (115) of the plurality of body parts (115) of the operator (110), and wherein the machine learning model recognizes each of the unique markers (270).
3. The protection assembly (210) of claim 2, wherein the machine learning model is a pre-trained image classification model that is re-trained through a transfer learning process, and wherein in the transfer learning process, the machine learning model is trained on a dataset comprising images of the unique markers (270) and images of the power tool (200) in a variety of settings and orientations.
4. The protection assembly (210) of claim 3, wherein the protection assembly (210) further comprises a radar (290) disposed at the power tool (200).
5. The protection assembly (210) of claim 4, wherein the radar (290) measures a distance to each body part (115) identified by the camera (240) from the power tool (200) and communicates the distance to the controller (140).
6. The protection assembly (210) of claim 5, wherein the controller (140) initiates the protective action responsive to determining the distance to any body part (115) from the power tool (200) is less than or equal to the predetermined distance threshold value (DI).
7. The protection assembly (210) of claim 3, wherein the protection assembly (210) further comprises inertial measurement unit (IMU)-based sensors (120) disposed at the power tool (200) and at each body part (115) of the plurality of body parts (115), and a tool position sensor (122) disposed at the working assembly (220).
8. The protection assembly (210) of claim 7, wherein the IMU-based sensors (120) measure movement and relative positions between each IMU-based sensor (120) and the tool position sensor (122) in addition to the camera (240) recording image data and the machine learning model recognizing protected objects in the image data.
9. The protection assembly (210) of claim 8, wherein the IMU-based sensors (120) are calibrated based on predefined poses of the operator (110) and corresponding positions of the power tool (200).
10. The protection assembly (210) of claim 9, wherein the controller (140) stores baseline data corresponding to known distances from each of the IMU-based sensors (120) to the tool position sensor (122) in each of the predefined poses, and wherein the controller (140) is configured to perform a comparison of the baseline data gathered in the predefined poses to current data.
11. The protection assembly (210) of claim 10, wherein the controller (140) initiates the protective action responsive to determining a distance from any of the IMU-based sensors (120) to the tool position sensor (122) is less than or equal to the predetermined distance threshold value (DI).
12. The protection assembly (210) of claim 11, wherein the protection assembly (210) further comprises infrared temperature sensors (250) disposed at the operator (110) that assist the camera (240) in identifying the body parts (115) by detecting infrared thermal radiation from the body parts (115).
13. The protection assembly (210) of claim 2, wherein the machine learning model is trained to recognize the protected objects using image data supplied from the camera (240) during the operation of the power tool (200) in a variety of settings and orientations.
14. The protection assembly (210) of claim 1, wherein the protection assembly (210) further comprises a second camera (240) disposed either at the operator (110) or at the power tool (200) to record image data from a different angle, and wherein the controller (140) uses image data from both cameras (240) to produce a stereoscopic effect to improve depth three dimensional perception.
15. The protection assembly (210) of claim 14, wherein the controller (140) generates a 3D model of a field of view (127) using the stereoscopic effect from the image data, and wherein the controller (140) augments the 3D model with invisible portions of protected objects within the field of view (127).
16. The protection assembly (210) of claim 1, wherein the power tool (200) is a chainsaw (100), wherein the working assembly (220) comprises a chain (102) and a guide bar, and wherein the protective action comprises activating a chain brake (170) of the chainsaw (100) responsive to the controller (140) determining that the trigger event has occurred.
17. The protection assembly (210) of claim 1, wherein the power tool (200) is a chainsaw (100) or power cutters, wherein the working assembly (220) comprises a chain (102) and guide bar or a cutting blade, and wherein the protective action comprises providing an audible or visual warning to the operator (110) responsive to the controller (140) determining that the trigger event has occurred.
18. A power tool (200) comprising: a working assembly (220) to perform a cutting operation; and a protection assembly (210) to protect an operator (110) of the power tool (200) during the cutting operation, wherein the protection assembly (210) comprises: a controller (140) comprising processing circuitry (280) configured to initiate a protective action with respect to the working assembly (220) of the power tool responsive to a trigger event occurring, the trigger event comprising the operator (110) intruding on a protective zone (260) defined by a predetermined distance threshold value (DI) that extends from all sides of the working assembly (220), wherein the processing circuitry (280) comprises a machine learning model that recognizes protected objects in image data recorded by a camera (240) disposed at the operator (HO).
19. The power tool (200) of claim 18, wherein the machine learning model recognizes a plurality of body parts (115) of the operator (110) via unique markers (270) disposed at each body part (115) of the plurality of body parts (115).
20. The power tool (200) of claim 19, wherein the machine learning model is a pretrained image classification model that is re-trained through a transfer learning process, and wherein in the transfer learning process, the machine learning model is trained on a dataset comprising images of the unique markers (270) and images of the power tool (200) in a variety of settings and orientations.
21. The power tool (200) of claim 20, wherein the protection assembly (210) further comprises a radar (290) disposed at the power tool (200).
22. The power tool (200) of claim 21, wherein the radar (290) measures a distance to each body part (115) identified by the camera (240) from the power tool (200) and communicates the distance to the controller (140).
23. The power tool (200) of claim 22, wherein the controller (140) initiates the protective action responsive to determining the distance to any body part (115) from the power tool (200) is less than or equal to the predetermined distance threshold value (DI).
24. The power tool (200) of claim 20, wherein the protection assembly (210) further comprises a tool position sensor (122) disposed at the working assembly (220) that communicates with inertial measurement unit (IMU)-based sensors (120) disposed at each body part (115) of the plurality of body parts (115).
25. The power tool (200) of claim 24, wherein the tool position sensor (122) and the IMU-based sensors (120) are calibrated based on predefined poses of the operator (110) and corresponding positions of the power tool (200).
26. The power tool (200) of claim 25, wherein the controller (140) stores baseline data corresponding to known distances from each of the IMU-based sensors (120) to the tool position sensor (122) in each of the predefined poses, and wherein the controller (140) is configured to perform a comparison of the baseline data gathered in the predefined poses to current data.
27. The power tool (200) of claim 26, wherein the controller (140) initiates the protective action responsive to determining a distance from any of the IMU-based sensors (120) to the tool position sensor (122) is less than or equal to the predetermined distance threshold value (DI).
28. The power tool (200) of claim 19, wherein the machine learning model is trained to recognize the protected objects using image data supplied from the camera (240) during the operation of the power tool (200) in a variety of settings and orientations.
29. The power tool (200) of claim 18, wherein the protection assembly (210) further comprises a second camera (240) disposed either at the operator (110) or at the power tool (200) to record image data from a different angle, and wherein the controller (140) uses image data from both cameras (240) to produce a stereoscopic effect to improve depth three dimensional perception.
30. The power tool (200) of claim 29, wherein the controller (140) generates a 3D model of a field of view (127) using the stereoscopic effect from the image data, and wherein the controller (140) augments the 3D model with invisible portions of protected objects within the field of view (127).
31. The power tool (200) of claim 18, wherein the power tool (200) is a chainsaw (100), wherein the working assembly (220) comprises a chain (102) and a guide bar, and wherein the protective action comprises activating a chain brake (170) of the chainsaw (100) responsive to the controller (140) determining that the trigger event has occurred.
32. The power tool (200) of claim 18, wherein the power tool (200) is a chainsaw or power cutters, wherein the working assembly (220) comprises a chain (102) and guide bar or a cutting blade, and wherein the protective action comprises providing an audible or visual warning to the operator (110) responsive to the controller (140) determining that the trigger event has occurred.
PCT/EP2023/080773 2022-11-07 2023-11-06 System for protecting an operator of a power tool having a camera WO2024099933A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE2251294-1 2022-11-07
SE2251294 2022-11-07

Publications (1)

Publication Number Publication Date
WO2024099933A1 true WO2024099933A1 (en) 2024-05-16

Family

ID=88697516

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2023/080773 WO2024099933A1 (en) 2022-11-07 2023-11-06 System for protecting an operator of a power tool having a camera

Country Status (1)

Country Link
WO (1) WO2024099933A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019165079A2 (en) * 2018-02-21 2019-08-29 Lantern Safety Kinetics, Llc High-precision abnormal motion detection for power tools
US20210104006A1 (en) * 2019-10-02 2021-04-08 Lantern Holdings, LLC Keepout zone detection and active safety system
WO2021215992A1 (en) * 2020-04-23 2021-10-28 Husqvarna Ab System for protecting an operator of a power tool

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019165079A2 (en) * 2018-02-21 2019-08-29 Lantern Safety Kinetics, Llc High-precision abnormal motion detection for power tools
US20210104006A1 (en) * 2019-10-02 2021-04-08 Lantern Holdings, LLC Keepout zone detection and active safety system
WO2021215992A1 (en) * 2020-04-23 2021-10-28 Husqvarna Ab System for protecting an operator of a power tool

Similar Documents

Publication Publication Date Title
US20220388081A1 (en) System for Protecting an Operator of a Power Tool
EP3363602B1 (en) Monitoring system, monitoring device, and monitoring method
CN107486857B (en) Robot system
EP2878875B1 (en) Monitoring device and monitoring method
CN111226178B (en) Monitoring device, industrial system, method for monitoring and computer program
EP2772336B1 (en) Recognition-based industrial automation control with position and derivative decision reference
US20170108838A1 (en) Building lighting and temperature control with an augmented reality system
EP3422153A1 (en) System and method for selective scanning on a binocular augmented reality device
BR112021009651A2 (en) proximity-based personnel security system and method
EP2772810A2 (en) Recognition-based industrial automation control with person and object discrimination
BR102017015871B1 (en) SYSTEM FOR OPERATING MACHINERY IN A MANUFACTURING ENVIRONMENT INCLUDING MACHINERY AND METHOD FOR OPERATING MACHINERY
JPWO2018131237A1 (en) Collaborative robot system and control method thereof
CN104076761A (en) Recognition-based industrial automation control with confidence-based decision support
JP6544044B2 (en) Image processing apparatus, image processing system and image processing method
EP2772812B1 (en) Recognition-based industrial automation control with redundant system input support
US11756326B2 (en) Keepout zone detection and active safety system
Kumar et al. Spatial object tracking system based on linear optical sensor arrays
Tsun et al. A human orientation tracking system using Template Matching and active Infrared marker
WO2024099933A1 (en) System for protecting an operator of a power tool having a camera
Marvel Sensors for safe, collaborative robots in smart manufacturing
US20160120149A1 (en) Warm Cap with Security Features
BR102020005643A2 (en) PROXIMITY DETECTION IN ASSEMBLY ENVIRONMENTS HAVING MACHINERY
Ostermann et al. Freed from fences-Safeguarding industrial robots with ultrasound
WO2024099932A1 (en) System for protecting an operator of a power tool having infrared sensors
Häger et al. Countering bias in tracking evaluations

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23801411

Country of ref document: EP

Kind code of ref document: A1