WO2022038314A1 - Biometric shape recognition - Google Patents

Biometric shape recognition Download PDF

Info

Publication number
WO2022038314A1
WO2022038314A1 PCT/FI2021/050559 FI2021050559W WO2022038314A1 WO 2022038314 A1 WO2022038314 A1 WO 2022038314A1 FI 2021050559 W FI2021050559 W FI 2021050559W WO 2022038314 A1 WO2022038314 A1 WO 2022038314A1
Authority
WO
WIPO (PCT)
Prior art keywords
digital
geometry data
further configured
biometric
orthotic
Prior art date
Application number
PCT/FI2021/050559
Other languages
French (fr)
Inventor
Timo Yletyinen
Roope KUISMA
Kimmo Virtanen
Original Assignee
Taika3D Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Taika3D Oy filed Critical Taika3D Oy
Publication of WO2022038314A1 publication Critical patent/WO2022038314A1/en

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C13/00Dental prostheses; Making same
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/5044Designing or manufacturing processes
    • A61F2/5046Designing or manufacturing processes for designing or making customized prostheses, e.g. using templates, finite-element analysis or CAD-CAM techniques
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F5/00Orthopaedic methods or devices for non-surgical treatment of bones or joints; Nursing devices; Anti-rape devices
    • A61F5/01Orthopaedic devices, e.g. splints, casts or braces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/5044Designing or manufacturing processes
    • A61F2/5046Designing or manufacturing processes for designing or making customized prostheses, e.g. using templates, finite-element analysis or CAD-CAM techniques
    • A61F2002/5047Designing or manufacturing processes for designing or making customized prostheses, e.g. using templates, finite-element analysis or CAD-CAM techniques using mathematical models
    • A61F2002/5049Computer aided shaping, e.g. rapid prototyping

Definitions

  • the present disclosure generally relates to orthotics or prosthetics .
  • some embodiments of the disclosure relate to computer assisted design of bespoke products based on anatomical data .
  • Orthotics and/or prosthetics are used to treat a range of medical conditions , disabilities or musculoskeletal anomalies .
  • Orthoses and prosthetic sockets are traditionally manufactured by hand, via craft or very basic computer assisted methods .
  • the industry is moving towards 3d scanning patients and designing the products somehow using this geometry .
  • the final product is then created from a digital design using milling systems , or increasingly often via 3d printing .
  • the process usually begins with a clinical asses sment where the clinician ( orthotist , prosthetist , podiatrist or a similar qualified person) assesses the patient ' s condition and the issue or disability and decides the intervention they wish to affect .
  • This intention is usually specified in a prescription or order form either digitally or on paper .
  • the other part of the assessment is the patient shape capture , which can be done through direct 3d scanning of the patient or through plaster casting or similar methods .
  • the negative impression captured is then often 3d scanned in order to have the patient geometry in a digital format .
  • 3d scanning in a general case is either a) 3d visualization of a physical object with a computer or b) to reverse engineer a physical object to make a replica or another physical object that accurately matches the original one.
  • the accuracy is not important and therefore the scanners can be low-cost.
  • the accuracy is usually a requirement, and therefore the scanners are very expensive.
  • the scanner is low-cost, because it is not financially feasible to buy very expensive 3D scanners for each orthotist. Therefore, the scans are also generally low quality, and added with low user experience, this may bring a challenge to producing customized orthopedic or prosthetic products.
  • a publication W02013071416A1 discloses a scanning method . This requires a known positioning such as a rack, and it also requires a ground reference limiting the scan options : For example , the bottom side of the foot or sides may be imposs ible to scan because of the rack . It requires a reference obj ect which may be slow and/or difficult to use or even expensive to use . It also requires calibration .
  • a device is configured for design of an orthotic or prosthetic product .
  • the device is further conf igured to : Receive a digital 3d geometry data of at least a part of an anatomy in relation to the orthotic or prosthetic product . Smooth a surface of the digital 3d geometry data using a smoothing algorithm . Identify areas that are not valid as a desired representation of the digital 3d geometry data . Recogni ze biometric shapes from the digital 3d geometry data having the identi fied areas removed . Place a landmark for each biometric shape . Output the digital 3d geometry data having the landmarks . This solution may not require any reference obj ects or reference planes .
  • the solution may use various kinds of 3d scanner devices or 3d scan data as an input . It may not require known orientation or positioning .
  • the solution may work without human intervention, creating very fast results .
  • the scan may be random, and if it does not include the human part that is being scanned it wil l produce an error indicating that a new scan is required .
  • the device may be further configured to : remove the identified areas . This solution may facilitate work and planning when there are no disturbances that may disturb the recognition process .
  • the device may be further configured to : identify areas that are not valid as a desired representation of the digital 3d geometry data ; remove the identified areas ; and recogni ze biometric shapes from the digital 3d geometry data having the identified areas removed .
  • Thrs solutron may enable frlter out possibly disturbing parts of the digital 3d geometry data . It may improve the recognition because not val id parts of the 3d geometry data are filtered out .
  • the input can include normal operating room obj ects such as sofa corners and irrelevant biometric data, which may be removed .
  • the device may be further configured to : smooth a surface of the digital 3d geometry data using a smoothing algorithm .
  • This solution may remove or reduce sharp corners from the digital 3d geometry data that may disturb the recognition operation .
  • the device may be further configured to : orientate the digital 3d geometry data having the landmarks so that the design of the orthotic or prosthetic product is based on the oriented digital 3d geometry data .
  • This solution may provide a reasonable basis for further processing of the digital 3d geometry data to design the digital orthotic or prosthetic product .
  • the device may be further configured to : process the digital 3d geometry data into a meshed surface so that the smoothened digital 3d geometry data is based on the meshed surface .
  • This solution may clean and smoothen the surface and provide a reasonable bas is for representing the anatomy of the surface of the digital 3d geometry data .
  • the device may be further configured to : apply marching cubes to remove noi se from the digital 3d geometry data .
  • This solution may clean and smoothen the surface .
  • the device is further configured to : recogni ze , from dimensions of the digital 3d geometry data, oversi ze obj ects compared to the anatomy; and remove the oversi ze obj ects .
  • This solution may remove disturbing or erroneous parts of the digital 3d geometry data with respect to the anatomy .
  • the device may be further configured to : identify flat obj ects compared to the anatomy from the digital 3d geometry data ; and remove the flat obj ects .
  • This solution may remove disturbing or erroneous parts of the digital 3d geometry data with respect to the anatomy .
  • the device may be further configured to : apply an average surface normal to categori ze the surface of the digital 3d geometry data . This solution may better establish the intended surface of the anatomy .
  • the device may be further configured to : sample the digital 3d geometry data having the identified area removed before performing the recognition to collect data for the recognition .
  • This solution may enable better shape recognition based on the sampled surface .
  • the device may be further configured to : collect samples from the digital 3d geometry data, wherein a sample comprises a set of 3d points that are placed in a pre-defined formation around a target . Certain points may be set for a predefined format so that shape recognition may be applied .
  • the device may be further configured to : recogni ze the biometric shapes based on the sampled digital 3d geometry data only . This solution may enable focusing shape recognition on the sampled data .
  • the device may be further configured to : return to perform the sampling if the recognition does not recogni ze any biometric shape , and re-sample the digital 3d geometry data based on the non-recogni zed biometric shape .
  • This solution may enable one or more iteration rounds between sampling and shape recognition, and gradually improve shape recognition results .
  • the device may be further configured to : fill in holes of the surface of the digital 3d geometry data having the recogni zed biometric shapes .
  • This solution may further process the data to better match the anatomy .
  • a method for design of an orthotic or prosthetic product comprising : Receiving a digital 3d geometry data of at least a part of an anatomy in relation to the orthotic or prosthetic product . Recognizing biometric shapes from the digital 3d geometry data . Placing a landmark for each biometric shape . Outputting the digital 3d geometry data having the landmarks .
  • the method of the second aspect may be executed in the device according to any implementation form of the first aspect .
  • a computer program is provided .
  • the computer program may comprise program code configured to cause performance of the method of any implementation form of the second aspect , when the computer program is executed on a computer .
  • a computer program product comprising a computer readable storage medium for storing program code.
  • the program code may comprise instructions for performing any implementation form of the second aspect .
  • Implementation forms of the disclosure can thus provide a device , a method, a computer program, and a computer program product for biometric shape recognition .
  • Fig . 1 illustrates an example of a process for biometric shape recognition and designing an orthotic or prosthetic product , according to an embodiment
  • Fig . 2 illustrates an example of a device for biometric shape recognition, and further designing an orthotic or prosthetic product , according to an embodiment
  • Fig . 3 illustrates an example of the 3d geometry data of a part of an anatomy, according to an embodiment ;
  • Fig . 4 illustrates an example of the 3d geometry data of a part of an anatomy, according to an embodiment
  • Fig . 5 illustrates an example of a negative source for the 3d geometry data, according to an embodiment .
  • a device comprises software configured to solve the problem of automatic batch work for different user demands . More specifically, the device is configured for manipulation of input 3d data (such as a 3d scan model ) and parameters (such as an order form) to a completed des ign file ( stl or g-code , for example ) .
  • the embodiment may also solve the problem of multiple product generation at the same time , having possibly scalability, and creates a logic to create human understandable processes .
  • the workflow of the method makes batch processing fast while automation is of reasonable quality, and allows for user intervention and validation of critical design elements .
  • the automatic custom 3d geometry data generation applies a set of rules that describe the logic which generates the shape of the product (such as the digital 3d model ) . These rules may be specified in the set-up stage of the device . These rules can be converted to different operations in the actual generation process , for example validation operations during automatic 3d model generation .
  • a device is configured by software to automatically recognize a biometric shape , for example hand, foot , head, from the 3d geometry data, such as raw 3d scan data, clean the data and finally orientate it in a way that further automatic processing for designing the orthotic or prosthetic product can continue .
  • the device comprises components that are configured as follows : At operation 1 , recogni ze a volume or surfaces of interest . Use for example marching cubes to remove small noise from the original scan . Recogni ze , from dimensions of the scan, if it is signif icantly too large to be exactly the obj ect being scanned . I f it is too big, then use dedicated recognition to remove entirely flat objects (for example table corners, sofa surfaces) and too small objects (shafts, pipes, rack supports etc.) from the scan also use average surface normals to roughly categorize surfaces of the digital 3d model. Remove the areas that are not clearly valid as desired surface candidates.
  • Each recognition candidate set includes the first landmark and secondary landmarks (2 or more) .
  • the secondary landmarks may be required to orientate and finalize cleaning of the scan.
  • Each landmark set can include for example heel center and two forefoot landmarks, each having unique and mathematically distinguishable anatomical characteristics around them.
  • the device uses mathematical computing to recognize locations of the landmarks.
  • the device is configured to use both surface sampling of the scan and landmark relations computations. These landmark relations include for example a distance between the landmarks or angles they make with each other. When all candidates are found, the best fit is selected as the final landmark set for the next operation.
  • the device is configured to use the landmark or the landmark set calculated in the previous operation to do final clean and adjustments for the scan. According to an embodiment, optionally this may include for example orientation, removing unnecessary parts of the scan, smoothing or reshaping areas of the scan that have potential or known unwanted anomalies .
  • the scan may be processed and ready for any manual or automated orthotics and prosthetics design operations .
  • FIG . 1 illustrates an example of a process for biometric shape recognition and designing an orthotic or prosthetic product , according to an embodiment .
  • the device is configured to perform the operations of Fig . 1 .
  • Initial smoothening and cleaning, surface sampling, shape recognition and landmark recognition can be processed for biometric shape recognition and landmark positioning .
  • the recognition of the biometric shape is important in order to automatically process a raw scan file into an orthotic or prosthetic product that i s customi zed for the individual being scanned .
  • the desired surface or volume in the scan needs to be found .
  • the unneces sary parts can be deleted from the scan, so that they do not hinder further processing .
  • the scanned obj ect may be orientated so that the automated design process for creating the desired orthotic or prosthetic product can continue .
  • the device is configured to process the 3d geometry data, for example the scan, into a continuous or discontinuous meshed surface . Operation 11 is optional so that it may be ignored .
  • the device is configured to remove sharp corners and other noi se from the scan data . This may be processed by suitable surface smoothing algorithms such as Marching Cubes or Taubin Smooth . The device is also configured to clean clearly unnecessary surfaces with for example small island removal , unconnected faces or vertices .
  • the device is configured to process surface sampling .
  • the device is configured to process the surface of the 3d geometry data so that the actual shape recognition is able to process it further . This may be performed, for example in order to collect data for shape recognition or neural networks .
  • the surface sampling may be a set of algorithms used to collect samples from the original scan data (for example from the digital 3d geometry data) using raycaster or other means .
  • a set of samples may initially be taken from a number of 3d points that may be referred to as targets on the surface . These targets may be equally distanced on the surface or otherwise placed so that relevant areas are covered for collection of samples .
  • a sample may be a set of 3d points that are placed in a pre-defined formation around the target and moved with raycaster, the raycasting result itself , vertex positions , surface normals , or any other arbitrary set of data derived from the original scan data .
  • the device is configured to perform shape recognition that detects the desired landmarks or areas in the original surface . I f the shape recognition fails , the process returns to previous operation 13 to iterate better surface sampling results . [0055]
  • the samples of operation 13 are fed to the shape recognition .
  • Shape recognition may be a set of neural networks , traditional algorithms or a combination of both, specifically trained or designed to identify and locate body parts and/or other biometric surface shapes .
  • the shape recognition of operation 14 and surface sampling of operation 13 may be used m tandem to grade and refine the initial set of samples.
  • the shape recognition may first attempt to find anatomical features, like the heel, in the initial set of samples and use this information as a basis for the next round of surface sampling at operation 13, iteratively gaining a better understanding of the initial samples. This process between operations 13 and 14 may be repeated multiple times until the location and position of the part of the anatomy, for example a foot, is established.
  • the device is configured to get a relevant surface by 1) removing unnecessary parts, 2) cut relevant parts off or 3) use the surface for generating a new surface.
  • the device may perform adaptive hole filling. Holes at the surface of data may be filled to form a more continuous surface. This operation may be optional.
  • the device is configured to detect and place necessary landmarks for actual design automation of the orthotic or prosthetic product.
  • FIG. 2 illustrates an example of a device 200 according to an embodiment.
  • the device 200 may be for example configured for biometric shape recognition.
  • the device 200 may comprise at least one processor 202.
  • the at least one processor may comprise, for example, one or more of various processing devices, such as for example a co-processor, a microprocessor, a controller, a digital signal processor (DSP) , a processing circuitry with or without an accompanying DSP, or various other processing devices including integrated circuits such as, for example, an application specific integrated circuit (ASIC) , a field programmable gate array (FPGA) , a microcontroller unit (MCU) , a hardware accelerator, a special-purpose computer chip, or the like .
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • MCU microcontroller unit
  • the device may further comprise at least one memory 204 .
  • the memory may be configured to store , for example , computer program code or the like , for example operating system software and appl ication software .
  • the memory 204 may be also configured to store neural network ( s ) .
  • the memory may comprise one or more volatile memory devices , one or more non-volatile memory devices , and/or a combination thereof .
  • the memory may be embodied as magnetic storage devices (such as hard disk drives , floppy disks , magnetic tapes , etc . ) , optical magnetic storage devices , or semiconductor memories (such as mas k ROM, PROM (programmable ROM) , EPROM (erasable PROM) , flash ROM, RAM ( random access memory) , etc . ) .
  • the device 200 may further comprise a communication interface 208 configured to enable the device 200 to transmit and/or receive information .
  • the communication interface may be configured to provide at least one wireless radio connection, such as for example a 3GPP mobile broadband connection (e . g . 3G, 4G, 5G) ; a wireless local area network (WLAN) connection such as for example standardi zed by IEEE 802 . 11 series or Wi-Fi alliance ; a short range wireless network connection such as for example a Bluetooth, NFC (near-field communication) , or RFID connection ; a local wired connection such as for example a local area network (LAN) connection or a universal serial bus (USB) connection, or the like ; or a wired Internet connection .
  • a 3GPP mobile broadband connection e . g . 3G, 4G, 5G
  • WLAN wireless local area network
  • RFID wireless local area network
  • LAN local area network
  • USB universal serial bus
  • the device 200 may further comprise a user interface 210 comprising at least one input device and/or at least one output device .
  • the input device may take various forms such as a keyboard, a touch screen, or one or more embedded control buttons .
  • the output devrce may for example comprise a display, a speaker, a vibration motor, or the like .
  • some component and/or components of the device 200 may be configured to implement this functionality .
  • this functionality may be implemented using program code 206 comprised, for example , in the memory 204 .
  • the functionality described herein may be performed, at least in part , by one or more computer program product components such as software components .
  • the device 200 comprises a processor 202 or processor circuitry, such as for example a microcontroller, configured by the program code 206 when executed to execute the embodiments of the operations and functionality described herein .
  • the functionality described herein can be performed, at least in part , by one or more hardware logic components .
  • illustrative types of hardware logic components include field-programmable gate arrays ( FPGAs ) , application-specific integrated circuits (AS ICs ) , application-specific standard products (ASSPs ) , system-on-a-chip systems ( SOCs ) , complex programmable logic devices (CPLDs ) , graphics processing units (GPUs ) , or the like .
  • FPGAs field-programmable gate arrays
  • AS ICs application-specific integrated circuits
  • ASSPs application-specific standard products
  • SOCs system-on-a-chip systems
  • CPLDs complex programmable logic devices
  • GPUs graphics processing units
  • the device 200 may comprise means or is configured for performing ( a) method ( s ) described herein .
  • the means comprise the at least one processor 202 and the at least one memory 204 including program code 206 configured to, when executed by the at least one processor 202, cause the device 200 to perform the method.
  • the device 200 may comprise for example a computing device such as for example a mobile phone, a tablet computer, a laptop, an internet of things (loT) device, a server or the like.
  • a computing device such as for example a mobile phone, a tablet computer, a laptop, an internet of things (loT) device, a server or the like.
  • loT devices include, but are not limited to, consumer electronics, wearables, sensors, and smart home appliances.
  • the device 200 is illustrated as a single device it is appreciated that, wherever applicable, functions of the device 200 may be distributed to a plurality of devices, for example to implement example embodiments as a cloud computing service.
  • Fig. 3 illustrates an example of the 3d geometry data 30 of a part of an anatomy, according to an embodiment.
  • Anatomical 3d scans may have some common issues, such as holes 31 in the surfaces or poorly stitched surfaces. However, in the case of an inanimate object, these tend to be less than what one finds in a body part.
  • anatomical scans may have extra surfaces 33 such as corners of chairs, fingers etc. in the scan data. They may distort the shape being recognized from the scan data as illustrated in the example of Fig. 4.
  • An embodiment may also be able to recognize plaster casts or other replicas of the original surface, which usually have holes and extra anomalies on the surface which are impossible for a human anatomy.
  • one traditional way of generating foot orthotics is to capture a foot impression on a foam box 50 as illustrated in Fig. 5.
  • the embodiment may also use this kind of a 3d shape , which is a negative of the original physical obj ect .
  • An embodiment can process a negative of the original physical shape of the part of the anatomy .
  • subj ects may be referred to as ' first ' or ' second' subj ects , this does not necessarily indicate any order or importance of the subj ects . Instead, such attributes may be used solely for the purpose of making a difference between subj ects .

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Epidemiology (AREA)
  • Medical Informatics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Vascular Medicine (AREA)
  • Primary Health Care (AREA)
  • Cardiology (AREA)
  • Transplantation (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Dentistry (AREA)
  • Computer Hardware Design (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Architecture (AREA)
  • Nursing (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Manufacturing & Machinery (AREA)
  • Biophysics (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Orthopedics, Nursing, And Contraception (AREA)
  • Prostheses (AREA)

Abstract

Various embodiments relate to biometric shape recognition. Digital 3d geometry data (30) of at least a part of an anatomy in relation to the orthotic or prosthetic product is received. Biometric shapes are recognized from the digital 3d geometry data, possibly having the identified areas (33) removed. A landmark is placed for each biometric shape, and the digital 3d geometry data having the landmarks are output. Devices (200), methods, and computer programs are disclosed.

Description

BIOMETRIC SHAPE RECOGNITION
TECHNICAL FIELD
[0001 ] The present disclosure generally relates to orthotics or prosthetics . In particular, some embodiments of the disclosure relate to computer assisted design of bespoke products based on anatomical data .
BACKGROUND
[0002] Orthotics and/or prosthetics are used to treat a range of medical conditions , disabilities or musculoskeletal anomalies . Orthoses and prosthetic sockets are traditionally manufactured by hand, via craft or very basic computer assisted methods . However, the industry is moving towards 3d scanning patients and designing the products somehow using this geometry . The final product is then created from a digital design using milling systems , or increasingly often via 3d printing . [0003] The process usually begins with a clinical asses sment where the clinician ( orthotist , prosthetist , podiatrist or a similar qualified person) assesses the patient ' s condition and the issue or disability and decides the intervention they wish to affect . This intention is usually specified in a prescription or order form either digitally or on paper . The other part of the assessment is the patient shape capture , which can be done through direct 3d scanning of the patient or through plaster casting or similar methods . The negative impression captured is then often 3d scanned in order to have the patient geometry in a digital format .
[0004] These two inputs ( order data or specification and 3d shape data) are then usually processed by a designer in a computer assisted design (CAD) system, one at a time, into a digital file that can be manufactured. This is typically a g-code file in case of milling, or an stl file for 3d printing. This design can be very time-consuming and the designer will have to make many decisions and estimations on the placement of features, orientations, trimline placement and material thicknesses. They also need to ensure that everything in the prescription or specification is considered. This manual design process is not only time-consuming but prone to errors. Each designer will create a slightly different output, which is usually not repeatable once the design is finished. These issues cause delays, expenses, and poor fit and functioning of the device. As a result, there may be poor clinical outcomes or even the patient not using the orthotic or prosthetic product at all.
[0005] The purpose of 3d scanning in a general case is either a) 3d visualization of a physical object with a computer or b) to reverse engineer a physical object to make a replica or another physical object that accurately matches the original one. In case a) the accuracy is not important and therefore the scanners can be low-cost. In case b) the accuracy is usually a requirement, and therefore the scanners are very expensive. To make customized orthoses and/or prosthetics, typically the scanner is low-cost, because it is not financially feasible to buy very expensive 3D scanners for each orthotist. Therefore, the scans are also generally low quality, and added with low user experience, this may bring a challenge to producing customized orthopedic or prosthetic products.
[0006] In both cases a) and b) above, the physical objects being scanned are very different every time. The size, shape, proportions and even surface quality of the scans can vary from person to person, and because of body hair . These kinds of anatomical scans are more difficult than in the general scanning case .
[0007] One issue is also the fact that these general non-biological scans are not inanimate obj ects , and the target is stable and not moving . Humans , animals or even organism constantly breathe and move involuntarily . This causes issues in the scan, such as poorly stitched or disconnected surfaces , holes in the scan and poor triangulation . A body part shape can also change depending on position and time of the day . The reasons mentioned above present a unique set of constraints and problems that may be complex and time consuming to resolve with manual tools .
[0008] A publication W02013071416A1 discloses a scanning method . This requires a known positioning such as a rack, and it also requires a ground reference limiting the scan options : For example , the bottom side of the foot or sides may be imposs ible to scan because of the rack . It requires a reference obj ect which may be slow and/or difficult to use or even expensive to use . It also requires calibration .
SUMMARY
[0009] This summary is provided to introduce a selection of concepts in a simpli fied form that are further described below in the detailed description . This summary is not intended to identify key features or essential features of the claimed subj ect matter, nor is it intended to be used to limit the scope of the claimed subj ect matter .
[0010] It is an obj ective of the disclosure to provide biometric shape recognition and proces sing . The foregoing and other obj ectives may be achieved by the features of the independent claims . Further implementation forms are apparent from the dependent claims , the description, and the figures .
[001 1 ] According to a first aspect , a device is configured for design of an orthotic or prosthetic product . The device is further conf igured to : Receive a digital 3d geometry data of at least a part of an anatomy in relation to the orthotic or prosthetic product . Smooth a surface of the digital 3d geometry data using a smoothing algorithm . Identify areas that are not valid as a desired representation of the digital 3d geometry data . Recogni ze biometric shapes from the digital 3d geometry data having the identi fied areas removed . Place a landmark for each biometric shape . Output the digital 3d geometry data having the landmarks . This solution may not require any reference obj ects or reference planes . Furthermore , the solution may use various kinds of 3d scanner devices or 3d scan data as an input . It may not require known orientation or positioning . The solution may work without human intervention, creating very fast results . The scan may be random, and if it does not include the human part that is being scanned it wil l produce an error indicating that a new scan is required . [0012] According to an implementation form of the first aspect , the device may be further configured to : remove the identified areas . This solution may facilitate work and planning when there are no disturbances that may disturb the recognition process .
[0013] According to an implementation form of the first aspect , the device may be further configured to : identify areas that are not valid as a desired representation of the digital 3d geometry data ; remove the identified areas ; and recogni ze biometric shapes from the digital 3d geometry data having the identified areas removed . Thrs solutron may enable frlter out possibly disturbing parts of the digital 3d geometry data . It may improve the recognition because not val id parts of the 3d geometry data are filtered out . The input can include normal operating room obj ects such as sofa corners and irrelevant biometric data, which may be removed .
[0014] According to an implementation form of the first aspect , the device may be further configured to : smooth a surface of the digital 3d geometry data using a smoothing algorithm . This solution may remove or reduce sharp corners from the digital 3d geometry data that may disturb the recognition operation .
[001 5] According to an implementation form of the first aspect , the device may be further configured to : orientate the digital 3d geometry data having the landmarks so that the design of the orthotic or prosthetic product is based on the oriented digital 3d geometry data . This solution may provide a reasonable basis for further processing of the digital 3d geometry data to design the digital orthotic or prosthetic product .
[0016] According to an implementation form of the first aspect , the device may be further configured to : process the digital 3d geometry data into a meshed surface so that the smoothened digital 3d geometry data is based on the meshed surface . This solution may clean and smoothen the surface and provide a reasonable bas is for representing the anatomy of the surface of the digital 3d geometry data .
[001 7] According to an implementation form of the first aspect , the device may be further configured to : apply marching cubes to remove noi se from the digital 3d geometry data . This solution may clean and smoothen the surface .
[0018] According to an implementation form of the first aspect , for the identifying of areas the device is further configured to : recogni ze , from dimensions of the digital 3d geometry data, oversi ze obj ects compared to the anatomy; and remove the oversi ze obj ects . This solution may remove disturbing or erroneous parts of the digital 3d geometry data with respect to the anatomy .
[0019] According to an implementation form of the first aspect , the device may be further configured to : identify flat obj ects compared to the anatomy from the digital 3d geometry data ; and remove the flat obj ects . This solution may remove disturbing or erroneous parts of the digital 3d geometry data with respect to the anatomy .
[0020] According to an implementation form of the first aspect , the device may be further configured to : apply an average surface normal to categori ze the surface of the digital 3d geometry data . This solution may better establish the intended surface of the anatomy .
[0021 ] According to an implementation form of the first aspect , the device may be further configured to : sample the digital 3d geometry data having the identified area removed before performing the recognition to collect data for the recognition . This solution may enable better shape recognition based on the sampled surface .
[0022] According to an implementation form of the first aspect , the device may be further configured to : collect samples from the digital 3d geometry data, wherein a sample comprises a set of 3d points that are placed in a pre-defined formation around a target . Certain points may be set for a predefined format so that shape recognition may be applied .
[0023] According to an implementation form of the first aspect , the device may be further configured to : recogni ze the biometric shapes based on the sampled digital 3d geometry data only . This solution may enable focusing shape recognition on the sampled data .
[0024] According to an implementation form of the first aspect , the device may be further configured to : return to perform the sampling if the recognition does not recogni ze any biometric shape , and re-sample the digital 3d geometry data based on the non-recogni zed biometric shape . This solution may enable one or more iteration rounds between sampling and shape recognition, and gradually improve shape recognition results .
[0025] According to an implementation form of the first aspect , the device may be further configured to : fill in holes of the surface of the digital 3d geometry data having the recogni zed biometric shapes . This solution may further process the data to better match the anatomy .
[0026] According to a second aspect , a method is provided for design of an orthotic or prosthetic product , the method comprising : Receiving a digital 3d geometry data of at least a part of an anatomy in relation to the orthotic or prosthetic product . Recogni zing biometric shapes from the digital 3d geometry data . Placing a landmark for each biometric shape . Outputting the digital 3d geometry data having the landmarks .
[0027] According to an implementation form, the method of the second aspect may be executed in the device according to any implementation form of the first aspect . [0028] According to a third aspect , a computer program is provided . The computer program may comprise program code configured to cause performance of the method of any implementation form of the second aspect , when the computer program is executed on a computer .
[0029] According to a fourth aspect , a computer program product comprising a computer readable storage medium for storing program code is provided . The program code may comprise instructions for performing any implementation form of the second aspect .
[0030] Implementation forms of the disclosure can thus provide a device , a method, a computer program, and a computer program product for biometric shape recognition . These and other aspects of the disclosure will be apparent from the example embodiment ( s ) described below .
DESCRIPTION OF THE DRAWINGS
[0031 ] The accompanying drawings , which are included to provide a further understanding of the example embodiments and constitute a part of this specification, illustrate example embodiments and together with the description help to explain the example embodiments . In the drawings :
[0032] Fig . 1 illustrates an example of a process for biometric shape recognition and designing an orthotic or prosthetic product , according to an embodiment ;
[0033] Fig . 2 illustrates an example of a device for biometric shape recognition, and further designing an orthotic or prosthetic product , according to an embodiment ; [0034] Fig . 3 illustrates an example of the 3d geometry data of a part of an anatomy, according to an embodiment ;
[0035] Fig . 4 illustrates an example of the 3d geometry data of a part of an anatomy, according to an embodiment ; and
[0036] Fig . 5 illustrates an example of a negative source for the 3d geometry data, according to an embodiment .
[0037] Like references are used to designate like parts in the accompanying drawings .
DETAILED DESCRIPTION
[0038] Reference will now be made in detail to example embodiments , examples of which are illustrated in the accompanying drawings . The detailed description provided below in connection with the appended drawings is intended as a description of the present embodiments and is not intended to represent the only forms in which the present example may be constructed or utili zed . The description sets forth the functions of the example and the sequence of steps for constructing and operating the example . However, the same or equivalent functions and sequences may be accomplished by different examples .
[0039] While embodiments address biometric shape recognition of orthotic and prosthetic products , the embodied solution could be utili zed in many other medical devices and in other industries where bespoke products are designed based on anatomical data .
[0040] According to an embodiment , a device comprises software configured to solve the problem of automatic batch work for different user demands . More specifically, the device is configured for manipulation of input 3d data ( such as a 3d scan model ) and parameters ( such as an order form) to a completed des ign file ( stl or g-code , for example ) . The embodiment may also solve the problem of multiple product generation at the same time , having possibly scalability, and creates a logic to create human understandable processes . The workflow of the method makes batch processing fast while automation is of reasonable quality, and allows for user intervention and validation of critical design elements . [0041 ] Software or a logic which determines the specific process for each design case depends on the orthotic or prosthetic product that needs to be designed for each patient . The automatic custom 3d geometry data generation applies a set of rules that describe the logic which generates the shape of the product ( such as the digital 3d model ) . These rules may be specified in the set-up stage of the device . These rules can be converted to different operations in the actual generation process , for example validation operations during automatic 3d model generation .
[0042] According to an embodiment , a device is configured by software to automatically recogni ze a biometric shape , for example hand, foot , head, from the 3d geometry data, such as raw 3d scan data, clean the data and finally orientate it in a way that further automatic processing for designing the orthotic or prosthetic product can continue .
[0043] The device comprises components that are configured as follows : At operation 1 , recogni ze a volume or surfaces of interest . Use for example marching cubes to remove small noise from the original scan . Recogni ze , from dimensions of the scan, if it is signif icantly too large to be exactly the obj ect being scanned . I f it is too big, then use dedicated recognition to remove entirely flat objects (for example table corners, sofa surfaces) and too small objects (shafts, pipes, rack supports etc.) from the scan also use average surface normals to roughly categorize surfaces of the digital 3d model. Remove the areas that are not clearly valid as desired surface candidates.
[0044] At operation 2, recognize candidates of the landmarks. Landmarks or landmark sets that define the recognition of the desired shape are recognized. Each recognition candidate set includes the first landmark and secondary landmarks (2 or more) . The secondary landmarks may be required to orientate and finalize cleaning of the scan. Each landmark set can include for example heel center and two forefoot landmarks, each having unique and mathematically distinguishable anatomical characteristics around them. When there is more than one candidate combination of first landmark and secondary landmarks attached to it, the device selects the best fit. The best fit is mathematically lated .
Figure imgf000013_0001
[0046] When finding the candidate landmark sets, the device uses mathematical computing to recognize locations of the landmarks. In an embodiment, the device is configured to use both surface sampling of the scan and landmark relations computations. These landmark relations include for example a distance between the landmarks or angles they make with each other. When all candidates are found, the best fit is selected as the final landmark set for the next operation.
[0047] At operation 3, the device is configured to use the landmark or the landmark set calculated in the previous operation to do final clean and adjustments for the scan. According to an embodiment, optionally this may include for example orientation, removing unnecessary parts of the scan, smoothing or reshaping areas of the scan that have potential or known unwanted anomalies . After operation 3 , the scan may be processed and ready for any manual or automated orthotics and prosthetics design operations .
[0048] FIG . 1 illustrates an example of a process for biometric shape recognition and designing an orthotic or prosthetic product , according to an embodiment . The device is configured to perform the operations of Fig . 1 . Initial smoothening and cleaning, surface sampling, shape recognition and landmark recognition can be processed for biometric shape recognition and landmark positioning .
[0049] The recognition of the biometric shape is important in order to automatically process a raw scan file into an orthotic or prosthetic product that i s customi zed for the individual being scanned . To enable automatic processing of the scan data ( the digital 3d model ) , the desired surface or volume in the scan needs to be found . After this , the unneces sary parts can be deleted from the scan, so that they do not hinder further processing . Finally, the scanned obj ect may be orientated so that the automated design process for creating the desired orthotic or prosthetic product can continue .
[0050] At operation 11 , the device is configured to process the 3d geometry data, for example the scan, into a continuous or discontinuous meshed surface . Operation 11 is optional so that it may be ignored .
[0051 ] At operation 12 , the device is configured to remove sharp corners and other noi se from the scan data . This may be processed by suitable surface smoothing algorithms such as Marching Cubes or Taubin Smooth . The device is also configured to clean clearly unnecessary surfaces with for example small island removal , unconnected faces or vertices .
[0052] At operation 13 , the device is configured to process surface sampling . The device is configured to process the surface of the 3d geometry data so that the actual shape recognition is able to process it further . This may be performed, for example in order to collect data for shape recognition or neural networks .
[0053] The surface sampling may be a set of algorithms used to collect samples from the original scan data ( for example from the digital 3d geometry data) using raycaster or other means . A set of samples may initially be taken from a number of 3d points that may be referred to as targets on the surface . These targets may be equally distanced on the surface or otherwise placed so that relevant areas are covered for collection of samples . A sample may be a set of 3d points that are placed in a pre-defined formation around the target and moved with raycaster, the raycasting result itself , vertex positions , surface normals , or any other arbitrary set of data derived from the original scan data .
[0054] At operation 14 , the device is configured to perform shape recognition that detects the desired landmarks or areas in the original surface . I f the shape recognition fails , the process returns to previous operation 13 to iterate better surface sampling results . [0055] The samples of operation 13 are fed to the shape recognition . Shape recognition may be a set of neural networks , traditional algorithms or a combination of both, specifically trained or designed to identify and locate body parts and/or other biometric surface shapes . The shape recognition of operation 14 and surface sampling of operation 13 may be used m tandem to grade and refine the initial set of samples. For example, the shape recognition may first attempt to find anatomical features, like the heel, in the initial set of samples and use this information as a basis for the next round of surface sampling at operation 13, iteratively gaining a better understanding of the initial samples. This process between operations 13 and 14 may be repeated multiple times until the location and position of the part of the anatomy, for example a foot, is established. [0056] At operation 15, the device is configured to get a relevant surface by 1) removing unnecessary parts, 2) cut relevant parts off or 3) use the surface for generating a new surface.
[0057] At operation 16, the device may perform adaptive hole filling. Holes at the surface of data may be filled to form a more continuous surface. This operation may be optional.
[0058] At operation 17, the device is configured to detect and place necessary landmarks for actual design automation of the orthotic or prosthetic product.
[0059] FIG. 2 illustrates an example of a device 200 according to an embodiment. The device 200 may be for example configured for biometric shape recognition. The device 200 may comprise at least one processor 202. The at least one processor may comprise, for example, one or more of various processing devices, such as for example a co-processor, a microprocessor, a controller, a digital signal processor (DSP) , a processing circuitry with or without an accompanying DSP, or various other processing devices including integrated circuits such as, for example, an application specific integrated circuit (ASIC) , a field programmable gate array (FPGA) , a microcontroller unit (MCU) , a hardware accelerator, a special-purpose computer chip, or the like .
[0060] The device may further comprise at least one memory 204 . The memory may be configured to store , for example , computer program code or the like , for example operating system software and appl ication software . The memory 204 may be also configured to store neural network ( s ) . The memory may comprise one or more volatile memory devices , one or more non-volatile memory devices , and/or a combination thereof . For example , the memory may be embodied as magnetic storage devices ( such as hard disk drives , floppy disks , magnetic tapes , etc . ) , optical magnetic storage devices , or semiconductor memories ( such as mas k ROM, PROM (programmable ROM) , EPROM (erasable PROM) , flash ROM, RAM ( random access memory) , etc . ) .
[0061 ] The device 200 may further comprise a communication interface 208 configured to enable the device 200 to transmit and/or receive information . The communication interface may be configured to provide at least one wireless radio connection, such as for example a 3GPP mobile broadband connection (e . g . 3G, 4G, 5G) ; a wireless local area network (WLAN) connection such as for example standardi zed by IEEE 802 . 11 series or Wi-Fi alliance ; a short range wireless network connection such as for example a Bluetooth, NFC (near-field communication) , or RFID connection ; a local wired connection such as for example a local area network (LAN) connection or a universal serial bus (USB) connection, or the like ; or a wired Internet connection .
[0062] The device 200 may further comprise a user interface 210 comprising at least one input device and/or at least one output device . The input device may take various forms such as a keyboard, a touch screen, or one or more embedded control buttons . The output devrce may for example comprise a display, a speaker, a vibration motor, or the like .
[0063] When the device 200 is configured to implement some functionality, some component and/or components of the device 200 , such as for example the at least one processor 202 and/or the memory 204 , may be configured to implement this functionality . Furthermore , when at least one processor is configured to implement some functionality, this functionality may be implemented using program code 206 comprised, for example , in the memory 204 .
[0064] The functionality described herein may be performed, at least in part , by one or more computer program product components such as software components . According to an embodiment , the device 200 comprises a processor 202 or processor circuitry, such as for example a microcontroller, configured by the program code 206 when executed to execute the embodiments of the operations and functionality described herein . Alternatively, or in addition, the functionality described herein can be performed, at least in part , by one or more hardware logic components . For example , and without limitation, illustrative types of hardware logic components that can be used include field-programmable gate arrays ( FPGAs ) , application-specific integrated circuits (AS ICs ) , application-specific standard products (ASSPs ) , system-on-a-chip systems ( SOCs ) , complex programmable logic devices (CPLDs ) , graphics processing units (GPUs ) , or the like .
[0065] The device 200 may comprise means or is configured for performing ( a) method ( s ) described herein . In one example , the means comprise the at least one processor 202 and the at least one memory 204 including program code 206 configured to, when executed by the at least one processor 202, cause the device 200 to perform the method.
[0066] The device 200 may comprise for example a computing device such as for example a mobile phone, a tablet computer, a laptop, an internet of things (loT) device, a server or the like. Examples of loT devices include, but are not limited to, consumer electronics, wearables, sensors, and smart home appliances. Although the device 200 is illustrated as a single device it is appreciated that, wherever applicable, functions of the device 200 may be distributed to a plurality of devices, for example to implement example embodiments as a cloud computing service.
[0067] Fig. 3 illustrates an example of the 3d geometry data 30 of a part of an anatomy, according to an embodiment. In Fig. 3 only partial surfaces of the anatomy are successfully captured as input data. Anatomical 3d scans may have some common issues, such as holes 31 in the surfaces or poorly stitched surfaces. However, in the case of an inanimate object, these tend to be less than what one finds in a body part. Furthermore, due to the nature of the scanning process at the clinic, anatomical scans may have extra surfaces 33 such as corners of chairs, fingers etc. in the scan data. They may distort the shape being recognized from the scan data as illustrated in the example of Fig. 4.
[0068] An embodiment may also be able to recognize plaster casts or other replicas of the original surface, which usually have holes and extra anomalies on the surface which are impossible for a human anatomy. For example, one traditional way of generating foot orthotics is to capture a foot impression on a foam box 50 as illustrated in Fig. 5. The embodiment may also use this kind of a 3d shape , which is a negative of the original physical obj ect . An embodiment can process a negative of the original physical shape of the part of the anatomy .
[0069] Further features of the method directly result from the functionalities and parameters of the methods and devices , as described in the appended claims and throughout the specification, and are therefore not repeated here .
[0070] Any range or device value given herein may be extended or altered without losing the effect sought . Also , any embodiment may be combined with another embodiment unless explicitly disallowed .
[0071 ] Although the subj ect matter has been described in language specific to structural features and/or acts , it is to be understood that the subj ect matter defined in the appended claims is not necessarily limited to the specific features or acts described above . Rather, the specific features and acts described above are disclosed as examples of implementing the claims and other equivalent features and acts are intended to be within the scope of the claims .
[0072] It will be understood that the benef its and advantages described above may relate to one embodiment or may relate to several embodiments . The embodiments are not limited to those that solve any or all of the stated problems or those that have any or al l of the stated benefits and advantages . It will further be understood that reference to ' an ' item may refer to one or more of those items .
[0073] The steps or operations of the methods described herein may be carried out in any suitable order, or simultaneously where appropriate . Additionally, individual blocks may be deleted from any of the methods without departing from the scope of the subj ect matter described herein . Aspects of any of the embodiments described above may be combined with aspects of any of the other embodiments described to form further embodiments without losing the effect sought .
[0074] The term ' comprising ' is used herein to mean including the method, blocks , or elements identified, but that such blocks or elements do not comprise an exclusive list and a method or device may contain additional blocks or elements .
[0075] Although subj ects may be referred to as ' first ' or ' second' subj ects , this does not necessarily indicate any order or importance of the subj ects . Instead, such attributes may be used solely for the purpose of making a difference between subj ects .
[0076] It will be understood that the above description is given by way of example only and that various modifications may be made by those skilled in the art . The above specification, examples and data provide a complete description of the structure and use of exemplary embodiments . Although various embodiments have been described above with a certain degree of particularity, or with reference to one or more individual embodiments , those skilled in the art could make numerous alterations to the disclosed embodiments without departing from the scope of this specification .

Claims

1. A device (200) for design of an orthotic or prosthetic product, the device configured to: receive digital 3d geometry data (30) of at least a part of an anatomy in relation to the orthotic or prosthetic product; detect biometric shapes from the digital 3d geometry data; characterized by identify a landmark for each biometric shape, wherein each landmark includes unique and mathematically distinguishable anatomical characteristics around them; place the landmark for each biometric shape; output the digital 3d geometry data having the landmarks, wherein the digital 3d geometry data is configured for the orthotic or prosthetic product.
2. The device according to claim 1, characterized by further configured to: identify areas (33) that are not valid as a desired representation of the digital 3d geometry data; remove the identified areas; and recognize biometric shapes from the digital 3d geometry data having the identified areas removed.
3. The device according to any preceding claim, characterized by further configured to: smooth a surface of the digital 3d geometry data using a smoothing algorithm.
4. The device according to any preceding claim, characterized by further configured to: orientate the digital 3d geometry data having the landmarks so that design of the orthotic or prosthetic product is based on the oriented digital 3d geometry data .
5. The device according to any preceding claim, characterized by further configured to: process the digital 3d geometry data into a meshed surface so that the smoothened digital 3d geometry data is based on the meshed surface.
6. The device according to any preceding claim, characterized by further configured to: apply marching cubes to remove noise from the digital 3d geometry data.
7. The device according to any preceding claim, wherein for the identifying of areas the device is characterized by further configured to: determining, from dimensions of the digital 3d geometry data, an oversize or undersize object compared to the anatomy; and remove the oversize or undersize object.
8. The device according to claim 7, characterized by further configured to: identify flat objects compared to the anatomy from the digital 3d geometry data; and remove the flat objects.
9. The device according to any preceding claim, characterized by further configured to: apply an average surface normal to categorize the surface of the digital 3d geometry data.
10. The devrce accordrng to any precedrng clarm, characterized by further configured to: sample the digital 3d geometry data having the identified area removed before being configured to perform the recognition in order to collect data for the recognition.
11. The device according to claim 10, characterized by further configured to: collect samples from the digital 3d geometry data, wherein a sample comprises a set of 3d points that are placed in a pre-defined formation around a target.
12. The device according claim 10, characterized by further configured to: recognize the biometric shapes based on the sampled digital 3d geometry data only.
13. The device according to claim 12, characterized by further configured to: return to the sample if the recognition does not recognize any biometric shape, and re-sample the digital 3d geometry data based on the non-recognized biometric shape .
14. The device according to any preceding claim, characterized by further configured to: fill in holes (31) of the surface of the digital 3d geometry data having the recognized biometric shapes.
15. A method for design of an orthotic or prosthetic product, the method comprising: receiving (11,12,13) digital 3d geometry data of at least a part of an anatomy in relation to the orthotic or prosthetic product; detecting (14) biometric shapes from the digital 3d geometry data; characterized by identifying (15,17) a landmark for each biometric shape, wherein each landmark includes unique and mathematically distinguishable anatomical characteristics around them; placing (15, 17) the landmark for each biometric shape ; outputting the digital 3d geometry data having the landmarks, wherein the digital 3d geometry data is configured for the orthotic or prosthetic product.
16. A computer program comprising program code configured to cause performance of the method according to claim 15, when the computer program is executed on a computer .
PCT/FI2021/050559 2020-08-20 2021-08-19 Biometric shape recognition WO2022038314A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FI20205807A FI20205807A1 (en) 2020-08-20 2020-08-20 Biometric shape recognition
FI20205807 2020-08-20

Publications (1)

Publication Number Publication Date
WO2022038314A1 true WO2022038314A1 (en) 2022-02-24

Family

ID=77519135

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2021/050559 WO2022038314A1 (en) 2020-08-20 2021-08-19 Biometric shape recognition

Country Status (2)

Country Link
FI (1) FI20205807A1 (en)
WO (1) WO2022038314A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090306801A1 (en) * 2006-11-27 2009-12-10 Northeastern University Patient specific ankle-foot orthotic device
WO2013071416A1 (en) 2011-11-17 2013-05-23 Techmed 3D Inc. Method and system for forming a virtual model of a human subject
US20200238626A1 (en) * 2017-07-21 2020-07-30 Nike, Inc. Custom Orthotics and Personalized Footwear

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090306801A1 (en) * 2006-11-27 2009-12-10 Northeastern University Patient specific ankle-foot orthotic device
WO2013071416A1 (en) 2011-11-17 2013-05-23 Techmed 3D Inc. Method and system for forming a virtual model of a human subject
US20200238626A1 (en) * 2017-07-21 2020-07-30 Nike, Inc. Custom Orthotics and Personalized Footwear

Also Published As

Publication number Publication date
FI20205807A1 (en) 2022-02-21

Similar Documents

Publication Publication Date Title
US9474582B2 (en) Personalized orthopedic implant CAD model generation
Wang et al. From laser-scanned data to feature human model: a system based on fuzzy logic concept
US10089413B2 (en) Systems and methods for designing and generating devices using accuracy maps and stability analysis
EP2313868B1 (en) Method, apparatus, signals and media for producing a computer representation of a three-dimensional surface of an appliance for a living body
EP2415025B1 (en) Method and apparatus for applying a rotational transform to a portion of a three-dimensional representation of an appliance for a living body
Comotti et al. Multi-material design and 3D printing method of lower limb prosthetic sockets
CN112307876B (en) Method and device for detecting node
US11741277B2 (en) Predictive modeling platform for serial casting to correct orthopedic deformities
WO2016102027A1 (en) Method of using a computing device for providing a design of an implant
EP3798979A1 (en) Technologies for determining the spatial orientation of input imagery for use in an orthopaedic surgical procedure
Grosland et al. Automated hexahedral meshing of anatomic structures using deformable registration
WO2022038314A1 (en) Biometric shape recognition
EP1761109A2 (en) Method and apparatus for surface partitioning using geodesic distance measure
Yang et al. Direct boolean intersection between acquired and designed geometry
FI20205808A1 (en) Assistive device and method for designing an orthotic or prosthetic product
Lievers et al. Patient-specific modelling of the foot: automated hexahedral meshing of the bones
Bradley Rapid prototyping models generated from machine vision data
Lu et al. Subdivision surface-based finish machining
US20220277113A1 (en) Computer-implemented method for changing a model geometry of an object
Vergeest et al. Freeform surface copy and paste techniques for shape synthesis
CN116342672B (en) Hip joint actual position registration method and device, electronic equipment and storage medium
CN113421326B (en) Walking aid equipment design method and system based on sitting posture point detection
CN116363184B (en) Hip joint position registration method and device, electronic equipment and storage medium
CN111297477B (en) Preparation method of full-information scoliosis model by compounding body surface data and X-ray film
Pham et al. Cost-effective solutions and tools for medical image processing and design of personalised cranioplasty implants

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21762066

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 19/06/2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21762066

Country of ref document: EP

Kind code of ref document: A1