WO2023152530A2 - Multidimensional anatomic mapping, descriptions, visualizations, and translations - Google Patents

Multidimensional anatomic mapping, descriptions, visualizations, and translations Download PDF

Info

Publication number
WO2023152530A2
WO2023152530A2 PCT/IB2022/000814 IB2022000814W WO2023152530A2 WO 2023152530 A2 WO2023152530 A2 WO 2023152530A2 IB 2022000814 W IB2022000814 W IB 2022000814W WO 2023152530 A2 WO2023152530 A2 WO 2023152530A2
Authority
WO
WIPO (PCT)
Prior art keywords
anatomic
paths
site
hierarchical
path
Prior art date
Application number
PCT/IB2022/000814
Other languages
French (fr)
Other versions
WO2023152530A3 (en
Inventor
Matthew A. MOLENDA
Original Assignee
Mofaip, Llp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mofaip, Llp filed Critical Mofaip, Llp
Priority to US18/225,872 priority Critical patent/US20230368878A1/en
Priority to PCT/IB2023/000535 priority patent/WO2024023584A2/en
Publication of WO2023152530A2 publication Critical patent/WO2023152530A2/en
Publication of WO2023152530A3 publication Critical patent/WO2023152530A3/en

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H15/00ICT specially adapted for medical reports, e.g. generation or transmission thereof
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS

Definitions

  • FIG. 21 is a three-dimensional model depicting a dynamic anatomic map with associated addresses.
  • FIG. 25 is a patient photograph overlayed with the accompanying hierarchical dynamic anatomic addressing system.
  • the present invention is a software application configured to map and label specific anatomic sites configured to provide enhanced directional modifiers labeled simultaneously in unlimited layers, hierarchies, tissues, organ systems, and groups with simultaneous categorization and translation into any coded, linguistic, or symbolic language in real-time and multidimensional space.
  • the software application of the present invention provides greater functionality than existing electronic anatomic mapping such that human and machine readable, translated and enhanced real-time descriptions with or without magnitude modifiers to automatically describe relationships, distances, surface areas, volumes, and findings between two or more anatomic sites, regions, or pins.
  • FIG. 1 depicts an anatomic visualization 10 with five hierarchically labeled, color-coded, enhanced, and translated anatomic site descriptions 20 correlating with the point of the cursor 14 over color-synced anatomic site regions in this example.
  • the multidimensional anatomic map that simultaneously displays all hierarchical levels has been overlaid on anatomic visualization 10, thus obscuring it.
  • the five enhanced, translated anatomy labels 20 corresponding with an anatomic site 18 are listed in the color-coded legend 20 with a portion of the dynamic anatomic address for this point.
  • the present embodiment depicts the following regions: left superior paramedian forehead 21 , superior left forehead 22, face 23, head 24, and head and neck 25.
  • the user has placed the cursor 14 over path-defined anatomic sites 18 with custom axes as a temporary site selection in the anatomic visualization 10.
  • An enhanced description 16 is displayed as a result of the cursor 14 hovering over the anatomic site 18.
  • the color-coded legend 12 additionally tells the user the hierarchy of the anatomic site 18 that the cursor 14 is within and over with a portion of the dynamic anatomic address for the anatomic site 18 visible in multiple hierarchical levels based on the custom axes of each level.
  • the color- coded legend 12 is indented by the relative hierarchical level.
  • FIG. 2 shows the same anatomic visualization 10 as FIG. 1 with the multidimensional hierarchical anatomic map underlaid rather than overlayed to prevent obscuring the borders of the anatomic visualization 10.
  • the cursor 14 is again placed over an anatomic site 18 as a temporary site selection. It is contemplated that the specific anatomic site 18 could be selected and saved with a pin or anchor or other such tagging mechanism in the software application as one skilled in the art would know.
  • the method of this invention applies such pin or anchor simultaneously in all layers, levels, dimensions, custom axes, and anatomic sites, at any angle.
  • the second point of interest 324 is auto-related to the termination point 322; however, since they have different path defined axes, a secondary axis is used to autorelated the two points, in this case, the cross-sectional axis.
  • the relationships between the points are described as the second point of interest 324 is left from the termination point 322 and the termination point 322 is right from the second point of interest 324 automatically applying the correct custom axial information.
  • the cross-section axes are reflected in mirror view, even when mirror view does not contain a midline or median. Therefore, the path axes do not require an overall midline, and are mirrored independently of the midline.
  • Two pins 52 are depicted.
  • the diagnosis component and other components like the pin description can change dynamically in the dynamic anatomic address, but the anatomic site information and visualization of the location can remain static. This is especially helpful at different time points, since diagnoses can change with additional information, like a pathology report from the biopsy.
  • FIG. 1 1 depicts four pins 52 on the same anatomic site 18, in this example the “central forehead” site that are automatically related to one another. Descriptions of all pins on same anatomic site 18 are automatically related, even though, in this case, some pins cross or approach the body midline. Additionally, linguistic descriptions to describe the magnitude of axial deviations from each other and from center are called “magnitude modifiers” and may also be applied when appropriate. Example automatic relationships with magnitude modifiers for the pins 52 in FIG. 1 1 include:
  • B is superior from D
  • FIG. 13 further depicts hierarchical painting that includes treatment recommendations 90 associated with the specific anatomic sites 18.
  • Patients often receive multiple recommendations, sometimes with ten or more products listed, and can understandably get confused on what to use, where, and when.
  • Visual hierarchical painting through mapping with anatomic site or site group descriptions can help to color code this information in an easily digestible format for the patient, in any language and creates a visual map of recommendations.
  • recommendations could build upon each other in areas of overlap.
  • clinicians can stamp their recommendations or highlight areas for where to apply their recommendations on standardized maps or directly on patient images creating a personalized visual map of recommendations with simultaneous condensed and simplified translated text descriptions of “what to use where” on their body.
  • FIGs. 15 and 16 depict an electronic regimen map 100 with printable labels 102 and affiliated educational instructions 105, respectively.
  • the educational instructions 105 provide visualizations by product, by area, and by condition. While a paper workflow generates a digestible report that can be handed to the patient; the electronic regimen can be saved into the patient’s chart for tracking the regimen over time and initiation of other workflows. The electronic regimen can evolve over time, with input from the patient and the professional, in their preferred language.
  • FIG. 22 depicts the customizability of a patient representative avatar to accurately represent patient characteristics 140 of a specific patient, including but not limited to adjusting skin tone, facial features, body features, body shape, and body size. It is contemplated that textural maps could further enhance the avatar.
  • a three-dimensional total body photograph can adjust all dynamic anatomic addresses and describe, categorize, and relate all areas of the body, in any coded, linguistic, or symbolic language.
  • high resolution three-dimensional captures detect, count, categorize, and assign gross and dermatoscopic-level features of lesions such as dermatoscopic-level morphologies and measurements.
  • the user interface allows for avatar manipulation 145, such as opening the mouth and sticking the tongue out to document on oral dynamic anatomic addresses or removing underwear to document anogenital dynamic anatomic addresses.

Abstract

The present invention includes a mapping and labeling engine for automatic, real-time multidimensional visualization of hierarchical map elements under a point, pin, path segment, entire path, or group of paths with corresponding enhanced descriptors that include components of the descriptor, such as laterality, prefix, or suffix, presented in any language, with natural linguistic sequencing, code strings, and categorization. Any point on the anatomic map is simultaneously labeled in all levels of an anatomic hierarchy, with each level having different custom axes, to create enhanced descriptors; and any path on any level can be targeted, visualized, segmented, colored, patterned, or associated with more data. Custom axes are also used to relate different points or paths with human and machine-readable descriptions. Anatomic sites are visually and linguistically sub-segmented and automatically related by using custom axes, angles, rotations, and offsets of each path, or groups of paths. A method for hierarchical visualization and hierarchical painting of diagnoses, treatments, morphologies, and symptoms to multidimensional anatomic sites is also included. Furthermore, the mapping and labeling engine works in reverse, with mirrored axes to allow visualization, mapping, and labeling, and thus the method includes mirrored axes to allow real-time, multidimensional labeling as an outside observer or in a "selfie/mirror view."

Description

UTILITY PATENT APPLICATION
CONFIDENTIAL INFORMATION
Applicant: MoFalP, LLC
Address: 4327 Pine Ridge Circle, Monclova, OH 43542, USA
Title: Multidimensional anatomic mapping, descriptions, visualizations, and translations
First Named Inventor: Matthew A. Molenda
Attorney: McCarthy, Lebit, Crystal & Liftman, Co. L.P. A.
Customer No.: 113863
Attorney Docket No.: AML.P023.PCT
RELATED APPLICATIONS
[0001] This application claims priority from each of U.S. Provisional Patent Application Serial No. 63/265,216 and its filing date December 10, 2021 , U.S. Provisional Patent Application Serial No. 63/369,717 and its filing date July 28, 2021 . Each of these applications is hereby incorporated by reference in their entireties for all purposes.
FIELD OF THE INVENTION
[0002] This invention relates to medical systems, and more particularly, to multidimensional visualization, description, translation, and mapping of data to hierarchical anatomic site.
SUMMARY OF THE INVENTION
[0003] Currently, mapping and labeling of specific anatomic sites within medical records is achievable with electronic health records that include two-dimensional or three-dimensional avatars and mapped anatomic sites. Some even have the ability to drop a pin and output an anatomic site description. However, none of the current methods are capable of combined realtime site description enhancement, translation, encoding, relational descriptions, site categorization, cross-mapping, and color coding through all levels of an anatomic hierarchy. The present invention operates on real-time axis switching in path defined different anatomic sites that have different levels, layers, centers, offsets, or rotations and simultaneous labeling and relational descriptions enabled by these different axes. This dramatically improves upon existing anatomical mapping of medical records by enabling multifunctional and multidimensional mapping simultaneously through all levels of a patient’s anatomic hierarchy in real-time creating a more comprehensive patient electronic health record that can track features associated with anatomic sites and regions over time.
[0004] Methods and systems are described herein for specific, enhanced, reproducible, translated, visualized, dynamic, descriptive, and encoded anatomic sites facilitate improved communication, documentation, tracking, understanding, and descriptions of anatomic sites or regions affected by diseases, treatments, symptoms, and morphologies across different systems and languages. For example, for an English-speaking physician to communicate with a patient who does not know English, the two would traditionally have to rely on a translator of translation device to communicate on the patient’s diagnosis and treatment. Furthermore, current labeling of specific anatomic sites within medical records and associating that anatomy labeling with diagnoses, images, records, billing codes, anatomic site codes, tags, external links, and translations, and visualizations is a manual, multi-step process that requires medical knowledge. Additionally, labeling different sections within a defined anatomic region has been a manual process, and within a single layer, axis, or dimension at a time. However, by creating a system that incorporates multiple coded, linguistic, or symbolic languages and in real-time allows for dynamic visualizations of defined different anatomic sites the patient and physician can simultaneously view the anatomic site of interest, the associated site information, and any updates in their own preferred language, thus improving communication between the two, eliminating the need for extensive medical knowledge of anatomical names, and improving tracking of the patient’s conditions and treatment.
[0005] For example, with this system the non-English speaking patient can stamp or hierarchically paint the anatomic locations or regions of their concerns (e.g., symptoms, lesions, rashes) on avatars in a mirrored view in their preferred language, simultaneously and in real-time the English-speaking physician can visualize the patient’s report in English from an outside observer view. The patient can even attach their own photos of lesions they are concerned about directly to the anatomic site in their preferred language, and the physician can add treatment recommendations, observations, diagnoses, and other data in English, which the patient will see in their preferred language. Further, if treatment is ongoing, for example for a rash, if different anatomic regions of the rash are responding differently to treatment, the patient can report on the reproduced distribution in a visual and descriptive workflow, without actually knowing anatomy names, and thus creating tracking points for disease response by time, by site, and by treatment automatically.
[0006] Conventional solutions are unable to allow for simultaneous, reproducible, and automatically enhanced translations and visualizations of anatomy nor do they offer translated and visualized travel through an anatomic hierarchy. The present invention applies automatically enhanced translations and visualizations to multiple mapping workflows to facilitate communication, documentation, understanding, and tracking. To overcome limitations in conventional methods of anatomic mapping, the system creates a dynamic anatomic address for every anatomic point and region on the body, with each address serving as a multidimensional data tracking and collation point. Thus, data relevant to anatomic site and distribution can be tracked and collated for different time points in different patients, and for populations.
[0007] Dynamic anatomic addresses apply to points of interest, which may be represented as pinpoints, segments of anatomic sites, groups of anatomic site segments, complete anatomic sites, groups of anatomic sites, different ievels of hierarchy, different groups, and different organ systems simultaneously. Visual definitions and dynamic custom coordinates of anatomic addresses can be looked up by anatomic site name in any combination of language or code, and progressively sub-segmented with directional and magnitude modifier terms with mixed code and language order (on which we apply coded and linguistic dissection to provide the visualizations and translations); and those definitions can be detected and visualized in different views and images and multimedia automatically. The system uses hierarchical painting of anatomy to automatically capture distributions, body surface area calculations, and intensity (such as first- degree, second-degree, third-degree burns) and simultaneously provide a standardized description of each distribution segment, an isolated visual preview of each distribution segment, and a combined visual preview of the different distribution segments, in both outside observer and mirror views. This provides a data matrix that tracks disease distribution, intensity, and surface area automatically with linguistic descriptions, machine readable descriptions, and visualizations, more effectively tracking a patient’s conditions.
[0008] The dynamic anatomic addresses along with standardized anatomy codes, names, and symbols, patient data, diagnosis data, encounter data, tags, and other data is used to generate language-agnostic file naming, grouping, and exporting function with optional universal symbolic low-character-count delimiters to automatically write a language agnostic story about an anatomic site. The data is combined into a string and separated with meaningful delimiters into an orderindependent, structureless, meaningful story. This story does not have to fit into an electronic health record (EHR) or other defined data structure. This creates a data block that can be truncated in a file name, encrypted into static or evolving QR codes (or other with or without encryption), stored in exported file metadata, exported to a database or file wrapper (such as a Digital Imaging and Communications in Medicine (DICOM) wrapper), filtered, searched, deidentified, encoded, and tagged. [0009] The anatomic site that the dynamic anatomic address correlates to is visually depicted for the system users on anatomic maps. These visualizations can be two- or three-dimensional. Each anatomic map is composed of layers of paths, shapes, and compound paths stacked in two- dimensional or three-dimensional space. A fourth dimension is added when the maps are compared over time, such as for tracking points, anatomic sites, or groups of anatomic sites. Multiple, unlimited dimensions are added when each path or layer has its own set of axes, centers, offsets, angles, medians, and rotations, as does each group of paths or layers, or group of groups. Each path or layer may function alone, and has its own center, offset, rotation, elevation, depression, topography, surface area, hierarchy level, laterality, prefix, suffix, name, axis definition, and metadata. Each path or layer can independently scale, align, rotate, move, remain static, or be shown, hidden, colored, patterned, or highlighted. Each path or layer can also dependently scale, align, rotate, move, remain static, or be shown, hidden, colored, patterned, or highlighted - depending on its membership in a group or on patient characteristics. In one iteration, group membership might be the paths or layers that are part of the left eye group on a facial diagram group, which represents a group (left eye) of groups (facial). In one iteration, patient characteristics might be selectively hidden or filtered out when those characteristics are immaterial to the patient, such as the multidimensional maps for deciduous dentition (children’s teeth) map groups for an adult patient or male genitalia map groups for a female patient. A single path or layer of the patient’s anatomic map may belong to multiple groups of the patient’s anatomic map simultaneously. Further, paths or layers can be selectively targeted and automatically related to one another based on their custom axial definitions or group axial definitions. Groups have the same dependent and independent properties as layers or paths.
[0010] Multidimensional maps may be overlaid or underlaid, and separately aligned in different dimensions, on anatomic images, avatars, and diagrams. Map visualization software is used to selectively filter or show all levels of an anatomic hierarchy under a point, such as a cursor or a placed pin, while simultaneously showing the points relationship to each level of hierarchy, and enhanced descriptions and translations of the anatomic site at that particular point. In one iteration, an additive color sequence is used to visualize all levels of an anatomic hierarchy simultaneously at a point, with a real-time, color-coded legend that displays enhanced, translated, and encoded anatomic descriptions based on the multidimensional custom axes.
[0011] When there are two or more points of interest, the custom axes of the points or the custom axes of a group membership are used in artificial neural networks to automatically describe their directional relationships to one another, in human readable and machine-readable language. Encoding, using an autoencoder type neural network, and cross-mapping of each component of the enhanced anatomic site descriptions shown on the map also occurs simultaneously and in real-time. In one iteration, encoding is converting points on an anatomic image to a code string containing laterality and anatomic site name from a dictionary, such as the International Classification of Diseases (“ICD-11 ”), a globally used diagnostic tool for epidemiology, health management and clinical purposes. In another iteration, encoding simultaneously occurs for the prefixes, suffixes, and enhanced modifier description components of the enhanced and translated anatomic site name, with natural linguistic sequencing applied through a natural language processing. In another iteration, cross-mapping would automatically show all codes or names from other anatomic lexicons, such as the New York University numbering system (“NYU numbers”), which allows clinicians to easily identify areas of the body with standardized non- hierarchical, two-dimensional numbered diagrams. Cross-mapping and automatic correlation to multidimensional maps enables lesion tracking over time, for all levels of an anatomic hierarchy, at any point on the diagram, at any cross-mapping point.
[0012] In one iteration, the anatomy mapping and labeling engine is applied to document enhanced, standardized, optionally segmented, and multidimensional locations and descriptions of procedures, diagnoses, treatments, symptoms, morphologies, or other patient characteristics. Hierarchical selectors in the present invention allow for documented pins and distribution segments to travel through different levels of the anatomic hierarchy and dimensions of the anatomic map, with real-time visualization, translation, and site descriptions for each anatomic site in the hierarchy. The present invention also includes a method of hierarchical painting of anatomic sites and site segments, to simultaneously visualize and label distributions and intensities of properties like diseases, morphologies, symptoms, and patient education on anatomic maps. Additionally, these anatomic maps can allow for visualization from both outside observer and selfie views. Expanding this example, perspective defaults can be mirrored, allowing for documentation, translation, labeling, mapping, and visualization in both outside observer and selfie views.
[0013] This system dramatically improves existing anatomical mapping of medical records. Multifunctional and multidimensional mapping is enabled simultaneously through all levels of a patient’s anatomic hierarchy in real-time, creating a more comprehensive patient electronic health record capable of tracking features associated with anatomic sites and regions over time. Still other benefits and advantages of the invention will become apparent to those skilled in the art to which it pertains upon a reading and understanding of the following detailed specification.
BRIEF DESCRIPTION OF DRAWINGS
[0014] FIG. 1 is a screenshot of an example of hierarchical visualization overlay of the present invention.
[0015] FIG. 2 is a screenshot of an example of hierarchical visualization underlay of the present invention.
[0016] FIG. 3 is a cross-section diagram depicting a representation of anatomic site paths in a single plane.
[0017] FIG. 4 describes custom axes parallel and perpendicular to the cross-section of anatomic site paths in a single plane, axial mirroring, and auto-relation.
[0018] FIG. 5 is a screenshot of a plane with a rotated bounding box and custom defined axes. [0019] FIG. 6 is a screenshot of a patient facial diagram displaying enhanced anatomic sites, hierarchical painting, auto-relation, hierarchical travel, and other capabilities.
[0020] FIG. 7 is a screenshot of a patient facial diagram displaying a translation with quick zoom multidimensional targeting and hierarchical diagnostic distribution painting options.
[0021] FIG. 8 is a screenshot showing map synchronization and pin-level data.
[0022] FIG. 9 is a screenshot of a two-dimensional model with customized axes and defining enhanced granularity zones and automatic pin relationships.
[0023] FIG. 10 is a screenshot of a two-dimensional model with customized axes defining an enhanced detail plane, and customization of an axial boundary in the enhanced granularity zone. [0024] FIG. 1 1 is a screenshot of pins on an anatomic site that are automatically related to each other.
[0025] FIG. 12 is a screenshot of a hierarchical painting of morphologies with time point tracking. [0026] FIG. 13 is a screenshot of a hierarchical painting with associated treatment recommendations.
[0027] FIG. 14 depicts an example printed output of targeted, isolated and combined visual previews with associated treatment recommendations.
[0028] FIG. 15 depicts an example digital regimen map with associated treatment recommendations on combined and isolated visual previews.
[0029] FIG. 16 depicts an example printed output with associated treatment recommendations with isolated visual previews and simplified, combined anatomic site descriptions.
[0030] FIG. 17 is a screenshot of an anatomic site name builder associated with a specific pin along with an isolated visual preview.
[0031] FIG. 18 depicts progressive linguistic and visual sub-segmentation of a dynamic anatomic address. [0032] FIG. 19 is a partial screenshot of dynamic anatomic addresses with select patient characteristics.
[0033] FIG. 20 is a partial screenshot of a dynamic anatomic addresses with alternative selected patient characteristics to dynamically filter the map.
[0034] FIG. 21 is a three-dimensional model depicting a dynamic anatomic map with associated addresses.
[0035] FIG. 22 is a three-dimensional avatar visualization depicting patient characteristics.
[0036] FIG. 23 is a two-dimensional diagram of multiple anatomic perspectives depicting the same pin location.
[0037] FIG. 24 is a patient photograph.
[0038] FIG. 25 is a patient photograph overlayed with the accompanying hierarchical dynamic anatomic addressing system.
DETAILED DESCRIPTION OF DRAWINGS
[0039] The present invention is a software application configured to map and label specific anatomic sites configured to provide enhanced directional modifiers labeled simultaneously in unlimited layers, hierarchies, tissues, organ systems, and groups with simultaneous categorization and translation into any coded, linguistic, or symbolic language in real-time and multidimensional space. The software application of the present invention provides greater functionality than existing electronic anatomic mapping such that human and machine readable, translated and enhanced real-time descriptions with or without magnitude modifiers to automatically describe relationships, distances, surface areas, volumes, and findings between two or more anatomic sites, regions, or pins. Distance from center can be detected as well as proximities to center, angles, rotations, medians, and offsets, and these detections are used to create more meaningful and improved anatomic site names, descriptions, borders, and visualizations. The magnitude of axial deviations can automatically and optionally reorder the directional modifier descriptors to make the combined descriptions and visualizations more meaningful, precise, and accurate. Linguistic and visual sub-segmentation are also applied to create enhanced descriptions with corresponding visual previews (in isolation or related to groups), that both change and realign dynamically with the patient when needed and remain static in terms of standardized language and code sequences.
[0040] Referring to the drawings, FIG. 1 depicts an anatomic visualization 10 with five hierarchically labeled, color-coded, enhanced, and translated anatomic site descriptions 20 correlating with the point of the cursor 14 over color-synced anatomic site regions in this example. The multidimensional anatomic map that simultaneously displays all hierarchical levels has been overlaid on anatomic visualization 10, thus obscuring it. The five enhanced, translated anatomy labels 20 corresponding with an anatomic site 18 are listed in the color-coded legend 20 with a portion of the dynamic anatomic address for this point. A dynamic anatomic address in the system is associated with an anatomic site and acts as a bucket in which to store data associated with the anatomic site but is dynamic in that it simultaneously stores the data in relative buckets across its lineage as well as proximately located buckets and is capable of moving based on patient anatomy changes. For example, as a patient naturally ages, the body grows and stretches as well as sustain trauma. Therefore, a dynamic anatomic address on a twenty-year-old may move multiple times as the patient ages and experiences things like weight gain or childbirth or injury or other such event as one skilled in the art would understand. The dynamic anatomic address must include an anatomic site name and may include any combination of additional applicable descriptive elements, including but not limited to, laterality, prefixes, suffixes, enhanced modifiers, custom descriptions, triangulations, measurements, automatic relationship descriptions to other anatomic sites or pins, translations, synonyms, cross-mappings, categorizations, coordinates, paths, axes, rotations, angles, topographies, depressions, elevations, visual definitions, segments, polyhierarchical membership, patterns, colors, intensities, surface areas, linked metadata, angles, centers, offsets, calculations, deviations, proximities, rotations, pin IDs, test IDs, historical information, visual definitions, visual previews, dates, notes, identifiers, groups, lineage (hierarchical parents, siblings, children, cousins, relatives, neighbors, etc.), zoning, visualization based on lineage, natural linguistic sequencing, attachments, language, multimedia, detections, symptoms, symbolic definitions, morphologies, skin tone, skin type, sex/gender variability, race/ethnicity variability, time-points, outside-observer view, mirror/selfie view, positions, coordinates, site segments. Further, a dynamic anatomic address can overlay or underlay with respect to images and dynamically recalibrate to images such as diagrams, photos, videos, avatars, partial images regardless of the image size or segments in view, while simultaneously providing visualization of self, relatives and lineage and providing correct anatomic site names and categorizations in any coded, linguistic, or symbolic language.
[0041] The present embodiment depicts the following regions: left superior paramedian forehead 21 , superior left forehead 22, face 23, head 24, and head and neck 25. In the present embodiment, the user has placed the cursor 14 over path-defined anatomic sites 18 with custom axes as a temporary site selection in the anatomic visualization 10. An enhanced description 16 is displayed as a result of the cursor 14 hovering over the anatomic site 18. The color-coded legend 12 additionally tells the user the hierarchy of the anatomic site 18 that the cursor 14 is within and over with a portion of the dynamic anatomic address for the anatomic site 18 visible in multiple hierarchical levels based on the custom axes of each level. In the present embodiment, the color- coded legend 12 is indented by the relative hierarchical level. It is contemplated that the coded legend 12 could be any visually distinct code such as a ROYGBIV (red, orange, yellow, green, blue, indigo, violet) scheme or any dynamic color or pattern sequence. With opacity adjustments as an alternative for overlaying, color blending occurs creating difficulties in hierarchical visualization, but these can be accounted for programmatically through site segmentation, addition, subtraction, multiplication, and other methods when overlays are needed other avatars, photos, and other images. It is also contemplated that dynamically changing overlay and underlay rules and functions provides relevant anatomic visualizations 10 to the user and targets the dynamic anatomic address 18 with additional visualization steps depending on their underlay or overlay status.
[0042] FIG. 2 shows the same anatomic visualization 10 as FIG. 1 with the multidimensional hierarchical anatomic map underlaid rather than overlayed to prevent obscuring the borders of the anatomic visualization 10. The cursor 14 is again placed over an anatomic site 18 as a temporary site selection. It is contemplated that the specific anatomic site 18 could be selected and saved with a pin or anchor or other such tagging mechanism in the software application as one skilled in the art would know. The method of this invention applies such pin or anchor simultaneously in all layers, levels, dimensions, custom axes, and anatomic sites, at any angle.
[0043] FIG. 3 depicts a diagram of a cross-section 30 representative of a single plane for an anatomic site or site segment and the paths 31 , centers 33, segmentation boundaries 34 at different deviations from center, and approach angle 35 to a point of interest 37 related to that anatomic site or site segment. In this example, the cross-section 30 indicates the offset from center 36 for a single path and show multiple paths 31 , including a path that span two levels 38 and a path of alternate thickness 32. Each path has custom axial definitions, which may differ from one to another, and a plurality of other properties such as: anatomic site name, identifiers, laterality, prefixes, suffixes, axial definitions (for axes), axial customizations, coordinates, rotations, scales, offsets, centers, thicknesses, bounding boxes, color, pattern, opacity, stroke, visual attributes, animations, group memberships, group axial definitions, curves, zones, linguistic segmentation instructions, visual segmentation instructions, relationships, position, level, mathematical translations, linguistic translations, transformations, links, order, perspective, view, and other metadata. It is contemplated that each path may touch paths in the same level, may span multiple levels, may touch, overlap, or be separated from other paths (leaving gaps in the level). Gaps may be automatically filled by underlying, overlying, or neighboring paths, passed through, or ignored. It is further contemplated that paths may have different thicknesses in different planes and can belong to multiple groups and simultaneously have proximity based physical neighbors and relatives defined by the path positions, and data-based neighbors and relatives, defined by data sets such as a text-based anatomic hierarchy. It is contemplated that artificial neural networks are used to describe, translate, relate, encode, and visualize physical and data-based relationships simultaneously.
[0044] FIG. 4 depicts the same cross-section as FIG. 3, except a median axis 39 has been added which is perpendicular to the cross-section paths 31 and off center. The custom axes for the paths reflect independently from the custom axes of the cross-section which enables multidimensional mirroring of customized axes even when an anatomic map is off-center, while the cross-section can still reflect along the defined median. Describing the paths for the angled insertion point 320 and termination point 322 in the present embodiment: insertion point 320 enters on a left medial path, crosses through the median 39, and the termination point 322 is on the right medial path. A second point of interest 324 can be described as on the left central path. In the present embodiment the second point of interest 324 is auto-related to the termination point 322; however, since they have different path defined axes, a secondary axis is used to autorelated the two points, in this case, the cross-sectional axis. The relationships between the points are described as the second point of interest 324 is left from the termination point 322 and the termination point 322 is right from the second point of interest 324 automatically applying the correct custom axial information. The cross-section axes are reflected in mirror view, even when mirror view does not contain a midline or median. Therefore, the path axes do not require an overall midline, and are mirrored independently of the midline. There can be midline dependent axes, exemplified in this figure as “Right” and “Left” and mirror along the median. “Right” is shown on the left of this figure because that is how it would appear on the frontal view of human anatomy from an outside observer perspective. In a mirror view, “Right” would be on the right, and “Left” would be on the left, resulting in a mirror view of the outside observer perspective aka a “selfie view.” It is contemplated that paths, path segments, path groups, maps, map groups, and relationships can have multiple midlines and medians, for example the trunk of the body has a median and midline, but so does each extremity (arm or leg) in relation to itself and to the trunk. Another example of multiple medians is when more than one map group and diagram group appears on the same anatomic map set, such as a map set that contains multiple visualizations simultaneously, like multiple views of the face at different angles.
[0045] Proximity-based and data-based grouping also occurs simultaneously, with one iteration applying artificial neural networks. Data-based relationships may be defined within the map structure, from an external source, such as a database. In one example of a structure-based map relationship, both right and left ears together are one group entity on a single diagram, even though they are not touching each other. In an example of an external data-based relationship, ICD-11 defines the ear (pinna) as singular concept further defining laterality or bilaterality by separate concepts in a separate code group. As each point of interest travels through each path, perpendicular to the plane or at an angle, the axes of interpretation are switched in real-time enabling descriptions of angles and measurements needed to reach to a terminal point of interest. [0046] FIG. 5 depicts an anatomic plane 45 of a right hand 44 with custom defined axes 40 at a particular rotation and angle. The custom defined axes 40 in this two-dimensional plane 45 intersect at a slight offset from the center for fine tuning and calibration of axial positions. A bounding box 42 is illustrated over the hand 44 with a center point 41 . In this example, the figure also depicts the laterality label 43 of the anatomic plane. As one skilled in the art would know, the label for points within the paths, path groups, and segments within this example custom axes could be labeled as proximal, lateral (radial), distal, medial (ulnar), or with a combination of the terms, depending on the point position within the custom axes.
[0047] FIG. 6 is a representative screenshot of an anatomic visualization 10 that depicts simultaneous, hierarchical, multidimensional, real-time coloring and labeling of anatomic sites under a cursor position with a color-coded legend representing the enhanced, translated anatomic site descriptions for the colored anatomic sites 50. Also depicted are hierarchically painted anatomy distribution segments with color-coding and patterned, non-patterned, and intensity visualization 51 , auto-relation descriptions 54 of the relationships between pins B and C in the pin list 53 (with this embodiment relating pins B and C 52, the pin list 53 describes that “B (this pin) is medial and superior from C” and “C (this pin) is lateral and inferior from B”), simultaneous hierarchical selection, relative visualization (to borders of each anatomic site), translation (with the present embodiment describing pin A 52 simultaneously as the “left (superior) paramedian forehead” in the pin list 53, and as “(superior left) forehead”, “face”, “head” and “head and neck” in the hierarchical selector that also shows pin A 52 isolated and related to the borders of each anatomic site in the anatomic hierarchy), and description of dynamic anatomic address components for the first list item in the example pin list 53, and other functions. Placed pins 52 on the anatomic visualization 10 indicate locations where patient medical events have occurred. In the present embodiment, dynamic pin descriptions indicate shave biopsies were performed on various locations on the forehead. The pin list 53 shows an isolated visual preview including information associated with each pin 52, including pin order, and pin position relative to the selected anatomic site component of the dynamic anatomic address, automatically populated diagnosis with a description and code, and the dynamic pin description in the same color as the pin, allowing for synchronization to the map and other outputs. Below the pin list 53 there are diagnoses 55 listed in collapsed lists correlating with other pins (melanoma) and painted distribution segments (dermatitis NOS and acne) on the layered anatomic map 50. In the present embodiment, despite pins 52 and hierarchically painted distribution segments with colors, patterns, and opacities occupying anatomic site segments, the color-coded legend to display the multidimensional site visualizations and translated descriptions relative to the cursor 14 is still visualized without interruption because of its underlayment. Hierarchical overlays again would cause real-time visualization obscurement issues. [0048] FIG. 7 depicts the same anatomic visualization 10 in a Spanish translation as the previous English figure. The translation occurs in real-time and simultaneously, allowing for multiple users to use the same tools and visualizations in a shared, real-time session but in different languages. It is contemplated that the software application is capable of translating diagnosis extensions and extended descriptions and all displayed text, including automatic enhanced descriptions of the multidimensional anatomic maps, points, distribution segments, and visualizations, in any coded, linguistic, or symbolic language through a plurality of neural networks and engines. Also depicted is a quick zoom functionality 56 that has translated isolated diagram visualizations to selectively target, find, and zoom in on multidimensional anatomic areas of interest. Also depicted are options in a hierarchical painting method 57 to paint diagnoses with different colors, patterns, intensities, opacities, and properties to multidimensional anatomic maps or avatars. It is contemplated that surface areas and intensity of involvement are also automatically calculated based on anatomic distribution.
[0049] It is contemplated to have automatically linked buckets to each dynamic anatomic address that accepts, segregates, collates, orders, tags, annotates, extracts, labels, error corrects, and analyzes pin-level photos, attachments, and links, and generates and stores dynamic forms (with or without encryption) for printing, signing, sharing, and saving. The forms and files are labeled automatically with dynamic anatomic address metadata, in any combination of the filename, file metadata, or file contents. It is contemplated that artificial intelligence and machine learning processes can extract, modify, analyze, and categorize different components of the dynamic anatomic address and the contents of their buckets.
[0050] It is further contemplated that forms will have relevant visual previews and isolated or grouped diagram segments, and photographs, thumbnails, and other multimedia and can accept electronic signatures and other digital inputs, while saving back to the relevant buckets including those in the dynamic anatomic address such that each bucket will show a dynamic count of relevant photos, attachments, links, and forms contained in it as depicted in FIG. 8. [0051] Functionality of the software application allows for the user to attach a variety of patient data, including but not limited to photos, attachments, links, procedure type, procedure measurements, procedure counts, procedure weight/value, diagnosis, diagnosis category, insurance status, fee schedule, units, country-specific billing rules, region-specific billing rules, deductible status, copay status, coinsurance status, account balance status, discount status, and other billing associated metadata, as well as comments, and other patient data associated with a particular anatomic site. It is contemplated that artificial intelligence and machine learning processes can apply neural networks and computer vision to extract, modify, analyze, segment, combine, and categorize different components of non-anatomy data, the dynamic anatomic address and the contents of their buckets. It is further contemplated that one skilled in the art would understand that any additional relevant data could additionally be incorporated into the patient record.
[0052] This embodiment further depicts software functionality including printing capability, visibility toggle, label reordering capability, customization of pins, and different types of pins. It is contemplated that printing could allow a user to print a form that includes dynamic anatomic address, optionally enhanced anatomic site descriptions, informational codes such as QR codes, visual previews, physical labels with isolated or grouped visual previews in any label format, and all relevant diagnosis, encounter, patient, healthcare provider/physician, and clinic information. It is contemplated that a visibility toggle would hide and show relevant information on the screen. The software functionality would allow a user to reorder the pins. It is contemplated that when reordering occurs the corresponding list items, visualizations, map, and dynamic anatomic addresses are also dynamically reordered in the list and synchronized to the map. In the present embodiment, a pin list sub-toolbar 58 allows the user to change additional pin-list options, such as pin-type, order type (from “A, B, C” to “1 , 2, 3” in English or equivalent change in alternate languages), grouping, and whether the pin can be placed off the map in the whitespace (i.e., where it is not associated with an anatomic site). It is further contemplated that visual previews could show each pin and distribution segment, along with pin order and position relative to the anatomic site. It is contemplated that additional types of pins could include find pins that find and identifies the dynamic anatomic address in any level of hierarchy, in any language, in any dimension, on any image or an expanded pin allowing further documentation and tagging options, in any language, that are specific to dynamic anatomic address, the country the application is being used in, diagnosis, procedure, form, or other metadata.
[0053] FIG. 8 further shows a screenshot with the real-time patient data 60 synchronized to an anatomic map that includes a mix of alpha-numeric and symbolic translations and delimiters. In this example, the patient’s name is followed by a numeric age and symbolic translation indicating patient sex, a birthday cake emoji precedes the patient date of birth, and a calendar emoji precedes the encounter date. Symbolic emoji tags are shown for brevity and also translated with linguistic parallels in any coded or linguistic language. The cursor 14 is shown over the right superior paramedian forehead, as indicated in the color-coded legend 12, and the laterality label 43 indicates “R” for “right” representing an outside-observer view of the anatomic visualization 10. In mirror or selfie view, the laterality label 43 would transform to an “L” representing “left”. Even though the diagrams and maps shown in this diagram do not have the midline of the body centrally located, mirroring the axes still allows for multidimensional labeling, visualization, and translation because of the custom axes for each path.
[0054] Two pins 52 are depicted. The first as an asterisk (*) representing a cryosurgery procedure to a diagnosis of an inflamed seborrheic keratosis. The second as “.A” representing a shave biopsy procedure on the diagnosis of a neoplasm of uncertain behavior of skin (2F72.Y). Further, the diagnosis component and other components like the pin description can change dynamically in the dynamic anatomic address, but the anatomic site information and visualization of the location can remain static. This is especially helpful at different time points, since diagnoses can change with additional information, like a pathology report from the biopsy. In this example, a user could easily change from “.A” representing the shave biopsy procedure and the diagnosis of “neoplasm of uncertain behavior of skin (2F72.Y)” to a diagnosis of “melanoma” using the pin-to- pin transformation method as shown. Pins can also be transformed to distribution segments with a pin-to-distribution segment method, and vice versa.
[0055] Additional information shown could include site name preview, auto-relation, hierarchical selector, find pin, isolated visual preview, dynamic pin description, list subtype selector, QR code linked to dynamic anatomic address information, such as patient data, re-creation data, anatomic site data, and other health data. Links are also categorized with symbolic delimiters and tagged with language agnostic symbolic and translatable text tags. As one skilled in the art would know, information could be manually entered by a user or inserted automatically depending on the automatically translatable features of the software application. In the present example, the procedure name (i.e., “shave biopsy”) could automatically be inserted into the notes box by automatically translatable text blocks.
[0056] FIG. 9 and 10 depict exemplar two-dimensional models of a right dorsum of hand 70. FIG.
9 shows a bounding box 42 segmented into customized Enhanced Detail Region (EDRs). It is contemplated that an EDR can be a custom shape, path, vector, or compound path, with custom offsets and angles. It is also contemplated that each EDR can determine proximity to and from center, angle to and from center, and magnitude of distance and angle from center, and each EDR can be broken up by an unlimited number of sub-segmentation boundaries. There are seventeen different sub-segmentation boundaries shown for this single EDR. Three pins 52 (A, B, and C) are shown in the central zone, one of the seventeen sub-segmented zones. The EDR custom axes (regardless of their defined thresholds for sub-segmentation boundaries) is used to automatically relate the pins 52 to one another, showing that A is medial from B, A is medial and proximal from C, B is lateral from A, B is lateral and proximal from C, C is distal and lateral from A, and C is distal and medial from B. It is contemplated that magnitude sensitivity may also be applied to enhanced anatomic site descriptions, such that for example (distal, medial) is different than (medial, distal), sequenced based on the magnitude of the deviation from center and the custom sub-segmentation boundary thresholds.
[0057] In addition to EDR, there are defined Enhanced Detail Planes (EDPs) that serve as backup relational planes, axes, and dimensions for relating two or more dynamic anatomic addresses, paths, or groups of paths that have different or undefined EDRs. For example, if two pins are close to each other but cross the midline of the body, and auto-relation is desired, the medial and lateral dimensions would not make sense, so a different directional plane and axis like an EDP is used to describe that relationship. FIG. 10 shows a representative EDP, defined by a bounding box 42 with axial synonym labels. The EDR in this embodiment also shows different subsegmentation boundary definitions in the y-axis as compared to the FIG. 9, omitting the automatic addition of the y-axis terms for this EDR only. It is also contemplated that EDR and EDP functions also work in three dimensions by adding a z-axis to each customized coordinate set, and custom offsets, rotations, scales, measurements, selective alignments, and topographies (including depressions and elevations) are also added for increased accuracy and precision. Because the axes applied to paths are customized, even three-dimensional views can take advantage of multidimensional hierarchical labeling without having an avatar or avatar component center in view or containing a midline. It is also contemplated that dynamic anatomic addresses can be recreated, enhanced, partially changed, and auto-related at different time points, thus in the fourth dimension of time, and moved, repositioned, or merged into other dynamic anatomic addresses with patient changes such as surgical changes, weight gain or loss, and growth. It is also contemplated that there are multiple medians and midlines on maps, images, and avatars representing anatomy. For example, body midline and median is different then the arm midline and median. Such midlines can also change based on the rotation, view, and area of the visualized anatomy.
[0058] FIG. 1 1 depicts four pins 52 on the same anatomic site 18, in this example the “central forehead” site that are automatically related to one another. Descriptions of all pins on same anatomic site 18 are automatically related, even though, in this case, some pins cross or approach the body midline. Additionally, linguistic descriptions to describe the magnitude of axial deviations from each other and from center are called “magnitude modifiers” and may also be applied when appropriate. Example automatic relationships with magnitude modifiers for the pins 52 in FIG. 1 1 include:
[0059] A is superior and right from B;
[0060] A is superior from C;
[0061] A is very superior and barely right from D;
[0062] B is inferior and left from A;
[0063] B is superior and left from C;
[0064] B is superior from D;
[0065] C is inferior from A;
[0066] C is inferior and right from B;
[0067] C is barely right from D;
[0068] D is very inferior and barely left from A;
[0069] D is inferior from B; and
[0070] D is barely left from C.
[0071] It is contemplated that all descriptions are translated automatically into any coded, linguistic, or symbolic language and directional arrows may also be depicted relating pins to one another. It is further contemplated that automatic directional omissions will be implemented when necessary to help avoid human user confusion when interpreting the automatic linguistic descriptions of the pin relationships. For example, “A is Superior from C” automatically omits x- axis descriptions. Reordering or deleting a pin, such as deleting “C” in this sequence, automatically relabels “D” as “C” and updates all auto-relation calculations and descriptions that contain the new “C”. [0072] It is contemplated that the software application can depict painted hierarchical anatomic sites with defined but customizable morphologies. FIG. 12 depicts a screenshot of an example of hierarchical painting of morphologies related to a disease, rather than diseases as shown in a prior figure. The customizable morphologies are painted to selected and segmented hierarchical components with dynamic anatomic addresses. Each morphology or morphological combination has any combination of color, opacity, pattern, and intensity. Body surface area is automatically calculated for each morphology (not shown) at its selected dynamic anatomic address component. Each mapped morphology has a corresponding visualization, and hierarchical selectors allow for different selections. An exported click-order matrix 85 that outputs the morphological findings into a tracking matrix, so that disease progression can occur by comparing different time points. It is contemplated that output may also be exported to a standardized matrix for easier data tracking and analysis. It is further contemplated that data overflow and duplicates may be shown and accounted for in the standardized matrix exports. It is further contemplated that inputs and outputs contain trackable secondary features 87, shown as ulceration and angiomatosis in this example, which are output into the tracking matrix. It is further contemplated that when the same site is selected on both lateralities (right and left), they can be optionally combined into a single (bilateral) visualization and description or kept separate. It is further contemplated that this hierarchical painting example documents morphologies on a single patient in a single point in time, which was done retrospectively in this example, but can hierarchically paint on a combination of different patients and different time points, thus enabling tracking of disease progression, resolution, or change in individual patients over time, or in populations.
[0073] FIG. 13 further depicts hierarchical painting that includes treatment recommendations 90 associated with the specific anatomic sites 18. Patients often receive multiple recommendations, sometimes with ten or more products listed, and can understandably get confused on what to use, where, and when. Visual hierarchical painting through mapping with anatomic site or site group descriptions can help to color code this information in an easily digestible format for the patient, in any language and creates a visual map of recommendations. It is contemplated that recommendations could build upon each other in areas of overlap. It is further contemplated that clinicians can stamp their recommendations or highlight areas for where to apply their recommendations on standardized maps or directly on patient images creating a personalized visual map of recommendations with simultaneous condensed and simplified translated text descriptions of “what to use where” on their body.
[0074] Workflow of associating anatomic sites and regimens can flow in either direction. A user can assign a color to an anatomic site, then add products, treatments, and recommendations to the painted anatomic sites. Conversely, products could be assigned a color and then the products could be painted onto the avatar. Anatomic areas are defined by area names and colors.
[0075] FIG. 14 depicts an exemplar output 90 with the treatment recommendation 90 for the anatomic sites 18. It is contemplated that the output can be printed, digital, or both, with relevant isolated or combined visual previews as well. It is contemplated that regimen mapping can be applied to treatment mapping as well, such as documenting different types of cosmetic procedures or settings. It is further contemplated that printed outputs could be made into labels and then affixed directly to the container for products, including over the counter products, to better instruct the patient how and where each product should be used to minimize confusion particularly when there are multiple products being used at different times of day. Printed physical labels inform a patient about the product regimen such as, how to use it, where to use it, frequency, warnings, and more. It is contemplated that emoji labels can also be utilized to communicate how the patient should use the medication. For example, a pill could show a mouth with a cup of water to indicate it should be taken with orally, a pill and a food emoji could indicate it should be taken with food, or a pill with a no sign and a cheese emoji could indicated avoid dairy when taking. If the current exemplar were unilateral, for example directing the use on one eyelid, the visualization could be shown in mirror view since this exemplar is representative of a physical label the patient could put directly on their product bottle, and the patient would likely be applying the product in front of a mirror. Such a label as described (unilateral, in mirror view) would serve to minimize patient confusion.
[0076] FIGs. 15 and 16 depict an electronic regimen map 100 with printable labels 102 and affiliated educational instructions 105, respectively. In the current embodiment, the educational instructions 105 provide visualizations by product, by area, and by condition. While a paper workflow generates a digestible report that can be handed to the patient; the electronic regimen can be saved into the patient’s chart for tracking the regimen over time and initiation of other workflows. The electronic regimen can evolve over time, with input from the patient and the professional, in their preferred language. It is contemplated, in an electronic evolving regimen, the patient could give feedback on a product, report a side effect from a product, initiate a refill request for a product, find up to date manufacturer’s coupons for a product, view product recall information, ask their professional about the product, report stopping the product, report starting a new product, or other electronic tasks. It is further contemplated that automatic alerts could be sent to the patient and physician if there is a product recall and automatic reminders could be sent to the patient if they are expected to be running low. From the electronic regimen, when there are office-dispensed products recommended, a single link could add all the office-dispensed products that are in stock to a ticket system with real-time pricing update for patient to initiate a purchase and the patient could also purchase remotely for products from office/retail facility.
[0077] In one embodiment, products can be registered including information including but not limited to: photo, product name, product ingredients, product SKU, product category in user country (Rx, office dispensed, OTC), product vehicle, product warnings, product directions, product suggested frequency, product side effects, product anatomic sites to avoid, product recommended anatomic sites, product substitutes, product key recommended ingredients, product active ingredients, product availability in user country, product synonyms, and other product metadata. A patient can then select a product that was already recommended, from the global product library, from the localized product library (e.g., based on country), or from their custom/favorite library of products. Automatic product recommendations and warnings will populate based on user preferences, patient condition, user frequency of selection, patient allergies or adverse reactions, patient interactions with other medications or conditions, patient has tried product already and failed, product availability (backordered, discontinued, banned in this country/region, etc.), substitutions, and other metadata. The patient can then print a physical label with a product image/thumbnail and the regimen for easy correlation and visual recognition by the patient. A product thumbnail can be expanded to show other metadata about the product on the electronic regimen.
[0078] The ability to dissect, sequence, and visualize anatomic site name components within the software application is depicted in FIG. 17. The screenshot shows the Anatomic Site Name Builder 110 which allows a user to reorder components of an anatomic site’s name or description. The name components are linguistically dissected and automatically placed into digital “chips” (which themselves can be deleted and reordered) in the appropriate category. In the present embodiment, neural networks output the automatic coded translations and symbolic groupings are showing for ICD-1 1 , Foundation IDs, AnatomyMapper ID code strings, and for symbolic emoji grouping of this selected anatomic site component, and laterality. The visibility toggle 112 has been toggled for the ICD-11 codes in this example, the enhanced code string displayed being “XA1 Z38&XK8G_(XK4H)”.
[0079] FIG. 18 shows the capability of the software to apply progressive linguistic and visual sub- segmentation simultaneously to achieve pinpoint precision in defining anatomic sites 18 on an anatomic visualization 10. Each anatomic visualization 10 (moving from left to right) progressively sub-segments from the previous until the right most visualization achieves pinpoint precision for the dynamic anatomic address. Above each visualization 10 are the English enhanced linguistic anatomic site descriptions 120 of the progressively sub-segmented anatomic sites. Each subsequent diagram adds enhanced modification language and simultaneous visualization. In the present embodiment, an enhanced modifier for sequence sensitivity is turned on causing the term “lateral” to be shown before “superior” in the anatomic site descriptions 120. Sequence insensitivity would visualize the entire upper right aspect of the highlighted area (combining the two right-most visualizations 10 in this figure). Keeping sequence sensitivity on, visualization of the right-most diagram color-codes the description for the site description 120 to pin-point precision. In this embodiment, since “superior” is listed before “lateral” and sequence sensitivity is on for the modifier terms, the upper right of the highlighted area is more accurately and precisely targeted. It is contemplated that the same sub-segmentation could be applied to a diagram as shown or a patient photo, avatar, video, live camera for augmented reality, virtual reality avatar, or other multimedia. It is further contemplated that the anatomic site descriptions 120 could be linguistic, symbolic, coded, mathematical or some combination of descriptors.
[0080] FIGs. 19 and 20 depict the capability of the software application to customize the anatomic visualization 10 based on patient characteristics, as well as translation of those customizations. The anatomic visualization 10 is representative of the patient identified by the identifying data 125. The customizable interface aspects 130 can adjust for things like patient sex, language, visibility, and filtering of certain anatomies, etc. In the embodiment depicted in FIG. 18, the text appears in English and symbolic language, the selected sex is male, show oral anatomy is selected and these customizations are reflected in the anatomic visualizations and maps 10. In comparison, in the embodiment in FIG. 19, the text appears as Chinese characters and symbolic language, the selected sex is female, the oral anatomy dynamic anatomic addresses are hidden, and these customizations are reflected in the anatomic visualizations 10. The software application allows for dynamic anatomic addresses that are not relevant to the patient to be filtered out or hidden automatically or with visibility toggling or by user preference. The user interface with the anatomic visualization 10 can also be zoomed in or out. Vector and mathematical algorithms and models allow for infinite and seamless scalability of each dynamic anatomic address, where neighbors and relatives are either affected or unaffected based on detected changes. [0081] FIG. 21 depicts an anatomic visualization 10 with dynamic anatomic addressing on a three-dimensional model visualizing the anatomic hierarchy over the right lunula of thumb with a corresponding color-coded legend 12. As with the two-dimensional anatomic visualizations, the cursor 14 hovers over an anatomic site 18 and a color-coded legend associates the linguistic descriptor with the appropriate color to define the five hierarchical regions 20 in this embodiment, which are: right upper extremity, right hand, right fingers and thumb, right thumb, right thumbnail, and right lunula of thumb. The site colors are underlaid and blended into the avatar, so as not to obscure the avatar or cause unexpected color blending caused by overlaying semi-transparent colors.
[0082] FIG. 22 depicts the customizability of a patient representative avatar to accurately represent patient characteristics 140 of a specific patient, including but not limited to adjusting skin tone, facial features, body features, body shape, and body size. It is contemplated that textural maps could further enhance the avatar. In one embodiment, a three-dimensional total body photograph can adjust all dynamic anatomic addresses and describe, categorize, and relate all areas of the body, in any coded, linguistic, or symbolic language. In another embodiment, high resolution three-dimensional captures detect, count, categorize, and assign gross and dermatoscopic-level features of lesions such as dermatoscopic-level morphologies and measurements. It is further contemplated that the user interface allows for avatar manipulation 145, such as opening the mouth and sticking the tongue out to document on oral dynamic anatomic addresses or removing underwear to document anogenital dynamic anatomic addresses.
[0083] FIG. 23 is representative of the standardized anatomic mapping at consistent dynamic anatomic addresses. When plotting and markup occurs in three-dimensional anatomic visualizations, corresponding plots and markups are simultaneously added to the anatomic sites 18 on the two-dimensional anatomic visualizations 10. Likewise, this corresponding mapping occurs on the three-dimensional maps when plotting and markup occurs on the two-dimensional maps to ensure accuracy and consistency amongst all of the patient’s anatomic visualizations. Likewise, when the same anatomic site is present on multiple two-dimensional map images, markup can selectively be shown or hidden on all views containing the same anatomic site. FIG. 23 further depicts that regions of interest can be simultaneously shown on all views of anatomic visualizations 10. In this embodiment, the anatomic site 18 or dynamic anatomic address can be seen in select perspective views of the anatomic visualization 10.
[0084] Mapping capabilities of the software application can be used in conjunction with each other through neural networks, computer vision, and artificial intelligence. FIGs. 24 and 25 depict anatomic visualizations 10 with such a scenario. FIG. 24 is a patient photograph 11 image added to the patient’s record. FIG. 25 depicts the multidimensional and hierarchical detection and visualization of anatomic sites 155 applied to the patient photograph 11 along with the alignment of the hierarchical visualization underlay 150. It is contemplated that computer vision can automatically count, categorize, characterize, and relate detected abnormalities, such as lesions or redness, in different anatomic sites and groups of anatomic sites. It is further contemplated that such detections will provide automatic diagnostic capabilities through machine learning and artificial intelligence, made possible by the distribution detection determined from the dynamic anatomic addresses of each detection.
[0085] The foregoing descriptions of specific embodiments of the present invention are presented for purposes of illustration and description. They are not intended to be exhaustive nor are they intended to limit the invention to precise forms disclosed and, obviously, many modifications and variations are possible in light of the above teaching. The embodiments are chosen and described in order to best explain principles of the invention and its practical application, to thereby enable others skilled in the art to best utilize the invention and its various embodiments with various modifications as are suited to the particular use contemplated. It is intended that a scope of the invention be defined broadly by the drawings and specification appended hereto and to their equivalents. Therefore, the scope of the invention is in no way to be limited only by any adverse inference under the rulings of Warner-Jenkinson Company, v. Hilton Davis Chemical, 520 US 17 (1997) or Festo Corp. v. Shoketsu Kinzoku Kogyo Kabushiki Co., 535 U.S. 722 (2002), or other similar caselaw or subsequent precedent should not be made if any future claims are added or amended subsequent to this patent application.

Claims

CLAIMS What is claimed is:
1. A method for mapping, visualizing, tracking, translating, encoding, describing, and/or labeling anatomic sites, the method compromising: selecting an anatomic site on an anatomic map wherein the anatomic map is defined by two or more defined paths of different sizes wherein the paths are automatically relatable to one another using custom axes or automatically enhanceable through descriptions with directional modifiers using custom axes; associating at least one component of anatomic site data with the anatomic site wherein the site data is metadata relevant to the anatomic site; creating a dynamic anatomic address that can be tracked spatially and through time wherein the address accepts images, attachments, records, links, and other data; visualizing a point of interest relative to path and the point of interest relative to ail underlying and overlying paths, in any hierarchical level or group, wherein all paths are described, sequenced, and categorized in any coded, linguistic, or symbolic language; wherein the visualization is overlaid, underlaid, or aligned to other multimedia containing anatomy to apply multidimensional and optionally mirrored anatomic mapping, descriptions, and visualization; wherein the point of interest is movable such that when moved it travels to different positions within all the paths above and below it; generating multidimensional descriptions, visualization, and map positioning in while the point of interest is moving; grouping paths or path segments with other paths or path segments to create a distribution list, and a hierarchical painting method where the hierarchical painting method allows for targeted, visualized hierarchical path selection of overlying and underlying paths, with translated visualizations and descriptions; wherein the distribution list includes surface area, intensity, and detected data, such as counts, morphology, treatment recommendations, or other metadata, for each group component, which can be output into a plurality of templates; linking multidimensional anatomic points and distribution segments to points in time wherein the linked points track progression, resolution, and change in diseases, diagnoses, morphologies, treatment regimens, and patient data; outputting translated, enhanced anatomic descriptions, cross-mappings, and encodings, and/or isolated visualizations for all defined paths under and over a point of interest or targeted defined path.
2. The method of claim 1 , wherein the anatomic site is represented by a pinpoint, segment of anatomic sites, groups of anatomic site segments, complete anatomic sites, groups of anatomic sites, different levels of hierarchy, different groups, and different organ systems simultaneously.
3. The method of claim 1 , wherein the defined paths are polygons, compound paths, lines, and/or shapes.
4. The method of claim 3, wherein the defined paths are dependently or independently scaled, rotated, excluded, included, aligned, and moved based on path or path group memberships, filters, patient characteristics, or detections.
5. The method of claim 1 , wherein the metadata is an anatomic site name, identifiers, laterality, prefixes, suffixes, axial definitions, axial customizations, orientation, synonyms, coordinates, rotations, scales, offsets, centers, thicknesses, bounding boxes, color, pattern, opacity, stroke, visual attributes, animations, group memberships, group axial definitions, curves, zones, linguistic segmentation instructions, visual segmentation instructions, relationships, position, level, mathematical translations, linguistic translations, transformations, links, order, perspective, view, and/or mirrored axis.
6. The method of claim 1, wherein the custom axes label, describe, and relate points of interest in multiple map dimensions and hierarchical levels.
7. The method of claim 6, wherein the custom axes are reversible to apply mapping, descriptions, labeling, translations, and visualization in an opposite perspective.
8. The method of claim 1 , wherein the visualization intensity is user assigned.
9. The method of claim 1, wherein the visualization intensity is automatically calculated with variables from path intersections, overlays, and underlays related to the other paths variables, including but not limited to opacities, color blending, pattern blending, addition, division, multiplication, and subtraction.
10. The method of claim 1 , wherein the visualizations are automatically categorized, grouped, described, and labeled into human and machine-readable labels and color coding.
11. A method for hierarchical painting of anatomic sites, the method compromising: selecting an anatomic site or site segment on a hierarchical anatomic map wherein the anatomic map is comprised of paths or path segments of different sizes and the path or path segments define the anatomic sites and site segments; applying a color and/or pattern and/or intensity to an anatomic site or site segment wherein the application is associated with health data such as diagnoses, symptoms, morphologies, and treatment recommendations; traveling through the hierarchical anatomic map with hierarchical selectors that visualize the anatomic sites and translated descriptions wherein the travel applies the color and/or pattern and/or intensity to the destination anatomic site; outputting a distribution list or tracking matrix that contains the anatomic sites lists; wherein the distribution list includes surface area, intensity, and detected data, such as counts, morphology, treatment recommendations, or other metadata, for each group component, which can be output into a plurality of templates; linking multidimensional anatomic points and distribution segments to points in time wherein the linked points track progression, resolution, and change in diseases, diagnoses, morphologies, treatment regimens, and patient data; associating at least one component piece of anatomic site data with the anatomic site wherein the site data is metadata relevant to the anatomic site; creating a dynamic anatomic address that can be tracked spatially and through time wherein the address accepts images, attachments, records, and other data; visualizing a point of interest relative to path and the point of interest simultaneously relative to all underlying and overlying paths, in any hierarchical level or group, wherein all paths are described, sequenced, and categorized in any coded, linguistic, or symbolic language; wherein the visualization is overlaid, underlaid, or aligned to other multimedia containing anatomy to apply multidimensional and optionally mirrored anatomic mapping, descriptions labeling, and visualization; wherein the point of interest is movable such that when moved it travels to different positions within all the paths above and below it; simultaneously generating multidimensional descriptions labeling, visualization, and map positioning in real-time while the point of interest is moving; grouping paths or path segments with other paths or path segments to create a distribution list, and a hierarchical painting method where the hierarchical painting method allows for targeted, visualized hierarchical path selection of overlying and underlying paths, with translated real-time visualizations and descriptions; wherein the distribution list includes surface area, intensity, and detected data, such as counts, morphology, treatment recommendations, or other metadata, for each group component, which can be output into a plurality of templates; linking multidimensional anatomic points and distribution segments to points in time wherein the linked points track progression, resolution, and change in diseases, diagnoses, morphologies, treatment regimens, and patient data; outputting translated, enhanced anatomic descriptions, cross-mappings, and encodings, and/or isolated visualizations for all defined paths under and over a point of interest or targeted defined path.
PCT/IB2022/000814 2021-12-10 2022-12-12 Multidimensional anatomic mapping, descriptions, visualizations, and translations WO2023152530A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US18/225,872 US20230368878A1 (en) 2021-12-10 2023-07-25 Systems and methods using multidimensional language and vision models and maps to categorize, describe, coordinate, and track anatomy and health data
PCT/IB2023/000535 WO2024023584A2 (en) 2022-07-26 2023-07-25 Systems and methods using multidimensional language and vision models and maps to categorize, describe, coordinate, and track anatomy and health data

Applications Claiming Priority (26)

Application Number Priority Date Filing Date Title
US202163265216P 2021-12-10 2021-12-10
US63/265,216 2021-12-10
US202163294653P 2021-12-29 2021-12-29
US63/294,653 2021-12-29
US202263267269P 2022-01-28 2022-01-28
US63/267,269 2022-01-28
US202263315289P 2022-03-01 2022-03-01
US63/315,289 2022-03-01
US202263269516P 2022-03-17 2022-03-17
US63/269,516 2022-03-17
US202263362791P 2022-04-11 2022-04-11
US63/362,791 2022-04-11
US202263364393P 2022-05-09 2022-05-09
US63/364,393 2022-05-09
US202263364764P 2022-05-16 2022-05-16
US63/364,764 2022-05-16
US202263365026P 2022-05-20 2022-05-20
US63/365,026 2022-05-20
US202263365373P 2022-05-26 2022-05-26
US63/365,373 2022-05-26
US202263366107P 2022-06-09 2022-06-09
US63/366,107 2022-06-09
US202263366816P 2022-06-22 2022-06-22
US63/366,816 2022-06-22
US202263369717P 2022-07-28 2022-07-28
US63/369,717 2022-07-28

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/225,872 Continuation-In-Part US20230368878A1 (en) 2021-12-10 2023-07-25 Systems and methods using multidimensional language and vision models and maps to categorize, describe, coordinate, and track anatomy and health data

Publications (2)

Publication Number Publication Date
WO2023152530A2 true WO2023152530A2 (en) 2023-08-17
WO2023152530A3 WO2023152530A3 (en) 2023-12-21

Family

ID=87565216

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2022/000814 WO2023152530A2 (en) 2021-12-10 2022-12-12 Multidimensional anatomic mapping, descriptions, visualizations, and translations

Country Status (1)

Country Link
WO (1) WO2023152530A2 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3428883B1 (en) * 2012-10-26 2021-06-09 Brainlab AG Matching patient images and images of an anatomical atlas
KR20150108701A (en) * 2014-03-18 2015-09-30 삼성전자주식회사 System and method for visualizing anatomic elements in a medical image
US20160015469A1 (en) * 2014-07-17 2016-01-21 Kyphon Sarl Surgical tissue recognition and navigation apparatus and method
JP6947759B2 (en) * 2016-07-08 2021-10-13 アヴェント インコーポレイテッド Systems and methods for automatically detecting, locating, and semantic segmenting anatomical objects
BR112018000197A2 (en) * 2016-12-02 2018-09-11 Avent Inc method for providing a user's navigational indications for locating a target anatomical object and medical imaging system for use in a medical procedure
US10878576B2 (en) * 2018-02-14 2020-12-29 Elekta, Inc. Atlas-based segmentation using deep-learning

Also Published As

Publication number Publication date
WO2023152530A3 (en) 2023-12-21

Similar Documents

Publication Publication Date Title
US11783929B2 (en) Graphical generation and retrieval of medical records
US9668702B2 (en) Systems, methods, and computer readable media for using descriptors to identify when a subject is likely to have a dysmorphic feature
CN112102937B (en) Patient data visualization method and system for chronic disease assistant decision making
Ackerman The visible human project
CN105940401B (en) System and method for providing executable annotations
Dionisio et al. MQuery: A visual query language for multimedia, timeline and simulation data
Bezgin et al. Matching spatial with ontological brain regions using Java tools for visualization, database access, and integrated data analysis
Lekschas et al. Pattern-driven navigation in 2D multiscale visualizations with scalable insets
CN117501375A (en) System and method for artificial intelligence assisted image analysis
Tschandl Risk of bias and error from data sets used for dermatologic artificial intelligence
Band et al. Application of explainable artificial intelligence in medical health: A systematic review of interpretability methods
Gotz et al. Multifaceted visual analytics for healthcare applications
WO2023152530A2 (en) Multidimensional anatomic mapping, descriptions, visualizations, and translations
Jin et al. TrammelGraph: visual graph abstraction for comparison
Mörth et al. Radex: Integrated visual exploration of multiparametric studies for radiomic tumor profiling
US10503867B1 (en) System for interacting with medical images
WO2020262810A2 (en) Health record system
CN108537893A (en) A kind of three-dimensional visualization model generation method of thyroid gland space occupying lesion
Liu et al. Perioperative nursing care of vascular decompression for trigeminal neuralgia under AR medical technology
WO2024023584A2 (en) Systems and methods using multidimensional language and vision models and maps to categorize, describe, coordinate, and track anatomy and health data
WO2023170442A2 (en) Targeted isolation of anatomic sites for form generation and medical record generation and retrieval
WO2024035976A2 (en) Dynamic areas of interest that travel and interact with maps and void spaces
Rau Cross-Cultural Design. Methods, Tools and User Experience: 11th International Conference, CCD 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Orlando, FL, USA, July 26–31, 2019, Proceedings, Part I
KR102632865B1 (en) Deep Learning Digital Templating System for Pre-operative Surgical Planning of Total Knee Replacement
Chen et al. An Image-based Typology for Visualization