US20220207771A1 - Heart Position Estimation - Google Patents

Heart Position Estimation Download PDF

Info

Publication number
US20220207771A1
US20220207771A1 US17/571,783 US202217571783A US2022207771A1 US 20220207771 A1 US20220207771 A1 US 20220207771A1 US 202217571783 A US202217571783 A US 202217571783A US 2022207771 A1 US2022207771 A1 US 2022207771A1
Authority
US
United States
Prior art keywords
heart
torso
information
marker
chest
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/571,783
Inventor
Peter Michael van Dam
Eelco Matthias VAN DAM
Samir Alioui
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Peacs Investments BV
Original Assignee
Peacs Investments BV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Peacs Investments BV filed Critical Peacs Investments BV
Priority to US17/571,783 priority Critical patent/US20220207771A1/en
Assigned to PEACS INVESTMENTS B.V. reassignment PEACS INVESTMENTS B.V. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: VAN DAM, PETER MICHAEL, VAN DAM, Eelco Matthias, Alioui, Samir
Publication of US20220207771A1 publication Critical patent/US20220207771A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0062Arrangements for scanning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0062Arrangements for scanning
    • A61B5/0064Body surface scanning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/25Bioelectric electrodes therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/684Indicating the position of the sensor on the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7405Details of notification to user or communication with user or patient ; user input means using sound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • A61B90/98Identification means for patients or instruments, e.g. tags using electromagnetic means, e.g. transponders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10544Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
    • G06K7/10712Fixed beam scanning
    • G06K7/10722Photodetector array or CCD scanning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14131D bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30048Heart; Cardiac
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Definitions

  • the present invention relates to a method to be performed by a computing device part of or coupled to an ECG device for estimating an orientation of the heart based on 3-D imaging information taken from a torso, preferably applying a marker element in a process of determining an orientation of a heart, preferably in a torso model of the torso of the human body, more preferably a heart-torso model specifically including data pertaining to the heart of the human body.
  • the present invention provides a method to be performed by a computing device part of or coupled to an ECG device for estimating an orientation of the heart based on 3-D imaging information taken from a torso, preferably applying a marker element in a process of determining an orientation of a heart, preferably in a torso model of the torso of the human body, more preferably a heart-torso model specifically including data pertaining to the heart of the, preferably human, body, the method comprising steps of:
  • the chest dimensions can be determined based on the information received from the measurement source.
  • the method comprises steps of estimating the heart orientation based on the chest dimensions, such as chest depth, chest width, and/or circumference.
  • the heart position can be estimated and as such information relating to the heart position can be output by the computing device for use by for example a physician.
  • the steps of receiving information relating to the human body comprise steps of receiving imaging information taken from the torso, preferably the 3D imaging information.
  • the steps of determining chest dimensions comprises steps of receiving measurement information relating to physical dimensions of the chest from input means or a database.
  • the method comprises steps of determining heart dimensions of a reference model to the determined chest dimensions.
  • the method comprises steps determining a position of the xyphoid, preferably determining a virtual determination of the xyphoid point, further preferably based on at least one of the shoulder height and the length of the sternum.
  • the method comprises steps of receiving imaging information of a marker element, the marker element being arranged in an area comprising an actual position on the body, preferably on the sternum, from a marker element optical imaging device, OID, the ECG electrodes imaging device and the marker element imaging device preferably being the same imaging device,
  • the method comprises steps in which the chest depth and/or heart orientation is used in steps for estimating a torso model or a heart torso model.
  • the method comprises steps in which the torso model or the heart torso model is based on measuring points such as a cloud of points based on the received information, preferably 3D imaging information.
  • the method comprises steps of in which the torso model or heart torso model is based on geometrical assemblies, such as triangles, created based on the cloud of points.
  • the method comprises steps of classifying partitions of the received information relating to the human body from a measurement source.
  • the method comprises steps of determining at least one angle for a long axis, mitral-tricuspid axis and left basal axis.
  • the method comprises steps of determining a rotation of the heart using at least one predetermined axis and angles, preferably over the three axis, preferably such that the cross section fills up about 1 ⁇ 3 of the thorax width.
  • the method comprises steps of estimating a heart orientation based on the estimated chest dimensions.
  • the method comprises steps of shifting the heart position such that the lower part of the heart coincides with the xiphoid position, preferably the heart being at a predetermined, such as minimal, distance from the chest wall.
  • the method comprises steps of applying parameters when determining the position of the heart relating to the body, such as age, existing conditions, weight, fitness, chest width.
  • FIG. 1 shows a preferred embodiment of a method according to the invention.
  • FIGS. 1B-1D show three embodiments of a marker element according to the present invention.
  • FIG. 2 shows a preferred embodiment of a further, more detailed method according to the invention.
  • FIG. 3 shows a preferred embodiment of a further method according to the invention.
  • FIG. 4 shows heart position examples relating to age and gender.
  • FIG. 5 shows an example of a heart in a chest and its exemplary relation to the xyphoid.
  • FIG. 6 shows an example of a shoulder height and sternum length in relation to a heart position in a torso.
  • FIG. 7 Shows examples of positions in which a heart can be oriented in a torso depending on body parameters.
  • FIG. 8 shows an example of a heart position relative to a xyphoid or height thereof.
  • a typical embodiment comprises a computing device with receiving means for receiving from an ECG device the ECG measurements during an ECG session, such as during a procedure or for obtaining data to base a subsequent diagnosis on.
  • the computing device is provided with a processor and memory.
  • the memory comprises program code for enabling the processor to perform the method according to the invention.
  • the computing device is coupled to a monitor for displaying resulting images.
  • a user interface is also displayed on the monitor for allowing input to be provided. Additional aspects of the user interface is comprised of a keyboard and mouse, touch screen, and all other user preferred in itself known input devices may be coupled to the computer through readily applicable connecting ports.
  • a 3-D camera is available for taking imaging information recordings from the torso.
  • a capability to record from several sides of the torso is preferred. This is obtained by either one camera that is movable to capture images from the top, left and right side of the torso.
  • two or more cameras may be fixedly mounted relative to the position of the torso in order to combine the 3-D imaging information recordings of the two or more cameras.
  • the computer is preferably connected to a database of 3-D torso models.
  • a database of 3-D torso models preferably comprises unique torso models obtained by imaging devices, such as an MRI, CT or sound echo device.
  • imaging devices such as an MRI, CT or sound echo device.
  • the respective information can advantageously be obtained during the ECG session, before the ECG session or based on historical measuring data for performing of this method.
  • the 3-D photo is recorded by means of a 3-D camera providing a cloud of points in a 3-D space.
  • the cloud of points represent the subject of the imaging information recording.
  • the 3-D camera is used to capture an image of a torso of a subject in the form of 3-D information comprising information with respect to depth and color of the subject and of the surroundings of the subject.
  • a single camera can be moved relative to the subject, such as along a generally circular line around the torso perpendicular to a longitudinal axis of the subject. Also multiple cameras can be used mounted around the subject for taking the appropriate recordings.
  • the heart position and orientation of the heart is still unknown. Based on patient specific information a good approximation of the heart position and orientation can be made.
  • the aortic valve is approximately behind the sternum.
  • the lower part of the heart at the sternum is well approximated by the lower part of the sternum (xyphoid). What exactly the lower part of the heart at the sternum is depends on the orientation of the heart. So knowing the xyphoid position supports the heart position significantly.
  • FIG. 1 shows an exemplary method as follows. Initially, optionally, a marker is detected as described relative to the below description and inserted from the priority application by reference.
  • step 130 the xyphoid position is determined. If a marker at this position is present this position can be used, if not determine a virtual position of the xyphoid
  • step 135 virtual determination of the xyphoid point.
  • the shoulders height is determined from the reconstructed model.
  • the length of the sternum is a function of the patient's length, for adults this is approximately 20 cm on average.
  • the sternum height is estimated and is used to localize the xyphoid location.
  • step 140 determine the depth of the chest at the selected xyphoid position.
  • step, 145 the reference model used (The selection of the reference model should have already been described in the vector patent) has a defined xyphoid depth and both depths are used to scale the heart model.
  • step 150 the angles for the three axis's are determined. With less sternum depth the heart remains more vertical. When the patient is over weighted the heart is more horizontal. Axis's angles are thus determined by weight and xyphoid depth (other patient characteristics might be incorporated to tune the estimated angles.
  • FIG. 2 shows an exemplary method as follows.
  • imaging information i.e. 3D photo(s) from which a torso model and marker positions are to be determined.
  • This can be any imaging from which the chest dimensions and xyphoid and/or sternum marker positions can be determined.
  • a 3D camera is one a preferred input device.
  • a marker is detected as described relative to the below description and inserted from the priority application by reference in steps 505 - 530 .
  • the method continues in step 540 .
  • the method continues in step 535 .
  • Step 700 is an optional manual step if no device measurements are available to have manual measurements entered.
  • the database provides information as to information of models and creation of the model.
  • step 530 the xyphoid position is determined. If an externally applied marker at this position is present this position can be used, if not determine a virtual position of the xyphoid based on used marker(s) or a manual indicated point on the torso. In case a Xyphoid marker is detected: the best positioning of the xyphoid is from a special marker on top of the xyphoid. With this information the sternum length can be determined as well as the lower part of the heart. Thus, such information helps with the estimation the heart position.
  • step 535 virtual determination of the xyphoid point.
  • the shoulder height is determined as the position where the relative change in shoulder height along the left and right side of the body is more than a predefined slope, e.g. 30% (see figure xx).
  • the length of the sternum is, at least preferably assumed to be a function of the patient's length, for adults this is approximately 20 cm on average.
  • the sternum height is estimated and is used to localize the xyphoid location. If no marker is present the xyphoid position can be estimated.
  • the sternum is in the middle of the torso, only the xyphoid z-position needs to be estimated. For adults this is on average 20 cm, decreasing with age and is slightly influenced by the thorax dimensions (depth of the sternum for instance)
  • step 540 the depth, width and of circumference of the chest at the selected xyphoid position is determined. Based on this also the chest ratio (chest depth divide by the chest width) can be determined. Preferably the chest dimensions at the xyphoid is determined. The width (from left and right) and depth (from front to back is calculated by the cross section of the x and y axis going through the middle of the thorax. These dimensions determine the bounding box of the heart, i.e. the ribcage.
  • the reference heart model has to fit the torso model.
  • the most limiting space is the chest depth.
  • the depth of the heart (back to front chest direction) scaled such that this heart depth is equal to half chest depth of the model used. (The selection of the reference model should have already been described in the vector patent).
  • Another method is to scale the heart by factor between the chest depths of the reference model and the patient specific chest dimensions has a defined xyphoid depth and both depths are used to scale the heart model. For example, when given the estimated ribcage space ( 540 ) the heart is scaled. The heart can take approximately 50% ( ⁇ ) of the sternum depth.
  • the reference heart is scaled given the thorax depth at the xyphoid position:
  • Heart Heart ref * thorax_depth * ⁇
  • step 550 the angles for the three axes is computed. With less sternum depth the heart is relatively more vertical. When the patient is over weighted the heart is more horizontal. Axis's angles are thus determined by weight and xyphoid depth (other patient characteristics might be incorporated to tune the estimated angles. preferably, the angles for long axis, mitral-tricuspid axis and left basal axis are computed. The heart is then rotated such that is also matches the thorax width at the xyphoid position
  • step 555 the heart is rotated using the three estimated angles when the chest dimensions indicate a small chest size, e.g, for instance a chest circumference of less than 1000 mm.
  • Heart rotation (Heartref, thorax_width) rotation is done over the three axis, such that the cross section fills up about 1 ⁇ 3 of the thorax width.
  • the heart position so the lower part matches the xyphoid is adjusted.
  • Heart Heart + ( x , y , z )
  • Reference torso-heart model database with models and descriptions of the models for instance, but not limited by chest dimensions, etiology, gender, weight, height of the models, such as is a database with models of heart torso combinations.
  • step 700 if no imaging data is available to determine the xyphoid and chest dimensions, e.g. chest depth at xyphoid, chest width at xyphoid height and chest circumference, the values can be measured on the patient directly with manual measurement devices.
  • other measurements can be taken into account here as well, such age, gender, height, weight, and etiology of the disease.
  • Patients with a failing heart have often an enlarged left ventricle, such model should be taken from the database. Based on these measurements a model is selected from the database and scaled to match the measurements, i.e. sternum length, chest circumference and xyphoid position match the measured values.
  • step 710 the shoulder height is determined, a preferred marker for this point is the manubrium of the sternum, the top of the sternum.
  • step 720 the distance between shoulder height and xyphoid is determined, the lower end of the sternum.
  • step 730 the chest circumference at the xyphoid position is determined.
  • step 740 the chest width and depth at the xyphoid is obtained. This is preferably measured or computed using both the circumference and one of the other measures (depth or width)
  • step 750 other demographics of the patient are inputted or obtained, such as weight, height, gender, etiology of a (cardiac) disease.
  • step 760 based on the information collected select the reference torso-heart model that closely matches the patient specific values is determined.
  • a further subject of the present invention is the use of a marker element to be used as a reference point relative to the torso.
  • a marking element provides an optically recordable element, such as a surface, perform an input for the analysis of the 3-D imaging information recording.
  • And may take the form of a patch, optionally comprising communication electronics providing an identification, having predetermined recognizable characteristics for detecting thereof by means of the computing device executing the appropriate program means.
  • the computing device is partly or wholly integrated into the camera device.
  • the position on the thorax is predefined and enables the computing device to match, orient and or detect the thorax in the 3-D imaging information recording under clinical circumstances.
  • computing device is able to perform an analysis eliminating disturbances, such as blankets, equipment, objects or cables momentarily arranged on top of the person, etc.
  • a quality check of the 3-D photo can be based on imaging information relating to the marker element and/or the position of the camera towards the torso and/or marker.
  • Alternative embodiments of the marker element comprise recognition by means of color, signal, patterns, geometry, such as a shape; wherein optionally also through openings in the marker are provided for discerning the skin color.
  • the marker element provides a means to use as a basis for analysis.
  • Algorithms of analysis are preferably adaptable to a range of predetermined marker types, such as distinguished by means of for example presence of information elements providing directional information such as a pattern of dots, color, shape, dimension, lighting, sound.
  • the position of the marker element, or marker elements on the thorax is predefined, for example by having the upper side of the marker element coincide with the upper part of the sternum or suprasternal notch and having the marker elements positioned along the sternum.
  • Several characteristics of several preferred embodiments of the marker element provide distinct advantages. Analysis of the color or combinations of color of the marker element provides advantages in permitting the detection of the marker element which enables analysis of an area of the subject where the marker is present. Providing a certain order of color them provide information regarding to orientations such as left, right, top, bottom and depth orientations of the subject and allow for such information to be used as inputs in the analysis.
  • the geometry or shape of the marker element provides the advantage of improved performance of analysis, such as during detection of the marker on the subject.
  • Characteristics such as sound, light or a signal from an RFID chip provided in the marker element provides advantages in the direction of the marker and advantages in performing the analysis according to the present invention, such as in identifying the marker element on the thorax of the patients.
  • the marker element represents a reference point towards the algorithm performing the analysis in a way that defines the 3-D space independently of how the recording of the imaging information is performed. That is, independent of which camera is used, what the orientation of the camera is, on as the marker is comprised in the imaging information recording.
  • the marker provides a basis for the algorithms to determine the orientation of the marker and based on that create an initial estimate of the orientation of the thorax. If non-preferable outcomes are obtained, information may be outputted as to a change in positioning of the camera are relative to the subject, such as to provide a better alignment with regard to e.g. a longitudinal axis of the torso, or to provide a better alignment relative to the marker element.
  • the marker element is set to improve.
  • algorithms are provisions to detect the shape of a breast relative to the marker position. Based on this, specific analysis of the 3-D imaging information recording is performed. Advantages thereof are that the time required for performing calculations for the analysis is reduced allowing for a real-time usable results.
  • the marker element is preferably the starting point of the comparison between the 3-D imaging information recording and the 3-D model of the torso obtained.
  • An initial verification step in recording the 3-D imaging information recording comprises verification of the presence of the marker element. Preferably also a verification is made of acceptability of the image due to general photographic circumstances of the area or area in a room in which the recording is performed.
  • the 3-D imaging information recording is generated and verified with respect to the presence of the marker element, whereupon it is saved and used for further analysis.
  • FIGS. 1A-1C show three preferred embodiments 1, 2, 3 according to the present invention of the marker element.
  • Marker element 1 consists of a blue rectangle allowing for identification based on at least the color and shape. The color allows for identification of the color and the meaning of such color as predetermined. Aspects of analysis of orientation with this embodiment will come from information in the 3-D image recording relating to e.g. the torso.
  • Marker element 2 consists of a rectangle having two areas of color. Such rectangle also provides an analysis as to the up and down and left and right based on the information of the colors. Therefore, more aspects of analysis can be determined based on the marker element in itself.
  • Marker element 3 consists of a rectangle having a general element of color and a shape defined therein, and his case comprising of a vertical line or bar with a circle generally oriented at the middle of the marker element 3 .
  • FIG. 2 is provided to show an overview of a torso with a marker element on the sternum and ECG electrodes oriented on the torso.
  • the quality of the ECG recordings, and comparability of a range of recordings over time is dependent on a correct orientation or the same orientation in the several recordings over time.
  • the present invention has as an important advantage that it becomes possible to relate the position of the ECG electrodes to the position of the marker element and thus to a fixed position on the torso. Furthermore, the present invention enables relating the imaging information recording to the model of the torso. Furthermore, the present invention enables relating the position of the marker element to the model of the torso, furthermore, the present invention enables relating the position of the ECG electrodes to the torso model, preferably wherein the marker element provides a basis for calculating such relation and allowing for calculating such relation very speedily, such as fast enough for providing a result usable within the session, defined as real-time within the context of the present invention.
  • the 3-D imaging information recording is divided in areas.
  • a main area of analysis is the marker area 11 .
  • the marker area 11 is an area defined around a detected marker.
  • Other areas comprise area has 16 , 17 , 18 , which are area is defined to compare parts of the 3-D imaging information recording with the torso model information for reaching a match between those.
  • the electrodes 13 are regular ECG electrodes, preferably recognizable by shape or color for identification thereof, which electrodes are to be matched to the torso model by means of imaging information and the presence of the marker therein.
  • FIG. 3 provides a general overview of the method according to an embodiment.
  • the method is started in step 20 .
  • imaging information as obtained from the 3-D camera is interpreted to assess the presence of a marker. It is preferred that imaging information is recorded in case such presence of a marker is assessed, in order to record usable information.
  • it is determined of a marker is present in imaging information. In case there is no marker detected in the imaging information, the method returns to step 21 .
  • the imaging information is structured to coordinates, color information and or depth. The result is a cloud of points in step 23 .
  • step 24 the results of step 23 are divided into areas of analysis, such as areas to be compared with areas of the 3-D torso model.
  • step 25 it is determined whether enough areas for further analysis are defined. In case it is determined that not enough areas for further analysis are defined, the method returns to step 21 . In case it is determined that enough areas for comparison are defined, and as such a quality check of the 3-D imaging information recording provides a positive determination, the method continues in step 26 .
  • step 26 the preprocessed 3-D imaging information recording is matched to matchable information of the 3-D torso model as obtained.
  • step 27 it is determined whether a match was possible between the information from the 3-D imaging information recording and the 3-D torso model. In case it is determined that the match was not possible, the method returns to step 21 in order to reprocess with a new 3-D imaging recording. In case it is determined that the method provided a match of an acceptable quality, such as within certain predetermined limits, the electrodes are detected from the 3-D imaging information and matched with the torso model for adding information relating to the electrodes to the 3-D torso model.
  • FIG. 4 provides a further embodiment of the method according to the present invention.
  • the method starts in step 100 as a configuration step 4 loading resources needed, based on an instruction into the user interface for generating a 3-D image recording.
  • the detection algorithm is initialized by providing characteristics of the marker element used in the respective session. Those characteristics, such as colors, such as blue, green or pink, geometry, such as rectangular, square, triangular, or its dimensions, such as with or height, are retrieved from the database. These characteristics I used for creating groups of points having at least one of the marker description, such as the color.
  • step 110 the information relating to the 3-D imaging information is received from the 3-D camera comprising e.g. color, depth and etc.
  • the information is structured to coordinates having color information.
  • the 3-D imaging information is prepared for analysis for detecting the marker the received imaging information is analyzed for the presence of the marker.
  • the marker has to be oriented in the marker area 11 that is displayed, preferably yellow, on a display of the camera or a monitor of the computing device showing the imaging information. Pixels that are received inside the marker element area 11 is added to a list with the same criteria. A list is created for each criterion. This helps in finding the marker as pixels with the same color, such as on the marker will be on the same list.
  • the marker is identified based on the created lists.
  • Information as to the geometry of the marker is extracted from the information of the pixels in the lists. If this is not successful, a calibration is performed with respect to constraints such as circumstances such as light in the room that can influence the colors of the recording. If the marker is found, the method allows in step 140 .
  • An example is a marker that is a rectangle, one half centimeter wide, 5 cm high and having a blue-collar.
  • the step of analyzing will identify the list having the pixels providing the rectangular shape, such as by taking 4 points of the selected list and calculating the angle created by each 3 points. If the angle is 90°, the distance and the color match, and then the list comprises information relating to the marker.
  • step 140 the order of calibration and a zone of calibration is processed. If the marker was not detected, the camera is directed such that the marker is in the marker element area 11 . The said calibration is performed and step 130 is repeated. In step 150 , it is determined that the 3-D imaging information contains the marker and the 3-D imaging information is recorded.
  • One session according to the present invention may comprise a number of recordings over the duration of the ECG session.
  • a full 3-D imaging recording of the torso is created. Images are taken of the torso from several angles, all comprising the marker. For example, the camera is moved by starting capturing from the left part of the torso moving over the torso to the right part of the torso. During such movement of the camera the camera takes a temporary image recording every second after which the recordings are combined to a 3-D image recording of the full torso. All individual recordings are processed as described in the above in order to assess the presence of the marker.
  • step 170 the 3-D image recording of the full torso is verified. In case the combination of the sip recordings electrodes to deformations you to inconsistent camera moving, the recordings have to be taken again by repeating the above steps.
  • step 180 it is determined if the resulting marker from the combination of separate 3-D imaging information recordings is valid. Such validity is obtained if the combined 3-D image recording comprises sufficient information, such as for example performed by analyzing the 3-D imaging information recording starting from the marker position down to the bottom by steps of 3 cm and checking the percentage of holes present in the 3-D photo. Acceptability is for instance defined if the percentage is below 3%.
  • step 190 the method ends with outputting a validated 3-D photo
  • FIG. 4 describes an embodiment of pre-analysis in preparation for computation during analysis.
  • This phase represents a pre-analysis of this 3-D imaging information recording for extracting information relating to the subject, was information is of assistance in the correction of errors in the 3-D imaging information, the recognition of parts of the subject such as the shoulders, head, breast areas.
  • the purpose of this embodiment is to analyze usability of available information for the extraction of the thorax information relating to the electrode positions.
  • the method is initialized by loading the 3-D imaging information and analyzing means, such as a tracker for shoulders.
  • the characteristics of the marker is related to a specific use.
  • a blue marker is for instance used for analyzing the anatomy of the body, such that for instance if the marker is rectangular and its color is plain blue the analysis of the body will be performed relating to the circumference of the torso the width or the circumference of the arm.
  • Step 210 comprises loading or detection an anatomical thorax element/parts from the 3D-Photo and categorization those element/parts relating them to their body positions.
  • the detection in this level is a categorization per region relative to the marker position as depicted in FIG. 9 .
  • a coordinate analyzer separates the points of 3-D imaging information to points which are in higher position than the position of the marker then divide this upper part to left and right category.
  • the other points from the 3-D imaging information preferably having an altitude lower than the marker's altitude, those will be in the list (group) representing the belly.
  • the detection of the thorax is a combination of information given by specific analyzers.
  • a shoulders analyzer will give the position of the shoulders, a circumference analyzer gives us different circumferences related to an altitude (horizontal position). From those two information we have the upper offset of the thorax and also the left and right is deduced from the ellipse equation of the biggest circumferences found.
  • Step 220 is defining the shape of the shoulders, for example using a cylindrical geometry approach, such as shown in FIG. 10 .
  • the shoulders analyzer gets lists of points and studies the relations thereof toward each other. In other words, the analyzer search for area where the depth of points decreases progressively for those points to be in the same distance from a line, such as a central axis of the cylinder they form.
  • step 230 the circumference of the torso is analyzed.
  • the symmetry and the variation in the distance of the torso in the same altitude is analyzed. Curved lines are created with points at the same altitude at for instance a 2 cm interval. Ellipses are created in this way for determining the circumference of the torso for permitting the prediction of the dimension of the subject. This prevents manipulation reduction.
  • step 240 other characteristics of the torso are determined, such as the head, hair, or skin color. Also elements in the photo that are distracting to analyzing the torso, such as a blanket on the subject, are rejected as much as possible. Preferably, it is known what color blanket is used such that imaging points having the same color as the blanket can be removed from analysis, thereby saving time of analysis by such noisy information.
  • step 250 the information from the analyzer is from the steps 220 , 230 , and 240 are combined to improve the analysis. For example, by knowing the position of the shoulders and the points in the imaging information constituting the shoulders, the central axis of the cylinder that they form helps in detecting borders where the imaging information may end, for example because there are no electrodes beyond such areas.
  • FIG. 6 shows a method for matching the 3-D torso model to the 3-D imaging information recording.
  • the method starts in step 500 with loading information, and initializing the computing device for performing the calculations.
  • the input of step 510 is the information relating to the 3-D imaging information recording and the 3-D torso model.
  • the position of the marker element in the imaging information recording as well as its equivalent position in the 3-D torso model is taken as the basis to calculate the difference of distance and the angles created by the kerf of the marker zone in the model and the imaging information.
  • Step 510 provides: take the marker position and its equivalent position by the model, then calculate the difference of distance and the angles created by the curve of the marker zone in the model and 3D photo.
  • the coordinate of the marker initially in the 3D photo is expressed in camera space (is the camera which give the point of 3D photo its coordinate) and the marker coordinate in the model is defined by the MRI device as consequence, except coincidence, the position of the marker in the 3D photo and model are different.
  • the positions of the marker and the equivalent of the marker are separate by a distance not nul 0, see FIG. 12 .
  • Step 520 provides: the 3D-Photo is moved to the model until the distance of the marker and its equivalent position is reduced to 0.
  • the distance between the marker in 3D photo and its equivalent position in the model is calculated then the 3D photo is translated to the position of the marker. Indeed we calculate the vector of translation and we move the coordinates of every point of the 3D photo using the calculated vector.
  • Step 530 provides: 3D-Photo will be rotated until their curves are positively parallel and control if the distance between the marker and its equivalent position is null, see FIG. 14 .
  • Step 540 provides: the new update of 3D-Photo is present but that does not grant that the photo and the model are in the same orientation.
  • This phase permits the correction of orientation of the 3D-Photo and the model by taking two equivalent random zones and calculation of two vectors which are going from the marker position to that area and rotate the 3D-photo until that two points coincide or become equal, see FIG. 12 and FIG. 15 .
  • Step 550 provides: Estimation of the matching is the calculation of the difference of distance between projection of the element of 3D-Photo and their equivalent from the model.
  • Step 560 provides: compare with latest estimation if the value increased or not and apply the best transformation. If the percentage increased the calculation will be retried with another configuration until a maximum related to the model and the 3D-Photo is reached.
  • step 550 the best percentage relative to the selected list of second point (point from random zone) is obtained. If the current best percentage is better than the old one the estimation improved.
  • Step 570 provides: the new 3D-photo is stored and can be used to extract the specific information and apply them to the model like electrodes position for example.
  • FIG. 7 shows a method for analyzing areas on the imaging information recording and defining a coordinate system relative to the marker element.
  • Step 600 provides: load the 3D-Photo and load the resources needed for the process (load the specific analyzer).
  • Step 610 provides: the marker analyzer goes to position of the marker and use the information related to it and creates from those information the “marker” to which every data is related.
  • the points of the 3D photo in the space (coordinates) are positioned related to that marker, see FIG. 19 .
  • the marker contains 3 axes to which every coordinate of a points from the 3D photo is expressed, see FIG. 20 .
  • a point is taken from the 3D photo. That point after recording is expressed with the coordinate that the camera gives to that point in one side. In the other one, the coordinate of a point from the model is expressed with the coordinates given by the MRI device.
  • the marker in 3D photo and its equivalent in the model are the same. As consequence, the coordinates are expressed using the same reference.
  • Step 620 provides: by the start of analyzing using the information generated from “ 610 ”, the zone of analysis is defined, see FIG. 20 .
  • the zone of analysis is the result of the combination of the information that the analyzers provide.
  • the line of the shoulders is the upper border of analysis.
  • the radius of the biggest circumference plus the ellipse's center of all circumferences provides the offset left and right from the axis formed by the ellipse's center.
  • Step 630 provides the nearby area (or zone of the analysis) is inspected and is prepared to be categorized to the zone marker or random zone.
  • a nearby area or the marker zone is the area related to list of points which are close to the marker with a certain distance (3 cm, 5 cm, . . . ). All elements outside of the marker zone are in the random zone or what is called the control zone, see FIG. 14 .
  • the analyzer calculates the curve of the geometry where the marker is positioned.
  • the curve is how the concavity of this zone of the 3D-photo behave, see slide 6
  • the objective is to find the relation of the points and how the global curvature of the area behaves (manifest) due to the repartition.
  • the result of the analysis of the curve is a normal vector that characterizes the area. For example if the area is plane then the normal (perpendicular vector) of the plane is a representation of the curvature which not changes along that plane. If the area is spherical then the normal is perpendicular to the tangent plane of the sphere in that selected position.
  • the analyzer get the no marker zone (random area), get their characteristics and store them to use them in matching process.
  • the areas are characterized by their distances, their positions, see FIG. 19 , see step 630 .
  • the end analyzing structures the obtained information and store those to use those in further process.
  • the objective is to have 3D points with the same characteristic from the model and their equivalent in 3D photo (3D point from the model which is 80 millimeter far from the marker) to compare every points and with its equivalents to control the quality of matching, see FIG. 15 .

Abstract

The present invention relates to a method to be performed by a computing device along with an ECG device for estimating an orientation of the heart based on 3-D imaging information taken from a torso, preferably applying a marker element in a process of determining an orientation of a heart, preferably in a torso model of the torso of the human body, more preferably a heart-torso model specifically including data pertaining to the heart of the, preferably human, body, the method being receiving information relating to the human body from a measurement source, such as dimensional measurement information input into the computing device from an input interface or an optical imaging device, preferably a 3D imaging device, the information being information of the exterior of the body, such as imaging information of the exterior of the torso, and determining chest dimensions, such as at the xyphoid position.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 16/651,752, filed on Mar. 27, 2020, which is the United States national phase of International Application No. PCT/NL2018/000020 filed Nov. 27, 2018, and claims priority to The Netherlands Patent Application No. 2019635 filed Sep. 27, 2017, the disclosures of which are hereby incorporated by reference in their entirety.
  • BACKGROUND OF THE INVENTION Field of the Invention
  • The present invention relates to a method to be performed by a computing device part of or coupled to an ECG device for estimating an orientation of the heart based on 3-D imaging information taken from a torso, preferably applying a marker element in a process of determining an orientation of a heart, preferably in a torso model of the torso of the human body, more preferably a heart-torso model specifically including data pertaining to the heart of the human body.
  • It is often preferable to have information relating to the position of the heart of a person, also when there is no scanning result available, such as a scanning result from a CT scan, an MRI scan or an echo. It may be the case that such scanning equipment is not available in a hospital or area or that there is no time to perform such scan at the moment the information relating to the position of the heart of a person is desirable, such as when performing an ECG.
  • As such, it is part of the inventive work of the present inventor to perceive such need and to consider the possibility of deriving a model of the torso and the heart in the torso from other means.
  • As such, the present invention provides a method to be performed by a computing device part of or coupled to an ECG device for estimating an orientation of the heart based on 3-D imaging information taken from a torso, preferably applying a marker element in a process of determining an orientation of a heart, preferably in a torso model of the torso of the human body, more preferably a heart-torso model specifically including data pertaining to the heart of the, preferably human, body, the method comprising steps of:
      • receiving information relating to the human body from a measurement source, such as dimensional measurement information input into the computing device from an input interface or an optical imaging device, preferably a 3D imaging device, the information relating to the human body comprising:
        • information of the exterior of the body, at least comprising imaging information of the exterior of the torso,
      • determining chest dimensions, such as at the xyphoid position.
  • It is an advantage of such a method that the chest dimensions can be determined based on the information received from the measurement source.
  • According to a first preferred embodiment, the method comprises steps of estimating the heart orientation based on the chest dimensions, such as chest depth, chest width, and/or circumference. As such, preferably by applying measurements from a 3D camera, the heart position can be estimated and as such information relating to the heart position can be output by the computing device for use by for example a physician.
  • According to a further preferred embodiment, the steps of receiving information relating to the human body, comprise steps of receiving imaging information taken from the torso, preferably the 3D imaging information.
  • Further preferably, the steps of determining chest dimensions comprises steps of receiving measurement information relating to physical dimensions of the chest from input means or a database.
  • Preferably, the method comprises steps of determining heart dimensions of a reference model to the determined chest dimensions.
  • Further preferably, the method comprises steps determining a position of the xyphoid, preferably determining a virtual determination of the xyphoid point, further preferably based on at least one of the shoulder height and the length of the sternum.
  • Further preferably, the method comprises steps of receiving imaging information of a marker element, the marker element being arranged in an area comprising an actual position on the body, preferably on the sternum, from a marker element optical imaging device, OID, the ECG electrodes imaging device and the marker element imaging device preferably being the same imaging device,
      • performing an image recognition on the imaging information for obtaining a presence determination, preferably a positive or negative determination, of the marker element in the imaging information.
  • Preferably, the method comprises steps in which the chest depth and/or heart orientation is used in steps for estimating a torso model or a heart torso model.
  • Preferably, the method comprises steps in which the torso model or the heart torso model is based on measuring points such as a cloud of points based on the received information, preferably 3D imaging information.
  • Preferably, the method comprises steps of in which the torso model or heart torso model is based on geometrical assemblies, such as triangles, created based on the cloud of points.
  • Preferably, the method comprises steps of classifying partitions of the received information relating to the human body from a measurement source.
  • Preferably, the method comprises steps of determining at least one angle for a long axis, mitral-tricuspid axis and left basal axis.
  • Preferably, the method comprises steps of determining a rotation of the heart using at least one predetermined axis and angles, preferably over the three axis, preferably such that the cross section fills up about ⅓ of the thorax width.
  • Preferably, the method comprises steps of estimating a heart orientation based on the estimated chest dimensions.
  • Preferably, the method comprises steps of shifting the heart position such that the lower part of the heart coincides with the xiphoid position, preferably the heart being at a predetermined, such as minimal, distance from the chest wall.
  • Preferably, the method comprises steps of applying parameters when determining the position of the heart relating to the body, such as age, existing conditions, weight, fitness, chest width.
  • Further advantages, features and details of the present invention will be further elucidated on the basis of a description of one or more embodiments with reference to the accompanying figures.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows a preferred embodiment of a method according to the invention.
  • FIGS. 1B-1D show three embodiments of a marker element according to the present invention.
  • FIG. 2 shows a preferred embodiment of a further, more detailed method according to the invention.
  • FIG. 3 shows a preferred embodiment of a further method according to the invention.
  • FIG. 4 shows heart position examples relating to age and gender.
  • FIG. 5 shows an example of a heart in a chest and its exemplary relation to the xyphoid.
  • FIG. 6 shows an example of a shoulder height and sternum length in relation to a heart position in a torso.
  • FIG. 7 Shows examples of positions in which a heart can be oriented in a torso depending on body parameters.
  • FIG. 8 shows an example of a heart position relative to a xyphoid or height thereof.
  • SUMMARY OF THE INVENTION
  • The present invention is for instance performed in cooperation with an ECG system as input or an ECG system with features added thereto for embodying the invention. A typical embodiment comprises a computing device with receiving means for receiving from an ECG device the ECG measurements during an ECG session, such as during a procedure or for obtaining data to base a subsequent diagnosis on. The computing device is provided with a processor and memory. The memory comprises program code for enabling the processor to perform the method according to the invention.
  • Furthermore, the computing device is coupled to a monitor for displaying resulting images. A user interface is also displayed on the monitor for allowing input to be provided. Additional aspects of the user interface is comprised of a keyboard and mouse, touch screen, and all other user preferred in itself known input devices may be coupled to the computer through readily applicable connecting ports.
  • Furthermore, a 3-D camera is available for taking imaging information recordings from the torso. For obtaining the 3-D imaging information recordings, a capability to record from several sides of the torso is preferred. This is obtained by either one camera that is movable to capture images from the top, left and right side of the torso. Alternatively, two or more cameras may be fixedly mounted relative to the position of the torso in order to combine the 3-D imaging information recordings of the two or more cameras.
  • Furthermore, the computer is preferably connected to a database of 3-D torso models. Such a database of 3-D torso models preferably comprises unique torso models obtained by imaging devices, such as an MRI, CT or sound echo device. Depending on available time and equipment the respective information can advantageously be obtained during the ECG session, before the ECG session or based on historical measuring data for performing of this method.
  • Preferably, the 3-D photo is recorded by means of a 3-D camera providing a cloud of points in a 3-D space. The cloud of points represent the subject of the imaging information recording. To this end, the 3-D camera is used to capture an image of a torso of a subject in the form of 3-D information comprising information with respect to depth and color of the subject and of the surroundings of the subject. As indicated in the above, a single camera can be moved relative to the subject, such as along a generally circular line around the torso perpendicular to a longitudinal axis of the subject. Also multiple cameras can be used mounted around the subject for taking the appropriate recordings.
  • When taking a photo of the torso the heart position and orientation of the heart is still unknown. Based on patient specific information a good approximation of the heart position and orientation can be made. The aortic valve is approximately behind the sternum. The lower part of the heart at the sternum is well approximated by the lower part of the sternum (xyphoid). What exactly the lower part of the heart at the sternum is depends on the orientation of the heart. So knowing the xyphoid position supports the heart position significantly.
  • When a heart is vertical (20 year all male example) the right heart is right, and the left ventricle is left. With increasing weight the heart is pushed upward, with means the long axis of the heart is becoming more horizontal, at the same time left chamber of the heart rotates backward. These rotations can be described as below. The chosen angles (long axis, left right angle and the left basal angle largely depends on the depth of the chest at the xyphoid.
  • DESCRIPTION OF THE INVENTION
  • FIG. 1 shows an exemplary method as follows. Initially, optionally, a marker is detected as described relative to the below description and inserted from the priority application by reference.
  • Further, in step 130 the xyphoid position is determined. If a marker at this position is present this position can be used, if not determine a virtual position of the xyphoid
  • In step 135 virtual determination of the xyphoid point. First the shoulders height is determined from the reconstructed model. The length of the sternum is a function of the patient's length, for adults this is approximately 20 cm on average. For adults the sternum height is estimated and is used to localize the xyphoid location.
  • In step, 140 determine the depth of the chest at the selected xyphoid position.
  • In step, 145 the reference model used (The selection of the reference model should have already been described in the vector patent) has a defined xyphoid depth and both depths are used to scale the heart model.
  • In step, 150 the angles for the three axis's are determined. With less sternum depth the heart remains more vertical. When the patient is over weighted the heart is more horizontal. Axis's angles are thus determined by weight and xyphoid depth (other patient characteristics might be incorporated to tune the estimated angles.
  • 155 rotate the heart using the three estimated angles
  • 160 shift the heart position such that the lower part of the heart coincides with the xyphoid position (Z axis=vertical body axis) and is at least a minimal distance from the chest wall (x direction=xyphoid depth axis)
  • FIG. 2 shows an exemplary method as follows. In step 500, it is preferably determined if imaging information is available, i.e. 3D photo(s) from which a torso model and marker positions are to be determined. This can be any imaging from which the chest dimensions and xyphoid and/or sternum marker positions can be determined. A 3D camera is one a preferred input device.
  • Initially, optionally, a marker is detected as described relative to the below description and inserted from the priority application by reference in steps 505-530. In case a marker is detected, the method continues in step 540. In case no marker is detected, the method continues in step 535. Step 700 is an optional manual step if no device measurements are available to have manual measurements entered.
  • The database provides information as to information of models and creation of the model.
  • In step 530 the xyphoid position is determined. If an externally applied marker at this position is present this position can be used, if not determine a virtual position of the xyphoid based on used marker(s) or a manual indicated point on the torso. In case a Xyphoid marker is detected: the best positioning of the xyphoid is from a special marker on top of the xyphoid. With this information the sternum length can be determined as well as the lower part of the heart. Thus, such information helps with the estimation the heart position.
  • In step 535 virtual determination of the xyphoid point. First the shoulders height is determined from the reconstructed model. The shoulder height is determined as the position where the relative change in shoulder height along the left and right side of the body is more than a predefined slope, e.g. 30% (see figure xx). The length of the sternum is, at least preferably assumed to be a function of the patient's length, for adults this is approximately 20 cm on average. For adults the sternum height is estimated and is used to localize the xyphoid location. If no marker is present the xyphoid position can be estimated. As the sternum is in the middle of the torso, only the xyphoid z-position needs to be estimated. For adults this is on average 20 cm, decreasing with age and is slightly influenced by the thorax dimensions (depth of the sternum for instance)
  • Zxyphoid=20−α age, the z axis is the axis from feet to head.
  • In step 540 the depth, width and of circumference of the chest at the selected xyphoid position is determined. Based on this also the chest ratio (chest depth divide by the chest width) can be determined. Preferably the chest dimensions at the xyphoid is determined. The width (from left and right) and depth (from front to back is calculated by the cross section of the x and y axis going through the middle of the thorax. These dimensions determine the bounding box of the heart, i.e. the ribcage.
  • In step 545 the reference heart model has to fit the torso model. The most limiting space is the chest depth. The depth of the heart (back to front chest direction) scaled such that this heart depth is equal to half chest depth of the model used. (The selection of the reference model should have already been described in the vector patent). Another method is to scale the heart by factor between the chest depths of the reference model and the patient specific chest dimensions has a defined xyphoid depth and both depths are used to scale the heart model. For example, when given the estimated ribcage space (540) the heart is scaled. The heart can take approximately 50% (β) of the sternum depth. The reference heart is scaled given the thorax depth at the xyphoid position:
  • Heart = Heart ref * thorax_depth * β
  • In step 550 the angles for the three axes is computed. With less sternum depth the heart is relatively more vertical. When the patient is over weighted the heart is more horizontal. Axis's angles are thus determined by weight and xyphoid depth (other patient characteristics might be incorporated to tune the estimated angles. preferably, the angles for long axis, mitral-tricuspid axis and left basal axis are computed. The heart is then rotated such that is also matches the thorax width at the xyphoid position
  • In step 555 the heart is rotated using the three estimated angles when the chest dimensions indicate a small chest size, e.g, for instance a chest circumference of less than 1000 mm. The smaller the chest, the less space there is for the heart, the more vertical the heart will be, as is the case for children for instance the heart using the three axis and angles is rotated. Heart=rotation (Heartref, thorax_width) rotation is done over the three axis, such that the cross section fills up about ⅓ of the thorax width.
  • In step 560 shift the heart position such that the lower part of the heart coincides with the xyphoid position (Z axis=vertical body axis) and is at least a minimal distance from the chest wall (x direction=xyphoid depth axis) and the right side of the aortic valve are behind the sternum (y-direction). The heart position so the lower part matches the xyphoid is adjusted.
  • Compute the average of the points of the model aortic, pulmonary, and tricuspid valve and move this point such that this point is behind the sternum. Subsequently move the lowest part of the heart at the sternum position (the z coordinate) to z coordinate of the xyphoid point.
  • Heart = Heart + ( x , y , z )
  • 600 Reference torso-heart model database with models and descriptions of the models, for instance, but not limited by chest dimensions, etiology, gender, weight, height of the models, such as is a database with models of heart torso combinations.
  • In step 700, if no imaging data is available to determine the xyphoid and chest dimensions, e.g. chest depth at xyphoid, chest width at xyphoid height and chest circumference, the values can be measured on the patient directly with manual measurement devices. For the proper selection of the model other measurements can be taken into account here as well, such age, gender, height, weight, and etiology of the disease. Patients with a failing heart have often an enlarged left ventricle, such model should be taken from the database. Based on these measurements a model is selected from the database and scaled to match the measurements, i.e. sternum length, chest circumference and xyphoid position match the measured values.
  • In step 710, the shoulder height is determined, a preferred marker for this point is the manubrium of the sternum, the top of the sternum.
  • In step 720, the distance between shoulder height and xyphoid is determined, the lower end of the sternum.
  • In step 730, the chest circumference at the xyphoid position is determined.
  • In step 740, the chest width and depth at the xyphoid is obtained. This is preferably measured or computed using both the circumference and one of the other measures (depth or width)
  • In step 750, other demographics of the patient are inputted or obtained, such as weight, height, gender, etiology of a (cardiac) disease.
  • In step 760, based on the information collected select the reference torso-heart model that closely matches the patient specific values is determined.
  • A further subject of the present invention is the use of a marker element to be used as a reference point relative to the torso. According to embodiments such a marking element provides an optically recordable element, such as a surface, perform an input for the analysis of the 3-D imaging information recording. And may take the form of a patch, optionally comprising communication electronics providing an identification, having predetermined recognizable characteristics for detecting thereof by means of the computing device executing the appropriate program means. Optionally, the computing device is partly or wholly integrated into the camera device.
  • Preferably, the position on the thorax is predefined and enables the computing device to match, orient and or detect the thorax in the 3-D imaging information recording under clinical circumstances. By applying the marker, according to the embodiment, computing device is able to perform an analysis eliminating disturbances, such as blankets, equipment, objects or cables momentarily arranged on top of the person, etc. Also, a quality check of the 3-D photo can be based on imaging information relating to the marker element and/or the position of the camera towards the torso and/or marker.
  • Alternative embodiments of the marker element comprise recognition by means of color, signal, patterns, geometry, such as a shape; wherein optionally also through openings in the marker are provided for discerning the skin color.
  • The marker element provides a means to use as a basis for analysis. Algorithms of analysis are preferably adaptable to a range of predetermined marker types, such as distinguished by means of for example presence of information elements providing directional information such as a pattern of dots, color, shape, dimension, lighting, sound. Preferably, the position of the marker element, or marker elements on the thorax is predefined, for example by having the upper side of the marker element coincide with the upper part of the sternum or suprasternal notch and having the marker elements positioned along the sternum.
  • Several characteristics of several preferred embodiments of the marker element provide distinct advantages. Analysis of the color or combinations of color of the marker element provides advantages in permitting the detection of the marker element which enables analysis of an area of the subject where the marker is present. Providing a certain order of color them provide information regarding to orientations such as left, right, top, bottom and depth orientations of the subject and allow for such information to be used as inputs in the analysis.
  • The geometry or shape of the marker element provides the advantage of improved performance of analysis, such as during detection of the marker on the subject.
  • Characteristics such as sound, light or a signal from an RFID chip provided in the marker element provides advantages in the direction of the marker and advantages in performing the analysis according to the present invention, such as in identifying the marker element on the thorax of the patients.
  • The marker element represents a reference point towards the algorithm performing the analysis in a way that defines the 3-D space independently of how the recording of the imaging information is performed. That is, independent of which camera is used, what the orientation of the camera is, on as the marker is comprised in the imaging information recording. The marker provides a basis for the algorithms to determine the orientation of the marker and based on that create an initial estimate of the orientation of the thorax. If non-preferable outcomes are obtained, information may be outputted as to a change in positioning of the camera are relative to the subject, such as to provide a better alignment with regard to e.g. a longitudinal axis of the torso, or to provide a better alignment relative to the marker element.
  • Analysis of the external shape of the subject is a further aspect in the marker element is set to improve. In case of e.g. a female subject, algorithms are provisions to detect the shape of a breast relative to the marker position. Based on this, specific analysis of the 3-D imaging information recording is performed. Advantages thereof are that the time required for performing calculations for the analysis is reduced allowing for a real-time usable results. As such, the marker element is preferably the starting point of the comparison between the 3-D imaging information recording and the 3-D model of the torso obtained.
  • An initial verification step in recording the 3-D imaging information recording comprises verification of the presence of the marker element. Preferably also a verification is made of acceptability of the image due to general photographic circumstances of the area or area in a room in which the recording is performed.
  • Furthermore, the 3-D imaging information recording is generated and verified with respect to the presence of the marker element, whereupon it is saved and used for further analysis.
  • FIGS. 1A-1C show three preferred embodiments 1, 2, 3 according to the present invention of the marker element. Marker element 1 consists of a blue rectangle allowing for identification based on at least the color and shape. The color allows for identification of the color and the meaning of such color as predetermined. Aspects of analysis of orientation with this embodiment will come from information in the 3-D image recording relating to e.g. the torso. Marker element 2 consists of a rectangle having two areas of color. Such rectangle also provides an analysis as to the up and down and left and right based on the information of the colors. Therefore, more aspects of analysis can be determined based on the marker element in itself. Marker element 3 consists of a rectangle having a general element of color and a shape defined therein, and his case comprising of a vertical line or bar with a circle generally oriented at the middle of the marker element 3.
  • FIG. 2 is provided to show an overview of a torso with a marker element on the sternum and ECG electrodes oriented on the torso. The quality of the ECG recordings, and comparability of a range of recordings over time is dependent on a correct orientation or the same orientation in the several recordings over time.
  • The present invention has as an important advantage that it becomes possible to relate the position of the ECG electrodes to the position of the marker element and thus to a fixed position on the torso. Furthermore, the present invention enables relating the imaging information recording to the model of the torso. Furthermore, the present invention enables relating the position of the marker element to the model of the torso, furthermore, the present invention enables relating the position of the ECG electrodes to the torso model, preferably wherein the marker element provides a basis for calculating such relation and allowing for calculating such relation very speedily, such as fast enough for providing a result usable within the session, defined as real-time within the context of the present invention.
  • In an embodiment of the analysis, the 3-D imaging information recording is divided in areas. A main area of analysis is the marker area 11. The marker area 11 is an area defined around a detected marker. Other areas comprise area has 16, 17, 18, which are area is defined to compare parts of the 3-D imaging information recording with the torso model information for reaching a match between those. The electrodes 13 are regular ECG electrodes, preferably recognizable by shape or color for identification thereof, which electrodes are to be matched to the torso model by means of imaging information and the presence of the marker therein.
  • FIG. 3 provides a general overview of the method according to an embodiment. Initially the method is started in step 20. In step 21, imaging information as obtained from the 3-D camera is interpreted to assess the presence of a marker. It is preferred that imaging information is recorded in case such presence of a marker is assessed, in order to record usable information. In step 22, it is determined of a marker is present in imaging information. In case there is no marker detected in the imaging information, the method returns to step 21. In case a marker is determined to be detected in step 22 in the imaging information, the imaging information is structured to coordinates, color information and or depth. The result is a cloud of points in step 23.
  • In step 24, the results of step 23 are divided into areas of analysis, such as areas to be compared with areas of the 3-D torso model. In step 25 it is determined whether enough areas for further analysis are defined. In case it is determined that not enough areas for further analysis are defined, the method returns to step 21. In case it is determined that enough areas for comparison are defined, and as such a quality check of the 3-D imaging information recording provides a positive determination, the method continues in step 26.
  • In step 26, the preprocessed 3-D imaging information recording is matched to matchable information of the 3-D torso model as obtained. In step 27 it is determined whether a match was possible between the information from the 3-D imaging information recording and the 3-D torso model. In case it is determined that the match was not possible, the method returns to step 21 in order to reprocess with a new 3-D imaging recording. In case it is determined that the method provided a match of an acceptable quality, such as within certain predetermined limits, the electrodes are detected from the 3-D imaging information and matched with the torso model for adding information relating to the electrodes to the 3-D torso model.
  • FIG. 4 provides a further embodiment of the method according to the present invention. The method starts in step 100 as a configuration step 4 loading resources needed, based on an instruction into the user interface for generating a 3-D image recording. The detection algorithm is initialized by providing characteristics of the marker element used in the respective session. Those characteristics, such as colors, such as blue, green or pink, geometry, such as rectangular, square, triangular, or its dimensions, such as with or height, are retrieved from the database. These characteristics I used for creating groups of points having at least one of the marker description, such as the color.
  • In step 110, the information relating to the 3-D imaging information is received from the 3-D camera comprising e.g. color, depth and etc. The information is structured to coordinates having color information.
  • In step 120, the 3-D imaging information is prepared for analysis for detecting the marker the received imaging information is analyzed for the presence of the marker. The marker has to be oriented in the marker area 11 that is displayed, preferably yellow, on a display of the camera or a monitor of the computing device showing the imaging information. Pixels that are received inside the marker element area 11 is added to a list with the same criteria. A list is created for each criterion. This helps in finding the marker as pixels with the same color, such as on the marker will be on the same list.
  • In step 130, the marker is identified based on the created lists. Information as to the geometry of the marker is extracted from the information of the pixels in the lists. If this is not successful, a calibration is performed with respect to constraints such as circumstances such as light in the room that can influence the colors of the recording. If the marker is found, the method allows in step 140. An example is a marker that is a rectangle, one half centimeter wide, 5 cm high and having a blue-collar. The step of analyzing will identify the list having the pixels providing the rectangular shape, such as by taking 4 points of the selected list and calculating the angle created by each 3 points. If the angle is 90°, the distance and the color match, and then the list comprises information relating to the marker.
  • In step 140, the order of calibration and a zone of calibration is processed. If the marker was not detected, the camera is directed such that the marker is in the marker element area 11. The said calibration is performed and step 130 is repeated. In step 150, it is determined that the 3-D imaging information contains the marker and the 3-D imaging information is recorded. One session according to the present invention may comprise a number of recordings over the duration of the ECG session.
  • Based on the created 3-D imaging recordings, a full 3-D imaging recording of the torso is created. Images are taken of the torso from several angles, all comprising the marker. For example, the camera is moved by starting capturing from the left part of the torso moving over the torso to the right part of the torso. During such movement of the camera the camera takes a temporary image recording every second after which the recordings are combined to a 3-D image recording of the full torso. All individual recordings are processed as described in the above in order to assess the presence of the marker.
  • In step 170, the 3-D image recording of the full torso is verified. In case the combination of the sip recordings electrodes to deformations you to inconsistent camera moving, the recordings have to be taken again by repeating the above steps.
  • In step 180, it is determined if the resulting marker from the combination of separate 3-D imaging information recordings is valid. Such validity is obtained if the combined 3-D image recording comprises sufficient information, such as for example performed by analyzing the 3-D imaging information recording starting from the marker position down to the bottom by steps of 3 cm and checking the percentage of holes present in the 3-D photo. Acceptability is for instance defined if the percentage is below 3%. In step 190, the method ends with outputting a validated 3-D photo
  • FIG. 4 describes an embodiment of pre-analysis in preparation for computation during analysis. This phase represents a pre-analysis of this 3-D imaging information recording for extracting information relating to the subject, was information is of assistance in the correction of errors in the 3-D imaging information, the recognition of parts of the subject such as the shoulders, head, breast areas. The purpose of this embodiment is to analyze usability of available information for the extraction of the thorax information relating to the electrode positions. In step 200, the method is initialized by loading the 3-D imaging information and analyzing means, such as a tracker for shoulders. Here, the characteristics of the marker is related to a specific use. A blue marker is for instance used for analyzing the anatomy of the body, such that for instance if the marker is rectangular and its color is plain blue the analysis of the body will be performed relating to the circumference of the torso the width or the circumference of the arm.
  • Step 210 comprises loading or detection an anatomical thorax element/parts from the 3D-Photo and categorization those element/parts relating them to their body positions. The detection in this level is a categorization per region relative to the marker position as depicted in FIG. 9.
  • A coordinate analyzer separates the points of 3-D imaging information to points which are in higher position than the position of the marker then divide this upper part to left and right category. The other points from the 3-D imaging information, preferably having an altitude lower than the marker's altitude, those will be in the list (group) representing the belly.
  • The detection of the thorax is a combination of information given by specific analyzers. A shoulders analyzer will give the position of the shoulders, a circumference analyzer gives us different circumferences related to an altitude (horizontal position). From those two information we have the upper offset of the thorax and also the left and right is deduced from the ellipse equation of the biggest circumferences found.
  • Step 220 is defining the shape of the shoulders, for example using a cylindrical geometry approach, such as shown in FIG. 10. The shoulders analyzer gets lists of points and studies the relations thereof toward each other. In other words, the analyzer search for area where the depth of points decreases progressively for those points to be in the same distance from a line, such as a central axis of the cylinder they form.
  • In step 230, the circumference of the torso is analyzed. For the detection of the circumference of the torso, the symmetry and the variation in the distance of the torso in the same altitude is analyzed. Curved lines are created with points at the same altitude at for instance a 2 cm interval. Ellipses are created in this way for determining the circumference of the torso for permitting the prediction of the dimension of the subject. This prevents manipulation reduction.
  • In step 240, other characteristics of the torso are determined, such as the head, hair, or skin color. Also elements in the photo that are distracting to analyzing the torso, such as a blanket on the subject, are rejected as much as possible. Preferably, it is known what color blanket is used such that imaging points having the same color as the blanket can be removed from analysis, thereby saving time of analysis by such noisy information.
  • In step 250, the information from the analyzer is from the steps 220, 230, and 240 are combined to improve the analysis. For example, by knowing the position of the shoulders and the points in the imaging information constituting the shoulders, the central axis of the cylinder that they form helps in detecting borders where the imaging information may end, for example because there are no electrodes beyond such areas.
  • FIG. 6 shows a method for matching the 3-D torso model to the 3-D imaging information recording. The method starts in step 500 with loading information, and initializing the computing device for performing the calculations. The input of step 510 is the information relating to the 3-D imaging information recording and the 3-D torso model. The position of the marker element in the imaging information recording as well as its equivalent position in the 3-D torso model is taken as the basis to calculate the difference of distance and the angles created by the kerf of the marker zone in the model and the imaging information.
  • Step 510 provides: take the marker position and its equivalent position by the model, then calculate the difference of distance and the angles created by the curve of the marker zone in the model and 3D photo.
  • The coordinate of the marker initially in the 3D photo is expressed in camera space (is the camera which give the point of 3D photo its coordinate) and the marker coordinate in the model is defined by the MRI device as consequence, except coincidence, the position of the marker in the 3D photo and model are different. The positions of the marker and the equivalent of the marker are separate by a distance not nul 0, see FIG. 12.
  • The normal in the position of the marker in 3D photo and in the model are the same. That is why the angles between should be nul as consequence transformation has to reduce that angle to 0, see FIG. 11.
  • Step 520 provides: the 3D-Photo is moved to the model until the distance of the marker and its equivalent position is reduced to 0.
  • The distance between the marker in 3D photo and its equivalent position in the model is calculated then the 3D photo is translated to the position of the marker. Indeed we calculate the vector of translation and we move the coordinates of every point of the 3D photo using the calculated vector.
  • Example
  • A ­­­ translate -> B A + vector = B Vector = B - A .
  • See FIG. 2 and FIG. 13
  • Step 530 provides: 3D-Photo will be rotated until their curves are positively parallel and control if the distance between the marker and its equivalent position is null, see FIG. 14.
  • After translating the 3D photo to model the normal vectors (representation of the curve) of the model and the 3D photo must have an angle equal to 0 degree. If is not the case points of 3D photo are rotated until that angle becomes null. See FIG. 11
  • Step 540 provides: the new update of 3D-Photo is present but that does not grant that the photo and the model are in the same orientation.
  • This phase permits the correction of orientation of the 3D-Photo and the model by taking two equivalent random zones and calculation of two vectors which are going from the marker position to that area and rotate the 3D-photo until that two points coincide or become equal, see FIG. 12 and FIG. 15.
  • Step 550 provides: Estimation of the matching is the calculation of the difference of distance between projection of the element of 3D-Photo and their equivalent from the model.
  • The following configuration of matching the marker position and a second point which is in 180 millimeter from the marker position is considered. We select a list of second point (random point) which are in same distance from marker in 3D photo then we calculate the transformation every time we select a point from the list of second point. Then the percentage of the difference of distance of every point from 3D photo and its projection into model is calculated and the best percentage is taken after comparing all transformations there are. See FIG. 13, FIG. 16 and FIG. 17.
  • Step 560 provides: compare with latest estimation if the value increased or not and apply the best transformation. If the percentage increased the calculation will be retried with another configuration until a maximum related to the model and the 3D-Photo is reached.
  • After step 550 the best percentage relative to the selected list of second point (point from random zone) is obtained. If the current best percentage is better than the old one the estimation improved.
  • Every time there is improvement the process is reiterated until that every percentage resulting from a selected configuration stays lower that the saved one.
  • By comparing every possibility it is indicated that the best result that can be provided with that 3D photo is obtained, see FIG. 15, FIG. 13 and FIG. 18.
  • Step 570 provides: the new 3D-photo is stored and can be used to extract the specific information and apply them to the model like electrodes position for example.
  • FIG. 7 shows a method for analyzing areas on the imaging information recording and defining a coordinate system relative to the marker element.
  • Step 600 provides: load the 3D-Photo and load the resources needed for the process (load the specific analyzer).
  • Step 610 provides: the marker analyzer goes to position of the marker and use the information related to it and creates from those information the “marker” to which every data is related. The points of the 3D photo in the space (coordinates) are positioned related to that marker, see FIG. 19.
  • The marker contains 3 axes to which every coordinate of a points from the 3D photo is expressed, see FIG. 20.
  • A point is taken from the 3D photo. That point after recording is expressed with the coordinate that the camera gives to that point in one side. In the other one, the coordinate of a point from the model is expressed with the coordinates given by the MRI device. The marker in 3D photo and its equivalent in the model are the same. As consequence, the coordinates are expressed using the same reference.
  • Step 620 provides: by the start of analyzing using the information generated from “610”, the zone of analysis is defined, see FIG. 20.
  • The zone of analysis is the result of the combination of the information that the analyzers provide. The line of the shoulders is the upper border of analysis. The radius of the biggest circumference plus the ellipse's center of all circumferences provides the offset left and right from the axis formed by the ellipse's center.
  • Step 630 provides the nearby area (or zone of the analysis) is inspected and is prepared to be categorized to the zone marker or random zone.
  • A nearby area or the marker zone is the area related to list of points which are close to the marker with a certain distance (3 cm, 5 cm, . . . ). All elements outside of the marker zone are in the random zone or what is called the control zone, see FIG. 14.
  • The analyzer calculates the curve of the geometry where the marker is positioned. The curve is how the concavity of this zone of the 3D-photo behave, see slide 6
  • The objective is to find the relation of the points and how the global curvature of the area behaves (manifest) due to the repartition. The result of the analysis of the curve is a normal vector that characterizes the area. For example if the area is plane then the normal (perpendicular vector) of the plane is a representation of the curvature which not changes along that plane. If the area is spherical then the normal is perpendicular to the tangent plane of the sphere in that selected position.
  • The analyzer get the no marker zone (random area), get their characteristics and store them to use them in matching process. The areas are characterized by their distances, their positions, see FIG. 19, see step 630.
  • The end analyzing structures the obtained information and store those to use those in further process.
  • The objective is to have 3D points with the same characteristic from the model and their equivalent in 3D photo (3D point from the model which is 80 millimeter far from the marker) to compare every points and with its equivalents to control the quality of matching, see FIG. 15.
  • Combine the marker with information and correct if there are errors present.
  • The present invention is described in the foregoing on the basis of several preferred embodiments. Different aspects of different embodiments can be combined, wherein all combinations which can be made by a skilled person on the basis of this document must be included. These preferred embodiments are not limitative for the scope of protection of this document. The rights sought are defined in the appended claims.

Claims (20)

The invention claimed is:
1. A method performed by a computing device part of or coupled to an ECG device for estimating an orientation of the heart based on 3-D imaging information taken from a human torso, applying a marker element in a process of determining an orientation of a heart of a human body, the method comprising steps of:
receiving information relating to the human body from a measurement source, such as dimensional measurement information input into the computing device from an input interface or an optical 3D imaging device, the information relating to the human body comprising:
information of the exterior of the torso; and
determining chest dimensions, such as at the xyphoid position.
2. The method according to claim 1 further comprising the step of estimating the heart orientation based on the chest dimensions, such as chest depth, chest width, and/or circumference.
3. The method according to claim 1 wherein the steps of receiving information relating to the human body comprises the step of receiving imaging information taken from the torso.
4. The method according to claim 1 wherein the step of determining chest dimensions further comprises the step of receiving measurement information relating to physical dimensions of the chest from input means or a database.
5. The method according to claim 1, further comprising the step of determining heart dimensions of a reference model to the determined chest dimensions.
6. The method according to claim 1, further comprising the step of determining a position of the xyphoid based on at least one of the shoulder height and the length of the sternum.
7. The method according to claim 1, further comprising the steps of receiving imaging information of a marker element, the marker element being arranged in an area comprising an actual position on the body, from a marker element optical imaging device (OID), the ECG electrodes imaging device and the marker element imaging device being the same imaging device; and,
performing an image recognition on the imaging information for obtaining a presence determination, of the marker element in the imaging information.
8. The method according to claim 1 in which the chest depth and/or heart orientation is used in steps for estimating a torso model or a heart torso model.
9. The method according to claim 8 in which the torso model or the heart torso model is based on measuring points such as a cloud of points based on the received information, preferably 3D imaging information.
10. The method according to claim 8 which the torso model or heart torso model is based on geometrical assemblies, such as triangles, created based on the cloud of points.
11. The method according to claim 1, further comprising the step of classifying partitions of the received information relating to the human body from a measurement source.
12. The method according to claim 1, further comprising the step of determining at least one angle for a long axis, mitral-tricuspid axis and left basal axis.
13. The method according to claim 1, further comprising the step of determining a rotation of the heart using at least one predetermined axis and angles, such that the cross section fills up about ⅓ of the thorax width.
14. The method according to claim 1, further comprising the step of estimating a heart orientation based on the estimated chest dimensions.
15. The method according to claim 1, further comprising the step of shifting the heart position such that the lower part of the heart coincides with the xiphoid position, the heart being at a predetermined, such as minimal, distance from the chest wall.
16. The method according to claim 1, further comprising the step of applying parameters when determining the position of the heart relating to the body, such as age, existing conditions, weight, fitness, chest width.
17. The method according to claim 1, wherein the method is performed using a heart-torso model of the human body.
18. The method according to claim 1, wherein the method is performed using a heart-torso model including data pertaining to the heart of the human body.
19. The method according to claim 3, wherein the imaging information is 3D imaging information.
20. The method according to claim 6, wherein the step of determining a position of the xiphoid comprises determining a virtual determination of the xiphoid point.
US17/571,783 2017-09-27 2022-01-10 Heart Position Estimation Pending US20220207771A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/571,783 US20220207771A1 (en) 2017-09-27 2022-01-10 Heart Position Estimation

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
NL2019635 2017-09-27
NL2019635 2017-09-27
PCT/NL2018/000020 WO2019093877A2 (en) 2017-09-27 2018-11-27 Heart position estimation
US202016651752A 2020-03-27 2020-03-27
US17/571,783 US20220207771A1 (en) 2017-09-27 2022-01-10 Heart Position Estimation

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
US16/651,752 Continuation US20200250815A1 (en) 2017-09-27 2018-11-27 Heart Position Estimation
PCT/NL2018/000020 Continuation WO2019093877A2 (en) 2017-09-27 2018-11-27 Heart position estimation

Publications (1)

Publication Number Publication Date
US20220207771A1 true US20220207771A1 (en) 2022-06-30

Family

ID=66437926

Family Applications (3)

Application Number Title Priority Date Filing Date
US16/651,804 Active 2039-08-13 US11532101B2 (en) 2017-09-27 2018-11-27 Marker element and application method with ECG
US16/651,752 Abandoned US20200250815A1 (en) 2017-09-27 2018-11-27 Heart Position Estimation
US17/571,783 Pending US20220207771A1 (en) 2017-09-27 2022-01-10 Heart Position Estimation

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US16/651,804 Active 2039-08-13 US11532101B2 (en) 2017-09-27 2018-11-27 Marker element and application method with ECG
US16/651,752 Abandoned US20200250815A1 (en) 2017-09-27 2018-11-27 Heart Position Estimation

Country Status (3)

Country Link
US (3) US11532101B2 (en)
EP (2) EP3782120A2 (en)
WO (2) WO2019093877A2 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111199674B (en) * 2020-01-21 2022-07-08 珠海赛纳三维科技有限公司 Heart model, and three-dimensional printing method and system of heart model
CN112914583B (en) * 2021-02-25 2022-10-21 中国人民解放军陆军特色医学中心 Method for determining arrangement position of electrocardiogram acquisition electrodes in non-contact manner
WO2024018009A1 (en) 2022-07-20 2024-01-25 Corify Care, S.L. Methods to determine the morphology and the location of a heart within a torso

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150320331A1 (en) * 2014-05-06 2015-11-12 Peacs B.V. Estimating distribution, fluctuation and/or movement of electrical activity through a heart tissue
US20170330075A1 (en) * 2016-05-12 2017-11-16 Siemens Healthcare Gmbh System and method for deep learning based cardiac electrophysiology model personalization

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RS49856B (en) 2004-01-16 2008-08-07 Boško Bojović METHOD AND DEVICE FOR VISUAL THREE-DIMENSIONAL PRESENTATlON OF ECG DATA
WO2014037939A1 (en) 2012-09-05 2014-03-13 Body Pass Ltd. System and method for deriving accurate body size measures from a sequence of 2d images
WO2014107700A1 (en) * 2013-01-07 2014-07-10 Alivecor, Inc. Methods and systems for electrode placement
GB2510627B (en) 2013-02-11 2015-03-04 Siemens Medical Solutions Reorientation of cardiac PET images
US11172860B2 (en) 2014-05-06 2021-11-16 Peacs Investments B.V. Estimating distribution fluctuation and/or movement of electrical activity through a heart tissue
KR101560282B1 (en) * 2015-03-06 2015-10-14 주식회사 휴이노 Mobile device having functions of measuring bio-signals and realtime-monitoring blood pressure estimation based on measured bio-signals
US10565782B2 (en) 2015-08-29 2020-02-18 Intel Corporation Facilitating body measurements through loose clothing and/or other obscurities using three-dimensional scans and smart calculations
US9883835B2 (en) * 2015-10-16 2018-02-06 General Electric Company Method and system of directing positioning of ECG electrodes
US20170156615A1 (en) * 2015-12-07 2017-06-08 Reza Shirazi Disposable heart monitoring system, apparatus and method
US10441180B2 (en) 2016-08-10 2019-10-15 Huami Inc. Episodical and continuous ECG monitoring
US10736570B2 (en) * 2016-03-24 2020-08-11 CardiacSense Ltd. Methods circuits assemblies devices systems facets and associated machine executable code for detecting vital signs
US11246662B2 (en) * 2017-08-01 2022-02-15 Catheter Precision, Inc. Methods of cardiac mapping and model merging

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150320331A1 (en) * 2014-05-06 2015-11-12 Peacs B.V. Estimating distribution, fluctuation and/or movement of electrical activity through a heart tissue
US20170330075A1 (en) * 2016-05-12 2017-11-16 Siemens Healthcare Gmbh System and method for deep learning based cardiac electrophysiology model personalization

Also Published As

Publication number Publication date
WO2019093878A8 (en) 2019-08-01
WO2019093878A3 (en) 2020-02-06
US20200250815A1 (en) 2020-08-06
WO2019093877A2 (en) 2019-05-16
WO2019093877A8 (en) 2019-08-01
US20200242802A1 (en) 2020-07-30
EP3782120A2 (en) 2021-02-24
EP3709873A2 (en) 2020-09-23
US11532101B2 (en) 2022-12-20
WO2019093877A3 (en) 2019-12-05
WO2019093878A2 (en) 2019-05-16

Similar Documents

Publication Publication Date Title
US20220207771A1 (en) Heart Position Estimation
US11576645B2 (en) Systems and methods for scanning a patient in an imaging system
US9524582B2 (en) Method and system for constructing personalized avatars using a parameterized deformable mesh
US10507002B2 (en) X-ray system and method for standing subject
CN101103377B (en) System and method for local deformable motion analysis
US10102622B2 (en) Processing apparatus, processing method, and non-transitory computer-readable storage medium
US7764817B2 (en) Method for database guided simultaneous multi slice object detection in three dimensional volumetric data
Shi et al. A comprehensive cardiac motion estimation framework using both untagged and 3-D tagged MR images based on nonrigid registration
US11576578B2 (en) Systems and methods for scanning a patient in an imaging system
US10318839B2 (en) Method for automatic detection of anatomical landmarks in volumetric data
US20100195881A1 (en) Method and apparatus for automatically identifying image views in a 3d dataset
RU2669680C2 (en) View classification-based model initialisation
CN111311655A (en) Multi-modal image registration method and device, electronic equipment and storage medium
JP2008204200A (en) Face analysis system and program
JP2004008419A (en) Anatomically characteristic position detector and object structure measuring instrument
US8189885B2 (en) Apparatus and method for computing regional statistical distribution over a mean anatomic space
Korotkov et al. An improved skin lesion matching scheme in total body photography
US20210068695A1 (en) Method Providing ECG Analysis Interface and System
US20230036897A1 (en) A method and system for improved ultrasound plane acquisition
US11341661B2 (en) Method and apparatus for registering live medical image with anatomical model
Ordás et al. Automatic quantitative analysis of myocardial wall motion and thickening from long-and short-axis cine MRI studies
JP6660428B2 (en) Processing device, processing method, and program
JP2022128882A (en) Image processing device, image processing program, and image processing method
JP2023135836A (en) Image processing device, image processing method, and program
Rosado-Toro Right ventricle segmentation using cardiac magnetic resonance images

Legal Events

Date Code Title Description
AS Assignment

Owner name: PEACS INVESTMENTS B.V., NETHERLANDS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:VAN DAM, PETER MICHAEL;VAN DAM, EELCO MATTHIAS;ALIOUI, SAMIR;SIGNING DATES FROM 20201206 TO 20210222;REEL/FRAME:058594/0951

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED