WO2024008803A1 - Système, procédé, programme informatique et support lisible par ordinateur - Google Patents

Système, procédé, programme informatique et support lisible par ordinateur Download PDF

Info

Publication number
WO2024008803A1
WO2024008803A1 PCT/EP2023/068567 EP2023068567W WO2024008803A1 WO 2024008803 A1 WO2024008803 A1 WO 2024008803A1 EP 2023068567 W EP2023068567 W EP 2023068567W WO 2024008803 A1 WO2024008803 A1 WO 2024008803A1
Authority
WO
WIPO (PCT)
Prior art keywords
hand
modality
feature
designed
measurement data
Prior art date
Application number
PCT/EP2023/068567
Other languages
German (de)
English (en)
Inventor
Martin Vossiek
Marc Stamminger
Ingrid Ullmann
Vanessa WIRTH
Johanna BRÄUNIG
Simon HEINRICH
Birte COPPERS
Sigrid LEYENDECKER
Anna-Maria LIPHARDT
Original Assignee
Friedrich-Alexander-Universität Erlangen-Nürnberg
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Friedrich-Alexander-Universität Erlangen-Nürnberg filed Critical Friedrich-Alexander-Universität Erlangen-Nürnberg
Publication of WO2024008803A1 publication Critical patent/WO2024008803A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/52Discriminating between fixed and moving objects or between objects moving at different speeds
    • G01S13/536Discriminating between fixed and moving objects or between objects moving at different speeds using transmission of continuous unmodulated waves, amplitude-, frequency-, or phase-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • G01S13/583Velocity or trajectory determination systems; Sense-of-movement determination systems using transmission of continuous unmodulated waves, amplitude-, frequency-, or phase-modulated waves and based upon the Doppler effect resulting from movement of targets
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • G01S13/583Velocity or trajectory determination systems; Sense-of-movement determination systems using transmission of continuous unmodulated waves, amplitude-, frequency-, or phase-modulated waves and based upon the Doppler effect resulting from movement of targets
    • G01S13/584Velocity or trajectory determination systems; Sense-of-movement determination systems using transmission of continuous unmodulated waves, amplitude-, frequency-, or phase-modulated waves and based upon the Doppler effect resulting from movement of targets adapted for simultaneous range and velocity measurements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/862Combination of radar systems with sonar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/415Identification of targets based on measurements of movement associated with the target
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Definitions

  • the present invention relates to a system for determining a feature of a hand.
  • IMU inertial measurement units
  • EMG electromyography
  • marker-based optical methods in which the position of reflecting or emitting markers can be determined using infrared cameras, as shown in [1, 2 ] is known. These systems are known for high accuracy in the sub-millimeter range, as is known from [3].
  • OMS optical marker-based system
  • kinematic variables such as specific joint angles, movement speeds and accelerations from the marker trajectories.
  • model-based algorithms are used to calculate the joint angles. These vary depending on the marker set used. The following variants are currently used: (1) one marker per segment is used primarily in the clinical field and for recording static positions, as the movements of the hand are least restricted. (2) With two markers per segment, the markers are attached to the distal and proximal ends of the segments, (3) with three in a triangular shape or (4) in a cluster.
  • markerless methods in which the pose of the person or an object is determined using depth or color cameras, as is known from [1, 2, 6, 7, 8]. Many of these markerless methods are used to determine whole body poses, for example when walking, as is known from [1, 6, 7, 8].
  • RGB data as is known from [9] [10]
  • RGB-D data as is known from [11]
  • One or more (depth )cameras used as input data.
  • additional sensors are also used that can be worn on the body, as is known from [9].
  • the methods used to determine gestures differ mainly in the type of hand model used and the underlying optimization method in order to track its movement over time.
  • Synthetic models come in many versions and can be found, for example, in [12], [13] and [9]. If a synthetic hand model does not already exist, it is usually created from data fusion of the input data across several recordings. Such processes are also referred to as “template-free” or “model-free”.
  • template-free or “model-free”.
  • DynamicFusion as known from [14]
  • successor works which can successively generate models of any objects.
  • Generative methods require an existing model and minimize the error between the current state of the model and the input data over time. Especially in traditional methods without deep learning, as known from [12], this error is usually minimized between two individual, successive input data. Such methods are called frame-to-frame or model-to-frame (if the current state of a model is used error determination is used) is called tracking. Instead, in the deep learning area, discriminative methods are often used, which usually estimate a current pose of the hand based on the most recent image, regardless of the previous input data. A collection of well-known deep learning works can be found in [10],
  • both categories benefit from greater information content in the input data.
  • (depth) cameras are not designed to be able to measure speed directly. Instead, the speed is obtained indirectly from the temporal tracking of spatial points that were captured with the (depth) camera at a previous point in time.
  • a three-dimensional radar-based image of the human body has been state of the art in security technology for several years, as is known, for example, from [15] or [16].
  • broadband multiple-input-multiple-output (MIMO) radar systems with a two-dimensional antenna arrangement are used.
  • Radar-based detection of the human hand is currently the subject of intensive research.
  • the focus is on recognizing hand gestures. Typical applications for this are human-machine interaction, medical radar for detecting medically relevant activities or automated detection of pedestrian movements for future autonomous driving.
  • the recognition of hand gestures is technically implemented using artificial intelligence, based on Doppler or micro-Doppler signatures, which are extracted from short-term Fourier spectrograms.
  • the movement is usually not recorded with spatial resolution, but only the (micro) Doppler signature is evaluated as a time signal, as known from [19], [20], [21].
  • Publications that include spatial resolution use range Doppler maps for the analysis, as known from [22], range angle maps, or both, as known from [23], [24], but not a high-resolution, three-dimensional image of the Hand.
  • EP 1 178330 A1 discloses a radar system for motion detection. However, there is no fusion with another sensor and radar imaging is not explicitly mentioned. There is also no need for model-based data processing or virtualization of the hand.
  • US 10,928,921 B2 discloses motion identification including processing for gesture recognition, but no virtualization of the hand.
  • the US 2015/0277569 A1 discloses the use of radar and camera for measuring movement characteristics of the hand with the aim of gesture recognition. However, explicit model-based virtualization is not disclosed.
  • WO2017/131545 A1 discloses a radar system including functionality for manual measurement. However, it includes an explicit limitation to radar. Furthermore, no explicit imaging is disclosed, but only a two-dimensional angle estimate. No virtualization of the hand is revealed either.
  • the present invention is based on the object of providing a system for optimized detection of a feature of a hand.
  • the system comprises a primary modality, an assisting modality and a control
  • the primary modality being a radar system with at least one transmitting antenna and at least one Receiving antenna
  • the assistive modality comprises a three-dimensional imaging sensor
  • the primary and the assistive modality are designed to each detect the feature, generate measurement data with respect to the feature, communicate with the controller designed for this purpose, from which to be controlled by a trained controller and to send the measurement data to the controller designed for this purpose
  • the controller having means which are designed to control the primary modality in such a way that the measurement data generated by the assisting modality are taken into account and that the controller has means, which are designed to determine the feature of the hand from the entirety or from part of the entirety of the measurement data generated.
  • the radar system is a narrow-band MIMO radar system and/or that the three-dimensional imaging sensor is an optical measuring system, in particular a camera, depth camera, a stereo camera or a laser scanner.
  • the feature of the hand relates to the shell of the hand and/or that the feature of the hand is a position, speed, acceleration and/or a movement sequence of at least one area of the hand or the shell of the hand.
  • the feature may also be phase information of a wave reflected from an area of the hand or the shell of the hand.
  • Control is preferably understood to mean any means, such as a circuit or a computer, designed to perform the functions assigned to the control or a means of control within this invention. Control is therefore preferably to be interpreted broadly and also includes, for example, regulation.
  • control includes means for inputting, displaying and/or outputting, in particular the determined feature of the object.
  • the primary modality further comprises a wave-based sensor modality, in particular an imaging MIMO radar, which is preferably designed to record additional dimensions.
  • control has means which are designed to carry out a virtualization of the hand, the virtualization preferably being carried out by a model based on all or part of the totality of the measurement data generated.
  • the system further comprises an object which is at least partially transparent or approximately transparent for at least a part of a modality and/or is non-transparent for at least another part of the modality or a part of another modality.
  • control has means that are designed to recognize gestures of the hand, in particular based on a synthetic model which consists of or has a kinematic structure and/or the surface of the hand and/or hand object -Model interactions.
  • control has means which are designed to store the measurement data.
  • control has means which are designed to compare several determined features and/or generated measurement data and thereby determine a comparison value from which a further feature, in particular a position, a speed and/or a Acceleration is derived.
  • the primary modality comprises a radar system with at least one transmitting antenna and at least one receiving antenna, with means preferably being present which are designed to transmit waves with less than 10, preferably with exactly one, two or three different frequency support points are emitted by a transmitting antenna and/or that the assistive modality comprises a, preferably three-dimensional, imaging sensor.
  • Frequency support points can also be referred to as frequencies and preferably describe the temporal succession of only one or more transmission frequencies.
  • the antennas of the radar system form a MIMO aperture, with means being present which are designed to generate a laterally focused image of the object through aperture synthesis.
  • control has means which are designed to optimize the determination of the feature of the object using features and/or measurement data.
  • control has means which are designed to use a statistical filter, preferably a Kalman filter, a movement model and/or a method known from information technology and/or that the means are designed to do so a vector speed measurement takes place.
  • a statistical filter preferably a Kalman filter, a movement model and/or a method known from information technology and/or that the means are designed to do so a vector speed measurement takes place.
  • the system is part of a sports, training or fitness measurement or fitness information system or that the system is used for a medical, psychological, diagnostic or therapeutic purpose and/or used to generate digital human avatars becomes.
  • the invention also relates to a use of a system according to the invention in or with a virtual augmented reality.
  • the invention also relates to a method for determining a feature of a hand using a system according to the invention with the following steps: a) detecting the feature by the assistive modality and generating measurement data with respect to the feature; b) detecting the feature by the primary modality, the primary modality being controlled by the controller in such a way that the measurement data generated in step a) is taken into account and generating measurement data with reference to the feature; c) Determining the feature from the measurement data.
  • the feature of the hand relates to the shell of the hand and/or that the feature of the hand is a position, speed, acceleration and/or a movement sequence of at least one area of the hand or the shell of the hand.
  • the method is carried out again, with the determined feature and/or the measurement data generated from the first implementation being compared with the determined feature and/or with the measurement data generated from the second implementation, and thereby determining a comparison value, from which a further and/or improved feature, in particular a position, a speed and/or an acceleration, is derived.
  • waves with fewer than 10, preferably with exactly one, two or three different frequency support points are emitted by a transmitting antenna through the primary modality.
  • the method includes a further step, in particular an aperture synthesis, whereby a laterally focused image of the object is generated.
  • the method is carried out again and a feature and/or measurement data from a previous implementation is used to optimize the implementation. It is preferably provided that a statistical filter, preferably a Kalman filter, a motion model and/or a method known from information technology is used in the method and/or that a vector speed measurement is carried out.
  • a statistical filter preferably a Kalman filter, a motion model and/or a method known from information technology is used in the method and/or that a vector speed measurement is carried out.
  • the method further comprises the step: d) virtualization of the hand, wherein the virtualization is preferably carried out by a model based on all or part of the entirety of the measurement data generated.
  • the method further comprises the step or steps: e) recognition of gestures of the hand, in particular based on a synthetic model which consists of a kinematic structure or has this and/or the surface of the hand and/or f ) Modeling hand-object interactions.
  • the method further includes the step: g) recording repeated standardized movement sequences of the hand under controlled measurement conditions at different times, in particular to check functional changes in the hand's condition.
  • the method is used in a sports, training, or fitness measurement or fitness information system or for a medical, psychological, diagnostic or therapeutic purpose and/or to generate digital human avatars.
  • the invention also relates to a computer program which comprises instructions which cause the system according to the invention to carry out the method steps of a method according to the invention.
  • the invention also relates to a computer-readable medium on which the computer program according to the invention is stored.
  • a multimodal arrangement and an associated method for recording and assessing hand movements and hand poses with the aim of virtualizing the hand are preferably provided.
  • precise detection of hand movements while simultaneously interacting with objects should be achieved through the use of different modalities.
  • the system can preferably consist of at least one arrangement comprising two modalities and a hand which is placed in front of, above and/or below the modalities.
  • the primary modality is preferably a narrow-band MIMO radar.
  • An arrangement for detecting the body shell of a living being comprising a primary sensor modality which includes a radar system with at least one transmitting antenna and at least one receiving antenna, the arrangement comprising a further assistive three-dimensional imaging sensor modality and in a first step the assistive sensor system determines an at least rough location and distance area in which at least one point of the envelope to be detected is located and, in a second step, an image of this at least one point is created by the primary sensor modality using at least one radar signal frequency Shell is reconstructed.
  • a primary sensor modality which includes a radar system with at least one transmitting antenna and at least one receiving antenna
  • the arrangement comprising a further assistive three-dimensional imaging sensor modality and in a first step the assistive sensor system determines an at least rough location and distance area in which at least one point of the envelope to be detected is located and, in a second step, an image of this at least one point is created by the primary sensor modality using at least one radar signal frequency Shell is reconstructed.
  • the image of this at least one reconstructed point is compared with its image from a previous measurement with respect to the phase, and at least one phase difference value is determined therefrom and a movement feature is derived from the phase difference value.
  • at least one further radar signal frequency is used to image this at least one point of the envelope and the phases of the images that were obtained with the first and with at least one further radar signal frequency, with regard to the phase values are compared, and a distance value of at least one point of the body shell is derived from at least one phase difference value obtained in this way.
  • An iterative process can be provided in which the location and speed information of at least one previous measurement is used to improve the representation or image of the shell and to improve the determination of its movement.
  • a sensor fusion of a depth camera and radar can also take place.
  • An advantageous embodiment is the multimodal arrangement of a narrow-band MIMO radar and an optical depth camera.
  • the combination of a primary modality with an assistive modality for determining the shell of a hand and its movement is provided in order to generate a virtual image of the hand in order to enable virtualization of the hand.
  • the individual orientations of the two modalities are preferably related to one another. This can be achieved, for example, by a structure in which both modalities are integrated into one another and thus a priori there is a common orientation.
  • Both modalities can preferably also be placed offset and the relative orientation to one another can then be calculated.
  • An advantageous embodiment is a combination of an imaging, narrow-band MIMO radar with a depth camera.
  • the two modalities are preferably placed in such a way that their generated data overlaps as much as possible in time and space, so that they can be merged and combined with one another as precisely as possible.
  • a further sensor modality that complements the above multimodal sensor system can be used.
  • This further sensor modality can be a wave-based sensor system, for example based on light, radar or ultrasound.
  • the supplementary modality is preferably designed with at least one transmitter and at least one receiver and enables the hand to be detected in the sense of wave-based imaging and/or speed information.
  • the multimodal measuring system is placed above the hand and the supplementary sensor system is placed under the hand.
  • the sensor modality placed under the hand can be placed under a table on which the hand is placed.
  • the table preferably has a point that is transparent to the sensor modality placed below.
  • Transparent is preferably understood to mean that the material in question does not cause any reflection in the sensor modality in question.
  • Optically transparent materials are known to be, for example, air or glass.
  • Materials that are transparent to a radar system are those that have a relative permittivity of one or, in a broader sense, a relative permittivity of almost one. The former is the case in a vacuum. The second case mentioned is therefore preferably relevant.
  • a material that has a permittivity value of almost one is, for example, Styrodur.
  • the use of a table in this context provides two advantages. On the one hand, the table can be used as a storage area for objects that are to be interacted with during hand movements. On the other hand, the table serves as a suitable reference point to make the same gesture comparable over several measurement cycles.
  • the location and/or speed of individual points can be determined even more precisely in three dimensions on the surface of the hand.
  • speed this corresponds to a measurement of vector speed.
  • the additional sensor modality is preferably implemented as a further MIMO radar system, that is, as a radar system with at least one transmitter and at least one receiver.
  • a millimeter wave radar is preferably used.
  • the arrangement is preferably an arrangement with two supplementary MIMO radar systems.
  • combinations of the aforementioned sensor modalities can also be used.
  • the system and measurement modality can be expanded to include additional objects with which a subject interacts. These can be balls, for example.
  • the interaction with the objects serves to determine normal and impaired movement patterns.
  • the objects can consist of one or more materials that are transparent to one or more sensor modalities. As above, transparent means that the material in question does not cause any reflection in the sensor modality in question.
  • an arrangement which comprises a primary sensor modality, which includes a radar system with at least one transmitting antenna and at least one receiving antenna, and comprises a further, assistive three-dimensional imaging sensor modality.
  • the arrangement can be expanded to include one or more further wave-based sensor modalities to record additional dimensions.
  • The: Arrangement can be intended for the virtualization of a hand and can be expanded to include one or more additional imaging MIMO radars to capture additional dimensions.
  • Virtualization of a hand approximated by a model based on data captured by the array may be provided.
  • the recording of movement sequences of the hand can be provided.
  • the arrangement can be used with the additional use of objects by a human, the objects consisting of one or more materials that are transparent to at least one of the waveforms used.
  • the recording of repeated standardized movement sequences under controlled measurement conditions at different times can be provided, in particular to check functional changes in status.
  • the movement sequences can be compared with previous measurements in both qualitative and quantitative form and can provide conclusions about a gradual improvement or deterioration in hand function, which can be used for a medical diagnostic or therapeutic purpose as well as in the performance diagnostic area in sports, music and sports Working world with a focus on hand function can be applied.
  • the digital hand can be used for gesture recognition.
  • a very simple synthetic model can be used for this purpose, which only consists of a kinematic structure.
  • the digital hand can also be used in virtual augmented reality.
  • a synthetic model is preferred which, in addition to the kinematic structure, also includes the surface and can, for example, model hand-object interactions.
  • Objects that consist of one or more materials that are transparent to at least one of the waveforms used are particularly suitable.
  • An object made of a material that causes a reflection for one of the modalities used and not for another is particularly suitable here.
  • the present invention solves many of the limitations present in the prior art.
  • the invention presents an arrangement and advantageous methods for operating the arrangement for high-precision, non-contact detection of the cover of a hand and the speed vector of one, several or every point on this cover.
  • a primary radar-based modality is combined with an assistive 3D imaging modality and the strengths of the sensors are combined.
  • the shell and its movements are recorded with comparatively little effort and with high spatial resolution and excellent speed accuracy in all three spatial directions and at a high measurement rate.
  • shell or body shell is preferably understood to mean the interface or the sum of many coupled interfaces that create the boundary between inside and outside in a living being's body or a hand.
  • the outside is usually air or the atmosphere surrounding the living being, and the inside is preferably the body or the material structures of the body.
  • the covering or the body covering can preferably be defined by the surface of the skin, although this is only one possible definition, since depending on the application it may make sense to include clothing, hair, dirt or other adhesions on the body or other objects firmly attached to the body may or may not be counted as part of the shell.
  • Detection of the casing is preferably understood to mean a metrological process in which the geometric shape of the casing is determined.
  • the detection process is preferably an imaging process in which an image of the casing that is as geometrically correct as possible is generated.
  • the spatial positions of several points of the shell are preferably determined and at least some points, but preferably all points of this group of points, also other features, such as certain reflectivity properties, such as the strength of the reflection, the frequency dependence of the reflection, called color in the optical range, or the polarimetric reflection behavior.
  • the detection of the envelope preferably also includes determining the positions of the envelope points relative to a reference coordinate system.
  • a motion feature can be a scalar velocity, a vector velocity, a one-dimensional or multi-dimensional acceleration quantity or even a complex multi-dimensional equation of motion of a higher order.
  • a model can be created directly from the generated raw data, or preferably an already synthetic model, which is initially defined generically, is personalized afterwards.
  • a model is preferably understood to be an approximate representation of the hand and/or its characteristics. Possible characteristics include, for example, anatomical structure, movement properties, material properties of the skin surface or color properties.
  • a model can be constructed manually by merging the input data from the specified sensor system, or it can already be available in digital form.
  • An approximate representation of the model can be created, for example, by estimating the interfaces of the hand.
  • models without illustrations can also be used, for example. These can, for example, be models of the hand that resemble a skeleton and only contain a bone-like structure to record hand movements.
  • a gesture preferably describes a temporal sequence of hand poses that result in a movement of the hand.
  • Input data is preferably data that is used as input for an algorithmic method.
  • the input data often corresponds to the data generated by a sensor system used. This can be, for example, RGB images from a camera, RGB-D images from a depth camera and/or location and/or speed information from a radar.
  • pre-processed data can also serve as input data. This could, for example, be predetermined movement information from a marker-based system.
  • preprocessing works on data generated directly by one or more sensors or modalities.
  • Preprocessing creates a preliminary, application-related dataset.
  • preprocessing creates a preliminary geometric shape of the shell.
  • preprocessing creates preliminary movement characteristics.
  • post-processing is to be classified after pre-processing and adds further information to the preliminary data set. This can be, for example, a correction or a completion of the previous data.
  • Post-processing in relation to the capture of the shell can, for example, recalculate or adjust the previously determined geometric profile so that it approximates the real shell even better.
  • post-processing can also serve to complete the shell, which could only be partially determined in pre-processing, since, for example, the back of an object could not be determined due to shadowing from the sensors used.
  • Post-processing to determine the movement of the envelope can also be used to correct or refine the preliminary movement characteristics. In particular, they can expand the preliminary movement features assigned to a set of points by a further set of points with movement features if the post-processing for determining the movement of the shell is combined, for example, with post-processing for detecting the shell.
  • the cover and its movement should preferably be detected by at least two different wave-based sensor modalities and their sensor data fusion.
  • a wave-based sensor modality is preferably understood to mean a measurement arrangement which uses a waveform to capture measurement information, e.g. to capture an image of a body.
  • Electromagnetic waves in the microwave or optical range or sound/ultrasound waves can be used as waveforms.
  • the modality is preferably defined here in such a way that a modality is defined both by the waveform, e.g. radar or optical or ultrasound as well as by the measuring principle.
  • the entirety of reconstruction methods for an optical measurement system which are preferably referred to as an optical reconstruction method or optical reconstruction, is preferably explicitly distinguished from one another from the entirety of reconstruction methods for a radar-based measurement system, preferably referred to as a radar-based reconstruction method or radar-based reconstruction.
  • Fig. 1 an embodiment of a system according to the invention.
  • Fig. 2 another embodiment of a system according to the invention.
  • Fig. 3 two matrices with phase information or image phases per pixel.
  • Fig. 4 a flowchart of a preferred method.
  • FIG. 1 shows the combination of a primary modality 20 with an assisting modality 30 for determining the cover of a hand 10 and its movement.
  • a general structure of the measurement system with primary and assistive modality for recording hand movement and, if necessary, object interaction can be seen.
  • the thumb and the little finger of the hand each move in FIG. 1 with a velocity vector V with the reference number 11, which can be broken down, for example, into the Cartesian components Vi and V2.
  • the velocity vector V can also be a three-dimensional vector that can be broken down into three Cartesian components.
  • the primary modality 20 and the assistive modality 30 are connected to a computer 40, which can be connected to a monitor 41, on which algorithms for determining the image and movement of the hand 10 are executed.
  • the individual orientations of the two modalities 20 and 30 are related to one another.
  • the MIMO radars preferably have receiving antennas 21 and transmitting antennas 22.
  • a narrow-band MIMO radar 20 and the depth camera 30 are arranged above the hand 10 to be stretched and another MIMO radar 20 is arranged under a table top of a table 50 that is transparent to the MIMO radar 20.
  • Individual points of the hand move in FIG. 2 with a velocity vector V with the reference number 11, which can be broken down into the Cartesian components Vi and V2.
  • the depth camera or the assistive modality 30 is connected to a computer 40 through a connection 35, via which distance features of the hand 10 recorded by the depth camera or the assistive modality 30 are transmitted.
  • the MIMO radars are connected to the computer 40 through connections 25 through which speed and distance characteristics of the hand 10 detected by the MIMO radars are communicated.
  • the computer 40 can be connected to a monitor 41 on which the results of the algorithms executed on the computer 40 for determining the image and movement of the hand 10 can be displayed.
  • the primary modality initially provides the radial velocity by evaluating time-varying phase information.
  • Suitable signal forms here include Continuous Wave (CW) signals. These can consist of either just one or several frequency support points. Using only one CW frequency, speeds as well as relative changes in distance can be measured. Due to the periodicity of the phase, with each period running from 0 to 360°, absolute distances can only be determined in a comparatively small range of clarity if the relevant period or the relevant distance range that this period represents is known.
  • CW Continuous Wave
  • an absolute distance measurement with a larger range of clarity is also possible.
  • an image of the recorded shell of the hand is measured.
  • the locally distributed speed of this shell can be determined using a suitable preprocessing method. In the simplest case, this can be a comparison of the consecutive radar images. Either the evaluation of the complex phase of consecutive images or the evaluation of the continuous change in the reception phases is preferred.
  • the assistive modality initially provides spatially resolved distance data of the recorded hand cover.
  • the reconstruction method with which this distance data is generated is preferably irrelevant.
  • the assisted modality is used to roughly localize the captured hand in space, thereby defining the area in which the shell of interest is located. Based on this, a radar-based reconstruction is carried out in this area using a known method. By limiting the radar-based reconstruction to selected points in the relevant area, the processing speed of the radar-based reconstruction is significantly accelerated. In addition, the localization of the hand that has already taken place allows the use of narrow-band radar signal forms that have no or only a very low distance resolution. They can be generated much faster compared to the broadband signal forms previously used in radar imaging and are therefore advantageous in time-critical applications. If the primary modality uses at least two frequency support points, spatially resolved images of the envelope can also be generated in addition to the existing data from the assisting modality.
  • two two-dimensional images of the shell are reconstructed at the distance or z coordinate d s estimated by the assistive sensor system or modality using two closely adjacent frequencies fi, f2 according to known methods.
  • the starting point for the reconstruction is provided by the two CW baseband signals s bfl and s b 2 , which were recorded at the two frequency support points fi and E with a transmitting antenna at location x tx and a receiving antenna at location x rx and were caused by a point scatterer at location x s .
  • the following formulas show one way in which the CW baseband signals can be determined.
  • the estimated distance d s is the sum of the real distance d and either a positive or negative difference distance Ad. This results in the hypotheses as shown in the following two formulas.
  • Fig. 3 there are two matrices with phase information or image phases per pixel for a 2D reconstruction at the estimated distance d s for two frequency support points, the left matrix containing the image phases for the frequency support point with the frequency and the right matrix with the image phases for the frequency support point the frequency f 2 is shown.
  • the correlation is carried out and summed up over all transmit-receive combinations. This results in two complex radar images at two different frequencies and f 2 .
  • Ad can be calculated and thus the exact distance of the envelope to the radar can be calculated for each pixel, as shown in the following formula.
  • the primary modality uses at least two frequency support points, for example according to the FSK-MIMO principle, with which a spatially resolved image of the shell is generated, this image is subsequently compared with the data of the assisting modality and, if necessary, complemented and fine-tuned. This step is not necessary when using only one frequency.
  • an image of different boundary surfaces of the shell can be generated, in particular by using different wavelengths, for example by combining a radar-based primary modality and an optical assistive modality, since, for example, the radar can penetrate material such as clothing or cardboard and an optical measuring system however, not.
  • the initially estimated speed information can be subsequently corrected.
  • the imaging of several interfaces through the combination of radar-based and optical measuring systems can be particularly advantageous if the recorded hand has several covers that move differently.
  • the methods described enable precise, high-performance determination of the movement of the shell of any object that has the potential for real-time capability.
  • FIG. 4 An overview in the form of a flowchart of a preferred method for operating a preferred configuration is shown in FIG. 4.
  • step S2 a query is made as to whether the modalities have the same orientation. If not, go with it Step S3 carried out a calibration. After calibration or if the modalities have the same orientation, an optical reconstruction of the object or the hand or the case is carried out in step S4 by the assisting modality. Depth data D1 is generated. This depth data is used in step S5 to localize the object or the hand or the case. In step S6, a radar-based reconstruction of the object or the hand or the shell is carried out by the primary modality. This gives data D2 with the speed and/or distance of the object or the hand or the case. In step S7, the data D1 and D2 are merged as part of a sensor data fusion.
  • step S8 the movement of the object or the hand or the case is determined.
  • the data D3 and the data D4 from a previous recording serve as input data, on the basis of which the data D5 of the movement of the object or the hand or the cover are generated from the current recording.
  • Step S9 asks whether further recordings are available. If further recordings are available, step S8 is repeated. If there are no further recordings, the recording is ended in step S10.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Electromagnetism (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Multimedia (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

L'invention se rapporte à un système de détermination d'une caractéristique d'une main, ledit système comprenant un module principal, un module d'aide et un dispositif de commande. Le module principal comprend un système radar comportant au moins une antenne d'émission et au moins une antenne de réception, et le module d'aide comprend un capteur d'imagerie tridimensionnelle, le module principal et le module d'assistance étant conçus pour effectuer une détection respective de la caractéristique, générer des données de mesure par rapport à la caractéristique, communiquer avec le dispositif de commande qui est conçu à cet effet, être actionné par le dispositif de commande qui est conçu à cet effet, et transmettre les données de mesure au dispositif de commande qui est conçu à cet effet. Le dispositif de commande comporte des moyens qui sont conçus pour actionner le module principal de telle sorte que les données de mesure générées par le module d'aide sont prises en considération, et le dispositif de commande comporte des moyens qui sont conçus pour déterminer la caractéristique de la main à partir de la totalité ou d'une partie de la totalité des données de mesure générées.
PCT/EP2023/068567 2022-07-05 2023-07-05 Système, procédé, programme informatique et support lisible par ordinateur WO2024008803A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102022116737.2A DE102022116737A1 (de) 2022-07-05 2022-07-05 System, Verfahren, Computerprogramm und computerlesbares Medium
DE102022116737.2 2022-07-05

Publications (1)

Publication Number Publication Date
WO2024008803A1 true WO2024008803A1 (fr) 2024-01-11

Family

ID=87158234

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2023/068567 WO2024008803A1 (fr) 2022-07-05 2023-07-05 Système, procédé, programme informatique et support lisible par ordinateur

Country Status (2)

Country Link
DE (1) DE102022116737A1 (fr)
WO (1) WO2024008803A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1178330A1 (fr) 1993-04-12 2002-02-06 The Regents Of The University Of California Capteur de mouvement à base de radar à bande ultra-large
US20120280900A1 (en) * 2011-05-06 2012-11-08 Nokia Corporation Gesture recognition using plural sensors
US20150277569A1 (en) 2014-03-28 2015-10-01 Mark E. Sprenger Radar-based gesture recognition
WO2017131545A1 (fr) 2016-01-26 2017-08-03 Novelic D.O.O. Système de capteur radar à ondes millimétriques pour analyse de mouvement et de geste
US20200097092A1 (en) * 2018-09-21 2020-03-26 International Business Machines Corporation Gesture recognition using 3d mm-wave radar
US10928921B2 (en) 2010-06-17 2021-02-23 Apple Inc. Gesture based user interface
US20210231775A1 (en) * 2020-01-27 2021-07-29 Plato Systems, Inc. System and method for smart device control using radar

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2817785B1 (fr) 2012-02-23 2019-05-15 Charles D. Huston Système et procédé de création d'un environnement et de partage d'expérience en fonction d'un emplacement dans un environnement
US10168785B2 (en) 2015-03-03 2019-01-01 Nvidia Corporation Multi-sensor based user interface
DE102017003937A1 (de) 2017-04-22 2018-10-25 Arnulf Deinzer Sensorfingerring
DE102019200177B4 (de) 2019-01-09 2022-08-11 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Verfahren zur Identifizierung eines Lebewesens
US20230260155A1 (en) 2020-04-24 2023-08-17 Cornell University Deep continuous 3d hand pose tracking

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1178330A1 (fr) 1993-04-12 2002-02-06 The Regents Of The University Of California Capteur de mouvement à base de radar à bande ultra-large
US10928921B2 (en) 2010-06-17 2021-02-23 Apple Inc. Gesture based user interface
US20120280900A1 (en) * 2011-05-06 2012-11-08 Nokia Corporation Gesture recognition using plural sensors
US20150277569A1 (en) 2014-03-28 2015-10-01 Mark E. Sprenger Radar-based gesture recognition
WO2017131545A1 (fr) 2016-01-26 2017-08-03 Novelic D.O.O. Système de capteur radar à ondes millimétriques pour analyse de mouvement et de geste
US20200097092A1 (en) * 2018-09-21 2020-03-26 International Business Machines Corporation Gesture recognition using 3d mm-wave radar
US20210231775A1 (en) * 2020-01-27 2021-07-29 Plato Systems, Inc. System and method for smart device control using radar
WO2021154831A1 (fr) * 2020-01-27 2021-08-05 Plato Systems, Inc. Commande de dispositif intelligent basée sur un radar

Non-Patent Citations (32)

* Cited by examiner, † Cited by third party
Title
A. EROLG. BEBISM. NICOLESCUR. BOYLEX. TWOMBLY: "Vision-based hand pose estimation: A review", COMPUTER VISION AND IMAGE UNDERSTANDING, 2007
C. D. METCALFC. PHILLIPSA. FORRESTERJ. GLODOWSKIK. SIMPSONC. EVERITTA. DAREKARL. KINGD. WARWICKA. DICKINSON: "Quantifying Soft Tissue Artefacts and Imaging Variability in Motion Capture of the Fingers", ANN BIOMED ENG, vol. 48, no. 5, 2020, pages 1551 - 1561, XP037091157, DOI: 10.1007/s10439-020-02476-2
C. LIUY. LID. AOH. TIAN: "Spectrum-Based Hand Gesture Recognition Using Millimeter-Wave Radar Parameter Measurements", IEEE ACCESS, vol. 7, 2019, pages 79147 - 79158, XP011732749, DOI: 10.1109/ACCESS.2019.2923122
E. CESERACCIUZ. SAWACHAC. COBELLI: "Comparison of Markerless and Marker-Based Motion Capture Technologies through Simultaneous Data Collection during Gait: Proof of Concept", PLOS ONE, vol. 9, no. 3, 2014
E. KLINEFELTERJ. A. NANZER: "Interferometric Radar For Spatially-Persistent Gesture Recognition in Human-Computer Interaction", PROC. 2019 IEEE RADAR CONFERENCE (RADARCONF), 2019
E. VAN DER KRUKM. M. REIJNE: "Accuracy of human motion capture systems for sport applications; state-of-the-art review", EUROPEAN JOURNAL OF SPORT SCIENCE, vol. 18, 2018, pages 1 - 14
F. GUMBMANNS. S. AHMED: "Walk through screening with multistatic mmW technology", PROC. SPIE 9993, MILLIMETRE WAVE AND TERAHERTZ SENSORS AND TECHNOLOGY IX, 2016
F. MÜLLER, REAL-TIME 3D HAND RECONSTRUCTION IN CHALLENGING SCENES FROM A SINGLE COLOR OR DEPTH CAMERA, 2020
J. W. SMITHS. THIAGARAJANR. WILLISY. MAKRISM. TORLAK: "Improved Static Hand Gesture Classification on Deep Convolutional Neural Networks Using Novel Sterile Training Technique", IEEE ACCESS, vol. 9, 2021, pages 10893 - 10902
J. WANGT. KARPJ. MUNOZ-FERRERASR. GÖMEZ-GARCIAC. LI: "A Spectrum-Efficient FSK Radar Technology for Range Tracking of Both Moving and Stationary Human Subjects", IEEE TRANSACTIONS ON MICROWAVE THEORY AND TECHNIQUES, vol. 67, no. 12, 2019, pages 5406 - 5416, XP011763477, DOI: 10.1109/TMTT.2019.2941189
J.-T. YUL. YENP.-H. TSENG: "mmWave Radar-based Hand Gesture Recognition using Range-Angle Image", PROC. IEEE 91ST VEHICULAR TECHNOLOGY CONFERENCE (VTC2020-SPRING), 2020, pages 1 - 5, XP033786830, DOI: 10.1109/VTC2020-Spring48590.2020.9128573
K. ROOTI. ULLMANNS. GMEHLINGM. VOSSIEK: "Close range ISAR imaging with target velocity determination for security screening applications", PROCEEDINGS OF THE EMERGING IMAGING AND SENSING TECHNOLOGIES FOR SECURITY AND DEFENCE VI SPIE SECURITY +DEFENCE, 2021
K.-S. LEEM.-C. JUNG: "Ergonomie evaluation of biomechanical hand function", SAFETY AND HEALTH WORK, vol. 6, no. 1, 2015, pages 9 - 17
L. WADEL. NEEDHAMP. MCGUIGANB. JAMES: "Applications and limitations of currenct markerless motion capture methods for clinical gait biomechanics", PEERJ, 2022
L. ZHENGJ. BAIX. ZHUL. HUANGC. SHANQ. WUL. ZHANG: "Dynamic Hand Gesture Recognition in In-Vehicle Environment Based on FMCW Radar and Transformer", SENSORS, vol. 21, no. 19, 2021, pages 63 - 68
M. MENOLOTTOD.-S. KOMARISS. TEDESCOB. O'FLYNNM. WALSH: "Motion Capture Technology in Industrial Applications: A Systematic Review", SENSORS, vol. 20, 2020, pages 5687
M. Q. NGUYENC. LI: "Radar and ultrasound hybrid system for human computer interaction", PROC. 2018 IEEE RADAR CONFERENCE (RADARCONF18), 2018, pages 1476 - 1480, XP033357107, DOI: 10.1109/RADAR.2018.8378783
M. TOPLEYJ. G. RICHARDS: "A Comparison of Currently Available Optoelectronic Motion Capture Systems", JOURNAL OF BIOMECHANICS, vol. 106, 2020, XP086169246, DOI: 10.1016/j.jbiomech.2020.109820
PAVLO MOLCHANOV ET AL: "Multi-sensor system for driver's hand-gesture recognition", 2015 11TH IEEE INTERNATIONAL CONFERENCE AND WORKSHOPS ON AUTOMATIC FACE AND GESTURE RECOGNITION (FG), 1 May 2015 (2015-05-01), pages 1 - 8, XP055286586, ISBN: 978-1-4799-6026-2, DOI: 10.1109/FG.2015.7163132 *
R. A. NEWCOMBED. FOXS. M. SEITZ: "DynamicFusion: Reconstruction and Tracking of Non-Rigid Scenes in Real-Time", COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2015
R. T. ANJASMARAA. PRAMUDITAY. WAHYU: "Configuration of Multisensor CW Radar for Hand Gesture Detection", PROC. 2021 INTERNATIONAL CONFERENCE ON RADAR, ANTENNA, MICROWAVE, ELECTRONICS, AND TELECOMMUNICATIONS (ICRAMET), 2021, pages 27 - 31, XP034057058, DOI: 10.1109/ICRAMET53537.2021.9650472
S. AHMEDK. KALLUS. AHMEDS. H. CHO: "Hand Gestures Recognition Using Radar Sensors for Human-Computer-Interaction: A Review.", REMOTE SENSING, vol. 13, no. 3, 2021, XP055823644, DOI: 10.3390/rs13030527
S. L. COLYERM. EVANSD. P. COSKERA. I. T. SALO: "A Review of the Evolution of Vision-Based Motion Analysis and the Integration of Advanced Computer Vision Methods Towards Developing a Markerless System", SPORTS MEDICINE - OPEN, vol. 4, no. 24, 2018
S. S. AHMED: "Microwave Imaging in Security - Two Decades of Innovation", IEEE JOURNAL OF MICROWAVES, vol. 1, no. 1, January 2021 (2021-01-01), pages 191 - 201, XP011831514, DOI: 10.1109/JMW.2020.3035790
S. S. AHMEDA. SCHIESSLF. GUMBMANNM. TIEBOUTS. METHFESSELL.-P. SCHMIDT: "Advanced Microwave Imging", IEEE MICROWAVE MAGAZINE, vol. 13, no. 6, 2012, pages 26 - 43
S. YUANG. GARCIA-HERNANDOB. STENGER: "Depth-Based 3D Hand Pose Estimation: From Current Achievements to Future Goals", CORR, 2017
S. ZHANGG. LIM. RITCHIEF. FIORANELLIH. GRIFFITHS: "Dynamic hand gesture classification based on radar micro-Doppler signatures", PROC. 2016 CIE INTERNATIONAL CONFERENCE ON RADAR (RADAR), 2016, pages 1 - 4
T. CHATZISA. STERGIOULASD. KONSTANTINIDISK. DIMITROPOULOSP. DARAS: "A Comprehensive Study on Deep Learning-Based 3D Hand Pose Estimation Methods", APPLIED SCIENCES, 2020
T. FAN: "Wireless Hand Gesture Recognition Based on Continuous-Wave Doppler Radar Sensors", IEEE TRANSACTIONS ON MICROWAVE THEORY AND TECHNIQUES, vol. 64, no. 11, 2016, pages 4012 - 4020, XP011633246, DOI: 10.1109/TMTT.2016.2610427
W. CHENC. YUC. TUZ. LYUJ. TANGS. OUY. FUZ. XUE: "A Survey on Hand Pose Estimation with Wearable Sensors and Computer-Vision-Based Methods", SENSORS, 2020
Y. KIMB. TOOMAJIAN: "Application of Doppler radar for the recognition of hand gestures using optimized deep convolutional neural networks", PROC., 2017, pages 1258 - 1260, XP033097537, DOI: 10.23919/EuCAP.2017.7928465
Z. PENGC. LIJ. MUNOZ-FERRERASR. GOMEZ-GARCIA: "n FMCW radar sensor for human gesture recognition in the presence of multiple targets", PROC. 2017 FIRST IEEE MTT-S INTERNATIONAL MICROWAVE BIO CONFERENCE (IMBIOC), 2017, pages 1 - 3, XP033109871, DOI: 10.1109/IMBIOC.2017.7965798

Also Published As

Publication number Publication date
DE102022116737A1 (de) 2024-01-11

Similar Documents

Publication Publication Date Title
DE102013021729B4 (de) Verbesserung der Nadelvisualisierung in der diagnostischen Ultraschallbildgebung
DE60133044T2 (de) Datenerfassung , analyse und abbildung von ultraschall diagnostische herzbildern
DE112019004104T5 (de) Medizinische bildverarbeitungsvorrichtung, medizinisches bildverarbeitungsverfahren und programm
EP2422297B1 (fr) Dispositif et procédé de prise de vue d'une plante
EP3409230B1 (fr) Mouvement d'un bras de robot
EP2227703B1 (fr) Procédé de capture de mouvements
DE102018105034A1 (de) Verfahren und Systeme zur modellgetriebenen multimodalen medizinischen Bildgebung
DE102010051207B4 (de) Vorrichtung sowie Verfahren zur dreidimensionalen Abbildung eines relativ zur Sensoranordnung bewegten Objekts
WO2015117905A1 (fr) Analyseur d'image 3d pour déterminer une direction du regard
DE10392310T5 (de) Ultraschall-Lokalisierung von anatomischen Bezugspunkten
DE19611990A1 (de) Verfahren und Vorrichtung zur Erzeugung von großen, zusammengesetzten Ultraschallbildern
EP2755051A2 (fr) Appareil de formation d'image et procédé d'imagerie nucléaire
DE112014001044T5 (de) Tragbare medizinische Bildgebungsvorrichtung mit Cursor-Zeiger-Steuerung
DE112018003204T5 (de) Chirurgisches Bildgebungssystem und -verfahren
WO2012031685A1 (fr) Détermination de position au moyen d'étiquettes rfid
DE112015006234T5 (de) Betätigungsunterstützungsvorrichtung, Einführungsteilsystem und Betätigungsunterstützungsverfahren
Erol et al. Synthesis of micro-doppler signatures for abnormal gait using multi-branch discriminator with embedded kinematics
DE102019200177B4 (de) Verfahren zur Identifizierung eines Lebewesens
WO2024008803A1 (fr) Système, procédé, programme informatique et support lisible par ordinateur
EP3663881B1 (fr) Procédé de commande d'un véhicule autonome en fonction des vecteurs de mouvement estimés
WO2024008515A1 (fr) Système, procédé, programme informatique et support lisible par ordinateur
DE112020002148T5 (de) Unterstützungsvorrichtung für dreidimensionale ultraschallbildgebung, unterstützungsverfahren für dreidimensionale ultraschallbildgebung und unterstützungsprogramm für dreidimensionale ultraschallbildgebung
DE102018121317A1 (de) Verfahren und Vorrichtung zur Schätzung einer durch eine Freiraumgeste vermittelten Richtungsinformation zur Bestimmung einer Benutzereingabe an einer Mensch-Maschine-Schnittstelle
EP3724852B1 (fr) Procédé de détermination sans contact de degrés de liberté d'un corps
EP3449463A1 (fr) Système d'analyse de déplacement et système de suivi de déplacement, le comprenant, d'objets déplacés ou se déplaçant, se détachant thermiquement de leur environnement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23738731

Country of ref document: EP

Kind code of ref document: A1