EP3376944A1 - System for formulating, emitting and interpreting a composite stream and associated method - Google Patents
System for formulating, emitting and interpreting a composite stream and associated methodInfo
- Publication number
- EP3376944A1 EP3376944A1 EP16828759.7A EP16828759A EP3376944A1 EP 3376944 A1 EP3376944 A1 EP 3376944A1 EP 16828759 A EP16828759 A EP 16828759A EP 3376944 A1 EP3376944 A1 EP 3376944A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- data
- interest
- encoding
- environment
- processing unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4863—Measuring or inducing nystagmus
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/12—Selection from among a plurality of transforms or standards, e.g. selection between discrete cosine transform [DCT] and sub-band transform or selection between H.263 and H.264
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/124—Quantisation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/167—Position within a video image, e.g. region of interest [ROI]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/17—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/177—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a group of pictures [GOP]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/46—Embedding additional information in the video signal during the compression process
Definitions
- the invention relates to a method for transmitting a composite data stream, said data being composed of data of interest and environment data respectively characterizing the digital production of information of interest and environmental information included. in one or more study information previously captured by capture means.
- the invention further relates to a method for interpreting and exploiting such a composite data stream and a system for implementing said methods.
- the invention relates to the transmission of a video stream in a videonystagmoscopy system used to observe eye movements in humans or animals or more generally a subject of study, and look for a possible nystagmus.
- a nystagmus is an involuntary and jerky motion of the eyeball caused by a disturbance of the muscles of the eye.
- the observation of a nystagmus may, by way of non-limiting example, make it possible to determine a dysfunction of the inner ear of a patient that may cause a feeling of vertigo.
- Dizziness is an erroneous sensation of movement of the body relative to the surrounding space, usually reflecting dysfunction or imbalance between the two vestibular apparatus of the inner ear of a human or animal.
- Each vestibular apparatus is composed of several sensors, such as semicircular canals and otolithic organs, whose walls are covered with ciliated sensory cells bathed in endolymphatic fluid.
- the semicircular canals detect the amplitude of the angular rotation of the head, while the otolithic bodies detect the vertical and / or horizontal linear accelerations of the head as well as the inclination of the head with respect to the axis of the head. gravity.
- the hair cells move and transmit information related to the rotation to the nervous system via the vestibular nerve.
- the interpretation by the patient's nervous system of such information causes movements of the eyeball in order to guarantee the stability of the gaze and the stability of the patient's posture.
- said stabilizations are not performed correctly causing nystagmus, as well as the feeling of vertigo.
- a videonystagmoscopy system generally comprises a helmet, a mask or glasses comprising one or more cameras and an electronic device, such as by way of non-limiting examples a tablet or a personal computer.
- an examination of the inner ear commonly called vestibular examination
- the patient's eye is plunged into darkness thus preventing its inhibition at fixation, which could alter the outcome of the examination.
- the movements of the eyeball are therefore only caused by the patient's nervous system following information received from the sensors of the inner ear said patient.
- the camera generally of infra ⁇ red type, captures a succession of images in order to create a video stream.
- the latter is transmitted to the electronic device, responsible for rendering images or graphics via a man-machine rendering interface, such as by way of non-limiting example a computer screen or a printer.
- a man-machine rendering interface such as by way of non-limiting example a computer screen or a printer.
- the practitioner can therefore analyze the restitution and make a diagnosis in the light of this tool.
- the observation of a nystagmus requires a high resolution of the image. Indeed, the horizontal and / or vertical jerky motions of the eyeball and / or its twists are low amplitude events. In addition, said events can be very short.
- the video stream must therefore be transmitted between 25Hz and 200Hz for images in 256 gray levels and up to a size of 640x480 pixels.
- the stream thus generated between the infrared camera and the electronic device conveys a large amount of information expressed in the form of bit frames, thus occupying a large bandwidth.
- videonystagmoscopy systems generally comprise a mask and a device communicating by wire connection, for example by means of a cable with USB type connector, acronym for "Universal Sériai” Bus “according to an Anglo-Saxon terminology.
- USB type connector acronym for "Universal Sériai” Bus “according to an Anglo-Saxon terminology.
- Such systems have the disadvantage of hindering the practitioner and the patient during the examination. Indeed, during an examination of the vestibular system, the practitioner must rotate the head of said patient from left to right, shake the patient's head and / or rotate at different speeds the chair on which the patient sits, in order to observe possible eye distortions and determine which organ is malfunctioning and causing vertigo.
- the disadvantage of this system thus lies mainly in said cable that connects the camera positioned in front of the patient's eyes to the practitioner's computer. Indeed, this cable can wrap around the patient during the examination.
- the USB standard may have insufficient bandwidth thus causing delays in receiving data packets or loss of data packets.
- the invention makes it possible to respond particularly effectively to all or some of the disadvantages raised by the solutions mentioned above.
- the invention proposes to reduce the volume of data to be transmitted by performing processing upstream of the transmission. Indeed, the invention provides to delegate to a microcontroller previously installed in videonystagmoscopy glasses and cooperating with the camera, local processing upstream of the transmission, in order to reduce the volume of data exchanged with an electronic device to the right useful information for the medical use that is made of the product. Indeed, during a vestibular examination, it suffices to observe the iris and the pupil of the patient's eye to diagnose a possible dysfunction of the inner ear, the rest of the image having a lower interest. or at least not requiring the same precision.
- the invention therefore provides initially for compressing and encoding the data of interest Di related to the zone of interest Zi according to a first encoding function F1 and the environment data Dn related to the zone d Zn environment according to a second encoding function F2, said first encoding function having a data loss rate related to zero or low data compression with respect to said second encoding function F2.
- the data of interest Di and the Dn environment data thus encoded will respectively be called encoded interest data Di 'and encoded environment data Dn'.
- function F1, F2 is understood to mean any transcription of data of a first format into a second format, which may comprise a step of compressing the data.
- a function F1, F2 may, by way of non-limiting example, consist in transcribing the digital representation R (t) of physical quantities captured by a matrix sensor of a camera encoded according to the ASCII standard, which stands for "American Standard Code for Information Interchange "according to an English terminology, in a standard JPEG acronym for" Joint Photographic Expert Group "according to English terminology.
- Such functions may have reversible or irreversible compression features. Reversible compression is a treatment that guarantees the integrity of the encoded data with respect to the original data, that is, after decompression the encoded data and the original data are identical.
- a transcription according to the RLE standard is an encoding according to a reversible compression.
- an irreversible compression is a treatment that reduces the number of data to be encoded, for example by encoding only part of the data, as a non-limiting example, every other data, or by performing a preliminary processing to determine the average of a neighboring data set and encoding said average.
- a transcription according to the JPEG standard acronym for "Joint Photographic Expert Group” according to a terminology Anglo-Saxon is an encoding according to an irreversible compression.
- study data processing upstream of their transmission makes it possible to preserve the bandwidth of the communication networks.
- the study data thus adapted can then be transported according to standard proximity communication protocols, such as non-limiting examples of WIFI or Bluetooth type. Therefore, it becomes possible to use standard communication equipment, thus reducing the hardware costs of the system while preserving the ergonomics and relevance of the system.
- such a reduction in the volume of study data to be transmitted, said study data being produced from a determined sensor, for example a matrix image sensor, makes it possible, for a bandwidth of a network of given communication, to convey together with said study data other study data produced by one or more other sensors, for example an accelerometer, a gyroscope and / or a second matrix image sensor.
- a determined sensor for example a matrix image sensor
- the invention firstly relates to a method for interpreting a composite data stream implemented by a processing unit of an electronic device cooperating with at least one communicating electronic object comprising capture means, said device comprising, in addition to said processing unit, communication means providing a mode of communication with said electronic object at through a communication network.
- the data flow includes study data previously encoded by the electronic object, said study data comprising data of interest encoded according to a first encoding function and environment data encoded according to a second function of encoding, said study data being previously produced by said means for capturing said electronic object.
- the method comprises:
- step for decoding said encoded study data said step consists of decoding said interest and environment data encoded respectively according to determined first and second decoding functions;
- said message may furthermore comprise a descriptor translating a repository common to the data of interest and to the encoded environment data.
- the step of jointly exploiting said data of interest and decoded environment data can therefore consist in producing restitution data from the decoded data of interest and environment and of said common repository.
- the method may comprise a step, prior to the step of receiving a message comprising data of interest and encoded environment data, to elaborate and trigger the transmission by the communication means to the electronic object of a control request interpretable by said electronic object comprising a first encoding parameter and a second encoding parameter, said first and second parameters being different.
- the method may comprise a step, prior to the step of receiving a message comprising data of interest and encoded environment data, for developing and triggering the transmission by the communication means to the electronic object of a control request interpretable by said electronic object comprising a parameter of area of interest designating data of interest in the data of studies produced by the means of capture of the electronic object, said area of interest parameter being interpretable by the electronic object to discriminate the data of interest from the environmental data previously produced by said object.
- the electronic device may include a rendering interface cooperating with the processing unit of said device.
- the step of jointly exploiting said data of interest and decoded environment data may consist in producing restitution data.
- the process can when having a step subsequent to said step for jointly exploiting said data of interest and decoded environment data, to trigger the restitution of said restitution data by the rendering interface.
- the invention relates to a computer program product comprising program instructions which, when they are previously stored in a program memory of an electronic device comprising in addition to the program memory, a processing unit, and communication means providing a determined communication mode, said program memory and said communication means cooperating with said processing unit, cause the implementation of a method for interpreting a composite data stream, according to the invention .
- the invention provides an electronic device comprising a processing unit, a program memory, a data memory, communication means cooperating with said processing unit, said device being characterized in that it comprises in the program memory of the instructions of a computer program product according to the invention.
- the invention provides a method for transmitting a composite data stream implemented by a processing unit of a communicating electronic object cooperating with at least one electronic device, said electronic object comprising in addition said processing unit of the capture means and communication means providing a mode of communication with said electronic device through a communication network.
- said method comprises:
- a step for triggering the production of study data by said means for capturing said electronic object said study data consisting of a digital representation of a measured physical quantity
- the method may comprise a step prior to the step of encoding the data of interest according to a first encoding function and the environment data according to a second encoding function, for receiving via the communication means a control request transmitted from the electronic device comprising a first encoding parameter and a second encoding parameter, and extracting said parameters, said first and second encoding functions respectively being implemented according to the content of the first and second encoding parameters.
- the method may include a step prior to the step of discriminating in the study data of the data of interest and the environment data according to a discrimination function. , for receiving via the communication means a control request transmitted from the electronic device comprising a zone of interest parameter designating data of interest and extracting said parameter, said discrimination function being implemented according to the content of said parameter of area of interest.
- the electronic object may further comprise storage means comprising a zone of interest parameter designating data of interest from among the study data.
- Said method can therefore comprise a step prior to the step of discriminating in the study data of the data of interest and the environment data, to extract from said storage means said parameter of area of interest denoting data of interest. interest and implement a discrimination function according to the content of said parameter.
- the electronic object may further comprise storage means comprising first and second encoding parameters.
- the step of encoding the data of interest according to a first encoding function and the environment data according to a second encoding function can therefore consist in extracting the content of said encoding parameters, said first and second encoding functions respectively being implemented according to said content of the first and second encoding parameters.
- the capture means of the electronic object may comprise a matrix image sensor, the study data produced by said sensor being the digital representation of a scene captured by said matrix sensor.
- the study data may consist of the digital representation of an eye, and the data of interest may be the digital representation of the iris and the pupil of said eye.
- the invention relates to a computer program product comprising program instructions which, when they are previously stored in a program memory of a communicating electronic object comprising in addition to said program memory, a processing unit. , communication means ensuring a determined communication mode, and capture means, said program memory, said communication means, and said capture means cooperating with said processing unit, cause the implementation of a method for transmit a composite data stream, according to the invention.
- the invention relates to a communicating electronic object comprising a processing unit, a program memory, a memory of data, means of communication, capture means cooperating with said processing unit, said object being characterized in that it comprises in the program memory instructions of a computer program product, according to the invention.
- the capture means may comprise a first capture means and a second capture means for producing first and second study data respectively.
- the processing unit can therefore implement a method for transmitting a composite data stream, according to the invention, for each of said first and second capture means.
- the invention relates to a system comprising an electronic device according to the invention, and at least one communicating electronic object according to the invention.
- said system may consist of a videonystagmoscopy system, for which the electronic device consists of a personal computer and the electronic object into a videonystagmoscopy mask comprising at least one matrix image sensor.
- the invention relates to a data processing method comprising:
- a step for triggering a production of study data by means of capturing a communicating electronic object according to the invention said step being implemented by a processing unit of said object; a step for discriminating in the study data of the data of interest and the environmental data by said electronic object;
- the data processing method may comprise:
- FIG. 2 is a block diagram of a method according to the invention for interpreting a composite data stream
- FIG. 3 is a block diagram of a method according to the invention for transmitting a composite data stream
- FIG. 4 presents a study scene captured by a system according to the invention
- FIG. 5 shows a reproduction of a composite image by a system according to the invention
- FIG. 6 shows a logic diagram implemented by a system according to the invention.
- the invention will be described through an application relating to the development, transmission and interpretation of a composite video stream, such as produced during an examination of the vestibular system of a human or an animal or more generally of a subject of study.
- a study scene according to the invention in connection with the examination of the vestibular system, comprises the region of an eye 0 of a patient that we will name study zone Ze, as described in FIG. Ze is the area observed by capture means 25, by way of non-limiting example, opposite said eye 0.
- the study zone Ze includes a zone of interest Zi comprising the pupil P and the iris I of said eye 0 represented by dashed lines in connection with FIG. 4, and an environment zone Zn corresponding to the study zone Ze subtracted from the zone of interest Zi represented by hatching in connection with FIG. 4.
- FIG. 1 makes it possible to present an exemplary system according to the invention.
- a system consists of an electronic device 10 cooperating with one or more communicating electronic objects 20, 20-2 via a wired or wireless NI communication link.
- Such electronic objects 20, 20-2 referenced in the remainder of the document for the sake of simplification, comprise capture means 25 cooperating with a processing unit 21.
- Such capture means 25 may consist of a matrix sensor or a sensor. single infrared camera.
- the processing unit 11 is responsible for collecting the data delivered by the capture means 25 and encoding them before transmitting them to the electronic device 10 by means of communication 24.
- the processing unit 21 therefore advantageously comprises one or several microcontrollers or processors cooperating by coupling and / or by bus wired, represented by double arrows in FIG.
- the capture means 25 may, in addition or alternatively, provide other study information, for example in connection with the trajectory and / or the movements made by the patient.
- Such means 25 may therefore, for example, comprise an inclinometer, a gyroscope or an accelerometer or more generally any sensor for determining a movement, such as by way of non-limiting example a movement of the head of a patient .
- the capture means 25 can measure one or more physical quantities related to the subject of study.
- the capture means 25 produce a digital representation R (t) of the study area Ze.
- the latter is recorded in storage means 22, 23 cooperating by coupling and / or wired bus with the processing unit 21.
- Such storage means 22, 23 may consist of a data memory 23 arranged to record data.
- the storage means 22, 23 may further consist of a program memory 22 comprising the instructions of a computer program product P2. Said program instructions P2 are arranged such that their execution by the processing unit 21 of the object 20 causes the implementation of a method for developing and transmitting a composite data stream. Such storage means 22, 23 may, optionally alternatively, constitute one and the same physical entity.
- the recording of a temporal data t characterizing a period of acquisition can be jointly realized with that of the digital representation R (t).
- the latter, and consequently the capture carried out by the capture means 25, is thus time stamped.
- the capture means 25 may include a matrix sensor, such as an infrared camera for example.
- the digital representation R (t) delivered by such a sensor consists of an array of pixels encoding a shade of gray for each of them.
- the capture means 25 may comprise or be associated with a light emitting diode emitting in the infrared, not shown in Figure 1, to allow adequate illumination for the acquisition of data by said capture means 25.
- FIG. 1 explicitly only describes a single capture device 25 in the form of a camera, other identical or complementary capture means can also be connected directly or indirectly to the capture unit. 21.
- a system according to the invention may therefore include, by way of nonlimiting examples, two cameras respectively capturing a scene of interest corresponding to the left eye of a patient and a second scene of interest corresponding to the right eye of the patient.
- Such a system may further comprise a gyro-type angular sensor to give the angular position of the patient's head in a given reference frame and thus be able to determine, by way of non-limiting example, the direction of rotation of the head or the body. of the patient during an examination.
- a digital representation R (t) delivered by such an angular sensor could consist of a vector of accelerations along several reference axes.
- such an electronic object 20 also comprises communication means 24, in the form of a modulator-demodulator enabling the electronic object 20 to communicate through an NI communication network, for example of the Bluetooth or WiFi type.
- the communication means 25 may further consist of a USB port, "Universal Serial Bus" according to English terminology, to implement a wire-type NI link.
- such an electronic object 20 may have a battery or more generally any internal electrical source, or be connected to the electrical network in order to draw sufficient electrical energy and necessary for its operation.
- the use of an internal battery power source will be preferred so as not to hinder the mobility of the object by the presence of an electric cable.
- an electronic object 20 may consist of non-limiting examples in a mask, glasses or helmet positioned on the head of a patient and having capture means 25. Unlike known solutions , it further comprises a processing unit 21, storage means 22, 23, and communication means 24. Such an object 20 no longer requires wired connectivity to cooperate with a third device.
- Figure 1 further describes such a third electronic device, such as a personal computer 10 or a touch pad for example.
- said device 10 comprises a processing unit 11, for example in the form of one or more microcontrollers or processors cooperating with storage means 12, 13, in the form of a memory of programs 12 and a data memory 13, said data memory 12 and programs 13 can be dissociated or possibly form one and the same physical entity.
- the electronic device 10 can be adapted to interpret a composite data stream by the loading of a computer program product PI according to the invention into the program memory 12.
- Said electronic device 10 thus adapted, becomes capable of receiving, interpreting and operating a composite data stream in the form of one or more incoming Ml messages conveyed by an NI communication network, advantageously exploiting a Wi-Fi or Bluetooth type wireless communication protocol.
- the invention does not exclude as such a wired communication mode, for example USB type, in order to no longer be penalized by the bandwidth limits imposed by such a link.
- the electronic device 10 thus comprises communication means 14 arranged to provide such communication, by receiving M1 messages previously encoded by an electronic object 20.
- the storage means 12, 13 and the communication means 14 advantageously cooperate with the unit. 11 by one or more communication buses, represented by double arrows in FIG.
- the data memory 13 can be arranged to record the content of messages Ml received by the communication means 14.
- An electronic device 10 furthermore advantageously comprises a man-machine interface of instruction and / or reproduction 1D cooperating with the processing unit 11.
- Said interface 1D makes it possible to restore to a user U of said device 10, for example the content of the data. transmitted by the electronic object 20 or produced by the processing unit 11 of said device 10.
- Said man / machine interface setpoint and / or restitution 1D can also be used to translate a gesture or a voice command of said user U data. setpoint C interpretable by the processing unit 11 of said device 10.
- Such a 1D interface may for example consist of a touch screen or be in the form of any other means allowing a user U of the device 10 to interact with said electronic device 10.
- the electronic device 10 may comprise two separate man-machine interfaces for translating instructions ema a user U and to restore to it graphical and / or sound contents.
- a setpoint interface may consist of a keyboard or a microphone and such a rendering interface may consist of a screen or a speaker.
- a capture means 25 for example of image matrix sensor type, captures the eyeball of a patient.
- the zone of interest Zi is therefore composed of the pupil P and the iris I of an eye 0 and the zone of environment Zn of the eyeball and the eyelid of the patient, as shown in connection with FIG.
- Figure 3 depicts a block diagram according to the invention of a method 200 for developing a composite data stream.
- a method 200 according to the invention and implemented by a processing unit 21 of an electronic object 20 as described in connection with FIG. 1, comprises a first step 202 for triggering the capture of a zone or scene of Ze study.
- Said step 202 consists in producing a digital representation R (ti) of the study zone Ze in connection with a current period ti.
- the data of studies characterizing the computer transcription of the digital representation R (ti) are stored within the storage means 22, 23, for example in the form of an array structure of integers, each field being respectively associated with the different pixels and describing a gray level of said associated pixel, such as for example a zero value describing a low gray level such as black and a value equal to 256 describing a high gray level such as the White.
- Such a structure may furthermore, and by way of nonlimiting example, record one or more attributes related to the capture of the study scene. For example, such an attribute can record the acquisition period ti and an identifier Idc characterizing the capture means 25.
- the method 200 also comprises a step 203 for discriminating, among the data of studies De, data of interest Di and data of environment Dn.
- a step 203 consists in implementing a discriminating function or treatment that makes it possible to identify and to isolate data of interest Di according to a predefined criterion. For example, we will try to dissociate the Di data in relation to the P pupil and the iris I of the studied eye with regard to the Dn data in relation to the rest of the eyeball.
- a discrimination function may consist in analyzing the digital representation, ie the data of studies De, of the captured study area, for example in the form of an array of pixels comprising the light intensities.
- Such image analysis methods may, by way of nonlimiting example, consist firstly of performing a thresholding of the study data De in order to obtain a binarized digital representation of the image, ie say having only two values.
- Such thresholding may for example consist of replacing the value of the gray level by a value of zero of all the pixels having a gray level lower than a predetermined value and replacing the value of gray level by a value of level maximum gray level of all pixels having a gray level greater than or equal to said predetermined value.
- the predetermined value can be set at 125.
- Another thresholding technique may consist in calculating the histogram of the image, that is to say, determining the distribution of the luminous intensities of the pixels of the image, then make a modification of the general intensity of the image to increase the shades of gray in order to increase the contrasts.
- a so-called cleaning step can implement a so-called cleaning step.
- the cleaning may consist, for example in a morphological analysis of the image from predetermined rules. For example, a pupil of an eye is usually circular in shape.
- Such a cleaning can therefore consist in excluding from the image analysis all the non-circular shaped areas, such as, for example, the oblong shapes that can characterize a makeup area on the top of the eyelid of the eye. 'eye.
- Such an image analysis method may comprise a calculation step for roughly estimating the location of the pupil.
- a step may, for example, consist in virtually isolating a group of pixels representing a disk and having a low gray level close to black, and then determining the center of said disk, for example by calculating its center of gravity.
- the discrimination function can provide for defining a geometrical shape around said center, for example a square or a circle of determined sides or radius, said geometric shape including the pixels of the area of interest, in this case the pupil and the iris.
- said geometric shape may consist of a square of sides of 12 millimeters or 144 pixels.
- the invention therefore provides for discriminating the data of interest Di as being written in a second integer array memory structure associated with the only pixels captured by said geometric shape.
- a second data structure Di can be recorded as a non-limiting example, in the data memory 23 of said object 20.
- the environment data Dn may consist of the structure recording the study data From or, as a variant, result from a copy of this data for which, the intensity values associated with the pixels of interest have been replaced by a predetermined value, for example zero.
- the storage means 22, 23 of the electronic object 20 comprise a recording arranged for storing parameters of area of interest P1 making it possible to discriminate the data of interest Di within the study data.
- Such parameters thus denote the data Di.
- area of interest parameters PI may correspond to a pair of X, Y coordinates denoting a reference pixel, that is to say the data associated with said pixel when they are arranged. advantageously in the form of an array in storage means 22, 23 of the object 20.
- Such parameters of area of interest PI may further comprise a pair of values W, H characterizing a width and / or a length a geometric figure, such as for example a square, delimiting a desired area of interest.
- Step 203 for discriminating the data of interest Di from the data of studies D and thus to distinguish them from the environment data Dn, consists therefore in extracting said parameters of area of interest P1 and in implementing a function discriminating according to the content of said parameters of area of interest PI.
- the data of interest Di can correspond to the set of data De studies.
- the method 200 therefore comprises a step 204 for encoding the data of interest Di according to a first encoding function F1 and the environment data Dn according to a second encoding function F2.
- the encoding functions F1, F2 can respectively rely on encoding parameters E1, E2, characterizing an encoding standard, a desired image resolution, expressed in number of pixels, and / or irreversible compression parameters characterizing a grouping ratio of pixels known under the name "binning" according to English terminology, etc.
- first encoding parameters El can characterize an attribute translating a RLE standard, an acronym for "Run Length Encoding" according to an English terminology, an attribute characterizing an image of 744 ⁇ 480 pixels, an attribute characterizing a zero binning rate, a attribute requiring transmission of all light intensities of the pixels.
- the second encoding parameters E2 encoding the environment data Dn can, on the other hand, characterize an attribute expressing a JPEG standard, an attribute characterizing an image of 320 ⁇ 240 pixels, an attribute characterizing a binning rate of 50%, and an attribute characterizing the transmission of light intensities of only pixels of odd rank.
- the data of interest Di thus encoded can then be recorded in the form of a data table. It is the same for the environment data Dn encoded, hereinafter denoted Dn '.
- the data memory 23 can thus comprise a table of encoded environment data Dn 'and a table of encoded interest data Di'. We can notice that according to the content of parameters E1 and E2, the data of interest encoded Di 'can be less degraded than the encoded environment data Dn'.
- Such encoding parameters E1, E2 can advantageously be recorded in the storage means 22, 23 of the electronic object 20, in the form of one or more recordings.
- the step 204 for encoding the data of interest Di according to a first encoding function F1 and the environment data Dn according to a second encoding function F2 then consists of extracting, prior to the encoding of the data Di and Dn as such, said parameters E1, E2 and to elaborate, parameterize or choose the first and second encoding functions Fl and F2 from the encoding parameters E1 and E2.
- the method 200 then comprises a step 205 for generating and triggering the transmission, by the communication means 24 to an electronic device 10, of one or more messages M1 each comprising all or part of the data Di 'and / or Dn 'and a descriptor, for example in the form of a header.
- a message descriptor M1 may comprise, for example, an identifier Idc of the capture means 25, a field comprising an identifier of the acquisition period t of the study data from which the data Di 'and / or Dn' originate, an attribute characterizing the type of study data transmitted, an encoding attribute designating the encoding function of said transmitted data, an object identifier transmitting said message M1, or even a device identifier receiving said message M1.
- Such a descriptor may further include an attribute characterizing a common reference frame between data of interest Di 'and environment data Dn' transmitted in a batch of messages Ml.
- Such a reference frame may consist, by way of example, in the coordinates of a reference pixel present in the two digital representations of the zone of interest Zi and the zone of environment Zn, for example a pixel associated with a pupillary center.
- the transmission of the previously encoded data Di 'and Dn' can be translated into a plurality of messages M1 whose respective descriptors may further comprise a sequential indicator, for ordering said messages during their reception, or even data of redundancy, of integrity and / or possible ciphers.
- the electronic device 10 receives through its communication means 14, one or more messages M1 during a step 102 of a method 100 for interpreting a composite data stream, an example of such a method 100 being described in connection with with Figures 2 and 6.
- Such a method 100 for interpreting a composite data stream is implemented by the processing unit 11 of said device 10.
- the method 100 includes a step 103 for decoding the content of said message M1.
- Such a step 103 may consist in decoding and extracting from the descriptor of said message M1, the attribute characterizing the type of study data transmitted and the encoding attribute designating the encoding function of the data of studies implemented by the electronic object 20 having sent said message M1.
- Step 103 may then consist of decoding the study data transmitted according to a decoding function F1 ', F2'.
- the choice and / or parameterization of such a decoding function can depend on said encoding attribute thus decoded or be predetermined.
- such an attribute may furthermore comprise an encoding parameter E1 or E2 of encoding function F1 or F2 implemented by the object 20.
- Step 103 therefore consists in implementing a first decoding function F1 'for decoding the data of interest Di' and a second decoding function F2 'for decoding said environment data Dn'.
- the storage means 12, 13 of the device 10 advantageously comprise data structures, for example in the form of one or more arrays of integers for storing the data of interest Di '' and the environment data Dn '' thus decoded. Such tables can thus, for example, record light intensities and / or gray level of pixels of an image that the device 10 can recompose.
- Said tables are therefore numerical representations similar to those of the zone of interest Zi and the zone of environment Zn previously captured by the electronic object 20.
- the data of study De may have been encoded and transmitted according to an irreversible compression factor implemented according to an encoding function F1, F2, for example by encoding and transmitting only the light intensities of only pixels of odd ranks.
- the decoding functions F1 ', F2' must therefore, by way of nonlimiting example, interpolate the data of studies De 'thus obtained by recording in the odd rows of integer arrays the value of the luminous intensities of the ranks lower peers. Any other interpolation function could alternatively be implemented.
- the method 100 comprises a step 104 for jointly exploiting said data of interest Di '' and environment data Dn '' thus decoded.
- Such an operation can, by way of nonlimiting example, consist in producing time-stamped recordings dedicated to archiving in the data memory 13 to create a history of the data of interest Di '' and environment data Dn '' decoded .
- a record may, by way of nonlimiting example, be arranged to store the acquisition period t of the study data from which the data Di 'and / or Dn' previously extracted from the message M1 and the tables of integers associated with the data of interest Di '' and the decoded environment data Dn ''.
- Such archiving can be used for computational purposes or for the development of new data.
- the study data relate to acceleration vectors delivered by a gyroscope
- such archiving can make it possible to reproduce the trajectory and the speed of the eyeball during an examination.
- the study data can be produced from a matrix image sensor of the eyeball.
- the processing unit 11 of the device 10 can determine the location of the pupillary center according to known image analysis methods and exposed previously, from the data of interest Di '' extracted several records dedicated to archiving.
- the data memory 13 therefore comprises a trajectory restitution data structure. having a plurality of records arranged to record trajectory feedback data Dr having said pupillary center locations and the associated acquisition period.
- joint exploitation of the decoded data Di '' of interest and decoded environment data Dn '' from the same acquisition by the capture means 25 of the object 20 may consist in recomposing an image of the study scene Ze intended to be restored by a rendering interface 1D of the electronic device 10 or cooperating with the latter, from said data Di '' and Dn ''.
- the data memory 13 therefore comprises an image restitution structure, for example in the form of an array of integers, for recording image reproduction data Dr necessary for the production of such an image.
- Said image reproduction data Dr consists of data of interest Di '' and environment data Dn '' previously decoded and coming from the same acquisition.
- Step 104 may therefore consist in recording in the rendering table the content of the decoded environment data Dn '', then in virtually superimposing the data of interest Di '' on the environment data Dn ''.
- Such an overlay can consist in determining the location of the data of interest Di '' with respect to the environment data Dn '' from the common repository extracted from the message M1 previously received. The value of the luminous intensities of the common pixels are therefore replaced by those of the data of interest Di ''.
- the reproduction data Dr thus created can be exploited later or restored on a 1D rendering interface of the electronic device 10.
- the method 100 further comprises a step 105 for triggering the reproduction, by a man-machine interface of reproduction 1D of the electronic device 10, of all or part of Dr restitution data previously developed.
- a reproduction may consist in displaying the trajectory restitution data Dr in the form of a graph representing the location of the pupillary center as a function of time.
- such a reproduction may consist in displaying the image reproduction data Dr in the form of an image representing a reconstruction of the study zone Ze, as shown with reference to FIG. 5.
- the curve of the eyelid of the observed eye is rendered in the form of a staircase of pixels in the part lying outside the white square and corresponding to the environment data Dn "decoded.
- the curve of the eyelid is very smoothly restored. There are two different image resolutions.
- the invention provides for the insertion of a demarcation line, as represented by the white square in FIG.
- a demarcation line may, by way of a non-limiting example, be inserted into the image reproduction data Dr by replacing the value of the pixels of the surrounding environment data Dn '' with the interest data Di '' by a value of 256, describing a high gray level, such as white.
- the detection of the boundary pixels can be done according to known image processing methods.
- the invention can provide for superimposing on the restitution of the composite image additional metadata, such as by way of non-limiting examples the acquisition period t of the captured image and the location of the pupillary center represented. by a cross in connection with FIG. 5.
- the electronic object 20 may comprise a plurality of capture means 25, such as by way of nonlimiting examples a matrix image sensor and a gyroscope, producing respectively first eyeball data Del and second data De2 studies relating to the direction of rotation of the head of a patient, said first and second study data Del, De2 being captured at the time of the same acquisition period ti.
- the method 200 for transmitting a composite data stream and the method 100 for interpreting such a stream are respectively respectively implemented by the processing unit 21 of the object 20 from said first and second Del and study data De2 and the processing unit 11 of the device 10 from the first and second study data Del '' and De2 '' decoded.
- the encoding functions Fl, F2, Fx and decoding Fl ', F2', Fx 'associated with the first and second data of interest Dil, Di2 and / or environment data Dn1, Dn2 respectively from the first and second data Del and De2 studies can be advantageously identical or distinct, depending on the nature of the data exchanged.
- the step 104 of the method 100 for exploiting the data of studies Del '' and De2 '' decoded can therefore consist in producing first and second reproduction data Dr1 and Dr2 respectively from the first and second data Del study. '' and De2 '' decoded.
- the step 105 for triggering the restitution of all or part of the reproduction data Dr1, Dr2 by a rendering interface 1D can therefore consist in embedding Dr2 reproduction data in the reproduction data Dr1.
- such an incrustation may consist in inserting a representation of an arrow translating the direction of movement of the patient's head into the restitution of the composite image resulting from the reproduction data Dr1.
- certain treatments implemented by the processing unit 21 of the electronic object 20 can be delegated and thus implemented in place of the electronic object 20, by the electronic device 10.
- the electronic device 10 can inform the object 20 of the positioning of the zone of interest Zi during the following acquisition or impose it from the beginning of the implementation of the method 200.
- Said method 100 comprises for that a step 101, prior to the step 102 to receive a message M1, to develop a control request Rcl comprising area parameters of interest PI defined above and designating Data of interest Di within the study data generated by the capture means 25 of the electronic object 20 that the object 20 will have to discriminate.
- the method 200 for producing a composite data stream therefore comprises a step 201 for receiving, via the communication means 24 of said object 20, said control request Rcl, and then decoding and extracting said parameters of area of interest P1.
- Step 203 for discriminating in the data of studies of data of interest Di and environment data Dn therefore consists in implementing a discrimination function according to the content of said parameters of area of interest P1.
- the step 101 for producing a control request Rcl may also consist in producing a second control request Rc2 comprising encoding parameters of all or part of the data of De.
- a request Rcl may comprise first encoding parameters E1 and second encoding parameters E2 specific to the data of interest D1 and environment data Dn.
- Step 201 for receiving and decoding the control request Rcl of the method 200 implemented by the processing unit 21 of the electronic object 20 may therefore consist in extracting the content of said encoding parameters E1 and E2 from the second control request Rc2.
- Step 204 to encode the data of interest Di according to a first encoding function F1 and the environment data Dn according to a second encoding function F2 is therefore to implement said first and second encoding functions F1 and F2 respectively according to the content of said first and second encoding parameters El, E2.
- said first and second control requests Rc1 and Rc2 may consist of a single request Rc3 comprising parameters of area of interest P1 and first and second encoding parameters E1, E2 specific to data of interest D1 and data d. environment Dn.
- the electronic object 20 can transmit a video stream to the electronic device 10.
- the method 200 for developing a composite data stream is therefore implemented iteratively, for example, every 20 milliseconds or all the 5 milliseconds.
- control requests Rc1 and / or Rc2 and / or Rc3 may comprise an attribute characterizing the capture frequency effected by the capture means 25 and transmission by the communication means 24 of the data of studies De.
- a frequency can be equal to 50Hz or 200Hz.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Heart & Thoracic Surgery (AREA)
- Biomedical Technology (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- General Physics & Mathematics (AREA)
- Ophthalmology & Optometry (AREA)
- Pathology (AREA)
- Human Computer Interaction (AREA)
- Discrete Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
Abstract
Description
Claims
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562255821P | 2015-11-16 | 2015-11-16 | |
FR1654875A FR3043818B1 (en) | 2015-11-16 | 2016-05-31 | SYSTEM FOR PRODUCING, TRANSMITTING AND INTERPRETING A COMPOSITE FLOW AND ASSOCIATED METHODS |
PCT/FR2016/052961 WO2017085396A1 (en) | 2015-11-16 | 2016-11-15 | System for formulating, emitting and interpreting a composite stream and associated method |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3376944A1 true EP3376944A1 (en) | 2018-09-26 |
Family
ID=58669393
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP16828759.7A Withdrawn EP3376944A1 (en) | 2015-11-16 | 2016-11-15 | System for formulating, emitting and interpreting a composite stream and associated method |
Country Status (4)
Country | Link |
---|---|
US (1) | US20190290199A1 (en) |
EP (1) | EP3376944A1 (en) |
FR (1) | FR3043818B1 (en) |
WO (1) | WO2017085396A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109993115B (en) * | 2019-03-29 | 2021-09-10 | 京东方科技集团股份有限公司 | Image processing method and device and wearable device |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6120461A (en) * | 1999-08-09 | 2000-09-19 | The United States Of America As Represented By The Secretary Of The Army | Apparatus for tracking the human eye with a retinal scanning display, and method thereof |
US7773670B1 (en) * | 2001-06-05 | 2010-08-10 | At+T Intellectual Property Ii, L.P. | Method of content adaptive video encoding |
US20110077548A1 (en) * | 2004-04-01 | 2011-03-31 | Torch William C | Biosensors, communicators, and controllers monitoring eye movement and methods for using them |
US9994228B2 (en) * | 2010-05-14 | 2018-06-12 | Iarmourholdings, Inc. | Systems and methods for controlling a vehicle or device in response to a measured human response to a provocative environment |
FR2978639B1 (en) * | 2011-07-29 | 2015-09-25 | Cassidian Sas | METHODS OF COMPRESSING AND DECOMPRESSING IMAGES |
FR2999373B1 (en) * | 2012-12-12 | 2018-04-06 | Harmonic Inc. | METHOD FOR DYNAMICALLY ADAPTING THE CODING OF AN AUDIO AND / OR VIDEO STREAM TRANSMITTED TO A DEVICE |
JP2014207110A (en) * | 2013-04-12 | 2014-10-30 | 株式会社日立ハイテクノロジーズ | Observation apparatus and observation method |
-
2016
- 2016-05-31 FR FR1654875A patent/FR3043818B1/en not_active Expired - Fee Related
- 2016-11-15 WO PCT/FR2016/052961 patent/WO2017085396A1/en active Application Filing
- 2016-11-15 US US15/776,344 patent/US20190290199A1/en not_active Abandoned
- 2016-11-15 EP EP16828759.7A patent/EP3376944A1/en not_active Withdrawn
Also Published As
Publication number | Publication date |
---|---|
FR3043818A1 (en) | 2017-05-19 |
US20190290199A1 (en) | 2019-09-26 |
WO2017085396A1 (en) | 2017-05-26 |
FR3043818B1 (en) | 2019-10-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11698674B2 (en) | Multimodal inputs for computer-generated reality | |
CN108805047B (en) | Living body detection method and device, electronic equipment and computer readable medium | |
WO2018083211A1 (en) | Streaming virtual reality video | |
CN106797459A (en) | The transmission of 3 D video | |
EP3612912B1 (en) | Method for reading a video stream | |
EP2104925A1 (en) | Method and device for the real time imbedding of virtual objects in an image stream using data from a real scene represented by said images | |
KR102284266B1 (en) | Method for vr sickness assessment considering neural mismatch model and the apparatus thereof | |
WO2014018296A1 (en) | Securing information using entity detection | |
CN113257372B (en) | Oral health management related system, method, device and equipment | |
FR2933794A1 (en) | METHOD AND DEVICE FOR STORING AND / OR TRANSMITTING MEDICAL DATA, METHOD AND DEVICE FOR VIEWING MEDICAL DATA, COMPUTER PROGRAM PRODUCTS, SIGNALS AND CORRESPONDING DATA CARRIER | |
FR3043818B1 (en) | SYSTEM FOR PRODUCING, TRANSMITTING AND INTERPRETING A COMPOSITE FLOW AND ASSOCIATED METHODS | |
CA2965332C (en) | Method for collecting image data for producing immersive video and method for viewing a space on the basis of the image data | |
CN114374832B (en) | Control method and device for virtual reality experience, user equipment and network equipment | |
FR3054062A1 (en) | SYSTEM AND METHOD FOR ONBOARD CAPTURE AND 3D / 360 ° REPRODUCTION OF THE MOVEMENT OF AN OPERATOR IN ITS ENVIRONMENT | |
CN112115852A (en) | Living body detection method using RGB infrared camera | |
CN110096955A (en) | Monitoring method, device, system and storage medium | |
WO2022075349A1 (en) | Image processing device, image processing method, and non-transitory computer readable medium whereon image processing program is stored | |
US11758104B1 (en) | Systems and methods for predictive streaming of image data for spatial computing | |
US20240205464A1 (en) | Systems and methods for configuring adaptive streaming of content items based on user comfort levels | |
EP4349006A1 (en) | Mixed-reality communication method, communication system, computer program, and information medium | |
WO2024170830A1 (en) | Method for communicating data on a system for virtually trying on an accessory by a digitally represented living being | |
KR20240070798A (en) | Apparatus and method for surveillance | |
EP4166067A1 (en) | On-board device for synchronised collection of brain waves and environmental data | |
KR20230105229A (en) | Apparatus and method for analyzing a situation in an image acquired through a closed-circuit television | |
FR2907298A1 (en) | Movable subject's e.g. person, video image e.g. face image, transmitting method for e.g. video game application, involves obtaining synthesis image by applying parameters and transmitting parameters to telephone for reconstituting image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20180517 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20220601 |