WO2018211711A1 - Dispositif d'analyse de comportement, programme d'analyse de comportement, système d'analyse de comportement, dispositif de commande de marqueur et programme de commande de marqueur - Google Patents

Dispositif d'analyse de comportement, programme d'analyse de comportement, système d'analyse de comportement, dispositif de commande de marqueur et programme de commande de marqueur Download PDF

Info

Publication number
WO2018211711A1
WO2018211711A1 PCT/JP2017/018924 JP2017018924W WO2018211711A1 WO 2018211711 A1 WO2018211711 A1 WO 2018211711A1 JP 2017018924 W JP2017018924 W JP 2017018924W WO 2018211711 A1 WO2018211711 A1 WO 2018211711A1
Authority
WO
WIPO (PCT)
Prior art keywords
marker
information
light emission
time
behavior analysis
Prior art date
Application number
PCT/JP2017/018924
Other languages
English (en)
Japanese (ja)
Inventor
篤史 入來
服部 泰
幸士 生田
佳秀 田森
Original Assignee
特定非営利活動法人ニューロクリアティブ研究会
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 特定非営利活動法人ニューロクリアティブ研究会 filed Critical 特定非営利活動法人ニューロクリアティブ研究会
Priority to PCT/JP2017/018924 priority Critical patent/WO2018211711A1/fr
Publication of WO2018211711A1 publication Critical patent/WO2018211711A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques

Definitions

  • the present invention relates to a behavior analysis device, a behavior analysis program, a behavior analysis system, a marker control device, and a marker control program.
  • time-axis information is important for such behavior analysis. Since time axis information is required to be accurate, the measuring apparatus must be equipped with a highly accurate timepiece. Furthermore, when there are a plurality of animals, the time axes of the animals must be matched in the behavior analysis of each animal, and the time axes must be synchronized between the measuring devices.
  • one of the objects of the present invention is to easily perform behavior analysis of a measurement target.
  • an acquisition unit that acquires video information, and an extraction that extracts a marker from the video information acquired by the acquisition unit corresponding to each data position on the time axis of the video information
  • a measurement position that acquires a position of a measurement object using a position of the marker
  • a marker position acquisition unit that acquires a position in the video of the marker extracted by the extraction unit, and a position of the marker acquired by the marker position acquisition unit
  • An acquisition unit, a time information acquisition unit that acquires time information based on the state of the marker extracted by the extraction unit, a position of a measurement target acquired by the measurement position acquisition unit, and the time information acquisition unit
  • a generation unit that generates data in which the time information acquired by the image information is associated with each data position on the time axis of the video information. It is provided.
  • it further includes a biological / environmental information acquisition unit that acquires biological information or environmental information, the biological information or environmental information, the position of the measurement target acquired by the measurement position acquisition unit, and the time information Data in which the time information acquired by the acquisition unit is associated with each data position on the time axis of the video information may be generated.
  • the marker state may be a physical quantity that changes with time.
  • the marker may include a light emitting unit, and the physical quantity may include the light emission intensity of the light emitting unit.
  • the marker may include a light emitting unit, and the physical quantity may include an emission wavelength of the light emitting unit.
  • the marker may have a shape changing portion, and the physical quantity may include a shape change amount.
  • the video information may include information captured by a plurality of cameras.
  • the marker state may be grasped by combining two or more marker states.
  • image information is acquired, a marker is extracted from the image information corresponding to each data position on the time axis of the image information, and the position of the marker in the image is acquired.
  • the position of the measurement target is acquired using the position of the marker, the time information is acquired based on the state of the marker, and the position of the measurement target and the time information are set on the time axis of the video information.
  • a behavior analysis program for generating data associated with each data position is provided.
  • a marker control device that controls light emission of a marker having a light emitting unit, corresponding to the time acquisition unit that acquires time information and the time information acquired by the time acquisition unit
  • a pattern determining unit that determines a light emission pattern of the light emitting unit
  • a light emission signal output unit that outputs a light emission signal for causing the light emitting unit to emit light with the light emission pattern determined by the pattern determining unit to the marker are provided.
  • the pattern determination unit may determine the light emission pattern of the light emission unit so as to change slower than an image update frequency or a shutter speed of a camera installed on the measurement target.
  • a marker control device that causes a computer to acquire time information and controls light emission of a marker having a light emitting unit, and determines a light emission pattern of the light emitting unit corresponding to the time information. And a marker control program for outputting to the marker a light emission signal for causing the light emitting unit to emit light with the light emission pattern.
  • a behavior analysis system including the behavior analysis device of the present invention and the marker control device of the present invention is provided.
  • FIG. 1 is a diagram showing the configuration of the behavior analysis system according to the first embodiment of the present invention.
  • the behavior analysis system 100 includes a marker control device 120, a marker 130, a measurement device 140, and an analysis device 160.
  • the marker control device 120 controls the light emission pattern of the marker 130.
  • the measuring device 140 is attached to a measurement target and captures the marker 130 to acquire video information.
  • the measurement device 140 measures a measurement target or environment and acquires biological information / environment information.
  • the biological information of the measurement target to be acquired is, for example, an electroencephalogram, heartbeat / respiration / body temperature / pupil diameter / line of sight, limb movement, and the like.
  • the analysis device 160 analyzes the video information and acquires the position of the measurement target and the measurement time.
  • the analysis device 160 generates data in which biological information / environment information is associated with the position of the measurement target and the measurement time.
  • the measurement object may be an animal that moves in a three-dimensional space such as a monkey or a person with six degrees of freedom, or may be an animal that moves in two dimensions. Also, there may be a plurality of measurement objects.
  • FIG. 2 is a diagram illustrating a state in which the marker and the measurement device according to the behavior analysis system of the first embodiment of the present invention are arranged in a space.
  • a plurality of markers 130 are arranged at arbitrary locations in the space 200.
  • the space 200 is assumed to be a rectangular parallelepiped room.
  • the measuring device 140 is attached to a measurement target, for example, the head.
  • FIG. 3 is an enlarged view of the marker according to the behavior analysis system of the first embodiment of the present invention.
  • a plurality of LEDs (A, B, C,...) 132 are arranged on the marker 130 of ID1.
  • the light emission pattern of the LED 132 is controlled by the marker control device 120, and time information is obtained from the light emission pattern. Therefore, the number of the LEDs 132 may be a number that can acquire time information by a combination of light emission patterns.
  • the LED 132 desirably emits light other than visible light so as not to affect the measurement target.
  • the marker 130 may include information for specifying the direction of the marker 130. For example, an image showing the upper left on the upper left of the marker and an image showing the lower right on the lower right may be arranged. Further, the orientation of the marker 130 may be specified by the arrangement of the LEDs 132 or the light emission pattern. As will be described later, when analyzing the marker 130 in the captured video, it is possible to specify which direction it is, so that the arrangement order of the LEDs 132 and the like can be accurately specified.
  • FIG. 4 is a diagram illustrating a marker control device according to the behavior analysis system of the first embodiment of the present invention.
  • the marker control device 120 includes a CPU (Central Processing Unit) 122, a memory 124, and an interface 126.
  • the memory 124 is a storage device such as a non-volatile memory, and stores various programs such as a program for controlling the marker 130.
  • the interface 126 is a terminal or the like for connecting the marker 130 to the marker control device 120. Note that the interface 126 is not limited to being connected to the marker 130 by wire, and may have a communication function for connecting wirelessly.
  • the CPU 122 is an arithmetic processing circuit or the like, and executes programs stored in the memory 124 to realize various functions in the marker control device 120.
  • FIG. 5 is a functional block diagram showing the operation of the marker control device according to the behavior analysis system of the first embodiment of the present invention.
  • the marker control device 120 implements the functions of the time acquisition unit 300, the pattern determination unit 310, and the light emission signal output unit 320 by executing the above program.
  • the time acquisition unit 300 acquires and outputs time information. This time information may be acquired from an internal timer such as a clock signal.
  • the pattern determination unit 310 determines the light emission patterns of the plurality of LEDs 132 based on the time information output from the time acquisition unit 300.
  • the calculation method for converting the time information into the light emission pattern may be determined in advance.
  • the time information may be a sequential reference time (a time set inside the apparatus) or a relative time (elapsed time from a certain time) with respect to the reference time.
  • the light emission pattern may have a time resolution of 1 millisecond or less.
  • the light emission pattern representing the time information is information that can be obtained by specifying a certain timing such as light emission intensity (luminance), light emission wavelength (color, light frequency, infrared region, ultraviolet region other than visible light), etc. Alternatively, it may be generated by using information such as a luminance change (blinking, blinking frequency, etc.) that can be known by time.
  • a certain timing such as light emission intensity (luminance), light emission wavelength (color, light frequency, infrared region, ultraviolet region other than visible light), etc.
  • the light emission pattern is desirably generated in consideration of the image update frequency and the shutter speed of the camera 141 to be used, so that it is possible to acquire information with no ambiguity in the camera 141 to be used. For example, if the light emission pattern changes faster than the image update frequency or shutter speed of the camera 141 to be used, the camera 141 cannot capture all of the light emission patterns, so the exact time corresponding to the light emission pattern is not obtained. Information cannot be obtained. Therefore, it is desirable to make it possible to acquire accurate information corresponding to the light emission pattern by making the light emission pattern change speed slower than the image update frequency or shutter speed of the camera 141 to be used. Information about the image update frequency and shutter speed of the camera 141 to be used may be stored in the memory 124 in advance.
  • the light emission signal output unit 320 outputs a light emission signal related to the light emission pattern to each of the LEDs 132 of the marker 130 (320).
  • the LED 132 of the marker 130 emits light according to the generated light emission pattern by the output light emission signal, and thereby time information is expressed in the marker 130.
  • the above-described operation of the marker control device is repeated while measuring the measurement target.
  • the marker control device 120 may further generate a light emission pattern indicating the ID and position information of each marker 130 and transmit a light emission instruction related to the light emission pattern to the LED 132 of each marker 130.
  • the ID may be anything such as 1, 2, 3,... As long as each marker 130 can be distinguished.
  • the marker control device 120 generates a light emission pattern representing a wide range of information such as temperature, humidity, wind direction speed, food / drinking intake, cognitive behavioral characteristics, and experimental conditions such as medication as other environmental information.
  • a light emission instruction according to the light emission pattern may be transmitted to the LED 132.
  • Information on experimental conditions such as temperature, humidity, wind direction speed, food / drinking intake, cognitive behavioral characteristics, and medication may be acquired from various sensors or may be input in advance.
  • a light emission pattern that causes the marker 130 to express these pieces of information at a preset time resolution, for example, a time resolution of 1 millisecond or less, is generated, and a light emission instruction corresponding to the light emission pattern is transmitted to each LED 132 of the marker 130.
  • each LED 132 may emit light.
  • the marker 130 may be an object other than an LED, a display, or the like.
  • the marker 130 has any physical quantity such as color, shape, size, position, etc. that changes with time, and any change can be associated with the time by the marker control device 120.
  • the marker control device 120 may set time information for the marker 130 by a physical quantity such as a color, shape, size, position, etc. that changes with time.
  • FIG. 6 is a diagram illustrating a measurement apparatus according to the behavior analysis system of the first embodiment of the present invention.
  • the measuring apparatus 140 includes a camera 141, an electrode 142, a memory 148, an interface 144, and a button 146.
  • the camera 141 captures a field of view from the measurement target, and is used to capture a marker 130, a part of the measurement target itself equipped with the measurement device, or another measurement target when there are a plurality of measurement targets. .
  • the camera 141 can set a specific image update frequency or shutter speed, and it is desirable that the camera 141 can capture visible light or invisible light such as infrared rays and ultraviolet rays as widely as possible.
  • the camera 141 may be a single camera capable of wide-angle imaging such as an omnidirectional camera. Although an example in which one camera 141 is provided in the measurement apparatus 140 is described in FIG. 6, the present invention is not limited to this, and the measurement apparatus 140 may be provided with a plurality of cameras 141 so that a wide-angle image can be acquired.
  • the electrode 142 acquires biological information to be measured (in this example, an electroencephalogram).
  • a plurality of electrodes 142 may be provided.
  • the memory 148 is a storage device such as a nonvolatile memory, and stores information from the camera 141, information from the electrode 142, and the like.
  • the interface 144 is a terminal for connecting the camera 141 and the electrode to the memory 148. Note that the interface 144 is not limited to being connected to the camera 141 and the electrode by wire, and may have a communication function for connecting wirelessly.
  • the button 146 is a setting / operation device such as a button for controlling the ON / OFF operation of the camera 141 and the electrode 142.
  • the measuring device 140 is attached to a measurement target.
  • the measurement device 140 may be attached to the head of the measurement target in order to acquire information on the movement of the head of the measurement target and the electroencephalogram.
  • FIG. 7 is a diagram showing an analysis apparatus according to the behavior analysis system of the first embodiment of the present invention.
  • the analysis device 160 includes a CPU 162, a memory 164, and an interface 166.
  • the memory 164 is a storage device such as a non-volatile memory, and stores various programs such as a program for analyzing video information and converting the analysis result into data.
  • the interface 166 is a terminal or the like for connecting the memory 148 in which information about the measurement target living body acquired by the measuring apparatus 140 and video information is stored to the analyzing apparatus 160. Note that the interface 166 is not limited to being connected to the memory 148 by a wire, and may have a communication function for wireless connection.
  • the CPU 162 is an arithmetic processing circuit or the like, and executes programs stored in the memory 164 to realize various functions in the analysis device 160.
  • the analysis device 160 may be provided in the same computer as the marker control device 120 or may be provided in another computer.
  • FIG. 8 is a functional block diagram showing the operation of the analysis apparatus according to the behavior analysis system of the first embodiment of the invention.
  • the analysis device 160 executes the above-described program, whereby the video data acquisition unit 400, the extraction unit 410, the marker position acquisition unit 420, the measurement position acquisition unit 430, the time information acquisition unit 440, the electroencephalogram information acquisition unit 450, and the generation unit 460. Each function is realized.
  • the video data acquisition unit 400 acquires the video data acquired by the measurement device 140 from the memory 148.
  • the extraction unit 410 extracts the marker 130 from the video data acquired by the video data acquisition unit 400 (410).
  • the marker position acquisition unit 420 acquires position information of the marker 130 in the video data.
  • the measurement position acquisition unit 430 acquires the position information of the measurement target (measurement device), that is, the space, from the relationship between the coordinates in the video for each ID of the marker 130 acquired by the marker position acquisition unit 420 and the coordinates in the space 200.
  • the information about the position of the measurement object and the rotation angle thereof are acquired.
  • Position information (coordinates in the space 200 (room)) where the marker 130 for each ID is arranged may be registered in advance, and the space of the marker 130 for each ID from the light emission pattern of the marker 130 for each ID.
  • the coordinates at 200 are obtained.
  • the accuracy of the position information of the measurement object to be acquired can be improved by photographing the plurality of markers 130 that are as far apart as possible by the camera 141. Therefore, the camera 141 is desirably a single camera capable of photographing at a wide angle, such as the omnidirectional camera as described above, but a plurality of cameras having different optical axes may be used.
  • the measurement position acquisition unit 430 further acquires operation information of the measurement target, for example, information regarding the orientation of the face of the measurement target when the measurement device 140 including the camera 141 is mounted on the head of the measurement target. be able to.
  • the time information acquisition unit 440 acquires time information from the marker 130 in the video data, and acquires the time when the video data was captured.
  • the LED 132 of the marker 130 is controlled to emit light according to the light emission pattern representing the time information generated by the marker control device 120. Therefore, from the light emission state of the marker 130 in the video data at that time, Information relating to the time when the video data was taken can also be acquired. In order to grasp the information related to the shooting time, the video data needs to include a number of LEDs 132 to the extent that the time information can be acquired.
  • the electroencephalogram information acquisition unit 450 acquires the electroencephalogram information of the measurement target acquired by the measurement apparatus 140 from the memory 148.
  • the analysis device 160 further includes the environmental information such as the experimental conditions such as temperature, humidity, wind speed, food / drinking intake, cognitive behavioral characteristics, medication, etc. in the marker 130 in the video data.
  • An environmental information acquisition unit that acquires environmental information related to the time when the video data was captured from the marker 130 may be provided.
  • the analysis device 160 can detect a part of the body to be measured (feature points that can be extracted such as joints, etc. in the video data. ) Is reflected, the analysis device 160 can acquire information on the relative movement of the body part to be measured and the head.
  • the analysis device 160 can acquire information on the operation of the other measurement objects when the other measurement objects are reflected in the video data. For example, the analysis device 160 acquires the operation information of another measurement object that has a social interaction such as struggle, cooperation, and joy with the measurement object, and the operation information of another measurement object that exists in the vicinity thereof. be able to.
  • the analysis device 160 extracts the marker 130, the part of the measurement target body, and the other measurement target from the video data, so that the position and direction where the video data is captured, the part of the measurement target body. And information on the relative movement of the head and the movement of other measurement objects can be acquired together with biological information such as time information, environmental information, and brain waves.
  • the generation unit 460 determines the position and orientation of the measurement target, the relative movement of the body part and the head of the measurement target, the movement of the other measurement target, and the time at which they were captured. Then, data associated with biological information such as an electroencephalogram or the like to be measured at the time when the environment and the time when they were photographed is generated and stored in the memory 164.
  • the measurer can easily analyze the relationship between the operation of the measurement target and the environment in which the measurement target is placed and biological information such as an electroencephalogram using the data.
  • biological information such as an electroencephalogram
  • the position information and the operation information of the measurement target can be acquired and stored while being linked to the biological information such as time information, environment information, and brain waves, so Analysis that associates information with actions becomes easier.
  • a system for associating a video with time and position (such as a clock or GPS) and a system for associating video information are required, so there is a limit to downsizing the equipment used.
  • time and position such as a clock or GPS
  • system for associating video information are required, so there is a limit to downsizing the equipment used.
  • information such as time and position is embedded in the video itself, the camera can be extremely miniaturized and the camera can be added indefinitely. Information such as time and position can be reconstructed later based on the video.
  • the behavior analysis system it is not necessary to design the behavior to be monitored and the analysis method before data acquisition, and the analysis is designed retroactively to the characteristics of the data acquired after data acquisition. I can do it.
  • FIG. 9 is a diagram showing a state of time adjustment of a plurality of data in the behavior analysis according to the first embodiment of the present invention.
  • the photographing start time 510 of the photographing data 1 of the measuring object 1 When the shooting start time 520 of the shooting data 2 of 2 is different, the time of the data 1 and the data 2 needs to be matched. And data 2 had to be added to each. In addition, when the time information added separately to the data 1 and the data 2 is combined, it is difficult to increase the accuracy of synchronization due to the deviation of the absolute time between the measuring devices.
  • position information and operation information of a plurality of measurement targets can be acquired and stored in video data while being associated with time information. Since it is not necessary to set the time of the plurality of videos and the time of the plurality of videos can be surely set, the accuracy of the behavior analysis of the plurality of monitoring targets can be improved.
  • the behavior analysis system according to the second embodiment emits light in that a plurality of LEDs 732 that emit light are dispersed and time information and the like can be grasped by photographing a marker 730 that includes a plurality of dispersed LEDs 732.
  • a plurality of LEDs to be collected are gathered into one marker and time information and the like are acquired from video information of one marker, but other than that according to the first embodiment This is the same as the behavior analysis system 100.
  • FIG. 10 is an example of a marker according to the behavior analysis system of the second embodiment of the present invention
  • FIG. 11 is another example of the marker according to the behavior analysis system of the second embodiment of the present invention.
  • LEDs 732 to be emitted are dispersed in different markers 730, and among the markers 730 including the LEDs 732 that are emitting light, the LEDs 732 necessary for representing time information and the like are included. By capturing all of the markers 730, the time information can be grasped.
  • the type A LED 732 and the type B LED 732 are arranged in the ID1 marker 730
  • the type B LED 732 and the type C LED 732 are arranged in the ID2 marker 730
  • the type C LED 732 is arranged in the ID3 marker 730.
  • An example in which the LED 732 and the type A LED 732 are arranged is shown.
  • the time information is acquired by extracting the ID1 marker 730 from the video data.
  • the time information can be obtained by extracting both the ID2 marker 730 including the type B LED 732 and the ID3 marker 730 including the type A LED 732 from the video data. get.
  • FIG. 11 shows an example in which the type A LED 732 is arranged at the ID1 marker 730, the type B LED 732 is arranged at the ID2 marker 730, and the type C LED 732 is arranged at the ID3 marker 730.
  • the ID1 marker 730 including the type A LED 732 and the type B LED 732 are extracted from the video data. Time information is acquired by extracting both the ID2 markers 730 including the LED 732.
  • Information indicating the ID and the type and arrangement of the LEDs provided in the marker of the ID are registered in advance. For example, information that the type A LED 730 is arranged on the left side and the type B LED 730 is arranged on the right side is registered in the marker 730 of ID1.
  • the analysis device or the marker control device of the behavior analysis system according to the embodiment of the present invention may be implemented by software.
  • the behavior analysis system causes a computer to acquire video information, extract a marker from the video information, acquire information related to the position of the marker, and use the marker position of the measurement target.
  • a behavior analysis program that acquires information related to position and orientation, acquires information related to time from the state of the marker, and generates data that associates information related to the position and orientation of the measurement target with information related to time It may be embodied.
  • the behavior analysis system causes a computer to acquire time information, generate a light emission pattern corresponding to the time information, and cause the marker to output a light emission signal corresponding to the light emission pattern. It may be embodied in the form of a control program.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physiology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Dentistry (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

L'invention concerne un dispositif d'analyse de comportement, comprenant : une unité d'acquisition destinée à acquérir des informations d'image ; une unité d'extraction destinée à extraire un marqueur à partir des informations d'image acquises par l'unité d'acquisition, en correspondance avec chaque position de données sur l'axe temporel des informations d'image ; une unité d'acquisition de position de marqueur destinée à acquérir la position, dans une image, du marqueur extrait par l'unité d'extraction ; une unité d'acquisition de position de mesure destinée à acquérir la position d'un objet à mesurer à l'aide de la position du marqueur acquise par l'unité d'acquisition de position de marqueur ; une unité d'acquisition d'informations temporelles destinée à acquérir des informations temporelles sur la base de l'état du marqueur extrait par l'unité d'extraction ; et une unité de génération destinée à générer des données dans lesquelles la position de l'objet à mesurer acquise par l'unité d'acquisition de position de mesure et les informations temporelles acquises par l'unité d'acquisition d'informations temporelles sont associées les unes aux autres de ce fait à chaque position de données sur l'axe des temps des informations d'image.
PCT/JP2017/018924 2017-05-19 2017-05-19 Dispositif d'analyse de comportement, programme d'analyse de comportement, système d'analyse de comportement, dispositif de commande de marqueur et programme de commande de marqueur WO2018211711A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/018924 WO2018211711A1 (fr) 2017-05-19 2017-05-19 Dispositif d'analyse de comportement, programme d'analyse de comportement, système d'analyse de comportement, dispositif de commande de marqueur et programme de commande de marqueur

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/018924 WO2018211711A1 (fr) 2017-05-19 2017-05-19 Dispositif d'analyse de comportement, programme d'analyse de comportement, système d'analyse de comportement, dispositif de commande de marqueur et programme de commande de marqueur

Publications (1)

Publication Number Publication Date
WO2018211711A1 true WO2018211711A1 (fr) 2018-11-22

Family

ID=64274321

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/018924 WO2018211711A1 (fr) 2017-05-19 2017-05-19 Dispositif d'analyse de comportement, programme d'analyse de comportement, système d'analyse de comportement, dispositif de commande de marqueur et programme de commande de marqueur

Country Status (1)

Country Link
WO (1) WO2018211711A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020184170A (ja) * 2019-05-07 2020-11-12 株式会社デンソー 画像解析装置、電子制御装置、及び画像解析方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005069977A (ja) * 2003-08-27 2005-03-17 Nippon Telegr & Teleph Corp <Ntt> 物体移動経路トラッキング装置およびトラッキング方法
JP2006033329A (ja) * 2004-07-15 2006-02-02 Advanced Telecommunication Research Institute International 光学マーカシステム
JP2009033366A (ja) * 2007-07-25 2009-02-12 Advanced Telecommunication Research Institute International 光学マーカシステム
JP2012208874A (ja) * 2011-03-30 2012-10-25 Fujifilm Corp 症状入力支援装置、支援方法及びプログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005069977A (ja) * 2003-08-27 2005-03-17 Nippon Telegr & Teleph Corp <Ntt> 物体移動経路トラッキング装置およびトラッキング方法
JP2006033329A (ja) * 2004-07-15 2006-02-02 Advanced Telecommunication Research Institute International 光学マーカシステム
JP2009033366A (ja) * 2007-07-25 2009-02-12 Advanced Telecommunication Research Institute International 光学マーカシステム
JP2012208874A (ja) * 2011-03-30 2012-10-25 Fujifilm Corp 症状入力支援装置、支援方法及びプログラム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020184170A (ja) * 2019-05-07 2020-11-12 株式会社デンソー 画像解析装置、電子制御装置、及び画像解析方法
JP7263904B2 (ja) 2019-05-07 2023-04-25 株式会社デンソー 画像解析装置、電子制御装置、及び画像解析方法

Similar Documents

Publication Publication Date Title
EP3606410B1 (fr) Procédés, dispositifs et systèmes d&#39;évaluation de surface anatomique
US11176669B2 (en) System for remote medical imaging using two conventional smart mobile devices and/or augmented reality (AR)
JP6664512B2 (ja) アイブレインインターフェースシステムのキャリブレーション方法、及びシステム内のスレーブデバイス、ホストデバイス
US20160196667A1 (en) System and Method for Tracking
WO2015123771A1 (fr) Suivi de gestes et commande en réalité augmentée et virtuelle
US9551566B2 (en) Coordinate measuring device
JP6101878B1 (ja) 診断装置
US11776146B2 (en) Edge handling methods for associated depth sensing camera devices, systems, and methods
KR20170013271A (ko) 재활 지원 시스템
US20160073854A1 (en) Systems and methods using spatial sensor data in full-field three-dimensional surface measurement
US11831993B2 (en) Information processing apparatus, information processing system, device for position and posture acquisition, and device information acquisition method
WO2021111879A1 (fr) Procédé de génération de modèle d&#39;apprentissage, programme, système d&#39;aide à la compétence, dispositif de traitement d&#39;informations, procédé de traitement d&#39;informations et processeur d&#39;endoscope
US10783376B2 (en) Information processing apparatus
JP2018014078A5 (fr)
JP7335899B2 (ja) 測定システム、測定装置、測定方法、及びプログラム
WO2018211711A1 (fr) Dispositif d&#39;analyse de comportement, programme d&#39;analyse de comportement, système d&#39;analyse de comportement, dispositif de commande de marqueur et programme de commande de marqueur
JP2021018508A5 (fr)
EP3072107B1 (fr) Suiveur optique compact
Vagvolgyi et al. Wide-angle, monocular head tracking using passive markers
US10389444B2 (en) Image processing apparatus, image processing system, image processing method, and recording medium
Nocerino et al. Low-cost human motion capture system for postural analysis onboard ships
CN107007997B (zh) 图像处理装置、测定装置、图像处理系统、图像处理方法以及记录介质
JP4547547B2 (ja) 頭部姿勢推定装置、頭部姿勢推定方法及び頭部姿勢推定処理プログラム
JP2018074910A (ja) 生体モニタシステム及び生体モニタ方法
JP2005322077A6 (ja) 頭部姿勢推定装置、頭部姿勢推定方法及び頭部姿勢推定処理プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17910244

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17910244

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP