WO2023136175A1 - Dispositif d'acquisition d'informations et système de traitement d'informations - Google Patents

Dispositif d'acquisition d'informations et système de traitement d'informations Download PDF

Info

Publication number
WO2023136175A1
WO2023136175A1 PCT/JP2022/048616 JP2022048616W WO2023136175A1 WO 2023136175 A1 WO2023136175 A1 WO 2023136175A1 JP 2022048616 W JP2022048616 W JP 2022048616W WO 2023136175 A1 WO2023136175 A1 WO 2023136175A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensing
data
information
user
processing system
Prior art date
Application number
PCT/JP2022/048616
Other languages
English (en)
Japanese (ja)
Inventor
孝文 柳元
律子 金野
一視 平野
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2023136175A1 publication Critical patent/WO2023136175A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices

Definitions

  • the present disclosure relates to an information acquisition device and an information processing system.
  • Patent Document 1 discloses a system for navigating the position of a diagnostic device used on the patient side in online medical care (remote medical care).
  • One aspect of the present disclosure provides technology that enables sensing of vital sounds at an appropriate auscultatory position.
  • An information acquisition device includes a plurality of sensing elements arranged two-dimensionally so as to sense vital sounds at opposing positions.
  • An information processing system includes an information acquisition device including a plurality of sensing elements arranged two-dimensionally so as to sense vital sounds at positions facing each other; a processing unit for generating, based on the data, navigation information of auscultation positions for applying the plurality of sensing elements to the user's body.
  • FIG. 1 is a diagram illustrating an example of a schematic configuration of an information processing system according to an embodiment
  • FIG. It is a figure which shows the example of schematic structure of a sensor. It is a figure which shows the example of schematic structure of a sensor. It is a figure which shows the example of the functional block of an information processing system.
  • FIG. 4 is a diagram showing an example of calculation of a navigation direction; It is a figure which shows the example of a process.
  • FIG. 4 is a diagram showing an example of presentation of navigation information and the like; It is a figure which shows the example of a process.
  • FIG. 4 is a diagram showing an example of presentation of navigation information and the like;
  • FIG. 4 is a diagram showing an example of presentation of navigation information and the like; FIG.
  • FIG. 4 is a diagram showing an example of presentation of navigation information and the like; 4 is a flow chart showing an example of the flow of vital sound sensing. It is a figure which shows the example of the goods provided with the information acquisition apparatus. It is a figure which shows the example of schematic structure of an information acquisition apparatus. It is a figure which shows the example of the functional block of an information processing system.
  • FIG. 10 is a diagram showing an example of presentation of accumulated data; 1 is a diagram illustrating an example of a schematic configuration of an information processing system; FIG. It is a figure which shows the example of the functional block of an information processing system. 3 is a block diagram showing an example of hardware configuration; FIG.
  • Vital sounds such as breath sounds and heart sounds of patients suffering from allergic respiratory diseases, heart diseases, pulmonary diseases, etc. are usually auscultated by a doctor by placing a stethoscope on an appropriate part of the patient's body.
  • the doctor is not close to the patient.
  • the stethoscope sensor deviates from a position suitable for sensing vital sounds, making accurate sensing of vital sounds difficult.
  • the user of the information acquisition device 11 is referred to as user U and illustrated.
  • a user U receives online medical treatment.
  • a medical worker such as a doctor who treats the user U is referred to as a medical worker C and illustrated.
  • the medical staff C remotely diagnoses the user U by using the medical staff terminal 3 .
  • the information acquisition device 11 has the function of a sensor that senses the user's U vital sounds. Examples of vital sounds are breath sounds, lung sounds, heart sounds, bowel sounds, and the like.
  • the information acquisition device 11 can also be called a stethoscope sensor or a sensing unit having a function as a stethoscope sensor.
  • FIGS. 2 and 3 are diagrams showing examples of the schematic configuration of the sensor.
  • the information acquisition device 11 includes a distal end portion 112 provided with a sensor array 111 , a body portion 113 , and an environmental sound sensor 114 .
  • the figure also shows an XYZ coordinate system.
  • the XY plane direction corresponds to the array direction of the sensor array 111 .
  • the body portion 113 and the tip portion 112 are positioned in this order in the Z-axis positive direction.
  • FIG. 2 schematically shows the external shape of the information acquisition device 11 when viewed obliquely.
  • FIG. 3 schematically shows the external shape of the information acquisition device 11 when viewed from the front (when viewed in the Z-axis negative direction).
  • the sensor array 111 includes a plurality of sensing elements 111a arranged two-dimensionally (on the XY plane).
  • the multiple sensing elements 113a are arranged in a honeycomb pattern.
  • the number, shape, etc. of the sensing elements 111a are not limited to the examples shown in FIGS. Note that the sensor array 111 and the plurality of sensing elements 111a may be read interchangeably within a consistent range.
  • the sensing element 111a senses the vital sound of the user U while the sensor array 111 is in contact with the user's U body.
  • sensing element 111a may include a diaphragm that vibrates in response to sound pressure.
  • Time waveform signal data time-series data
  • the signal data obtained by the sensing element 111a is also called "sensing data". As long as there is no contradiction, the sensing data may be appropriately read as signal data, signals, or the like.
  • Each of the plurality of sensing elements 111a senses vital sounds at opposing positions.
  • the sensing element 113a is configured such that no sound interference occurs between adjacent sensing elements 113a.
  • the diaphragm of each sensing element 113a is configured to vibrate independently of the diaphragm of the adjacent sensing element 111a.
  • the sensing data of each of the plurality of sensing elements 111a can be different sensing data.
  • the environmental sound sensor 114 includes, for example, a microphone, and senses environmental sounds. Examples of environmental sounds are music, air conditioner sounds, conversation sounds, and the like. The environmental sound becomes noise with respect to the vital sound for sensing purposes.
  • Environmental sound sensor 114 is provided in a portion other than sensor array 111 . In this example, the environmental sound sensor 114 is provided on the body portion 113, more specifically, on the portion of the body portion 113 on the side opposite to the distal end portion 112 (Z-axis negative direction side).
  • the user terminal 12 includes a communication unit 13, a user interface unit 14, a processing unit 15, and a storage unit 16.
  • Examples of information stored in the storage unit 16 include an application program 161 (application software), a trained model 162 and user data 163 .
  • the communication unit 13 communicates with other devices. For example, the communication unit 13 receives sensing data from the information acquisition device 11 . The communication unit 13 also transmits vital sound data, which will be described later, to the medical staff terminal 3 .
  • the user interface unit 14 receives the operation of the user terminal 12 by the user U and presents information to the user U.
  • the processing unit 15 functions as a control unit that controls each element of the user terminal 12 and executes various processes.
  • the processing unit 15 executes the application program 161 .
  • An application used by the user U is provided by executing the application program 161 .
  • An example of an application is a clinical application.
  • Diagnosis includes sensing of user U's vital sounds.
  • the information acquisition device 11 described above is used for sensing vital sounds.
  • the processing unit 15 provides various functions by processing the sensing data of each of the plurality of sensing elements 111a. Examples of functions are the navigation of auscultation positions and the generation of vital sound data for auscultation, which will be described in turn below.
  • the processing unit 15 of the user terminal 12 generates navigation information of the auscultation position based on the sensing data of each of the plurality of sensing elements 111a.
  • the navigation information is information for guiding the auscultation position to a position suitable for auscultation of vital sounds for sensing purposes (hereinafter also referred to as "target auscultation position").
  • the navigation information may include the navigation direction.
  • the navigation direction is information for directing the sensor array 111 of the information acquisition device 11 to the target auscultation position (the direction toward the target auscultation position).
  • the processing unit 15 calculates the navigation direction based on the sensing data of each of the plurality of sensing elements 111a.
  • the processing unit 15 calculates an index related to the vital sound for sensing purposes in each of the plurality of sensing elements 111a, and calculates the navigation direction based on the calculated index.
  • the index of the sensing element 111a positioned near the target auscultation position may be calculated to be larger than the index of the sensing element 111a positioned far from the target auscultation position.
  • indicators are the magnitude of the vital sound for sensing purpose included in the sensing data (of the sensing element 111a), the ratio of the vital sound for sensing purpose to all the vital sounds included in the sensing data, and the sensing data included in the sensing data. It is the frequency component (for example, magnitude or ratio) of the target vital sound.
  • the sensing data of each of the plurality of sensing elements 111a includes not only vital sounds for sensing purposes (eg, heartbeat sounds), but also other vital sounds (eg, breathing sounds, lung sounds, bowel sounds, etc.).
  • the processing unit 15 detects (extracts, etc.) the magnitude, frequency component, etc. of each vital sound included in the sensing data for each of the plurality of sensing elements 111a.
  • the detection method is not particularly limited. Each included vital sound may be detected.
  • a trained model 162 may be used as described below.
  • the processing unit 15 calculates the above index based on the detection result. For example, the magnitude (amplitude value, etc.) of the detected vital sound for sensing is calculated as the magnitude of the vital sound for sensing. The magnitude of the vital sound intended for sensing with respect to the total magnitude of all vital sounds included in the sensing data of the sensing element 111a is calculated as the ratio of the vital sound intended for sensing. The detected frequency component of the vital sound for sensing is calculated as it is as the frequency component of the target vital sound.
  • the processing unit 15 calculates the navigation direction based on the calculated index of each of the plurality of sensing elements 111a. For example, in the array direction (XY plane direction) of the sensor array 111, the processing unit 15 calculates the direction from the sensing element 111a with the smaller index to the sensing element 111a with the larger index as the navigation direction. Note that the index of the sensing element 111a positioned near the target auscultation position may be calculated so as to be smaller than the index of the sensing element 111a positioned far from the target auscultation position, in which case the index is large. A direction from the sensing element 111a toward the sensing element 111a having a smaller index may be calculated as the navigation direction.
  • FIG. 5 is a diagram showing an example of calculating the navigation direction.
  • Each sensing element 111a is shown with hatching according to the size of the index.
  • the sensing element 111a is positioned on the upper right side of the sensor array 111 (positive direction of the X axis and the positive direction of the Y axis) than the sensing element 111a positioned on the lower left side of the sensor array 111 (the side of the negative direction of the X axis and the negative direction of the Y axis).
  • the index of the sensing element 111a is larger.
  • the processing unit 15 calculates the direction from the lower left of the sensor array 111 to the upper right of the sensor array 111 as the navigation direction. In the figure, the calculated navigation direction is schematically indicated by an outline arrow.
  • the learned model 162 may be used for part or all of the calculation of the navigation direction.
  • the processing unit 15 uses the sensing data of each of the plurality of sensing elements 111a and the learned model 162 to calculate the navigation direction.
  • the trained model 162 is generated by machine learning using training data so as to output data corresponding to the navigation direction when data corresponding to sensing data of each of the plurality of sensing elements 111a is input. good.
  • the processing unit 15 acquires data corresponding to the navigation direction by inputting data corresponding to sensing data of each of the plurality of sensing elements 111a into the learned model 162 .
  • the learned model 162 may also be used for detecting each vital sound based on sensing data, calculating an index, calculating a navigation direction based on the index, and the like.
  • the trained model 162 may output data corresponding to each vital sound included in each sensing data when data corresponding to each sensing data of the plurality of sensing elements 111a is input.
  • the trained model 162 may output data corresponding to indices of the plurality of sensing elements 111a when data corresponding to the sensing data of the plurality of sensing elements 111a is input.
  • the learned model 162 may output data corresponding to the navigation direction when data corresponding to the indices of the plurality of sensing elements 111a are input.
  • the training data may include data obtained from medical examinations (auscultation, etc.) of many patients performed using the information acquisition device 11 or a device having a configuration similar to that of the information acquisition device 11.
  • the training data may include data obtained by the user U using the information processing system 100 together with such general-purpose data or instead of such general-purpose data.
  • Data obtained by the user U using the information processing system 100 is stored (saved or accumulated) in the storage unit 16 as user data 163 .
  • Training data for the trained model 162 for example, all data from which input data and output data for the trained model 162 can be obtained may be stored in the storage unit 16 as the user data 163 .
  • Machine learning using training data including the user data 163 provides a trained model 162 optimized for the user U's use of the information acquisition device 11, the use environment, and the like.
  • the trained model 162 may be generated by machine learning using training data including the general-purpose data described above, and then updated by additional machine learning using training data including the accumulated user data 163.
  • the trained model 162 can be used before user data 163 is accumulated.
  • the learned model 162 may be generated by machine learning using only training data including accumulated user data 163.
  • a trained model 162 optimized specifically for the user U can be used rather than using training data including general-purpose data.
  • the possibility of optimizing the navigation information for each user U is further increased.
  • navigation may be performed according to instructions from the medical staff C or the like.
  • the generation of the learned model 162 may be performed by the processing unit 15, or may be performed by a device (server device, etc.) external to the user terminal 12.
  • the navigation direction is calculated as described above.
  • the processing unit 15 generates navigation information including the calculated navigation direction.
  • the navigation information is presented to the user U by the user interface unit 14 as described later.
  • the processing unit 15 may generate vital sound data by adding the sensing data of each of the plurality of sensing elements 111a. As a further contrivance, the processing unit 15 may perform weighted addition of the sensing data of each of the plurality of sensing elements 111a so as to generate vital sound data suitable for auscultation of vital sounds for sensing purposes.
  • FIG. 6 is a diagram showing an example of processing.
  • processing include weight calculation processing P1, weighting processing P2, addition processing P3, noise reduction processing P4, and navigation information generation processing P5.
  • weight calculation process P1, the weighting process P2, the addition process P3, and the noise reduction process P4 are particularly related to the generation of vital sound data.
  • the navigation information generation process P5 is a process for generating the above-described navigation information, and for example, calculates the navigation direction as described above with reference to FIG.
  • weights (coefficients, etc.) for the sensing data of each of the plurality of sensing elements 111a are calculated.
  • the corresponding sensing data is weighted by the weight calculated in the weight calculation process P1 (multiplication of coefficients and the like is performed).
  • each weighted sensing data is added.
  • noise reduction processing P4 noise contained in the data obtained by the addition in the addition processing P3 is reduced. Data whose noise has been reduced by the noise reduction processing P4 is obtained as vital sound data.
  • the weight is calculated based on the addition result of the addition process P3. For example, the weight of the sensing data of the sensing element 111a that contains relatively many vital sounds for sensing purposes (for example, the above-mentioned index is large) is larger than the weight of the sensing data of the sensing element 111a that does not. is calculated.
  • Sensing data from the environmental sound sensor 114 may also be used in the weight calculation process P1.
  • the weight of each sensing data is calculated such that the weight of the sensing data of the sensing element 111a that contains relatively little environmental sound is greater than the weight of the sensing data of the sensing element 111a that does not.
  • the environmental sound, ie, noise contained in the data obtained by the addition in the addition processing P3 is reduced.
  • noise reduction process P4 based on the sensing data of the environmental sound sensor 114, environmental sound, ie, noise-reduced vital sound data is generated.
  • environmental sound ie, noise-reduced vital sound data
  • noise cancellation technology etc.
  • the learned model 162 may be used for the weight calculation process P1 described above.
  • the trained model 162 outputs data corresponding to the weight of each sensing data, for example, when data corresponding to sensing data of each of the plurality of sensing elements 111a is input.
  • the processing unit 15 acquires data corresponding to the weight of each sensing data by inputting the data corresponding to the sensing data of each of the plurality of sensing elements 111a into the learned model 162 .
  • trained model 162 may be a trained model generated by machine learning using training data including user data 163 . The possibility of generating vital sound data optimized for each user U increases.
  • vital sound data suitable for auscultation of vital sounds for sensing purposes is generated.
  • Part or all of the processing by the processing unit 15 may be executed by the information acquisition device 11 .
  • the processing in that case may be executed by a processing unit (processor or the like) (not shown) mounted in the main unit 113 of the information acquisition device 11, for example.
  • the processing unit 15 of the user terminal 12 may be appropriately read as the processing unit within the information acquisition device 11 .
  • FIG. 7 is a diagram showing an example of presentation of navigation information and the like.
  • the user interface unit 14 is exemplified as a display screen (display unit, etc.) of the user terminal 12 .
  • the waveform of the vital sound is displayed in the upper portion of the display screen.
  • vital sounds may be output.
  • Navigation information is displayed in the lower portion of the display screen.
  • the navigation information includes the appearance of the information acquisition device 11 and the navigation direction from the information acquisition device 11 to the target auscultation position.
  • the navigation direction is displayed as an arrow G extending from the information acquisition device 11 as a base end.
  • the user U can bring the sensor array 111 of the information acquisition device 11 closer to the target auscultation position.
  • the user U moves the information acquisition device 11 in contact with his/her body along the direction of the displayed arrow G.
  • the processing unit 15 repeats generation of navigation information and vital sound data, and the information is updated and displayed.
  • the communication unit 13 transmits the vital sound signal to the medical staff terminal 3.
  • the medical staff terminal 3 includes a communication unit 31, a user interface unit 32, and a storage unit 33.
  • Medical information 331 is exemplified as information stored in the storage unit 33 .
  • the medical information 331 includes, for example, information such as an electronic medical record of the user U, and is used for medical care of the user U by the medical staff C, and the like.
  • the communication unit 31 communicates with other devices and the like. For example, the communication unit 31 receives a vital sound signal from the information acquisition device 1 .
  • the user interface unit 32 receives operations of the medical staff terminal 3 by the medical staff C and presents information to the medical staff C. For example, the user interface unit 32 presents a vital sound (sound output, etc.). The medical staff C can treat the user U based on the presented vital sounds.
  • the movement, movement amount, etc. of the information acquisition device 11 may be sensed. It enables generation of more detailed navigation information. Description will be made with reference to FIG.
  • FIG. 8 is a diagram showing an example of processing.
  • the illustrated information acquisition device 11 further includes a motion sensor 115 .
  • the motion sensor 115 senses the movement, movement amount, etc. of the information acquisition device 11 .
  • Examples of the motion sensor 115 are an acceleration sensor, an angle sensor (gyro sensor), and the like.
  • the position of the sensor array 111 of the information acquisition device 11 is also calculated based on the sensing data of the motion sensor 115.
  • the position is calculated, for example, based on the moving direction, the moving distance, etc. from the known reference position.
  • the calculated position of the information acquisition device 11 is also included in the navigation information and presented by the user interface unit 14 .
  • the movement amount (distance) to the target auscultation position, the movement direction, and the like can be presented as specific numerical values, or can be presented together with the current position of the information acquisition device 11 . Presenting detailed navigation information increases the possibility of improving navigation accuracy.
  • the weight calculated by the weight calculation process P1 is also reflected in the position calculation in the navigation information generation process P5.
  • Data on the amount of change between the weight calculated in the n-1th (n is a positive integer) process and the weight calculated in the nth process is used to assist in calculating the position of the information acquisition device 11. .
  • navigation information may include body images. Description will be made with reference to FIGS. 9 and 10. FIG. 9
  • FIG. 9 and 10 are diagrams showing examples of presentation of navigation information and the like.
  • an image of the body is displayed in the lower portion of the display screen.
  • the displayed image of the body also includes the target auscultation position and characteristic parts of the body such as the clavicle, sternum, and the like.
  • the target auscultation locations are indicated ordered by 1-10 circles on the body.
  • the user U applies the sensor array 111 of the information acquisition device 11 to the position of his/her own body corresponding to the characteristic part of the body shown in the image. Note that this position may be indicated by the medical staff C.
  • An arrow G (navigation direction) pointing to the first target auscultation position is displayed, and the user U moves (the sensor array 111 of) the information acquisition device 11 in that direction.
  • the medical staff C can diagnose the user U based on the vital sounds appropriately sensed at the target auscultatory position.
  • This first target auscultation position may be set, for example, to the reference position described above.
  • the position of the sensor array 111 of the information acquisition device 11 calculated based on the amount of movement from the reference position and the like is also displayed. Vital sounds are appropriately sensed at each target auscultation position.
  • the user U is a pregnant woman, and the target auscultatory position for sensing fetal vital sounds is indicated by a circle on her body.
  • the target auscultatory position for sensing fetal vital sounds is indicated by a circle on her body.
  • an arrow G pointing to the target auscultation position is displayed.
  • the user U moves (the sensor array 111 of) the information acquisition device 11 in that direction.
  • Vital sounds are sensed while the sensor array 111 of the information acquisition device 11 is applied to the target auscultatory position.
  • the medical staff C can diagnose the user U based on the vital sounds appropriately sensed at the target auscultatory position.
  • FIG. 11 is a flowchart showing an example of the flow of vital sound sensing.
  • step S1 the initial position of the auscultatory position is set. For example, as described above, a characteristic part of the body to which the sensor array 111 of the information acquisition device 11 is applied is first displayed as an image, and the auscultation position is indicated by the medical staff C. FIG. The user U applies the sensor array 111 of the information acquisition device 11 to the corresponding position of his/her own body.
  • step S2 navigation information is presented.
  • the processing unit 15 of the user terminal 12 generates navigation information based on the sensing data of each of the plurality of sensing elements 111a.
  • the user interface unit 14 presents navigation information.
  • the user U moves the sensor array 111 of the information acquisition device 11 according to the presented navigation information. As the user moves, new navigation information is repeatedly generated and presented. In addition, generation of vital sound data, transmission to the medical staff terminal 3, and the like are also performed.
  • the operation of step S2 is repeated until the sensor array 111 of the information acquisition device 11 moves to the target auscultation position.
  • step S3 the auscultatory position is determined. Vital sounds are sensed at the target auscultation position. Sensing is performed at an appropriate auscultatory position, and properly generated vital sound data is obtained.
  • the information acquisition device 11 may be provided on an article so that the sensor array 111 is in contact with the user's U back.
  • An article provided with the information acquisition device 11 in this way may also be a component of the information processing system 100 .
  • articles are furniture such as chairs and beds.
  • FIG. 12 is a diagram showing an example of an article provided with an information acquisition device.
  • the illustrated article is a chair 8 .
  • the chair 8 includes a support portion 81 that supports the information acquisition device 11 in the backrest portion.
  • the support unit 81 supports the information acquisition device 11 so that the sensor array 111 of the information acquisition device 11 touches the back of the user U while the user U is leaning against the backrest of the chair 8 .
  • the displayed navigation information includes, in addition to the image of the back, the target auscultation position and characteristic parts of the back, such as the shoulder blades and spine.
  • the target auscultation position is indicated by a circle.
  • the support part 81 is configured to be movable in the XY plane direction.
  • the user U operates the direction key K displayed in the lower part of the display screen of the user terminal 12, and the sensor array 111 of the information acquisition device 11 indicated by the support section 81 and the support section 81 is operated. to move.
  • the chair 8 and the user terminal 12 may be configured to be able to communicate with each other by, for example, wired or short-range wireless communication so that the chair 8 can be operated from the user terminal 12 .
  • an arrow G (navigation direction) pointing to the target auscultation position is displayed.
  • the user U moves the support portion 81 in that direction.
  • the sensor array 111 of the information acquisition device 11 is moved together with the support portion 81 to the target auscultation position, and vital sounds are sensed in that state.
  • the medical staff C can diagnose the user U based on appropriately sensed vital sounds.
  • the movement of the sensor array 111 of the information acquisition device 11 by the support section 81 may be automatically controlled by the processing section 15 of the user terminal 12 without intervention of the user U's operation.
  • the information acquisition device 11 may present navigation information. Description will be made with reference to FIG.
  • FIG. 13 is a diagram showing an example of a schematic configuration of an information acquisition device.
  • FIG. 13 schematically shows the external shape of the information acquisition device 11 when viewed from the back (when viewed in the Z-axis positive direction).
  • Information acquisition device 11 includes light emitting unit 116 .
  • the light emitting unit 116 is provided at a portion other than the main surface 112a of the tip portion 112 between the tip portion 112 and the main body portion 113 so as to be visible to the user U when the information acquisition device 11 is viewed from the rear.
  • the light-emitting portion 116 is provided in a ring shape on the surface of the distal end portion 112 opposite to the main surface 112a.
  • the light emitting section 116 includes a plurality of light emitting elements 116a arranged in a ring shape.
  • the light emitting element 116a are an LED (Light Emitting Diode), an OLED (Organic Light-Emitting Diode), and the like. Lighting and extinguishing (blinking) of each of the plurality of light emitting elements 116a can be individually controlled.
  • one light emitting element 116a on the upper left side (the positive direction of the X axis and the positive direction of the Y axis) is lit.
  • a direction (schematically illustrated as an arrow G) from the center position of the ring shape toward the lit light emitting element 116a is presented as the navigation direction. Besides presenting the navigation direction, for example, by changing the blinking pattern of the light emitting element 116a, it is possible to present the distance to the target auscultation position.
  • the light emitting unit 116 provided in the information acquisition device 11 can present navigation information to the user U as described above.
  • the auscultation position can be navigated without using the user interface unit 14 of the user terminal 12 .
  • data including vital sound data may be saved and accumulated. This data may be accumulated in the user terminal 12 or may be accumulated in an external device. Description will be made with reference to FIG.
  • FIG. 14 is a diagram showing an example of functional blocks of an information processing system.
  • An information processing device 2 is exemplified as an external device of the information acquisition device 1 .
  • the information processing device 2 is configured to be able to communicate with the information acquisition device 1 and the medical staff terminal 3 via the network N (FIG. 1).
  • data including vital sound data generated by the processing unit 15 is stored in the storage unit 16 as accumulated data AD.
  • the vital sound data included in the accumulated data AD may be processed data.
  • the communication unit 13 of the user terminal 12 also transmits the vital sound data to the information processing device 2 .
  • Information processing device 2 includes communication unit 21 , estimation unit 22 , and storage unit 23 .
  • the communication unit 21 communicates with other devices and the like.
  • the communication unit 21 receives a vital sound signal from the user terminal 12 .
  • Accumulated data AD including the vital sound signal is also stored in the storage unit 23 of the information processing device 2 .
  • the information processing device 2 performs various analyzes of the accumulated data AD. An example of the analysis is estimation of the medical condition of the user U, etc., which is performed by the estimation unit 22 .
  • the estimation unit 22 estimates an index related to the user U's medical condition based on the evaluation result of the feature amount calculated from the vital sound data included in the accumulated data AD.
  • An example of the feature quantity is vital stability (stability of user U's heartbeat, etc.) and the like.
  • Various algorithms may be used for estimation.
  • the estimating unit 22 calculates feature amounts from the vital sound signal using various feature amount calculation algorithms (learned models may be used). Then, the estimation unit 22 estimates the index by evaluating the calculated feature amount. Examples of indicators are the degree of progress of a medical condition, the effectiveness of rehabilitation, and the like.
  • the accumulated data AD may also include the estimation result (analysis result) of the estimation unit 22 .
  • the accumulated data AD is transmitted to the medical staff terminal 3 and reflected in the medical information 331 (electronic medical record, etc.) stored in the storage unit 33 .
  • the accumulated data AD may be presented to the user U by the user terminal 12 or presented to the medical staff C by the medical staff terminal 3. It becomes possible to visualize the history of symptoms of the user U, and the like.
  • the health care worker C can check the condition of the user U each time, which could only be checked when visiting the hospital. Not only the medical staff C, but also the user U himself/herself can grasp the patient's condition and its change until the next hospital visit. It is also possible to predict aggravation tendency based on the symptom data and its changes, and call attention to the medical staff C and the user U.
  • FIG. 15 is a diagram showing an example of presentation of accumulated data.
  • An example of presentation of accumulated data AD by the user interface unit 14 of the user terminal 12 is shown.
  • On the upper side of the display screen a graph of feature values for each day over a period of several months, for example, is displayed. Vital sounds of the selected day, in this example, breathing sounds, heartbeat sounds, etc., can be reproduced and confirmed.
  • a similar presentation may be made by the user interface unit 32 of the medical staff terminal 3 .
  • the community member terminal 4 is a terminal used by member M.
  • the member M is a member belonging to the same community as the user U, such as a family member of the user U, another patient having the same disease as the user U, or the like.
  • the life insurance/health insurance terminal 5, product/service provider terminal 6, and analysis terminal 7 will be described later.
  • FIG. 17 is a diagram showing an example of functional blocks of an information processing system.
  • the information processing device 2 further includes a recommendation section 24 and a processing section 25 .
  • patient information 231, community information 232, algorithm DB 233, learned model 234, recommendation information 235, and anonymously processed information 236 are stored in the storage unit 23 of the information processing device 2.
  • FIG. 17 is a diagram showing an example of functional blocks of an information processing system.
  • the information processing device 2 further includes a recommendation section 24 and a processing section 25 .
  • patient information 231, community information 232, algorithm DB 233, learned model 234, recommendation information 235, and anonymously processed information 236 are stored in the storage unit 23 of the information processing device 2.
  • the patient information 231 includes information about the user U who is a patient.
  • Examples of the patient information 251 are user U's disease information, hospital visit history information, rehabilitation history information, the above-described accumulated data AD (FIG. 14), and the like.
  • the disease information includes user U's disease name and the like.
  • the hospital visit history information is information about the user U's past hospital visits.
  • Rehabilitation history information is information related to past rehabilitation performed by user U. These pieces of information are provided, for example, from the medical staff terminal 3 or the like.
  • the community information 232 is information about the community to which the user U belongs, and includes information about the members M and the community member terminals 4 .
  • the algorithm DB 233, the learned model 234, the recommended information 235, and the anonymously processed information 236 will be described later.
  • the communication unit 21 and the estimation unit 22 are as described above.
  • the information including the estimation result of the estimation unit 22 and the accumulated data AD is referred to as "result information”.
  • the communication unit 21 transmits the result information to other devices and terminals, in this example, the user terminal 12 , the medical worker terminal 3 , the community member terminal 4 and the life/health insurance terminal 5 . Note that the recommendation unit 24 and processing unit 25 of the information processing device 2 will be described later.
  • the result information is presented by the user interface unit 14.
  • the user U can know various indicators regarding his/her own physical function, behavior, disease, and the like.
  • Content to be presented to the user U may be included in the result information transmitted from the information processing device 2 to the user terminal 12 . Individual content can be presented based on the estimation results.
  • the result information is presented by 34 on the medical staff terminal 3 as well.
  • the medical staff C can know various indexes related to the user U's physical function, behavior, disease, and the like.
  • individual content such as a rehabilitation menu customized by the medical staff C so as to suit the user U, for example, may be generated using the user interface unit 32 .
  • the content is transmitted to the user terminal 12 and presented to the user U by the communication unit 31 .
  • the community member terminal 4 includes a communication unit 41 and a user interface unit 42.
  • the communication unit 41 communicates with other devices and the like. For example, the communication unit 41 receives result information from the information processing device 2 .
  • the user interface unit 42 receives operations of the community member terminal 4 by the member M and presents information to the member M. For example, the result information from the information processing device 2 is presented and shared with the member M as well.
  • the result information transmitted from the information processing device 2 to the community member terminal 4 may include content to be presented to the member M. FIG. Individual content can be presented based on the estimation results.
  • the life insurance/health insurance terminal 5 is a terminal used by insurance companies, health insurance companies, and the like.
  • Life insurance/health insurance terminal 5 includes communication unit 51 , analysis unit 52 , and storage unit 53 .
  • Customer/employee information 531 is exemplified as information stored in the storage unit 53 .
  • Customer/employee information 531 includes information about user U's life insurance, health insurance, and the like.
  • the communication unit 51 receives result information from the information processing device 2 .
  • the analysis unit 52 analyzes the result information and specifies (calculates, etc.) insurance premiums and rewards.
  • the identification may involve the work, judgment, etc. of an employee of a life insurance company or a health insurance company. A reduction in insurance premiums, a change to a limited plan, or the like may also be performed.
  • the communication unit 51 transmits to the user terminal 12 the identified insurance premium and the recommended insurance premium/reward information. Premium/reward information is presented by the user interface unit 14 of the user terminal 12 .
  • the product/service provider terminal 6 is a terminal used by companies that provide products/services. Examples of products include wheelchairs, walking aids, rehabilitation equipment, health food and health equipment. An example of a service is a health application or the like that can be executed on the user terminal 12 .
  • the product/service provider terminal 6 includes a communication unit 61 and a user interface unit 62.
  • product/service information that associates the anonymized result information with products and services is input or generated via the user interface unit 62 .
  • the communication unit 61 transmits product/service information to the information processing device 2 .
  • the communication unit 21 of the information processing device 2 receives product/service information from the product/service provider terminal 6 .
  • the recommendation unit 24 and the processing unit 25 of the information processing device 2 will be explained.
  • the recommendation unit 24 generates recommendation information 235 including information on products and services to be recommended to the user U based on the product/service information from the product/service provider terminal 6 .
  • the communication unit 21 transmits the recommendation information 235 to the user terminals 12 and the community member terminals 4 .
  • the recommendation information 235 is presented by the user interface unit 14 of the user terminal 12 or by the user interface unit 42 of the community member terminal 4 .
  • the processing unit 25 generates anonymously processed information 236 by anonymizing the result information.
  • the anonymously processed information 236 describes the anonymized personal information and the result information in association with each other.
  • the communication unit 21 of the information processing device 2 transmits the anonymously processed information 236 to the analysis terminal 7.
  • the analysis terminal 7 is, for example, a terminal used by a company that provides the above products/services, a pharmaceutical company that conducts clinical development, and the like.
  • the analysis terminal 7 includes a communication section 71 , an analysis section 72 and a user interface section 73 .
  • the communication unit 71 receives the anonymously processed information 236 from the information processing device 2.
  • the analysis unit 72 performs data analysis (data analysis) based on the anonymously processed information 236 .
  • the analysis may involve the work, judgment, etc., of employees of the company or the like.
  • the user interface unit 73 presents information related to data analysis. Examples of analysis include analysis of user groups for products such as health foods and health equipment, data analysis for clinical development, and the like.
  • the anonymously processed information 236 can be used for various services, such as marketing analysis by manufacturers, analysis of percentage of users with specific symptoms, age, gender, etc., and monitoring of symptoms of patients taking specific drugs.
  • the sensing data of the information acquisition device 11 is transmitted to the information processing device 2 as vital sound data via the user terminal 12 as an example.
  • part or all of the sensing data of the information acquisition device 11 is directly transmitted from the information acquisition device 11 to the information processing device 2 without going through the user terminal 12, and the vital sound raw data is generated in the information processing device 2.
  • the information processing device 2 may also generate the navigation information. That is, part or all of the functions of the processing unit 15 of the user terminal 12 may be implemented in the information processing device 2 .
  • these functions and information may be provided in a server device or the like (service provider server device) managed by the user of the corresponding terminal.
  • the life insurance/health insurance terminal 5, product/service provider terminal 6, and analysis terminal 7 communicate with corresponding server devices and the like to use their functions. It is possible to reduce the processing load on the information processing device 2 and to simplify the functions to reduce the cost.
  • Some of the functions of the terminal may be provided in a server device or the like managed by the user of the corresponding terminal.
  • the function of the analysis unit 52 of the life insurance/health insurance terminal 5 may be provided in a server device or the like managed by an insurance company, a health insurance company, or the like.
  • the life insurance/health insurance terminal 5 communicates with the server device or the like to use its functions.
  • the function of the analysis unit 72 of the analysis terminal 7 may be provided in a server device or the like managed by a pharmaceutical company or the like.
  • the analysis terminal 7 communicates with the server device or the like to use its functions. It is possible to reduce the processing load on the life insurance/health insurance terminal 5 and the analysis terminal 7, and to reduce the cost by simplifying the functions.
  • FIG. 18 is a block diagram showing an example of the hardware configuration.
  • the user terminal 12 will be described below as an example. The same explanation can be given for the information acquisition device 11, the information processing device 2, the medical staff terminal 3, the community member terminal 4, the life insurance/health insurance terminal 5, the product/service provider terminal 6, the analysis terminal 7, and the like.
  • Various types of processing are realized by cooperation between software and hardware described below.
  • the user terminal 12 includes a CPU (Central Processing Unit) 901, a ROM (Read Only Memory) 902, a RAM (Random Access Memory) 903, and a host bus 904a.
  • the user terminal 12 also includes a bridge 904 , an external bus 904 b , an interface 905 , an input device 906 , an output device 907 , a storage device 908 , a drive 909 , a connection port 911 , a communication device 913 and a sensor 915 .
  • the information processing device 2 may have a processing circuit such as a DSP or an ASIC in place of or together with the CPU 901 .
  • the CPU 901 functions as an arithmetic processing device and a control device, and controls general operations within the information processing device 2 according to various programs.
  • the CPU 901 may be a microprocessor.
  • the ROM 902 stores programs, calculation parameters, and the like used by the CPU 901 .
  • the RAM 903 temporarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like.
  • the CPU 901 can embody the processing unit 15 of the user terminal 12, for example.
  • the CPU 901, ROM 902 and RAM 903 are interconnected by a host bus 904a including a CPU bus and the like.
  • the host bus 904a is connected via a bridge 904 to an external bus 904b such as a PCI (Peripheral Component Interconnect/Interface) bus.
  • PCI Peripheral Component Interconnect/Interface
  • host bus 904a, bridge 904 and external bus 904b need not necessarily have separate configurations from each other and may be implemented in a single configuration (eg, one bus).
  • the input device 906 is implemented by a device such as a mouse, keyboard, touch panel, button, microphone, switch, lever, etc., through which information is input by the practitioner.
  • the input device 906 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device such as a mobile phone or PDA corresponding to the operation of the information processing device 2.
  • the input device 906 may include, for example, an input control circuit that generates an input signal based on information input by the practitioner using the above input means and outputs the signal to the CPU 901 . By operating the input device 906, the practitioner can input various data to the user terminal 12 and instruct processing operations.
  • the output device 907 is formed by a device capable of visually or audibly notifying the practitioner of the acquired information.
  • Such devices include display devices such as CRT display devices, liquid crystal display devices, plasma display devices, EL display devices and lamps, audio output devices such as speakers and headphones, and printer devices.
  • the storage device 908 is a device for storing data.
  • the storage device 908 is implemented by, for example, a magnetic storage device such as an HDD, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
  • the storage device 908 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like.
  • the storage device 908 stores programs executed by the CPU 901, various data, and various data acquired from the outside.
  • the storage device 908 can embody the storage unit 16 or the like of the user terminal 12, for example.
  • the drive 909 is a reader/writer for storage media, and is built in or externally attached to the information processing device 2 .
  • the drive 909 reads information recorded on a removable storage medium such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the information to the RAM 903 .
  • Drive 909 can also write information to a removable storage medium.
  • connection port 911 is an interface connected to an external device, and is a connection port with an external device capable of data transmission by, for example, USB (Universal Serial Bus).
  • USB Universal Serial Bus
  • the communication device 913 is, for example, a communication interface formed of a communication device or the like for connecting to the network 920 (corresponding to the network N in FIG. 1, for example).
  • the communication device 913 is, for example, a communication card for wired or wireless LAN (Local Area Network), LTE (Long Term Evolution), Bluetooth (registered trademark), or WUSB (Wireless USB).
  • the communication device 913 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), a modem for various types of communication, or the like.
  • This communication device 913 can transmit and receive signals and the like to and from the Internet and other communication devices, for example, according to a predetermined protocol such as TCP/IP.
  • the communication device 913 can embody the communication unit 21 of the information processing device 2, for example.
  • the sensor 915 may include at least part of the sensors included in the information acquisition device 11 .
  • the network 920 is a wired or wireless transmission path for information transmitted from devices connected to the network 920 .
  • the network 920 may include the Internet, a telephone line network, a public line network such as a satellite communication network, various LANs (Local Area Networks) including Ethernet (registered trademark), WANs (Wide Area Networks), and the like.
  • Network 920 may also include a dedicated line network such as IP-VPN (Internet Protocol-Virtual Private Network).
  • a hardware configuration example capable of realizing the functions of the user terminal 12 has been shown above.
  • Each component described above may be implemented using general-purpose members, or may be implemented by hardware specialized for the function of each component. Therefore, it is possible to appropriately change the hardware configuration to be used according to the technical level at which the present disclosure is implemented.
  • a computer-readable recording medium storing such a computer program can also be provided. Recording media include, for example, magnetic disks, optical disks, magneto-optical disks, flash memories, and the like. Also, the above computer program may be distributed, for example, via a network without using a recording medium.
  • the technical category that embodies the above technical idea is not limited.
  • the above technical ideas may be embodied by a computer program for causing a computer to execute one or more procedures (steps) included in the method of manufacturing or using the above apparatus.
  • the above technical idea may be embodied by a computer-readable non-transitory recording medium in which such a computer program is recorded.
  • the information acquisition device 11 includes a plurality of sensing elements 111a arranged two-dimensionally so as to sense vital sounds at opposing positions.
  • navigation of the auscultation position can be performed by processing the sensing data of each of the plurality of sensing elements 111a as described above. Therefore, it is possible to sense vital sounds at an appropriate auscultatory position.
  • the information acquisition device 11 includes a tip portion 112 having a main surface 112a provided with a plurality of sensing elements 111a, and a tip portion 112 having a main surface 112a opposite to the main surface 112a. and a body portion 113 that is supported from the side, and the tip portion 112 and the body portion 113 have sizes that can be held and moved by the user U (for example, about the same size as the headpiece of a stethoscope). may have The user U can easily handle the stethoscope sensor of the information acquisition device 11 .
  • the information processing system 100 described with reference to FIGS. 1 to 7 is also one of the disclosed technologies.
  • the information processing system 100 includes the information acquisition device 11 including the plurality of sensing elements 111a described above, and the navigation of the auscultation position where the plurality of sensing elements 111a are applied to the body of the user U based on the sensing data of each of the plurality of sensing elements 111a. and a processing unit 15 that generates information.
  • the navigation information includes a navigation direction for orienting the information acquisition device 11 to a target auscultation position suitable for auscultation of vital sounds for sensing purposes. may compute the navigation direction. For example, such navigational information can be provided for sensing vital sounds at appropriate auscultatory locations.
  • the processing unit 15 calculates the navigation direction based on the indices related to the vital sounds for sensing purposes in each of the plurality of sensing elements 111a. At least one of the volume of the vital sound for sensing included, the ratio of the vital sound for sensing to all the vital sounds included in the sensing data, and the frequency component of the vital sound for sensing included in the sensing data. may contain. For example, navigation directions can be calculated based on such indicators.
  • the processing unit 15 uses the sensing data of each of the plurality of sensing elements 111a and the learned model 162 to calculate the navigation direction.
  • the data corresponding to the sensing data of each of the sensing elements 111a are input, the data corresponding to the navigation direction may be output.
  • a navigation direction can also be calculated using such a trained model 162 .
  • the trained model 162 in that case may be a trained model generated by machine learning using training data including the user data 163 obtained by the user U using the information processing system 100. .
  • the navigation direction can be calculated using the learned model 162 optimized for the usage status of the information acquisition device 11 by the user U, the usage environment, and the like.
  • the information acquisition device 11 includes the motion sensor 115, and the processing unit 15 obtains navigation information including the positions of the plurality of sensing elements 111a based on the sensing data of the motion sensor. may be generated. The possibility of improving navigation accuracy increases.
  • the information processing system 100 may include the user interface section 14 that presents the user U with navigation information. According to the presented navigation information, the user U can bring the plurality of sensing elements 111a of the information acquisition device 11 closer to the target auscultatory position.
  • the information processing system 100 includes the tip portion 112 described above, the body portion 113 described above, and the tip portion 112 and the body portion 113 other than the main surface 112 a of the tip portion 112 . and a light-emitting unit 116 provided in the portion of , and the light-emitting unit 116 may present the user U with navigation information. Navigation information can also be presented by such a light emitting unit 116 .
  • the processing unit 15 may generate vital sound data based on the sensing data of each of the plurality of sensing elements 111a.
  • the processing unit 15 may weight-add the sensing data of each of the plurality of sensing elements 111a so as to generate the vital sound data suitable for auscultation of vital sounds for sensing purposes.
  • the information acquisition device 11 may include the environmental sound sensor 114 , and the processing unit 15 may weight-add the sensing data of each of the plurality of sensing elements 111 a based on the sensing data of the environmental sound sensor 114 .
  • the processing unit 15 may generate vital sound data with reduced noise based on the sensing data of the environmental sound sensor. For example, by generating vital sound data in this manner, vital sounds suitable for auscultation can be obtained.
  • the processing unit 15 uses the sensing data of each of the plurality of sensing elements 111a and the learned model 162 to calculate the weight of each of the plurality of sensing elements 111a.
  • the trained model 162 may output data corresponding to the weight of each of the plurality of sensing elements 111a when the data corresponding to the sensing data of each of the plurality of sensing elements 111a is input.
  • a weight can also be calculated using such a trained model 162 .
  • the trained model 162 in that case may also be a trained model generated by machine learning using training data including the user data 163 . The possibility of optimizing the vital sound data for each user U increases.
  • the information processing system 100 may include a storage unit for accumulating vital sound data (storage unit 16, storage unit 23, etc. for storing accumulated data AD).
  • a storage unit for accumulating vital sound data storage unit 16, storage unit 23, etc. for storing accumulated data AD.
  • it can be used in various ways by presenting the accumulated data AD and analyzing the accumulated data AD.
  • the information processing system 100 may include an article (for example, the chair 8) provided with the information acquisition device 11 such that the plurality of sensing elements 111a are in contact with the back of the user U. Auscultation of the back of the user U can also be easily performed.
  • the present technology can also take the following configuration.
  • (1) Equipped with a plurality of sensing elements arranged two-dimensionally so as to sense vital sounds at opposing positions, Information acquisition device.
  • (2) a tip portion having a main surface on which the plurality of sensing elements are provided; a body portion that supports the tip portion from a side opposite to the main surface; with The tip portion and the body portion have a size that allows a user to hold and move them by hand, The information acquisition device according to (1).
  • an information acquisition device including a plurality of sensing elements arranged two-dimensionally so as to sense vital sounds at opposing positions; a processing unit that generates navigation information of an auscultatory position where the plurality of sensing elements are applied to the user's body based on the sensing data of each of the plurality of sensing elements; comprising Information processing system.
  • the navigation information includes navigation directions for directing the information acquisition device to a target auscultation position suitable for auscultation of vital sounds for sensing purposes; the processing unit calculates the navigation direction; The information processing system according to (3).
  • the processing unit calculates the navigation direction based on an index related to the vital sound for the purpose of sensing in each of the plurality of sensing elements,
  • the indicator is the volume of the vital sound for sensing purposes included in the sensing data, The ratio of vital sounds for sensing purposes to all vital sounds contained in the sensing data, as well as, frequency components of vital sounds for sensing purposes included in the sensing data; including at least one of The information processing system according to (4).
  • the processing unit calculates the navigation direction using sensing data of each of the plurality of sensing elements and a trained model, The trained model outputs data corresponding to the navigation direction when data corresponding to sensing data of each of the plurality of sensing elements is input.
  • the information processing system according to (4).
  • the processing unit generates vital sound data based on the sensing data of each of the plurality of sensing elements.
  • (12) The processing unit performs weighted addition of the sensing data of each of the plurality of sensing elements so as to generate the vital sound data suitable for auscultation of vital sounds for sensing purposes.
  • the information acquisition device includes an environmental sound sensor, The processing unit performs weighted addition of the sensing data of each of the plurality of sensing elements based on the sensing data of the environmental sound sensor.
  • the information acquisition device includes an environmental sound sensor, The processing unit generates the vital sound data with reduced noise based on the sensing data of the environmental sound sensor.
  • Information Processing System 11 Information Acquisition Device 111 Sensor Array 111a Sensing Element 112 Tip Part 112a Main Surface 113 Body Part 114 Environmental Sound Sensor 115 Motion Sensor 116 Light Emitting Part 116a Light Emitting Element 12 User Terminal 13 Communication Part 14 User Interface Part 15 Processing Part 16 Storage unit 161 Application program 162 Learned model 163 User data 2 Information processing device 21 Communication unit 22 Estimation unit 23 Storage unit 231 Patient information 232 Community information 233 Algorithm DB 234 Trained model 235 Recommended information 236 Anonymously processed information 24 Recommendation unit 25 Processing unit 3 Medical staff terminal 31 Communication unit 32 User interface unit 33 Storage unit 331 Medical information 4 Community member terminal 41 Communication unit 42 User interface unit 5 Life/health insurance Terminal 51 Communication Unit 52 Analysis Unit 53 Storage Unit 531 Customer/Employee Information 6 Product/Service Provider Terminal 61 Communication Unit 62 User Interface Unit 7 Analysis Terminal 71 Communication Unit 72 Analysis Unit 73 User Interface Unit 8 chair 81 support G arrow (navigation direction) AD Accumulated data C Medical

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

Ce dispositif d'acquisition d'informations (11) est équipé de multiples éléments de détection (111a) qui sont agencés de manière bidimensionnelle de telle sorte que chaque élément détecte un son vital à une position opposée.
PCT/JP2022/048616 2022-01-12 2022-12-28 Dispositif d'acquisition d'informations et système de traitement d'informations WO2023136175A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022003065 2022-01-12
JP2022-003065 2022-01-12

Publications (1)

Publication Number Publication Date
WO2023136175A1 true WO2023136175A1 (fr) 2023-07-20

Family

ID=87279145

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/048616 WO2023136175A1 (fr) 2022-01-12 2022-12-28 Dispositif d'acquisition d'informations et système de traitement d'informations

Country Status (1)

Country Link
WO (1) WO2023136175A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002165292A (ja) * 2000-11-27 2002-06-07 Nippon Telegr & Teleph Corp <Ntt> 多チャネル音響信号収集装置
JP2009273817A (ja) * 2008-05-19 2009-11-26 Shigehiro Kuroki 生体反応記録装置ならびに生体反応記録方法
WO2011114669A1 (fr) * 2010-03-18 2011-09-22 パナソニック株式会社 Dispositif d'inspection de sons biologiques
US20180160907A1 (en) * 2018-01-26 2018-06-14 Shiv Prakash Verma Digital healthcare practice system for digital citizens
CN112515698A (zh) * 2020-11-24 2021-03-19 英华达(上海)科技有限公司 听诊系统及其控制方法
JP2022006758A (ja) * 2020-06-25 2022-01-13 オンキヨー株式会社 聴診システム、聴診器、及び、方法
JP2023016317A (ja) * 2021-07-21 2023-02-02 オンキヨー株式会社 聴診器、及び、聴診システム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002165292A (ja) * 2000-11-27 2002-06-07 Nippon Telegr & Teleph Corp <Ntt> 多チャネル音響信号収集装置
JP2009273817A (ja) * 2008-05-19 2009-11-26 Shigehiro Kuroki 生体反応記録装置ならびに生体反応記録方法
WO2011114669A1 (fr) * 2010-03-18 2011-09-22 パナソニック株式会社 Dispositif d'inspection de sons biologiques
US20180160907A1 (en) * 2018-01-26 2018-06-14 Shiv Prakash Verma Digital healthcare practice system for digital citizens
JP2022006758A (ja) * 2020-06-25 2022-01-13 オンキヨー株式会社 聴診システム、聴診器、及び、方法
CN112515698A (zh) * 2020-11-24 2021-03-19 英华达(上海)科技有限公司 听诊系统及其控制方法
JP2023016317A (ja) * 2021-07-21 2023-02-02 オンキヨー株式会社 聴診器、及び、聴診システム

Similar Documents

Publication Publication Date Title
US10307104B2 (en) Chair pad system and associated, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees
US9805339B2 (en) Method for monitoring and improving health and productivity of employees using a computer mouse system
US10206625B2 (en) Chair pad system and associated, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees
US9949640B2 (en) System for monitoring employee health
US9808156B2 (en) Systems, computer medium and computer-implemented methods for monitoring and improving biomechanical health of employees
US9693734B2 (en) Systems for monitoring and improving biometric health of employees
US9526455B2 (en) Systems, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees
US20130012802A1 (en) Systems, Computer Medium and Computer-Implemented Methods For Monitoring and Improving Cognitive and Emotive Health of Employees
JP2013524332A (ja) アンケートを最適化する方法及び装置
JP2006230679A (ja) 健康管理装置
TW200526174A (en) Analysis of auscultatory sounds using single value decomposition
CN100391405C (zh) 人体平衡功能训练系统
JP2014524800A (ja) モバイルデバイスを使用して従業員の健康を監視するためのシステム、コンピュータ媒体、およびコンピュータにより実行される方法
JP2008176816A (ja) 健康状態によるアバター映像の生成方法及び装置
KR20190058858A (ko) 스마트장치를 이용한 심혈관 질환의 진단정보 제공방법 및 이를 위한 심음 애플리케이션
JP2004157941A (ja) ホームケア・システムおよびそのサーバ,ならびにホームケア・システムに用いられるおもちゃ装置
JP2019003570A (ja) 健康管理装置、健康管理方法、および健康管理プログラム
WO2023136175A1 (fr) Dispositif d&#39;acquisition d&#39;informations et système de traitement d&#39;informations
CN112740332A (zh) 用于对循环系统的状态的评价进行辅助的评价辅助系统和评价辅助方法
US20220151582A1 (en) System and method for assessing pulmonary health
WO2023058391A1 (fr) Procédé de traitement d&#39;informations, système de traitement d&#39;informations et dispositif de traitement d&#39;informations
WO2023223380A1 (fr) Système de traitement d&#39;informations, procédé de traitement d&#39;informations, dispositif de traitement d&#39;informations, dispositif de mesure et programme informatique
de Silva et al. Telemedicine-Remote Sensory Interaction with Patients for Medical Evaluation and Diagnosis

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22920690

Country of ref document: EP

Kind code of ref document: A1