WO2023223476A1 - Information processing system and information processing method - Google Patents

Information processing system and information processing method Download PDF

Info

Publication number
WO2023223476A1
WO2023223476A1 PCT/JP2022/020725 JP2022020725W WO2023223476A1 WO 2023223476 A1 WO2023223476 A1 WO 2023223476A1 JP 2022020725 W JP2022020725 W JP 2022020725W WO 2023223476 A1 WO2023223476 A1 WO 2023223476A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
information
biometric
model
biometric data
Prior art date
Application number
PCT/JP2022/020725
Other languages
French (fr)
Japanese (ja)
Inventor
英弟 謝
彦鵬 張
夢▲せん▼ 許
雨佳 劉
道久 井口
Original Assignee
株式会社Vrc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Vrc filed Critical 株式会社Vrc
Priority to JP2022529951A priority Critical patent/JPWO2023223476A1/ja
Priority to PCT/JP2022/020725 priority patent/WO2023223476A1/en
Priority to PCT/JP2023/018529 priority patent/WO2023224083A1/en
Publication of WO2023223476A1 publication Critical patent/WO2023223476A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons

Definitions

  • the present invention relates to a technology for processing 3D data representing a 3D model.
  • Patent Document 1 discloses a technique for storing and utilizing 3D data representing a 3D model of a user in a database.
  • the present invention provides 3D data that can be used in a variety of applications.
  • One aspect of the present disclosure includes: a 3D data acquisition unit that acquires 3D data representing a 3D model of a living body; a biological data acquisition unit that acquires biological data representing biological information acquired from the living body; a storage means for storing data in a database in association with each other; a reception means for receiving a request including an identifier of a 3D model; an extraction means for extracting biometric data corresponding to the identifier included in the request from the database; and transmitting means for transmitting biometric data obtained from the request to the source of the request.
  • the biological information may be information acquired at the same time as the image for generating the 3D model.
  • the biometric data may indicate a spatial distribution of the biometric information in the living body.
  • the biological information may include temperature distribution in the biological body.
  • the biological information may include the skin color of the living body.
  • the biological information may include medical information about the living body.
  • the biometric data may indicate a change in the biometric information over time.
  • the biometric data may include time information corresponding to the time when the biometric information was acquired.
  • the 3D data may indicate changes in the 3D model over time.
  • the biological information may indicate center-of-gravity sway of the biological body.
  • Another aspect of the present disclosure includes a step of acquiring 3D data indicating a 3D model of a living body, a step of acquiring biometric data indicating biometric information acquired from the living body, and associating the 3D data and the biometric data. a step of receiving a request including an identifier of the 3D model; a step of extracting biometric data corresponding to the identifier included in the request from the database; and adding the extracted biometric data to the request.
  • An information processing method is provided, which includes the step of transmitting information to a transmission source.
  • FIG. 1 is a diagram showing an overview of a 3D data system 1 according to an embodiment.
  • 1 is a diagram illustrating a functional configuration of a 3D data system 1.
  • FIG. FIG. 2 is a diagram illustrating a hardware configuration of a 3D scanner 20.
  • FIG. 1 is a diagram illustrating a hardware configuration of a server 10.
  • FIG. 3 is a diagram illustrating a hardware configuration of a user terminal 30.
  • 3 is a diagram illustrating data recorded in a database 111.
  • Storage means 32...Requesting means, 33...Receiving means, 39...Controlling means, 101...CPU, 102...Memory, 103...Storage, 104...Communication IF, 201...Frame, 202...Lighting, 203...Camera, 204...Depth sensor , 205...Thermography camera, 206...Computer, 207...Display, 301...CPU, 302...Memory, 303...Storage, 304...Communication IF, 305...Display, 306...Input device
  • FIG. 1 is a diagram showing an overview of a 3D data system 1 according to an embodiment.
  • the 3D data system 1 is an information processing system that provides 3D data services.
  • 3D data and biological information are stored in association with each other.
  • 3D data refers to data representing a 3D model.
  • a 3D model is a model that represents the three-dimensional shape of an object in a virtual three-dimensional space.
  • the object is a living organism, such as a human being.
  • the biological information is information regarding the living body of the object, and includes, for example, at least one of body composition, skin color, skin unevenness, temperature, blood pressure, pulse rate, and arterial blood oxygen saturation.
  • the biological information may include at least one of spatial distribution and temporal changes on the body surface or inside the body of the target object.
  • the 3D data system 1 includes a server 10, a 3D scanner 20, and a user terminal 30.
  • the server 10 is a server that provides 3D data services.
  • the 3D scanner 20 photographs an object and generates 3D data.
  • 3D scanner 20 further measures biometric data of the object.
  • the 3D scanner 20 transmits the generated 3D data and the measured biometric data to the server 10.
  • the server 10 stores 3D data and biometric data in association with each other.
  • the server 10, 3D scanner 20, and user terminal 30 are connected via a network 90.
  • Network 90 is a computer network such as the Internet.
  • FIG. 2 is a diagram illustrating the functional configuration of the 3D data system 1.
  • the 3D data system 1 includes a storage means 11, an acquisition means 12, an acquisition means 13, a storage control means 14, an acceptance means 15, an extraction means 16, a transmission means 17, a control means 19, a storage means 21, an imaging means 22, and a measurement means 23.
  • the storage means 11, the acquisition means 12, the acquisition means 13, the storage control means 14, the reception means 15, the extraction means 16, the transmission means 17, and the control means 19 are stored in the server 10.
  • the means 23, the measuring means 24, the generating means 25, the transmitting means 26, and the controlling means 29 are installed in the 3D scanner 20, and the storing means 31, the requesting means 32, the receiving means 33, and the controlling means 39 are installed in the user terminal 30. Ru.
  • the storage means 11 stores various data.
  • the data stored by the storage means 11 includes a database 111.
  • the database 111 is a database that records 3D data and biometric information.
  • the acquisition means 12 acquires 3D data representing a 3D model of the object.
  • the acquisition means 13 acquires biometric data indicating biometric information of the object.
  • the storage control means 14 records the 3D data and the biometric data in the database 111 in association with each other.
  • the accepting means 15 accepts requests from the user terminal 30, for example. This request includes identification information of the 3D model.
  • the extraction means 16 extracts biometric data corresponding to the identification information included in the request from the database 111.
  • the transmitting means 17 transmits the extracted biometric data to the source of the request.
  • the control means 19 performs various controls.
  • the storage means 21 stores various data.
  • the photographing means 22 photographs the object.
  • "photographing" a target object means acquiring a two-dimensional image of the target object.
  • the measuring means 23 measures the distance from the photographing means 22 to the surface of the object. By measuring the distance, information regarding the three-dimensional shape of the object can be obtained.
  • the measuring means 24 measures biological information of the object.
  • the generating means 25 generates 3D data of the object using the image photographed by the photographing means 22 and the distance measured by the measuring means 23.
  • the transmitting means 26 transmits the 3D data and biometric data to the server 10.
  • the control means 29 performs various controls.
  • the storage means 31 stores various data.
  • the request means 32 sends a request to the server 10. This request requests the transmission of 3D data and biometric data.
  • the receiving means 33 receives the 3D data and biometric data transmitted from the server 10 in response to this request.
  • the control means 39 performs various controls.
  • FIG. 3 is a diagram illustrating the hardware configuration of the 3D scanner 20.
  • the 3D scanner 20 has a frame 201, a light 202, a camera 203, a depth sensor (or distance sensor) 204, a thermography camera 205, a computer 206, and a display 207.
  • the frame 201 is a structure for forming a photography room (or photography space).
  • the frame 201 has a basic skeleton in the shape of, for example, a rectangular parallelepiped, a cube, a pentagonal prism, a hexagonal prism, or a similar shape. The interior of these three-dimensional objects will become the photography room.
  • Lighting 202 illuminates the inside of the photographing room.
  • the camera 203 photographs the object in the photographing room (that is, acquires a two-dimensional image or photograph of the object).
  • Depth sensor 204 measures the distance to the surface of the object. More specifically, depth sensor 204 measures the spatial distribution of distances to the surface of the object.
  • One camera 203 and one depth sensor 204 constitute a sensor unit.
  • the sensor unit is fixed to the frame 201, for example.
  • 3D scanner 20 includes multiple sensor units. The plurality of sensor units are arranged so as to photograph the object from different directions.
  • the thermography camera 205 is a device that measures thermography, that is, heat distribution, of an object. Thermographic camera 205 may be included in the sensor unit.
  • the 3D scanner 20 may have the same number of thermography cameras 205 as the cameras 203 at approximately the same positions as the cameras 203.
  • the thermography camera 205 may be provided independently of the sensor unit. That is, the position of thermography camera 205 may be different from the position of camera 203, or the number thereof may be different.
  • Computer 206 processes the images taken by camera 203, the distance measured by depth sensor 204, and the heat distribution measured by thermography camera 205. Specifically, computer 206 generates 3D data representing a 3D model of the object. Computer 206 transmits the generated 3D data to server 10.
  • the computer 206 is, for example, a general-purpose computer having a processor, memory, storage, and communication IF, and is equipped with a function of generating 3D data by executing a program.
  • Display 207 is controlled by computer 206 and presents information to a user (eg, a human object).
  • the camera 203 is an example of the photographing means 22.
  • Depth sensor 204 is an example of measuring means 23.
  • a thermographic camera 205 is an example of the measuring means 24. That is, a thermograph measured by the thermography camera 205 is an example of biological information.
  • the computer 206 stores in its storage a program for generating 3D data (hereinafter referred to as "3D data generation program").
  • 3D data generation program In a state where the processor is executing the 3D data generation program, at least one of the memory and the storage is an example of the storage means 21, the processor is an example of the generation means 25 and the control means 29, and the communication IF is an example of the transmission means 26. This is an example.
  • FIG. 4 is a diagram illustrating the hardware configuration of the server 10.
  • the server 10 includes a CPU (Central Processing Unit) 101, a memory 102, a storage 103, and a communication IF 104.
  • the CPU 101 is a processor that performs various calculations according to programs.
  • the memory 102 is a main storage device that functions as a work area when the CPU 101 executes a program, and includes, for example, RAM (Random Access Memory).
  • the storage 103 is an auxiliary storage device that stores various data and programs, and includes, for example, an SSD (Solid State Drive) or an HDD (Hard Disc Drive).
  • the communication IF 104 is a device that communicates with other devices according to a predetermined communication standard (eg, Ethernet (registered trademark)), and includes, for example, a NIC (Network Interface Card).
  • a NIC Network Interface Card
  • the programs stored in the storage 103 include a program for causing a computer to function as a server in the 3D data system 1 (hereinafter referred to as a "server program").
  • server program a program for causing a computer to function as a server in the 3D data system 1
  • the CPU 101 is executing the server program
  • the CPU 101 is the acquisition means 12, the acquisition means 13, the storage control means 14, the reception means 15, and the extraction means. 16 and the control means 19, and the communication IF 104 is an example of the transmission means 17.
  • FIG. 5 is a diagram illustrating the hardware configuration of the user terminal 30.
  • the user terminal 30 is a computer including a CPU 301, a memory 302, a storage 303, a communication IF 304, a display 305, and an input device 306, and is, for example, a smartphone or a personal computer.
  • the CPU 301 is a processor that performs various calculations according to programs.
  • the memory 302 is a main storage device that functions as a work area when the CPU 301 executes a program, and includes, for example, a RAM.
  • the storage 303 is an auxiliary storage device that stores various data and programs, and includes, for example, an SSD or an HDD.
  • the communication IF 304 is a device that communicates with other devices according to a predetermined communication standard (for example, WiFi (registered trademark)), and includes, for example, a wireless chip.
  • the display 305 is a device that displays information, and includes, for example, an organic EL display.
  • Input device 306 is a device for inputting information to user terminal 30, and includes, for example, a touch screen, a keyboard, or a pointing device.
  • the programs stored in the storage 303 include a program for causing a computer to function as a client in the 3D data system 1 (hereinafter referred to as a "client program").
  • client program a program for causing a computer to function as a client in the 3D data system 1
  • the CPU 301 is executing a client program
  • at least one of the memory 302 and the storage 303 is an example of the storage means 31
  • the CPU 101 is an example of the requesting means 32 and the control means 39
  • the communication IF 104 is an example of the receiving means 33. This is an example.
  • FIG. 6 is a sequence chart illustrating the operation of the 3D data system 1.
  • the 3D scanner 20 is installed in a commercial facility, a sports gym, a hospital, etc., and the user can use the 3D scanner 20 for a fee or for free.
  • the 3D scanner 20 photographs the user as a target object. Specifically, for example, it is as follows.
  • a user who wants to generate 3D data enters the imaging room.
  • the 3D scanner 20 has a human sensor (not shown) in the imaging room, and when it is detected that a person has entered the imaging room, the computer 206 displays a screen that guides imaging to generate 3D data. It is displayed on the display 207.
  • This screen includes, for example, guidance on where the user should stand, and guidance on the pose or posture the user should take during shooting. Further, this screen may include a UI object that prompts the user to input the user ID.
  • the user ID is identification information that identifies a user in the 3D data system 1. The user enters the user ID from the touch screen or keyboard.
  • an image code indicating the user ID may be displayed on a terminal device (for example, a smartphone, not shown) owned by the user, and the computer 206 may read the image code with a camera (not shown) and accept the input of the user ID. good.
  • the computer 206 displays a countdown until shooting on the display 207.
  • the computer 206 controls the camera 203 to photograph the user in the photographing room in accordance with this countdown.
  • the computer 206 controls the depth sensor 204 to measure the distance to the object at the same timing as the camera 203 photographs the object.
  • the computer 206 stores image data indicating an image (a still image in this example) captured by the camera 203 and distance data indicating the distance measured by the depth sensor 204 in a storage.
  • the 3D scanner 20 measures biological information of the object.
  • a thermograph of the object is measured as biological information.
  • the computer 206 controls the thermography camera 205 to measure the thermograph of the object at the same timing (ie, approximately at the same time) as the object is photographed by the camera 203.
  • the computer 206 stores thermograph data (an example of biological data) indicating a thermograph measured by the thermography camera 205 in a storage.
  • the 3D scanner 20 generates 3D data of the object. Specifically, for example, it is as follows.
  • the computer 206 generates a 3D model of the object from the image data and distance data according to the 3D data generation program. Specifically, the computer 206 generates a 3D model by pasting an image indicated by the image data as a texture onto a three-dimensional shape calculated from the distance data.
  • the biometric data indicates the surface distribution of biometric information (i.e., surface temperature) on the object.
  • the computer 206 pastes the thermograph onto the three-dimensional shape of the object (an element body to which no image texture is pasted) in the same way as pasting a photographed image of the object onto a three-dimensional shape.
  • the computer 206 creates a 3D model in which an image of the object (visible light image, i.e., color distribution) is pasted onto the object's element body, and a 3D model in which the temperature distribution of the object is pasted onto the object's element body.
  • the 3D data representing these two 3D models may be generated as separate data files, or may be included in a single data file.
  • This 3D data includes a user ID and a time stamp.
  • the timestamp is an example of time information indicating the date and time when the 3D model was generated.
  • This 3D data may further include identification information of the 3D scanner 20.
  • step S4 the 3D scanner 20 transmits the generated 3D data to the server 10.
  • the server 10 to which the 3D data is transmitted is defined in the 3D data generation program.
  • step S5 the server 10 records the 3D data received from the 3D scanner 20 in the database 111.
  • FIG. 7 is a diagram illustrating data recorded in the database 111.
  • Database 111 has multiple records. Each record includes one or more 3D data for one user. Each record includes one or more subrecords. A timestamp T[i] is associated with the subrecord. Each sub-record includes 3D data DD[i], biometric data LL[i], and attribute information. Attribute information is information indicating attributes of 3D data or biometric data. Here, as attributes, identification information ID [S, i] of the 3D scanner 20 that took the image and identification information ID [L, i] of the type of biometric data included in the 3D data are used.
  • 3D data and biometric data are not limited to being acquired at the same time, so one of the 3D data DD and biometric data LL in one subrecord may be empty.
  • a user generates 3D data at any timing. For example, a certain user generates new 3D data once a year. Another user generates new 3D data once a month. Still another user generates new 3D data irregularly.
  • the database 111 records a plurality of 3D data DD and biometric data LL generated at different times for each of a plurality of users.
  • the attribute information of 3D data is not limited to the above example.
  • the attribute information of the 3D data includes, for example, the location of the 3D scanner 20 that generated the 3D data (address such as city, town, village name, etc.), the type of facility where the 3D scanner 20 is installed (sports gym, hospital, etc.), and the information for generating the 3D model.
  • the image may include information indicating at least one type of user's behavior (exercising, eating, etc.) immediately before photographing the image.
  • the 3D scanner 20 collects this information and sends it to the server 10.
  • steps S1 to S5 are processes for recording the 3D data and biometric data generated by the 3D scanner 20 in the database 111.
  • a process for using the 3D data and biometric data recorded in the database 111 from the user terminal 30 will be described.
  • this process will be explained following steps S1 to S5, but the process of recording 3D data and biometric data in the database 111 and the process of using the 3D data and biometric data recorded in the database 111 are independent of each other. It will be done.
  • step S6 the user terminal 30 transmits a 3D data transmission request to the server 10. Specifically, for example, it is as follows. The user starts a client program on the user terminal 30. The user terminal 30 transmits a 3D data transmission request to the server 10 according to the client program. This transmission request includes information that specifies the 3D data, for example, a user ID. Note that this user ID may be the user ID of the user operating the user terminal 30, or may be another user ID. That is, the user of the user terminal 30 may request his or her own 3D data or may request 3D data of another person.
  • the server 10 Upon receiving a 3D data transmission request from the user terminal 30, the server 10 extracts attribute information of the requested 3D data from the database 111. The server 10 transmits a list of (at least part of) the extracted attribute information to the user terminal 30. The user terminal 30 displays the list sent from the server 10 as well as a UI object that prompts the user to select 3D data and biometric data. The user selects desired 3D data and biometric data. Upon accepting the user's selection, the user terminal 30 transmits a request to send the selected 3D data and biometric data to the server 10. This transmission request includes information specifying the selected 3D data and biometric data.
  • the server 10 Upon receiving a request to send specific 3D data from the user terminal 30, the server 10 extracts the requested 3D data and biometric data from the database 111 (step S7). The server 10 transmits the extracted 3D data and biometric data to the user terminal 30, which is the source of the request, as a response to the transmission request (step S8).
  • processing using 3D data and biometric data includes, for example, displaying a 3D model and biometric information. In this way, the user of the user terminal 30 can visually confirm the 3D model and biometric information.
  • the 3D data system 1 can be used in the field of health care or medicine.
  • the user who is the object of the 3D model is a person to be managed or a patient.
  • the 3D scanner 20 is installed at a health care center or hospital. Thermographs are measured as biological information.
  • the user periodically (for example, once a year) generates a new 3D model and measures new biological information according to instructions from a staff member or a doctor. These data are recorded in the database 111.
  • the user terminal 30 is a terminal operated by a staff member or a doctor.
  • the staff or doctor inputs the user ID of the person to be managed or the patient into the user terminal 30.
  • the user terminal 30 acquires 3D data and biometric data of the person to be managed or the patient from the server 10 .
  • the 3D data and biometric data that the user terminal 30 acquires here are 3D data and biometric data from the most recent predetermined period (for example, the most recent three years).
  • the user terminal 30 displays a plurality of 3D models photographed at different times in such a way that changes over time can be seen. In one example, the user terminal 30 displays these multiple 3D models side by side along the time axis.
  • the user terminal 30 may emphasize and display changes over time (for example, the waist circumference becomes thicker year by year, the back becomes rounder year by year).
  • Display emphasizing time-series changes refers to, for example, displaying a portion where the change is greater than a standard in an appearance (color, size, or decoration) that is different from the others.
  • the user terminal 30 can switch between displaying a normal 3D model and a 3D model to which biometric information (for example, a thermograph) is attached using a UI object such as a button. Since this thermograph is pasted on the 3D model (its body), an image at a specified magnification is displayed as seen from a viewpoint that is changed according to instructions from the user (staff or doctor). The 3D model with biometric information pasted thereon is also displayed in a way that allows you to see its changes over time, just like a normal 3D model. At this time, the user terminal 30 may emphasize and display changes over time (for example, parts where the temperature becomes lower or higher year by year).
  • biometric information for example, a thermograph
  • parts of a normal 3D model with large changes may be highlighted and displayed.
  • parts where the biometric information changes greatly may be highlighted and displayed.
  • Biometric information is not limited to what is measured by the 3D scanner 20.
  • biometric information measured by a measuring device other than the 3D scanner 20 may be used.
  • the other measuring devices mentioned here are, for example, devices such as body composition monitors, weight scales, blood pressure monitors, pulse meters, or pulse oximeters, and the biological information measured includes body resuscitation, weight, blood pressure, pulse rate, etc. , or medical information such as arterial oxygen saturation.
  • Body composition refers to the composition of the elements (fat, bones, and lean soft tissue) that make up the body.
  • This measuring device continuously, periodically, or temporarily connects with a terminal device (for example, a smartphone or a PC) owned by the user (here, the user is the user who is the object of the 3D model). and transmits biometric data indicating the measured biometric information to the terminal device.
  • the terminal device stores biometric data received from the measuring device.
  • a time stamp is attached to this biometric data.
  • This terminal device continuously, periodically, or temporarily transmits the biometric data and user ID to the server 10 via a network such as the Internet.
  • the server 10 Upon receiving the biometric data from the terminal device, the server 10 records the received biometric data in a record corresponding to the received user ID in the database 111.
  • the time when the 3D model was generated (that is, the time when the user was photographed) and the time when the biometric information was acquired do not necessarily match. Therefore, the biometric data obtained from the measuring device is recorded as a single biometric data, not as a set with 3D data generated at a specific time. Note that if the time when the biometric information was acquired and the time when the 3D data was generated meet a predetermined condition (for example, the time difference between the two is less than or equal to a threshold), the server 10 converts the biometric data into the 3D data. It may also be recorded in the database 111 as a set. The server 10 transmits this biometric data (obtained by the measuring device) in response to a request from the user terminal 30.
  • a predetermined condition for example, the time difference between the two is less than or equal to a threshold
  • the biometric information shown by this biometric data does not indicate a spatial distribution, so it is not pasted onto the 3D model body. Even if they were pasted together, the spatial distribution is uniform, so when viewed as a single 3D model, the only difference would be the base body and color.
  • the biometric information shown by this biometric data does not indicate a spatial distribution, so it is not pasted onto the 3D model body. Even if they were pasted together, the spatial distribution is uniform, so when viewed as a single 3D model, the only difference would be the base body and color.
  • the biometric information shown by this biometric data does not indicate a spatial distribution, so it is not pasted onto the 3D model body. Even if they were pasted together, the spatial distribution is uniform, so when viewed as a single 3D model, the only difference would be the base body and color.
  • the time-series changes in blood pressure values can be recognized visually.
  • an image may be displayed in which biometric information corresponding to the period specified by a UI object (for example, a slide bar) specifying the period is pasted onto a single 3D model body.
  • a UI object for example, a slide bar
  • the color of the 3D model changes to the color corresponding to the biological information at that time.
  • the 3D model also changes to a shape corresponding to that period.
  • the user can visually correlate and confirm the temporal change in the shape of the 3D model (that is, the temporal change in the body structure or posture) and the temporal change in the biological information.
  • the biometric information may be displayed on the user terminal 30 in a form other than being pasted onto the 3D model. For example, when a plurality of 3D models generated at different times are displayed side by side, a graph showing changes in biological information over time may be displayed above the 3D models.
  • a specific part of an object for example, the face
  • other parts for example, the body
  • controlled light sources with adjustable orientation, height, and intensity are used for this measurement.
  • a controlled light source for example, light is irradiated onto the skin from various angles and reflected and scattered light is measured.
  • the texture of the skin that is, minute irregularities (pimples or wrinkles) can be measured.
  • Information on this fine unevenness is recorded as, for example, a normal map (or normal map). In this case, this normal map is an example of biological information.
  • Highly accurate skin texture is measured using, for example, a lighting device different from that used for photographing to generate a 3D model. That is, the 3D scanner 20 has two systems of illumination, one for photographing to generate three models, and the other for measurement to record the detailed texture of the skin as biometric information. May be used.
  • the 3D scanner 20 may measure multiple types of biological information from the object.
  • the biological information measured by the 3D scanner 20 includes information that has a spatial distribution (such as a thermograph) and information that does not have a spatial distribution (such as blood pressure, pulse rate, and arterial oxygen saturation). It may include both.
  • a chair is provided in the photographing room of the 3D scanner 20, and after photographing for 3D model generation while standing in a place other than the chair, biometric information is collected while the user is sitting in the chair. May be measured.
  • This chair is equipped with, for example, a body composition monitor, weight scale, blood pressure monitor, pulse monitor, and pulse oximeter.
  • this chair may be provided with an X-ray device or an ultrasonic examination device to take an X-ray image or an echo image (these are examples of biological information having a spatial distribution).
  • the chair may have the ability to draw blood and perform blood tests.
  • blood test results are an example of biological information.
  • biological information may be measured by either a non-invasive measuring device or an invasive measuring device.
  • the chair may have a robotic arm for palpating the user. This robot arm presses a predetermined part of the user (for example, the abdomen) and outputs the reaction force from that part as biological information.
  • this device is expressed as a chair for convenience here, this device is not limited to a chair type, and may have any shape.
  • Biological information is not limited to thermographs.
  • the biological information may include findings by a doctor (for example, "suspected hypertension” or "suspected diabetes"). Since the biological information includes the doctor's findings, the doctor's findings and the 3D model of the patient at the time corresponding to the findings are recorded in the database 111.
  • the biometric information may include information indicating the emotions of the user who is the object of the 3D model (such as “happy,” “sad,” “disappointed,” or “feels good,” etc.).
  • the user's own emotions are input, for example, by the user making a selection from among the options presented by the 3D scanner 20 when photographing for 3D model generation. Since the biometric information includes the user's emotions, the user's emotions are recorded in the database 111 along with other biometric information and the 3D model of the patient at the corresponding time.
  • the biological information may be information indicating the sway of the center of gravity when the user, who is the object of the 3D model, assumes a predetermined posture (for example, an upright posture).
  • the 3D scanner 20 has a center of gravity oscillation meter.
  • a center of gravity sway meter is a device that measures the sway of the center of gravity of a person standing on it. When measuring the center of gravity sway, the locus of the center of gravity position is first measured for a predetermined period of time (for example, 60 seconds) with the eyes open and then for a predetermined period of time with the eyes closed.
  • the measurement results include the total trajectory distance of the center of gravity sway, the sway area, the deviation of the center of gravity sway, the Romberg ratio, or the power spectrum.
  • the 3D data system 1 can be used in the field of fitness.
  • the user who is the object of the 3D model is a member of a fitness club or a sports gym.
  • the 3D scanner 20 is installed in a fitness club or sports gym. Thermographs are measured as biological information.
  • the user generates a new 3D model and measures new biological information before and after exercise according to instructions from the staff. These data are recorded in the database 111.
  • the staff of the fitness club or sports gym can judge from this biological information whether the exercise performed by the user is applying an appropriate load to the target muscles. Furthermore, by checking the shape of the 3D model, that is, changes in body shape over time, the staff can determine whether the exercise is having an effect.
  • the server 10 may have a function as an analysis means for analyzing biological information.
  • imaging for 3D model generation and thermograph measurement are performed almost simultaneously. Therefore, it is considered that there is a correlation between the skin condition (color and unevenness) of the object obtained during photography for 3D model generation and the thermograph obtained at the same time.
  • the analysis means analyzes the correlation between the skin condition and the thermograph using a technique such as correlation analysis, regression analysis, or machine learning. This correlation may be analyzed for one specific user, or may be analyzed for a user group including a plurality of users.
  • the server 10 stores information indicating the correlation between the skin condition and the thermograph (for example, a relational expression or a machine learning model between the two) in the storage means 11.
  • the analysis means may analyze the correlation between changes over time in the biological information and changes over time in the 3D model.
  • a predetermined period For example, an analysis result can be obtained that shows that changes in a predetermined region in the 3D model have a predetermined trend (for example, waist circumference increased by x% or more) in the last three years.
  • the server 10 stores information indicating this analysis result in the storage means 11.
  • the server 10 can be used as a prediction means to predict what the health condition of a certain user is likely to be in the future based on changes over time of the 3D model up to the present. It can have the following functions.
  • the control means 19 is an example of this analysis means and prediction means.
  • Target In the above-described embodiment, an example in which the target is a human has been described, but the target may be an animal other than a human, such as a pet or livestock.
  • the database 111 records data indicating changes in the 3D model over time, as well as spatial distribution and changes over time of biological information.
  • the database 111 may not include data indicating changes in the 3D model over time for a certain user.
  • the database 111 may record only one of the spatial distribution and temporal change of biological information.
  • the database 111 may record, for a certain user, only 3D data showing a 3D model at a single time and biometric data showing the spatial distribution of the user's biometric information (without information on changes over time). .
  • the hardware configuration of the 3D data system 1 is not limited to that illustrated in the embodiment.
  • the 3D data system 1 may have any hardware configuration as long as it can realize the required functions.
  • a plurality of physical devices may work together to function as the server 10.
  • the server 10 may be a physical server or a virtual server (including a so-called cloud server).
  • the correspondence between functional elements and hardware is not limited to that illustrated in the embodiment.
  • at least some of the functions described as being implemented in the 3D scanner 20 or the user terminal 30 in the embodiment may be implemented in the server 10.
  • the generating means 25 may be implemented in the server 10.
  • the 3D scanner 20 transmits to the server 10 a 3D model generation request that includes the image data captured by the camera 203, the distance measured by the depth sensor 204, and the thermograph measured by the thermography camera 205.
  • the configuration of the 3D scanner 20 is not limited to that illustrated in the embodiment, and may include a smartphone that uses a smartphone instead of a camera, a fixed camera that photographs a rotating object, and a fixed camera that captures images of the surroundings of a fixed object. Any type of 3D scanner may be used, such as one that takes images with a rotating camera.
  • the sequence chart shown in FIG. 6 merely shows an example of processing, and the operation of the 3D data system 1 is not limited thereto. Some of the illustrated processes may be omitted, the order may be changed, or new processes may be added.
  • first a list of data recorded in the database 111 is transmitted to the user terminal 30, and the data selected by the user terminal 30 is An example has been described in which data is sent from the server 10 to the user terminal 30.
  • the procedure for transmitting 3D data and biometric data from the server 10 to the user terminal 30 is not limited to this.
  • the application program of the user terminal 30 or the server program of the server 10 may automatically select the 3D data and biometric data to be transmitted to the user terminal 30.
  • the server 10 may transmit only one of the 3D data and the biometric data to the user terminal 30.
  • the method of specifying the 3D data and biometric data transmitted from the server 10 to the user terminal 30 is not limited to using a user ID.
  • 3D data and biometric data having the selected attribute value at the server 10 or the user terminal 30 may be extracted from the database 111 and transmitted to the user terminal 30.
  • the application program executed by the CPU 101 may be provided by downloading via a network such as the Internet, or may be provided in a state recorded on a computer-readable non-temporary recording medium such as a DVD-ROM. may be done.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

This information processing system is provided with: a 3D data acquiring means for acquiring 3D data that show a 3D model of a living body; a biological data acquiring means for acquiring biological data that show biological information acquired from the living body; a storing means for storing the 3D data and the biological data in a database while correlating the 3D data and the biological data with each other; a receiving means for receiving a request including an identifier for the 3D model; an extracting means for extracting biological data corresponding to the identifier included in the request from the database; and a transmitting means for transmitting the extracted biological data to a transmission source of the request.

Description

情報処理システム及び情報処理方法Information processing system and information processing method
 本発明は、3Dモデルを示す3Dデータを処理する技術に関する。 The present invention relates to a technology for processing 3D data representing a 3D model.
 対象物の3Dデータを利用する技術が知られている。例えば特許文献1は、ユーザの3Dモデルを示す3Dデータをデータベースに格納して利用する技術を開示している。 Techniques that utilize 3D data of objects are known. For example, Patent Document 1 discloses a technique for storing and utilizing 3D data representing a 3D model of a user in a database.
特許6791530号公報Patent No. 6791530
 特許文献1に記載された技術において、3Dデータの多様なアプリケーションを想定すると、データベースに記録されるデータは改善の余地があった。 In the technology described in Patent Document 1, assuming various applications of 3D data, there is room for improvement in the data recorded in the database.
 これに対し本発明は、多様なアプリケーションにおける利用を可能とする3Dデータを提供する。 In contrast, the present invention provides 3D data that can be used in a variety of applications.
 本開示の一態様は、生体の3Dモデルを示す3Dデータを取得する3Dデータ取得手段と、前記生体から取得した生体情報を示す生体データを取得する生体データ取得手段と、前記3Dデータ及び前記生体データを対応付けてデータベースに記憶する記憶手段と、3Dモデルの識別子を含む要求を受け付ける受け付け手段と、前記要求に含まれる識別子に対応する生体データを前記データベースから抽出する抽出手段と、前記抽出された生体データを前記要求の送信元に送信する送信手段とを有する情報処理システムを提供する。 One aspect of the present disclosure includes: a 3D data acquisition unit that acquires 3D data representing a 3D model of a living body; a biological data acquisition unit that acquires biological data representing biological information acquired from the living body; a storage means for storing data in a database in association with each other; a reception means for receiving a request including an identifier of a 3D model; an extraction means for extracting biometric data corresponding to the identifier included in the request from the database; and transmitting means for transmitting biometric data obtained from the request to the source of the request.
 前記生体情報が、前記3Dモデルを生成するための画像と同時に取得された情報であってもよい。 The biological information may be information acquired at the same time as the image for generating the 3D model.
 前記生体データが、前記生体における前記生体情報の空間分布を示してもよい。 The biometric data may indicate a spatial distribution of the biometric information in the living body.
 前記生体情報が、前記生体における温度分布を含んでもよい。 The biological information may include temperature distribution in the biological body.
 前記生体情報が、前記生体における肌の色を含んでもよい。 The biological information may include the skin color of the living body.
 前記生体情報が、前記生体における医学的情報を含んでもよい。 The biological information may include medical information about the living body.
 前記生体データが、前記生体情報の経時変化を示してもよい。 The biometric data may indicate a change in the biometric information over time.
 前記生体データが、前記生体情報が取得された時刻に相当する時刻情報を含んでもよい。 The biometric data may include time information corresponding to the time when the biometric information was acquired.
 前記3Dデータが、前記3Dモデルの経時変化を示してもよい。 The 3D data may indicate changes in the 3D model over time.
 前記生体情報が、前記生体の重心動揺を示してもよい。 The biological information may indicate center-of-gravity sway of the biological body.
 本開示の別の一態様は、生体の3Dモデルを示す3Dデータを取得するステップと、前記生体から取得した生体情報を示す生体データを取得するステップと、前記3Dデータ及び前記生体データを対応付けてデータベースに記憶するステップと、3Dモデルの識別子を含む要求を受け付けるステップと、前記要求に含まれる識別子に対応する生体データを前記データベースから抽出するステップと、前記抽出された生体データを前記要求に送信元に送信するステップとを有する情報処理方法を提供する。 Another aspect of the present disclosure includes a step of acquiring 3D data indicating a 3D model of a living body, a step of acquiring biometric data indicating biometric information acquired from the living body, and associating the 3D data and the biometric data. a step of receiving a request including an identifier of the 3D model; a step of extracting biometric data corresponding to the identifier included in the request from the database; and adding the extracted biometric data to the request. An information processing method is provided, which includes the step of transmitting information to a transmission source.
 本発明によれば、多様なアプリケーションにおける3Dデータの利用が可能となる。 According to the present invention, it is possible to use 3D data in a variety of applications.
一実施形態に係る3Dデータシステム1の概要を示す図。FIG. 1 is a diagram showing an overview of a 3D data system 1 according to an embodiment. 3Dデータシステム1の機能構成を例示する図。1 is a diagram illustrating a functional configuration of a 3D data system 1. FIG. 3Dスキャナ20のハードウェア構成を例示する図。FIG. 2 is a diagram illustrating a hardware configuration of a 3D scanner 20. FIG. サーバ10のハードウェア構成を例示する図。1 is a diagram illustrating a hardware configuration of a server 10. FIG. ユーザ端末30のハードウェア構成を例示する図。FIG. 3 is a diagram illustrating a hardware configuration of a user terminal 30. FIG. 3Dデータシステム1の動作を例示するシーケンスチャート。A sequence chart illustrating the operation of the 3D data system 1. データベース111に記録されるデータを例示する図。3 is a diagram illustrating data recorded in a database 111. FIG.
1…3Dデータシステム、10…サーバ、11…記憶手段、12…取得手段、13…取得手段、14…記憶制御手段、15…受け付け手段、16…抽出手段、17…送信手段、19…制御手段、20…3Dスキャナ、21…記憶手段、22…撮影手段、23…測定手段、24…測定手段、25…生成手段、26…送信手段、29…制御手段、30…ユーザ端末、31…記憶手段、32…要求手段、33…受信手段、39…制御手段、101…CPU、102…メモリ、103…ストレージ、104…通信IF、201…フレーム、202…照明、203…カメラ、204…深さセンサー、205…サーモグラフィーカメラ、206…コンピュータ、207…ディスプレイ、301…CPU、302…メモリ、303…ストレージ、304…通信IF、305…ディスプレイ、306…入力装置 DESCRIPTION OF SYMBOLS 1... 3D data system, 10... Server, 11... Storage means, 12... Acquisition means, 13... Acquisition means, 14... Storage control means, 15... Acceptance means, 16... Extraction means, 17... Transmission means, 19... Control means , 20... 3D scanner, 21... Storage means, 22... Photographing means, 23... Measuring means, 24... Measuring means, 25... Generating means, 26... Transmitting means, 29... Control means, 30... User terminal, 31... Storage means , 32...Requesting means, 33...Receiving means, 39...Controlling means, 101...CPU, 102...Memory, 103...Storage, 104...Communication IF, 201...Frame, 202...Lighting, 203...Camera, 204...Depth sensor , 205...Thermography camera, 206...Computer, 207...Display, 301...CPU, 302...Memory, 303...Storage, 304...Communication IF, 305...Display, 306...Input device
1.構成
 図1は、一実施形態に係る3Dデータシステム1の概要を示す図である。3Dデータシステム1は、3Dデータサービスを提供する情報処理システムである。3Dデータシステム1において、3Dデータと生体情報とが対応付けて記憶される。3Dデータとは、3Dモデルを表すデータをいう。3Dモデルとは、仮想3次元空間内において対象物の立体的な形状を表すモデルである。この例において対象物は生体、例えば人間である。生体情報とは、対象物の生体に関する情報であり、例えば、体組成、肌の色、肌の凹凸、温度、血圧、脈拍、及び動脈血酸素飽和度の少なくとも1種を含む。生体情報は、対象物の体表面又は体内における空間分布及び経時変化の少なくとも一方を含んでもよい。
1. Configuration FIG. 1 is a diagram showing an overview of a 3D data system 1 according to an embodiment. The 3D data system 1 is an information processing system that provides 3D data services. In the 3D data system 1, 3D data and biological information are stored in association with each other. 3D data refers to data representing a 3D model. A 3D model is a model that represents the three-dimensional shape of an object in a virtual three-dimensional space. In this example, the object is a living organism, such as a human being. The biological information is information regarding the living body of the object, and includes, for example, at least one of body composition, skin color, skin unevenness, temperature, blood pressure, pulse rate, and arterial blood oxygen saturation. The biological information may include at least one of spatial distribution and temporal changes on the body surface or inside the body of the target object.
 3Dデータシステム1は、サーバ10、3Dスキャナ20、及びユーザ端末30を有する。サーバ10は、3Dデータサービスを提供するサーバである。3Dスキャナ20は、対象物を撮影して3Dデータを生成する。この例において、3Dスキャナ20は、さらに、対象物の生体データを測定する。3Dスキャナ20は、生成した3Dデータ及び測定した生体データをサーバ10に送信する。サーバ10は、3Dデータ及び生体データを対応付けて記憶する。サーバ10、3Dスキャナ20、及びユーザ端末30は、ネットワーク90を介して接続される。ネットワーク90は、インターネット等のコンピュータネットワークである。 The 3D data system 1 includes a server 10, a 3D scanner 20, and a user terminal 30. The server 10 is a server that provides 3D data services. The 3D scanner 20 photographs an object and generates 3D data. In this example, 3D scanner 20 further measures biometric data of the object. The 3D scanner 20 transmits the generated 3D data and the measured biometric data to the server 10. The server 10 stores 3D data and biometric data in association with each other. The server 10, 3D scanner 20, and user terminal 30 are connected via a network 90. Network 90 is a computer network such as the Internet.
 図2は、3Dデータシステム1の機能構成を例示する図である。3Dデータシステム1は、記憶手段11、取得手段12、取得手段13、記憶制御手段14、受け付け手段15、抽出手段16、送信手段17、制御手段19、記憶手段21、撮影手段22、測定手段23、測定手段24、生成手段25、送信手段26、制御手段29、記憶手段31、要求手段32、受信手段33、及び制御手段39を有する。このうち、記憶手段11、取得手段12、取得手段13、記憶制御手段14、受け付け手段15、抽出手段16、送信手段17、及び制御手段19はサーバ10に、記憶手段21、撮影手段22、測定手段23、測定手段24、生成手段25、送信手段26、及び制御手段29は3Dスキャナ20に、記憶手段31、要求手段32、受信手段33、及び制御手段39はユーザ端末30に、それぞれ実装される。 FIG. 2 is a diagram illustrating the functional configuration of the 3D data system 1. The 3D data system 1 includes a storage means 11, an acquisition means 12, an acquisition means 13, a storage control means 14, an acceptance means 15, an extraction means 16, a transmission means 17, a control means 19, a storage means 21, an imaging means 22, and a measurement means 23. , a measuring means 24, a generating means 25, a transmitting means 26, a controlling means 29, a storing means 31, a requesting means 32, a receiving means 33, and a controlling means 39. Of these, the storage means 11, the acquisition means 12, the acquisition means 13, the storage control means 14, the reception means 15, the extraction means 16, the transmission means 17, and the control means 19 are stored in the server 10. The means 23, the measuring means 24, the generating means 25, the transmitting means 26, and the controlling means 29 are installed in the 3D scanner 20, and the storing means 31, the requesting means 32, the receiving means 33, and the controlling means 39 are installed in the user terminal 30. Ru.
 サーバ10において、記憶手段11は、各種のデータを記憶する。この例において、記憶手段11が記憶するデータには、データベース111が含まれる。データベース111は、3Dデータ及び生体情報を記録するデータベースである。取得手段12は、対象物の3Dモデルを示す3Dデータを取得する。取得手段13は、その対象物の生体情報を示す生体データを取得する。記憶制御手段14は、その3Dデータ及びその生体データを対応付けてデータベース111に記録する。受け付け手段15は、例えばユーザ端末30から要求を受け付ける。この要求は、3Dモデルの識別情報を含む。抽出手段16は、その要求に含まれる識別情報に対応する生体データをデータベース111から抽出する。送信手段17は、抽出された生体データを要求の送信元に送信する。制御手段19は、各種の制御を行う。 In the server 10, the storage means 11 stores various data. In this example, the data stored by the storage means 11 includes a database 111. The database 111 is a database that records 3D data and biometric information. The acquisition means 12 acquires 3D data representing a 3D model of the object. The acquisition means 13 acquires biometric data indicating biometric information of the object. The storage control means 14 records the 3D data and the biometric data in the database 111 in association with each other. The accepting means 15 accepts requests from the user terminal 30, for example. This request includes identification information of the 3D model. The extraction means 16 extracts biometric data corresponding to the identification information included in the request from the database 111. The transmitting means 17 transmits the extracted biometric data to the source of the request. The control means 19 performs various controls.
 3Dスキャナ20において、記憶手段21は、各種のデータを記憶する。撮影手段22は、対象物を撮影する。ここで、対象物を「撮影する」とは、対象物の2次元画像を取得することをいう。測定手段23は、撮影手段22から対象物の表面までの距離を測定する。距離を測定することにより、対象物の立体的形状に関する情報を得ることができる。測定手段24は、対象物の生体情報を測定する。生成手段25は、撮影手段22が撮影した画像及び測定手段23が測定した距離を用いて対象物の3Dデータを生成する。送信手段26は、3Dデータ及び生体データをサーバ10に送信する。制御手段29は、各種の制御を行う。 In the 3D scanner 20, the storage means 21 stores various data. The photographing means 22 photographs the object. Here, "photographing" a target object means acquiring a two-dimensional image of the target object. The measuring means 23 measures the distance from the photographing means 22 to the surface of the object. By measuring the distance, information regarding the three-dimensional shape of the object can be obtained. The measuring means 24 measures biological information of the object. The generating means 25 generates 3D data of the object using the image photographed by the photographing means 22 and the distance measured by the measuring means 23. The transmitting means 26 transmits the 3D data and biometric data to the server 10. The control means 29 performs various controls.
 ユーザ端末30において、記憶手段31は、各種のデータを記憶する。要求手段32は、サーバ10に要求を送信する。この要求は、3Dデータ及び生体データの送信を要求するものである。受信手段33は、この要求に応じてサーバ10から送信された3Dデータ及び生体データを受信する。制御手段39は、各種の制御を行う。 In the user terminal 30, the storage means 31 stores various data. The request means 32 sends a request to the server 10. This request requests the transmission of 3D data and biometric data. The receiving means 33 receives the 3D data and biometric data transmitted from the server 10 in response to this request. The control means 39 performs various controls.
 図3は、3Dスキャナ20のハードウェア構成を例示する図である。3Dスキャナ20は、フレーム201、照明202、カメラ203、深さセンサー(又は距離センサー)204、サーモグラフィーカメラ205、コンピュータ206、及びディスプレイ207を有する。フレーム201は、撮影室(又は撮影空間)を形成するための構造体である。フレーム201は、例えば、直方体、立方体、五角柱、又は六角柱若しくはこれに類似する形状の基本骨格を有する。これらの立体の内部が撮影室となる。照明202は、撮影室内を照らす。カメラ203は、撮影室において対象物を撮影する(すなわち対象物の2次元画像すなわち写真を取得する)。深さセンサー204は、対象物の表面までの距離を測定する。より詳細には、深さセンサー204は、対象物の表面までの距離の空間分布を測定する。1台のカメラ203及び1台の深さセンサー204は、センサーユニットを構成する。センサーユニットは、例えばフレーム201に固定される。一例において、3Dスキャナ20は、複数のセンサーユニットを有する。複数のセンサーユニットは、それぞれ異なる方向から対象物を撮影するように配置される。サーモグラフィーカメラ205は、対象物のサーモグラフィーすなわち熱分布を測定する装置である。サーモグラフィーカメラ205は、センサーユニットに含まれていてもよい。すなわち3Dスキャナ20は、カメラ203と同数のサーモグラフィーカメラ205を、カメラ203とほぼ同じ位置に有してもよい。あるいは、サーモグラフィーカメラ205は、センサーユニットとは独立して設けられてもよい。すなわち、サーモグラフィーカメラ205の位置は、カメラ203の位置とは異なっていてもよいし、その数が異なっていてもよい。コンピュータ206は、カメラ203により撮影された画像、深さセンサー204により測定された距離、及びサーモグラフィーカメラ205により測定された熱分布を処理する。具体的には、コンピュータ206は、対象物の3Dモデルを示す3Dデータを生成する。コンピュータ206は、生成した3Dデータをサーバ10に送信する。コンピュータ206は、例えば、プロセッサ、メモリ、ストレージ、及び通信IFを有する汎用コンピュータであり、プログラムを実行することにより3Dデータを生成する機能が実装される。ディスプレイ207はコンピュータ206により制御され、ユーザ(例えば対象物である人間)に情報を提示する。 FIG. 3 is a diagram illustrating the hardware configuration of the 3D scanner 20. The 3D scanner 20 has a frame 201, a light 202, a camera 203, a depth sensor (or distance sensor) 204, a thermography camera 205, a computer 206, and a display 207. The frame 201 is a structure for forming a photography room (or photography space). The frame 201 has a basic skeleton in the shape of, for example, a rectangular parallelepiped, a cube, a pentagonal prism, a hexagonal prism, or a similar shape. The interior of these three-dimensional objects will become the photography room. Lighting 202 illuminates the inside of the photographing room. The camera 203 photographs the object in the photographing room (that is, acquires a two-dimensional image or photograph of the object). Depth sensor 204 measures the distance to the surface of the object. More specifically, depth sensor 204 measures the spatial distribution of distances to the surface of the object. One camera 203 and one depth sensor 204 constitute a sensor unit. The sensor unit is fixed to the frame 201, for example. In one example, 3D scanner 20 includes multiple sensor units. The plurality of sensor units are arranged so as to photograph the object from different directions. The thermography camera 205 is a device that measures thermography, that is, heat distribution, of an object. Thermographic camera 205 may be included in the sensor unit. That is, the 3D scanner 20 may have the same number of thermography cameras 205 as the cameras 203 at approximately the same positions as the cameras 203. Alternatively, the thermography camera 205 may be provided independently of the sensor unit. That is, the position of thermography camera 205 may be different from the position of camera 203, or the number thereof may be different. Computer 206 processes the images taken by camera 203, the distance measured by depth sensor 204, and the heat distribution measured by thermography camera 205. Specifically, computer 206 generates 3D data representing a 3D model of the object. Computer 206 transmits the generated 3D data to server 10. The computer 206 is, for example, a general-purpose computer having a processor, memory, storage, and communication IF, and is equipped with a function of generating 3D data by executing a program. Display 207 is controlled by computer 206 and presents information to a user (eg, a human object).
 図2の機能構成との対応関係を説明する。カメラ203が撮影手段22の一例である。深さセンサー204が測定手段23の一例である。サーモグラフィーカメラ205が測定手段24の一例である。すなわち、サーモグラフィーカメラ205により測定されるサーモグラフが生体情報の一例である。コンピュータ206はストレージに、3Dデータを生成するためのプログラム(以下「3Dデータ生成プログラム」という。)を記憶している。プロセッサが3Dデータ生成プログラムを実行している状態において、メモリ及びストレージの少なくとも一方が記憶手段21の一例であり、プロセッサが生成手段25及び制御手段29の一例であり、通信IFが送信手段26の一例である。 The correspondence relationship with the functional configuration of FIG. 2 will be explained. The camera 203 is an example of the photographing means 22. Depth sensor 204 is an example of measuring means 23. A thermographic camera 205 is an example of the measuring means 24. That is, a thermograph measured by the thermography camera 205 is an example of biological information. The computer 206 stores in its storage a program for generating 3D data (hereinafter referred to as "3D data generation program"). In a state where the processor is executing the 3D data generation program, at least one of the memory and the storage is an example of the storage means 21, the processor is an example of the generation means 25 and the control means 29, and the communication IF is an example of the transmission means 26. This is an example.
 図4は、サーバ10のハードウェア構成を例示する図である。サーバ10は、CPU(Central Processing Unit)101、メモリ102、ストレージ103、及び通信IF104を有する。CPU101は、プログラムに従って各種の演算を行うプロセッサである。メモリ102は、CPU101がプログラムを実行する際のワークエリアとして機能する主記憶装置であり、例えばRAM(Random Access Memory)を含む。ストレージ103は、各種のデータ及びプログラムを記憶する補助記憶装置であり、例えばSSD(Solid State Drive)又はHDD(Hard Disc Drive)を含む。通信IF104は、所定の通信規格(例えばイーサネット(登録商標))に従って他の装置と通信する装置であり、例えばNIC(Network Interface Card)を含む。 FIG. 4 is a diagram illustrating the hardware configuration of the server 10. The server 10 includes a CPU (Central Processing Unit) 101, a memory 102, a storage 103, and a communication IF 104. The CPU 101 is a processor that performs various calculations according to programs. The memory 102 is a main storage device that functions as a work area when the CPU 101 executes a program, and includes, for example, RAM (Random Access Memory). The storage 103 is an auxiliary storage device that stores various data and programs, and includes, for example, an SSD (Solid State Drive) or an HDD (Hard Disc Drive). The communication IF 104 is a device that communicates with other devices according to a predetermined communication standard (eg, Ethernet (registered trademark)), and includes, for example, a NIC (Network Interface Card).
 この例においてストレージ103が記憶するプログラムには、コンピュータを3Dデータシステム1におけるサーバとして機能させるためのプログラム(以下「サーバプログラム」という。)が含まれる。CPU101がサーバプログラムを実行している状態において、メモリ102及びストレージ103の少なくとも一方が記憶手段11の一例であり、CPU101が取得手段12、取得手段13、記憶制御手段14、受け付け手段15、抽出手段16、及び制御手段19の一例であり、通信IF104が送信手段17の一例である。 In this example, the programs stored in the storage 103 include a program for causing a computer to function as a server in the 3D data system 1 (hereinafter referred to as a "server program"). In a state where the CPU 101 is executing the server program, at least one of the memory 102 and the storage 103 is an example of the storage means 11, and the CPU 101 is the acquisition means 12, the acquisition means 13, the storage control means 14, the reception means 15, and the extraction means. 16 and the control means 19, and the communication IF 104 is an example of the transmission means 17.
 図5は、ユーザ端末30のハードウェア構成を例示する図である。ユーザ端末30は、CPU301、メモリ302、ストレージ303、通信IF304、ディスプレイ305、及び入力装置306を有するコンピュータであり、例えばスマートフォン又はパーソナルコンピュータである。CPU301は、プログラムに従って各種の演算を行うプロセッサである。メモリ302は、CPU301がプログラムを実行する際のワークエリアとして機能する主記憶装置であり、例えばRAMを含む。ストレージ303は、各種のデータ及びプログラムを記憶する補助記憶装置であり、例えばSSD又はHDDを含む。通信IF304は、所定の通信規格(例えばWiFi(登録商標))に従って他の装置と通信する装置であり、例えば無線チップを含む。ディスプレイ305は情報を表示する装置であり、例えば有機ELディスプレイを含む。入力装置306はユーザ端末30に情報を入力するための装置であり、例えばタッチスクリーン、キーボード、又はポインティングデバイスを含む。 FIG. 5 is a diagram illustrating the hardware configuration of the user terminal 30. The user terminal 30 is a computer including a CPU 301, a memory 302, a storage 303, a communication IF 304, a display 305, and an input device 306, and is, for example, a smartphone or a personal computer. The CPU 301 is a processor that performs various calculations according to programs. The memory 302 is a main storage device that functions as a work area when the CPU 301 executes a program, and includes, for example, a RAM. The storage 303 is an auxiliary storage device that stores various data and programs, and includes, for example, an SSD or an HDD. The communication IF 304 is a device that communicates with other devices according to a predetermined communication standard (for example, WiFi (registered trademark)), and includes, for example, a wireless chip. The display 305 is a device that displays information, and includes, for example, an organic EL display. Input device 306 is a device for inputting information to user terminal 30, and includes, for example, a touch screen, a keyboard, or a pointing device.
 この例においてストレージ303が記憶するプログラムには、コンピュータを3Dデータシステム1におけるクライアントとして機能させるためのプログラム(以下「クライアントプログラム」という。)が含まれる。CPU301がクライアントプログラムを実行している状態において、メモリ302及びストレージ303の少なくとも一方が記憶手段31の一例であり、CPU101が要求手段32及び制御手段39の一例であり、通信IF104が受信手段33の一例である。 In this example, the programs stored in the storage 303 include a program for causing a computer to function as a client in the 3D data system 1 (hereinafter referred to as a "client program"). In a state where the CPU 301 is executing a client program, at least one of the memory 302 and the storage 303 is an example of the storage means 31, the CPU 101 is an example of the requesting means 32 and the control means 39, and the communication IF 104 is an example of the receiving means 33. This is an example.
2.動作
 図6は、3Dデータシステム1の動作を例示するシーケンスチャートである。この例において、3Dスキャナ20は、商業施設、スポーツジム、又は病院等に設置されており、ユーザは、有償又は無償で3Dスキャナ20を利用することができる。
2. Operation FIG. 6 is a sequence chart illustrating the operation of the 3D data system 1. In this example, the 3D scanner 20 is installed in a commercial facility, a sports gym, a hospital, etc., and the user can use the 3D scanner 20 for a fee or for free.
 ステップS1において、3Dスキャナ20は、対象物であるユーザを撮影する。具体的には例えば以下のとおりである。3Dデータを生成しようとするユーザは、撮影室内に入る。3Dスキャナ20は撮影室内に人感センサー(図示略)を有しており、撮影室内に人が入ったことが検知されると、コンピュータ206は、3Dデータ生成のための撮影を案内する画面をディスプレイ207に表示する。この画面は、例えば、ユーザが立つ位置の案内、撮影中にユーザが取るべきポーズ又は姿勢の案内を含む。また、この画面は、ユーザIDの入力を促すUIオブジェクトを含んでもよい。ユーザIDは、3Dデータシステム1においてユーザを識別する識別情報である。ユーザは、タッチスクリーン又はキーボードからユーザIDを入力する。あるいは、ユーザ自身が所持する端末装置(例えばスマートフォン。図示略)にユーザIDを示す画像コードを表示させ、コンピュータ206がカメラ(図示略)でその画像コードを読み取り、ユーザIDの入力を受け付けてもよい。ユーザの準備ができると、コンピュータ206は、撮影までのカウントダウンをディスプレイ207に表示する。コンピュータ206は、このカウントダウンに合わせて、撮影室内のユーザを撮影するようカメラ203を制御する。コンピュータ206は、カメラ203による対象物の撮影と同じタイミングで、対象物までの距離を測定するよう、深さセンサー204を制御する。コンピュータ206は、カメラ203により撮影された画像(この例では静止画)を示す画像データ、及び深さセンサー204により測定された距離を示す距離データをストレージに記憶する。 In step S1, the 3D scanner 20 photographs the user as a target object. Specifically, for example, it is as follows. A user who wants to generate 3D data enters the imaging room. The 3D scanner 20 has a human sensor (not shown) in the imaging room, and when it is detected that a person has entered the imaging room, the computer 206 displays a screen that guides imaging to generate 3D data. It is displayed on the display 207. This screen includes, for example, guidance on where the user should stand, and guidance on the pose or posture the user should take during shooting. Further, this screen may include a UI object that prompts the user to input the user ID. The user ID is identification information that identifies a user in the 3D data system 1. The user enters the user ID from the touch screen or keyboard. Alternatively, an image code indicating the user ID may be displayed on a terminal device (for example, a smartphone, not shown) owned by the user, and the computer 206 may read the image code with a camera (not shown) and accept the input of the user ID. good. When the user is ready, the computer 206 displays a countdown until shooting on the display 207. The computer 206 controls the camera 203 to photograph the user in the photographing room in accordance with this countdown. The computer 206 controls the depth sensor 204 to measure the distance to the object at the same timing as the camera 203 photographs the object. The computer 206 stores image data indicating an image (a still image in this example) captured by the camera 203 and distance data indicating the distance measured by the depth sensor 204 in a storage.
 ステップS2において、3Dスキャナ20は、対象物の生体情報を測定する。この例においては、生体情報として対象物のサーモグラフが測定される。具体的には例えば以下のとおりである。コンピュータ206は、カメラ203により対象物を撮影するのと同じタイミング(すなわちほぼ同時に)で、対象物のサーモグラフを測定するよう、サーモグラフィーカメラ205を制御する。コンピュータ206は、サーモグラフィーカメラ205により測定されたサーモグラフを示すサーモグラフデータ(生体データの一例)をストレージに記憶する。 In step S2, the 3D scanner 20 measures biological information of the object. In this example, a thermograph of the object is measured as biological information. Specifically, for example, it is as follows. The computer 206 controls the thermography camera 205 to measure the thermograph of the object at the same timing (ie, approximately at the same time) as the object is photographed by the camera 203. The computer 206 stores thermograph data (an example of biological data) indicating a thermograph measured by the thermography camera 205 in a storage.
 ステップS3において、3Dスキャナ20は、対象物の3Dデータを生成する。具体的には例えば以下のとおりである。コンピュータ206は、3Dデータ生成プログラムに従って、画像データ及び距離データから対象物の3Dモデルを生成する。具体的には、コンピュータ206は、距離データから計算される立体形状に、画像データにより示される画像をテクスチャとして貼り付けることにより3Dモデルを生成する。またこの例では、生体データは対象物における生体情報(すなわち表面温度)の表面分布を示している。コンピュータ206は、対象物を撮影した画像を立体形状に貼り付けるのと同様に、サーモグラフを対象物の立体形状(画像のテクスチャを貼り付けていない素体)に貼り付ける。すなわち、コンピュータ206は、対象物の素体に対象物の画像(可視光の画像すなわち色の分布)を貼り付けた3Dモデル、及び対象物の素体に対象物の温度分布を貼り付けた3Dモデルの2つの3Dモデルに相当するデータを生成する。これら2つの3Dモデルを示す3Dデータは、それぞれ別のデータファイルとして生成されてもよいし、単一のデータファイルに含まれてもよい。この3Dデータは、ユーザID及びタイムスタンプを含む。タイムスタンプは、3Dモデルが生成された日時を示す時刻情報の一例である。この3Dデータはさらに、3Dスキャナ20の識別情報を含んでもよい。 In step S3, the 3D scanner 20 generates 3D data of the object. Specifically, for example, it is as follows. The computer 206 generates a 3D model of the object from the image data and distance data according to the 3D data generation program. Specifically, the computer 206 generates a 3D model by pasting an image indicated by the image data as a texture onto a three-dimensional shape calculated from the distance data. Further, in this example, the biometric data indicates the surface distribution of biometric information (i.e., surface temperature) on the object. The computer 206 pastes the thermograph onto the three-dimensional shape of the object (an element body to which no image texture is pasted) in the same way as pasting a photographed image of the object onto a three-dimensional shape. That is, the computer 206 creates a 3D model in which an image of the object (visible light image, i.e., color distribution) is pasted onto the object's element body, and a 3D model in which the temperature distribution of the object is pasted onto the object's element body. Generate data corresponding to two 3D models of the model. The 3D data representing these two 3D models may be generated as separate data files, or may be included in a single data file. This 3D data includes a user ID and a time stamp. The timestamp is an example of time information indicating the date and time when the 3D model was generated. This 3D data may further include identification information of the 3D scanner 20.
 ステップS4において、3Dスキャナ20は、生成した3Dデータをサーバ10に送信する。3Dデータの送信先となるサーバ10は、3Dデータ生成プログラムにおいて定義される。ステップS5において、サーバ10は、3Dスキャナ20から受信した3Dデータをデータベース111に記録する。 In step S4, the 3D scanner 20 transmits the generated 3D data to the server 10. The server 10 to which the 3D data is transmitted is defined in the 3D data generation program. In step S5, the server 10 records the 3D data received from the 3D scanner 20 in the database 111.
 図7は、データベース111に記録されるデータを例示する図である。データベース111は、複数のレコードを有する。各レコードは、1人のユーザについて、1又は複数の3Dデータを含む。各レコードは、1又は複数のサブレコードを含む。サブレコードには、タイムスタンプT[i]が対応づけられる。各サブレコードは、3DデータDD[i]、生体データLL[i]、及び属性情報を含む。属性情報は、3Dデータ又は生体データの属性を示す情報である。ここでは属性として、その撮影を行った3Dスキャナ20の識別情報ID[S,i]、及び3Dデータに含まれる生体データの種類の識別情報ID[L,i]が用いられる。後述するように、3Dデータシステム1では3Dデータと生体データとが同時に取得されるものに限定されないので、1つのサブレコードにおいて、3DデータDD及び生体データLLの一方は空でもよい。ユーザは、任意のタイミングで3Dデータを生成する。例えばあるユーザは1年に1回、新たに3Dデータを生成する。別のユーザは月に1回、新たに3Dデータを生成する。さらに別のユーザは、不定期に新たな3Dデータを生成する。こうしてデータベース111には、複数のユーザの各々について、異なる時期に生成された複数の3DデータDD及び生体データLLが記録される。 FIG. 7 is a diagram illustrating data recorded in the database 111. Database 111 has multiple records. Each record includes one or more 3D data for one user. Each record includes one or more subrecords. A timestamp T[i] is associated with the subrecord. Each sub-record includes 3D data DD[i], biometric data LL[i], and attribute information. Attribute information is information indicating attributes of 3D data or biometric data. Here, as attributes, identification information ID [S, i] of the 3D scanner 20 that took the image and identification information ID [L, i] of the type of biometric data included in the 3D data are used. As will be described later, in the 3D data system 1, 3D data and biometric data are not limited to being acquired at the same time, so one of the 3D data DD and biometric data LL in one subrecord may be empty. A user generates 3D data at any timing. For example, a certain user generates new 3D data once a year. Another user generates new 3D data once a month. Still another user generates new 3D data irregularly. In this way, the database 111 records a plurality of 3D data DD and biometric data LL generated at different times for each of a plurality of users.
 なお、3Dデータの属性情報は上記の例に限定されない。3Dデータの属性情報は、例えば、3Dデータを生成した3Dスキャナ20の所在地(市町村名など住所)、3Dスキャナ20が設置された施設の種類(スポーツジム又は病院など)、及び3Dモデル生成のための撮影を行う直前のユーザの行動(運動又は食事など)の少なくとも1種を示す情報を含んでもよい。これらの情報は、3Dスキャナ20が収集し、サーバ10に送信する。 Note that the attribute information of 3D data is not limited to the above example. The attribute information of the 3D data includes, for example, the location of the 3D scanner 20 that generated the 3D data (address such as city, town, village name, etc.), the type of facility where the 3D scanner 20 is installed (sports gym, hospital, etc.), and the information for generating the 3D model. The image may include information indicating at least one type of user's behavior (exercising, eating, etc.) immediately before photographing the image. The 3D scanner 20 collects this information and sends it to the server 10.
 再び図6を参照する。ステップS1~S5の処理は、3Dスキャナ20において生成された3Dデータ及び生体データを、データベース111に記録する処理である。以下、データベース111に記録された3Dデータ及び生体データをユーザ端末30から利用する処理を説明する。ここではこの処理をステップS1~S5に引き続き説明するが、3Dデータ及び生体データをデータベース111に記録する処理と、データベース111に記録された3Dデータ及び生体データを利用する処理とは、独立して行われる。 Refer to FIG. 6 again. The processes in steps S1 to S5 are processes for recording the 3D data and biometric data generated by the 3D scanner 20 in the database 111. Hereinafter, a process for using the 3D data and biometric data recorded in the database 111 from the user terminal 30 will be described. Here, this process will be explained following steps S1 to S5, but the process of recording 3D data and biometric data in the database 111 and the process of using the 3D data and biometric data recorded in the database 111 are independent of each other. It will be done.
 ステップS6において、ユーザ端末30は、3Dデータの送信要求をサーバ10に送信する。具体的には例えば以下のとおりである。ユーザは、ユーザ端末30において、クライアントプログラムを起動する。ユーザ端末30は、クライアントプログラムに従って、3Dデータの送信要求をサーバ10に送信する。この送信要求は、3Dデータを特定する情報、例えば、ユーザIDを含む。なお、このユーザIDは、ユーザ端末30を操作しているユーザ自身のユーザIDであってもよいし、別のユーザIDであってもよい。すなわちユーザ端末30のユーザは、自分自身の3Dデータを要求してもよいし、他人の3Dデータを要求してもよい。 In step S6, the user terminal 30 transmits a 3D data transmission request to the server 10. Specifically, for example, it is as follows. The user starts a client program on the user terminal 30. The user terminal 30 transmits a 3D data transmission request to the server 10 according to the client program. This transmission request includes information that specifies the 3D data, for example, a user ID. Note that this user ID may be the user ID of the user operating the user terminal 30, or may be another user ID. That is, the user of the user terminal 30 may request his or her own 3D data or may request 3D data of another person.
 ユーザ端末30から3Dデータの送信要求を受信すると、サーバ10は、要求された3Dデータの属性情報をデータベース111から抽出する。サーバ10は抽出された属性情報(の少なくとも一部)の一覧をユーザ端末30に送信する。ユーザ端末30は、サーバ10から送信された一覧と共に、ユーザに3Dデータ及び生体データの選択を促すUIオブジェクトを表示する。ユーザは、所望の3Dデータ及び生体データを選択する。ユーザの選択を受け付けると、ユーザ端末30は、選択された3Dデータ及び生体データの送信要求をサーバ10に送信する。この送信要求は、選択された3Dデータ及び生体データを特定する情報を含む。 Upon receiving a 3D data transmission request from the user terminal 30, the server 10 extracts attribute information of the requested 3D data from the database 111. The server 10 transmits a list of (at least part of) the extracted attribute information to the user terminal 30. The user terminal 30 displays the list sent from the server 10 as well as a UI object that prompts the user to select 3D data and biometric data. The user selects desired 3D data and biometric data. Upon accepting the user's selection, the user terminal 30 transmits a request to send the selected 3D data and biometric data to the server 10. This transmission request includes information specifying the selected 3D data and biometric data.
 ユーザ端末30から特定の3Dデータの送信要求を受信すると、サーバ10は、要求された3Dデータ及び生体データをデータベース111から抽出する(ステップS7)。サーバ10は、抽出された3Dデータ及び生体データを、要求の送信元であるユーザ端末30に、送信要求に対する応答として送信する(ステップS8)。 Upon receiving a request to send specific 3D data from the user terminal 30, the server 10 extracts the requested 3D data and biometric data from the database 111 (step S7). The server 10 transmits the extracted 3D data and biometric data to the user terminal 30, which is the source of the request, as a response to the transmission request (step S8).
 サーバ10から3Dデータ及び生体データを受信すると、ユーザ端末30は、受信した3Dデータ及び生体データを利用した処理を行う(ステップS9)。3Dデータ及び生体データを利用した処理は、例えば、3Dモデル及び生体情報の表示を含む。こうして、ユーザ端末30のユーザは、3Dモデル及び生体情報を視覚的に確認することができる。 Upon receiving the 3D data and biometric data from the server 10, the user terminal 30 performs processing using the received 3D data and biometric data (step S9). Processing using 3D data and biometric data includes, for example, displaying a 3D model and biometric information. In this way, the user of the user terminal 30 can visually confirm the 3D model and biometric information.
3.利用例
 ここまで3Dデータシステム1の基本的な動作を説明したが、システム構成、装置構成、生体情報の種類、及び3Dデータと生体データとの関係などは、種々の変形が可能である。以下、3Dデータシステム1の具体的な利用例をいくつか説明する。
3. Usage Example Although the basic operation of the 3D data system 1 has been described so far, various modifications can be made to the system configuration, device configuration, type of biometric information, relationship between 3D data and biometric data, and the like. Hereinafter, some specific usage examples of the 3D data system 1 will be explained.
3-1.健康管理又は医療における利用
(1)定期的な測定
 3Dデータシステム1は、健康管理又は医療の分野において利用が可能である。3Dモデルの対象物となるユーザは、管理対象者又は患者である。3Dスキャナ20は、健康管理センター又は病院に設置される。生体情報としてサーモグラフが測定される。ユーザは、スタッフ又は医師の指示により、定期的に(例えば1年に1回)新たな3Dモデルを生成し、新たな生体情報を測定する。データベース111にはこれらのデータが記録される。
3-1. Use in health care or medicine (1) Periodic measurement The 3D data system 1 can be used in the field of health care or medicine. The user who is the object of the 3D model is a person to be managed or a patient. The 3D scanner 20 is installed at a health care center or hospital. Thermographs are measured as biological information. The user periodically (for example, once a year) generates a new 3D model and measures new biological information according to instructions from a staff member or a doctor. These data are recorded in the database 111.
 ユーザ端末30は、スタッフ又は医師が操作する端末である。スタッフ又は医師は、ユーザ端末30に管理対象者又は患者のユーザIDを入力する。ユーザ端末30は、管理対象者又は患者の3Dデータ及び生体データをサーバ10から取得する。ここでユーザ端末30が取得する3Dデータ及び生体データは、直近の所定期間(例えば直近3年)の3Dデータ及び生体データである。ユーザ端末30は、撮影が行われた時期が異なる複数の3Dモデルを、その時系列の変化が分かるように表示する。一例において、ユーザ端末30は、これら複数の3Dモデルを時間軸に沿って並べて表示する。このとき、ユーザ端末30は、時系列の変化(例えば、お腹周りが年々太くなっている、背中が年々丸くなっている)を強調して表示してもよい。時系列の変化を強調する表示とは、例えば、変化が基準より大きい部分を他とは異なる外観(色、サイズ、又は装飾)で表示することをいう。 The user terminal 30 is a terminal operated by a staff member or a doctor. The staff or doctor inputs the user ID of the person to be managed or the patient into the user terminal 30. The user terminal 30 acquires 3D data and biometric data of the person to be managed or the patient from the server 10 . The 3D data and biometric data that the user terminal 30 acquires here are 3D data and biometric data from the most recent predetermined period (for example, the most recent three years). The user terminal 30 displays a plurality of 3D models photographed at different times in such a way that changes over time can be seen. In one example, the user terminal 30 displays these multiple 3D models side by side along the time axis. At this time, the user terminal 30 may emphasize and display changes over time (for example, the waist circumference becomes thicker year by year, the back becomes rounder year by year). Display emphasizing time-series changes refers to, for example, displaying a portion where the change is greater than a standard in an appearance (color, size, or decoration) that is different from the others.
 ユーザ端末30は、ボタン等のUIオブジェクトにより、通常の3Dモデルと、生体情報(例えばサーモグラフ)が貼り付けられた3Dモデルとを切り替えて表示することができる。このサーモグラフは3Dモデル(の素体)に貼り付けられているので、ユーザ(スタッフ又は医師)の指示に応じて変更される視点から見た、指定された拡大率の像が表示される。生体情報が貼り付けられた3Dモデルも、通常の3Dモデルと同様、その時系列の変化が分かるように表示される。このとき、ユーザ端末30は、時系列の変化(例えば、年々温度が低くなる又は高くなる部位)を強調して表示してもよい。あるいは、生体情報が貼り付けられた3Dモデルを表示しているときに、通常の3Dモデルにおいて変化が大きい部分を強調して表示してもよい。さらにあるいは、通常の3Dモデルを表示しているときに、生体情報の変化が大きい部分を強調して表示してもよい。 The user terminal 30 can switch between displaying a normal 3D model and a 3D model to which biometric information (for example, a thermograph) is attached using a UI object such as a button. Since this thermograph is pasted on the 3D model (its body), an image at a specified magnification is displayed as seen from a viewpoint that is changed according to instructions from the user (staff or doctor). The 3D model with biometric information pasted thereon is also displayed in a way that allows you to see its changes over time, just like a normal 3D model. At this time, the user terminal 30 may emphasize and display changes over time (for example, parts where the temperature becomes lower or higher year by year). Alternatively, when displaying a 3D model to which biometric information has been pasted, parts of a normal 3D model with large changes may be highlighted and displayed. Furthermore, when displaying a normal 3D model, parts where the biometric information changes greatly may be highlighted and displayed.
(2)他の測定機器との連携
 生体情報は、3Dスキャナ20によって測定されるものに限定されない。3Dスキャナ20により測定される生体情報に加えて、又は代えて、3Dスキャナ20とは別の測定機器により測定された生体情報が用いられてもよい。ここでいう別の測定機器とは、例えば、体組成計、体重計、血圧計、脈拍計、又はパルスオキシメーターなどの機器であり、測定される生体情報は、体蘇生、体重、血圧、脈拍、又は動脈血酸素飽和度などの医学的情報である。体組成とは、身体を構成する要素(脂肪、骨、及び除脂肪軟組織)の組成をいう。この測定機器は、継続的に、定期的に、又は一時的に、ユーザ(ここでいうユーザは、3Dモデルの対象物となったユーザである)が所持する端末装置(例えばスマートフォン又はPC)と通信し、測定した生体情報を示す生体データをその端末装置に送信する。端末装置は、測定機器から受信した生体データを記憶する。この生体データにはタイムスタンプが付与される。この端末装置は、継続的に、定期的に、又は一時的に、インターネット等のネットワークを介してサーバ10に生体データ及びユーザIDを送信する。端末装置から生体データを受信すると、サーバ10は、データベース111において、受信したユーザIDに対応するレコードに、受信した生体データを記録する。
(2) Cooperation with other measuring devices Biometric information is not limited to what is measured by the 3D scanner 20. In addition to or in place of the biometric information measured by the 3D scanner 20, biometric information measured by a measuring device other than the 3D scanner 20 may be used. The other measuring devices mentioned here are, for example, devices such as body composition monitors, weight scales, blood pressure monitors, pulse meters, or pulse oximeters, and the biological information measured includes body resuscitation, weight, blood pressure, pulse rate, etc. , or medical information such as arterial oxygen saturation. Body composition refers to the composition of the elements (fat, bones, and lean soft tissue) that make up the body. This measuring device continuously, periodically, or temporarily connects with a terminal device (for example, a smartphone or a PC) owned by the user (here, the user is the user who is the object of the 3D model). and transmits biometric data indicating the measured biometric information to the terminal device. The terminal device stores biometric data received from the measuring device. A time stamp is attached to this biometric data. This terminal device continuously, periodically, or temporarily transmits the biometric data and user ID to the server 10 via a network such as the Internet. Upon receiving the biometric data from the terminal device, the server 10 records the received biometric data in a record corresponding to the received user ID in the database 111.
 この例では、3Dモデルを生成した時期(すなわちユーザを撮影した時期)と生体情報を取得した時期とは必ずしも一致しない。したがって、測定機器から得られた生体データは、特定の時期に生成された3Dデータとセットではなく、生体データ単体で記録される。なお、生体情報を取得した時期と3Dデータを生成した時期とが所定の条件(例えば、両者の時間差がしきい値以下という条件)を満たした場合、サーバ10は、この生体データをその3Dデータとセットでデータベース111に記録してもよい。サーバ10は、ユーザ端末30からの要求に応じて、この(測定機器により得られた)生体データを送信する。 In this example, the time when the 3D model was generated (that is, the time when the user was photographed) and the time when the biometric information was acquired do not necessarily match. Therefore, the biometric data obtained from the measuring device is recorded as a single biometric data, not as a set with 3D data generated at a specific time. Note that if the time when the biometric information was acquired and the time when the 3D data was generated meet a predetermined condition (for example, the time difference between the two is less than or equal to a threshold), the server 10 converts the biometric data into the 3D data. It may also be recorded in the database 111 as a set. The server 10 transmits this biometric data (obtained by the measuring device) in response to a request from the user terminal 30.
 この生体データにより示される生体情報は、サーモグラフの例とは異なり空間分布を示すものではないので、3Dモデルの素体に貼り付けるものではない。仮に貼り付けたとしても、空間分布が一様なので1個の3Dモデルとして見れば素体と色が違うだけである。ただし、ユーザ端末30において、生成した時期が異なる複数の3Dモデルを並べて表示する際に、例えば血圧値に応じて異なる色を3Dモデルの素体に付与すれば、血圧値の時系列変化を視覚的に認識することができる。 Unlike the thermograph example, the biometric information shown by this biometric data does not indicate a spatial distribution, so it is not pasted onto the 3D model body. Even if they were pasted together, the spatial distribution is uniform, so when viewed as a single 3D model, the only difference would be the base body and color. However, when displaying multiple 3D models that were generated at different times side by side on the user terminal 30, for example, if different colors are given to the bodies of the 3D models according to blood pressure values, it is possible to visually see the time-series changes in blood pressure values. can be recognized visually.
 あるいは、ユーザ端末30において、時期を指定するUIオブジェクト(例えばスライドバー)により指定された時期に対応する生体情報を、1個の3Dモデルの素体に貼り付けた画像が表示されてもよい。この例では、スライドバーを動かすと、3Dモデルの色が、その時期の生体情報に対応する色に変化する。このとき、3Dモデルも、その時期に対応する形状に変化する。これにより、ユーザは、3Dモデルの形状の経時変化(すなわち体系又は姿勢の経時変化)と、生体情報の経時変化とを視覚的に対応づけて確認することができる。 Alternatively, on the user terminal 30, an image may be displayed in which biometric information corresponding to the period specified by a UI object (for example, a slide bar) specifying the period is pasted onto a single 3D model body. In this example, when the slide bar is moved, the color of the 3D model changes to the color corresponding to the biological information at that time. At this time, the 3D model also changes to a shape corresponding to that period. Thereby, the user can visually correlate and confirm the temporal change in the shape of the 3D model (that is, the temporal change in the body structure or posture) and the temporal change in the biological information.
 さらにあるいは、ユーザ端末30において、生体情報は、3Dモデルに貼り付けるのではない形態で表示されてもよい。例えば、生成された時期が異なる複数の3Dモデルが並べて表示されるときに、その上部に生体情報の経時変化を示すグラフが表示されてもよい。 Furthermore, the biometric information may be displayed on the user terminal 30 in a form other than being pasted onto the 3D model. For example, when a plurality of 3D models generated at different times are displayed side by side, a graph showing changes in biological information over time may be displayed above the 3D models.
(3)肌の色(又は顔色)、肌の質感などの利用
 健康管理センター又は病院で定期的に3Dモデルを生成するという状況を想定すると、定期的に「同じ装置」で撮影を行うことが想定される。毎回同じ3Dスキャナ20で撮影をするということは、照明(又はライティング)の条件を毎回ほぼ同一にすることができるということを意味する。これはすなわち、(医学的な意味での)肌の色(又は顔色)を同じ条件で記録できるということである。例えば、黄疸が出ている、湿疹が出ている、顔が青白い、顔がどす黒いなど、体調の変化に起因する肌の状態の変化を、ほぼ同一の照明の条件で記録することができる。3Dモデルを生成する際に舌を出したポーズ(又は表情)で撮影を行うよう指示をすれば、舌の色を記録することができる。
(3) Use of skin color (or complexion), skin texture, etc. Assuming a situation in which 3D models are generated regularly at a health care center or hospital, it is possible to periodically take images using the same device. is assumed. Photographing with the same 3D scanner 20 each time means that the illumination (or lighting) conditions can be made almost the same each time. This means that skin color (or complexion) (in a medical sense) can be recorded under the same conditions. For example, changes in skin condition caused by changes in physical condition, such as jaundice, eczema, a pale face, or a dark face, can be recorded under almost the same lighting conditions. When generating a 3D model, if you instruct the user to take a picture with the tongue out (or facial expression), the color of the tongue can be recorded.
 対象物の特定の部位(例えば顔)について、3Dモデルの生成のための撮影と同時に、又はその撮影とは別に、他の部位(例えば体)より高精度に、肌の質感を測定してもよい。この測定には、例えば、向き、高さ、及び強度を調整可能な、制御された複数の光源が用いられる。制御された光源を用いて、例えば様々な角度から光を肌に照射して反射光及び散乱光が測定される。この様な測定によれば、肌の質感すなわち微細な凹凸(吹き出物又は皺)を測定することができる。この微細な凹凸の情報は、例えばノーマルマップ(又は法線マップ)として記録される。この場合、このノーマルマップが生体情報の一例である。高精度な肌の質感は、例えば3Dモデル生成のための撮影とは別の照明装置を用いて測定される。すなわち、3Dスキャナ20は、3モデル生成のための撮影用の照明と、生体情報としての肌の詳細な質感を記録するための測定用の照明と、2系統の照明を有し、それぞれ切り替えて用いてもよい。 It is possible to measure the skin texture of a specific part of an object (for example, the face) with higher precision than other parts (for example, the body) at the same time as shooting to generate a 3D model, or separately from the shooting. good. For example, controlled light sources with adjustable orientation, height, and intensity are used for this measurement. Using a controlled light source, for example, light is irradiated onto the skin from various angles and reflected and scattered light is measured. According to such measurements, the texture of the skin, that is, minute irregularities (pimples or wrinkles) can be measured. Information on this fine unevenness is recorded as, for example, a normal map (or normal map). In this case, this normal map is an example of biological information. Highly accurate skin texture is measured using, for example, a lighting device different from that used for photographing to generate a 3D model. That is, the 3D scanner 20 has two systems of illumination, one for photographing to generate three models, and the other for measurement to record the detailed texture of the skin as biometric information. May be used.
(4)3Dスキャナ20における生体情報の取得
 3Dスキャナ20は、対象物から複数種類の生体情報を測定してもよい。この場合において、3Dスキャナ20が測定する生体情報は、空間分布を有するもの(例えば、サーモグラフなど)と空間分布を有さないもの(例えば、血圧、脈拍、及び動脈血酸素飽和度など)との両者を含んでいてもよい。例えば、3Dスキャナ20の撮影室内に椅子を設け、椅子とは別の場所で立った状態で3Dモデル生成のための撮影を行った後で、ユーザをその椅子に座らせた状態で生体情報が測定されてもよい。この椅子には、例えば、体組成計、体重計、血圧計、脈拍計、及びパルスオキシメーターが設けられる。あるいはこの椅子には、X線装置又は超音波検査装置が設けられ、X線像又はエコー像(これらは空間分布を有する生体情報の例である)が撮影されてもよい。あるいは、この椅子は、採血をして血液検査をする機能を有してもよい。この場合、血液検査の結果が生体情報の例である。このように、生体情報は、非侵襲型の測定装置及び侵襲型の測定装置のいずれにより測定されてもよい。さらにあるいは、この椅子は、ユーザに対し触診を行うためのロボットアームを有してもよい。このロボットアームは、ユーザの所定の部位(例えば腹部)を押し、その部位からの反力を生体情報として出力する。なおここでは便宜上、この装置を椅子と表現したが、この装置は椅子型のものに限られず、どのような形状を有してもよい。
(4) Acquisition of biological information by 3D scanner 20 The 3D scanner 20 may measure multiple types of biological information from the object. In this case, the biological information measured by the 3D scanner 20 includes information that has a spatial distribution (such as a thermograph) and information that does not have a spatial distribution (such as blood pressure, pulse rate, and arterial oxygen saturation). It may include both. For example, a chair is provided in the photographing room of the 3D scanner 20, and after photographing for 3D model generation while standing in a place other than the chair, biometric information is collected while the user is sitting in the chair. May be measured. This chair is equipped with, for example, a body composition monitor, weight scale, blood pressure monitor, pulse monitor, and pulse oximeter. Alternatively, this chair may be provided with an X-ray device or an ultrasonic examination device to take an X-ray image or an echo image (these are examples of biological information having a spatial distribution). Alternatively, the chair may have the ability to draw blood and perform blood tests. In this case, blood test results are an example of biological information. In this way, biological information may be measured by either a non-invasive measuring device or an invasive measuring device. Additionally or alternatively, the chair may have a robotic arm for palpating the user. This robot arm presses a predetermined part of the user (for example, the abdomen) and outputs the reaction force from that part as biological information. Note that although this device is expressed as a chair for convenience here, this device is not limited to a chair type, and may have any shape.
(5)生体情報の拡張
 生体情報はサーモグラフに限定されない。生体情報は、サーモグラフィーカメラ又は血圧計などの機器によって計測される情報に加えて、又は代えて、医師による所見(例えば「高血圧症の疑い」又は「糖尿病の疑い」など)を含んでもよい。生体情報が医師の所見を含むことにより、医師の所見と、その所見と対応する時期における患者の3Dモデルとがデータベース111に記録される。
(5) Expansion of biological information Biological information is not limited to thermographs. In addition to or instead of information measured by a device such as a thermography camera or a blood pressure monitor, the biological information may include findings by a doctor (for example, "suspected hypertension" or "suspected diabetes"). Since the biological information includes the doctor's findings, the doctor's findings and the 3D model of the patient at the time corresponding to the findings are recorded in the database 111.
 あるいは、生体情報は、3Dモデルの対象物であるユーザ本人の感情(「楽しい」、「悲しい」、「落胆」、又は「気持ちいい」など)を示す情報を含んでもよい。ユーザ本人の感情は、例えば、3Dモデル生成のための撮影を行う際に3Dスキャナ20が提示する選択肢の中から、ユーザ自身が選択をすることにより入力される。生体情報がユーザの感情を含むことにより、ユーザの感情が、他の生体情報及び対応する時期における患者の3Dモデルとともにデータベース111に記録される。 Alternatively, the biometric information may include information indicating the emotions of the user who is the object of the 3D model (such as "happy," "sad," "disappointed," or "feels good," etc.). The user's own emotions are input, for example, by the user making a selection from among the options presented by the 3D scanner 20 when photographing for 3D model generation. Since the biometric information includes the user's emotions, the user's emotions are recorded in the database 111 along with other biometric information and the 3D model of the patient at the corresponding time.
 さらにあるいは、生体情報は、3Dモデルの対象物であるユーザに、所定の姿勢(例えば、直立姿勢)をとらせたときの重心動揺を示す情報であってもよい。この場合、3Dスキャナ20は重心動揺計を有する。重心動揺計は、その上に立った人間の重心の動揺を測定する装置である。重心動揺を測定に際し、最初は開眼状態で所定時間(例えば60秒間)、次に閉眼状態で所定時間、重心位置の軌跡が測定される。測定結果は、重心動揺の総軌道距離、動揺面積、重心動揺中心の偏倚、ロンベルグ率、又はパワースペクトルを含む。 Alternatively, the biological information may be information indicating the sway of the center of gravity when the user, who is the object of the 3D model, assumes a predetermined posture (for example, an upright posture). In this case, the 3D scanner 20 has a center of gravity oscillation meter. A center of gravity sway meter is a device that measures the sway of the center of gravity of a person standing on it. When measuring the center of gravity sway, the locus of the center of gravity position is first measured for a predetermined period of time (for example, 60 seconds) with the eyes open and then for a predetermined period of time with the eyes closed. The measurement results include the total trajectory distance of the center of gravity sway, the sway area, the deviation of the center of gravity sway, the Romberg ratio, or the power spectrum.
3-2.フィットネスにおける利用
 3Dデータシステム1は、フィットネスの分野において利用が可能である。3Dモデルの対象物となるユーザは、フィットネスクラブ又はスポーツジムの会員である。3Dスキャナ20は、フィットネスクラブ又はスポーツジムに設置される。生体情報としてサーモグラフが測定される。ユーザは、スタッフの指示により、運動前及び運動後に新たな3Dモデルを生成し、新たな生体情報を測定する。データベース111にはこれらのデータが記録される。
3-2. Application in Fitness The 3D data system 1 can be used in the field of fitness. The user who is the object of the 3D model is a member of a fitness club or a sports gym. The 3D scanner 20 is installed in a fitness club or sports gym. Thermographs are measured as biological information. The user generates a new 3D model and measures new biological information before and after exercise according to instructions from the staff. These data are recorded in the database 111.
 運動前後に生体情報を記録することによって、例えば体のうち運動後に発熱している部位すなわち代謝が向上している部位を特定することができる。すなわち、フィットネスクラブ又はスポーツジムのスタッフは、この生体情報から、例えばそのユーザが行った運動が、対象とする筋肉に適切な負荷を与えているかどうか判断することができる。さらに、このスタッフは、3Dモデルの形状すなわち体型の時系列の変化を合わせて確認することにより、運動が効果を上げているかどうか判断することができる。 By recording biological information before and after exercise, it is possible to identify, for example, parts of the body that generate heat after exercise, that is, parts of the body where metabolism is improved. That is, the staff of the fitness club or sports gym can judge from this biological information whether the exercise performed by the user is applying an appropriate load to the target muscles. Furthermore, by checking the shape of the 3D model, that is, changes in body shape over time, the staff can determine whether the exercise is having an effect.
4.変形例
 本発明は上述の実施形態に限定されるものではなく、種々の変形実施が可能である。以下、変形例をいくつか説明する。以下で説明する事項のうち2つ以上の事項が組み合わせて適用されてもよい。
4. Modifications The present invention is not limited to the above-described embodiments, and various modifications are possible. Some modified examples will be explained below. Two or more of the items described below may be applied in combination.
(1)生体情報の解析
 サーバ10は、生体情報を解析する解析手段としての機能を有してもよい。上述の実施形態においては、3Dモデル生成のための撮影とサーモグラフの測定とがほぼ同時に行われる。したがって、3Dモデル生成のための撮影において得られる対象物の肌の状態(色及び凹凸)と、同時に得られるサーモグラフとの間には相関があると考えられる。解析手段は、相関分析、回帰分析、又は機械学習などの手法を用いて肌の状態とサーモグラフとの間の相関を解析する。この相関は、特定の1人のユーザを対象として解析されてもよいし、複数のユーザを含むユーザ群を対象として解析されてもよい。サーバ10は、肌の状態とサーモグラフとの間の相関を示す情報(例えば、両者の関係式又は機械学習モデル)を記憶手段11に記憶する。
(1) Analysis of biological information The server 10 may have a function as an analysis means for analyzing biological information. In the embodiment described above, imaging for 3D model generation and thermograph measurement are performed almost simultaneously. Therefore, it is considered that there is a correlation between the skin condition (color and unevenness) of the object obtained during photography for 3D model generation and the thermograph obtained at the same time. The analysis means analyzes the correlation between the skin condition and the thermograph using a technique such as correlation analysis, regression analysis, or machine learning. This correlation may be analyzed for one specific user, or may be analyzed for a user group including a plurality of users. The server 10 stores information indicating the correlation between the skin condition and the thermograph (for example, a relational expression or a machine learning model between the two) in the storage means 11.
 あるいは、解析手段は、生体情報の経時変化と3Dモデルの経時変化との相関を解析してもよい。この解析により、例えば、ある生体情報(具体的には、例えば医師の所見)においてある時期からメタボリックシンドロームであることが示されるユーザ群において、その生体情報が得られた時期の直近の所定期間(例えば直近の3年)において、3Dモデルにおける所定の部位の変化が所定の傾向(例えば、胴囲がx%以上増加)を有する、というような解析結果が得られる。サーバ10は、この解析結果を示す情報を記憶手段11に記憶する。 Alternatively, the analysis means may analyze the correlation between changes over time in the biological information and changes over time in the 3D model. Through this analysis, for example, in a group of users whose biological information (specifically, for example, a doctor's findings) indicates that they have metabolic syndrome from a certain time, a predetermined period ( For example, an analysis result can be obtained that shows that changes in a predetermined region in the 3D model have a predetermined trend (for example, waist circumference increased by x% or more) in the last three years. The server 10 stores information indicating this analysis result in the storage means 11.
 これらの情報(相関を示す情報又は解析結果を示す情報)を用いれば、サーバ10は、あるユーザの健康状態が将来どうなりそうかを、現在までの3Dモデルの経時変化から予測する予測手段としての機能を有することができる。図2の例においては、制御手段19がこの解析手段及び予測手段の一例である。 By using this information (information indicating correlation or information indicating analysis results), the server 10 can be used as a prediction means to predict what the health condition of a certain user is likely to be in the future based on changes over time of the 3D model up to the present. It can have the following functions. In the example of FIG. 2, the control means 19 is an example of this analysis means and prediction means.
(2)対象物
 上述の実施形態においては、対象物が人間である例を説明したが、対象物は、ペット又は家畜など、人間以外の動物であってもよい。
(2) Target In the above-described embodiment, an example in which the target is a human has been described, but the target may be an animal other than a human, such as a pet or livestock.
(3)データベースに記録されるデータ
 実施形態の例では、データベース111は、3Dモデルの経時変化、並びに生体情報の空間分布及び経時変化を示すデータを記録する。しかし、データベース111は、あるユーザについて、3Dモデルの経時変化を示すデータを含んでいなくてもよい。また、データベース111は、生体情報の空間分布及び経時変化の一方のみを記録してもよい。例えば、データベース111は、あるユーザについて、単一の時期における3Dモデルを示す3Dデータ、及びそのユーザの生体情報の空間分布を示す生体データのみ(経時変化の情報は無し)を記録してもよい。
(3) Data Recorded in the Database In the example embodiment, the database 111 records data indicating changes in the 3D model over time, as well as spatial distribution and changes over time of biological information. However, the database 111 may not include data indicating changes in the 3D model over time for a certain user. Further, the database 111 may record only one of the spatial distribution and temporal change of biological information. For example, the database 111 may record, for a certain user, only 3D data showing a 3D model at a single time and biometric data showing the spatial distribution of the user's biometric information (without information on changes over time). .
(4)その他
 3Dデータシステム1のハードウェア構成は実施形態において例示したものに限定されない。要求される機能を実現できるものであれば、3Dデータシステム1はどのようはハードウェア構成を有していてもよい。例えば、物理的に複数の装置が協働してサーバ10として機能してもよい。サーバ10は物理サーバでもよいし、仮想サーバ(いわゆるクラウドを含む)であってもよい。また、機能要素とハードウェアとの対応関係は実施形態において例示したものに限定されない。例えば、実施形態において3Dスキャナ20又はユーザ端末30に実装されるものとして説明した機能の少なくとも一部がサーバ10に実装されてもよい。例えば、生成手段25はサーバ10に実装されてもよい。この場合、3Dスキャナ20は、カメラ203で撮影した画像データ、深さセンサー204で測定した距離、及びサーモグラフィーカメラ205で測定したサーモグラフを含む3Dモデル生成要求を、サーバ10に送信する。また、3Dスキャナ20の構成は実施形態において例示したものに限定されず、カメラの代わりにスマートフォンを用いるもの、固定されたカメラで回転する対象物を撮影するもの、固定された対象物をその周りを回転するカメラで撮影するものなど、どのような構造の3Dスキャナが用いられてもよい。
(4) Others The hardware configuration of the 3D data system 1 is not limited to that illustrated in the embodiment. The 3D data system 1 may have any hardware configuration as long as it can realize the required functions. For example, a plurality of physical devices may work together to function as the server 10. The server 10 may be a physical server or a virtual server (including a so-called cloud server). Further, the correspondence between functional elements and hardware is not limited to that illustrated in the embodiment. For example, at least some of the functions described as being implemented in the 3D scanner 20 or the user terminal 30 in the embodiment may be implemented in the server 10. For example, the generating means 25 may be implemented in the server 10. In this case, the 3D scanner 20 transmits to the server 10 a 3D model generation request that includes the image data captured by the camera 203, the distance measured by the depth sensor 204, and the thermograph measured by the thermography camera 205. Furthermore, the configuration of the 3D scanner 20 is not limited to that illustrated in the embodiment, and may include a smartphone that uses a smartphone instead of a camera, a fixed camera that photographs a rotating object, and a fixed camera that captures images of the surroundings of a fixed object. Any type of 3D scanner may be used, such as one that takes images with a rotating camera.
 図6に示すシーケンスチャートはあくまで処理の一例を示すものであり、3Dデータシステム1の動作はこれに限定されない。図示した処理の一部が省略されてもよいし、順番が入れ替えられてもよいし、新たな処理が追加されてもよい。特に、実施形態においては、サーバ10からユーザ端末30に3Dデータ及び生体データを送信する際に、まずデータベース111に記録されているデータの一覧をユーザ端末30に送信し、ユーザ端末30において選択されたデータをサーバ10からユーザ端末30に送信する例を説明した。しかし、3Dデータ及び生体データをサーバ10からユーザ端末30に送信する手順はこれに限定されない。ユーザ端末30のアプリケーションプログラム又はサーバ10のサーバプログラムが、ユーザ端末30に送信する3Dデータ及び生体データを自動的に選択してもよい。あるいは、サーバ10は、3Dデータ及び生体データの一方のみをユーザ端末30に送信してもよい。さらに、サーバ10からユーザ端末30に送信される3Dデータ及び生体データを指定する方法はユーザIDを用いるものに限定されない。サーバ10又はユーザ端末30において選択された属性の値を有する3Dデータ及び生体データがデータベース111から抽出され、ユーザ端末30に送信されてもよい。 The sequence chart shown in FIG. 6 merely shows an example of processing, and the operation of the 3D data system 1 is not limited thereto. Some of the illustrated processes may be omitted, the order may be changed, or new processes may be added. In particular, in the embodiment, when transmitting 3D data and biometric data from the server 10 to the user terminal 30, first a list of data recorded in the database 111 is transmitted to the user terminal 30, and the data selected by the user terminal 30 is An example has been described in which data is sent from the server 10 to the user terminal 30. However, the procedure for transmitting 3D data and biometric data from the server 10 to the user terminal 30 is not limited to this. The application program of the user terminal 30 or the server program of the server 10 may automatically select the 3D data and biometric data to be transmitted to the user terminal 30. Alternatively, the server 10 may transmit only one of the 3D data and the biometric data to the user terminal 30. Furthermore, the method of specifying the 3D data and biometric data transmitted from the server 10 to the user terminal 30 is not limited to using a user ID. 3D data and biometric data having the selected attribute value at the server 10 or the user terminal 30 may be extracted from the database 111 and transmitted to the user terminal 30.
 CPU101によって実行されるアプリケーションプログラムは、インターネット等のネットワークを介したダウンロードにより提供されるものであってもよいし、DVD-ROM等のコンピュータ読み取り可能な非一時的記録媒体に記録された状態で提供されてもよい。 The application program executed by the CPU 101 may be provided by downloading via a network such as the Internet, or may be provided in a state recorded on a computer-readable non-temporary recording medium such as a DVD-ROM. may be done.

Claims (11)

  1.  生体の3Dモデルを示す3Dデータを取得する3Dデータ取得手段と、
     前記生体から取得した生体情報を示す生体データを取得する生体データ取得手段と、
     前記3Dデータ及び前記生体データを対応付けてデータベースに記憶する記憶手段と、
     3Dモデルの識別子を含む要求を受け付ける受け付け手段と、
     前記要求に含まれる識別子に対応する生体データを前記データベースから抽出する抽出手段と、
     前記抽出された生体データを前記要求の送信元に送信する送信手段と
     を有する情報処理システム。
    3D data acquisition means for acquiring 3D data representing a 3D model of a living body;
    biometric data acquisition means for acquiring biometric data indicating biometric information acquired from the living body;
    storage means for storing the 3D data and the biometric data in a database in association with each other;
    Acceptance means for accepting a request including an identifier of a 3D model;
    Extracting means for extracting biometric data corresponding to the identifier included in the request from the database;
    and transmitting means for transmitting the extracted biometric data to a source of the request.
  2.  前記生体情報が、前記3Dモデルを生成するための画像と同時に取得された情報である
     請求項1に記載の情報処理システム。
    The information processing system according to claim 1, wherein the biological information is information acquired simultaneously with an image for generating the 3D model.
  3.  前記生体データが、前記生体における前記生体情報の空間分布を示す
     請求項1に記載の情報処理システム。
    The information processing system according to claim 1, wherein the biometric data indicates a spatial distribution of the biometric information in the living body.
  4.  前記生体情報が、前記生体における温度分布を含む
     請求項3に記載の情報処理システム。
    The information processing system according to claim 3, wherein the biological information includes temperature distribution in the biological body.
  5.  前記生体情報が、前記生体における肌の色を含む
     請求項3に記載の情報処理システム。
    The information processing system according to claim 3, wherein the biological information includes skin color of the living body.
  6.  前記生体情報が、前記生体における医学的情報を含む
     請求項3に記載の情報処理システム。
    The information processing system according to claim 3, wherein the biological information includes medical information about the living body.
  7.  前記生体データが、前記生体情報の経時変化を示す
     請求項1に記載の情報処理システム。
    The information processing system according to claim 1, wherein the biometric data indicates a change in the biometric information over time.
  8.  前記生体データが、前記生体情報が取得された時刻に相当する時刻情報を含む
     請求項7に記載の情報処理システム。
    The information processing system according to claim 7, wherein the biometric data includes time information corresponding to a time when the biometric information was acquired.
  9.  前記3Dデータが、前記3Dモデルの経時変化を示す
     請求項7に記載の情報処理システム。
    The information processing system according to claim 7, wherein the 3D data indicates a change over time of the 3D model.
  10.  前記生体情報が、前記生体の重心動揺を示す
     請求項1に記載の情報処理システム。
    The information processing system according to claim 1, wherein the biological information indicates center of gravity sway of the biological body.
  11.  生体の3Dモデルを示す3Dデータを取得するステップと、
     前記生体から取得した生体情報を示す生体データを取得するステップと、
     前記3Dデータ及び前記生体データを対応付けてデータベースに記憶するステップと、
     3Dモデルの識別子を含む要求を受け付けるステップと、
     前記要求に含まれる識別子に対応する生体データを前記データベースから抽出するステップと、
     前記抽出された生体データを前記要求に送信元に送信するステップと
     を有する情報処理方法。
    obtaining 3D data representing a 3D model of the living body;
    acquiring biometric data indicating biometric information obtained from the living body;
    storing the 3D data and the biometric data in a database in association with each other;
    accepting a request including an identifier of the 3D model;
    extracting biometric data corresponding to the identifier included in the request from the database;
    An information processing method comprising: transmitting the extracted biometric data to a transmission source in response to the request.
PCT/JP2022/020725 2022-05-18 2022-05-18 Information processing system and information processing method WO2023223476A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2022529951A JPWO2023223476A1 (en) 2022-05-18 2022-05-18
PCT/JP2022/020725 WO2023223476A1 (en) 2022-05-18 2022-05-18 Information processing system and information processing method
PCT/JP2023/018529 WO2023224083A1 (en) 2022-05-18 2023-05-18 Information processing system and information processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/020725 WO2023223476A1 (en) 2022-05-18 2022-05-18 Information processing system and information processing method

Publications (1)

Publication Number Publication Date
WO2023223476A1 true WO2023223476A1 (en) 2023-11-23

Family

ID=88834878

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/JP2022/020725 WO2023223476A1 (en) 2022-05-18 2022-05-18 Information processing system and information processing method
PCT/JP2023/018529 WO2023224083A1 (en) 2022-05-18 2023-05-18 Information processing system and information processing method

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/018529 WO2023224083A1 (en) 2022-05-18 2023-05-18 Information processing system and information processing method

Country Status (2)

Country Link
JP (1) JPWO2023223476A1 (en)
WO (2) WO2023223476A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017176803A (en) * 2016-09-07 2017-10-05 株式会社3D body Lab Human body model providing system, human body model deforming method, and computer program
CN109727675A (en) * 2018-12-24 2019-05-07 深圳创维-Rgb电子有限公司 A kind of health status detection method, system and television set based on television set
US20200196940A1 (en) * 2018-12-20 2020-06-25 Spiral Physical Therapy, Inc. Digital platform to identify health conditions and therapeutic interventions using an automatic and distributed artificial intelligence system
US20210097759A1 (en) * 2019-09-26 2021-04-01 Amazon Technologies, Inc. Predictive personalized three-dimensional body models

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7254742B2 (en) * 2020-03-26 2023-04-10 Hoya株式会社 Program, information processing method, information processing device, and diagnosis support system
JP7104951B1 (en) * 2021-12-21 2022-07-22 株式会社ジィ・シィ企画 healthcare system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017176803A (en) * 2016-09-07 2017-10-05 株式会社3D body Lab Human body model providing system, human body model deforming method, and computer program
US20200196940A1 (en) * 2018-12-20 2020-06-25 Spiral Physical Therapy, Inc. Digital platform to identify health conditions and therapeutic interventions using an automatic and distributed artificial intelligence system
CN109727675A (en) * 2018-12-24 2019-05-07 深圳创维-Rgb电子有限公司 A kind of health status detection method, system and television set based on television set
US20210097759A1 (en) * 2019-09-26 2021-04-01 Amazon Technologies, Inc. Predictive personalized three-dimensional body models

Also Published As

Publication number Publication date
WO2023224083A1 (en) 2023-11-23
JPWO2023223476A1 (en) 2023-11-23

Similar Documents

Publication Publication Date Title
US20210029007A1 (en) Systems and methods for response calibration
US20140340479A1 (en) System and method to capture and process body measurements
US6336136B1 (en) Internet weight reduction system
Bonnechere et al. Determination of the precision and accuracy of morphological measurements using the Kinect™ sensor: comparison with standard stereophotogrammetry
CN108565023A (en) A kind of children disease prevention and control monitoring method, device and system
Henriquez et al. Mirror mirror on the wall... an unobtrusive intelligent multisensory mirror for well-being status self-assessment and visualization
JP6914896B2 (en) Health condition diagnosis system
KR20140054197A (en) Systems, apparatus and methods for non-invasive motion tracking to augment patient administered physical rehabilitation
JP2016131825A (en) Information processing device, blood pressure value calculation method and program
JP2019091498A (en) System and method for correcting answers
US11000188B2 (en) Smart body analyzer with 3D body scanner and other vital parameter sensors
US11918347B2 (en) Systems and methods for verified biomeasurements
US11179065B2 (en) Systems, devices, and methods for determining an overall motion and flexibility envelope
KR20100052951A (en) Unified health-care apparatus
WO2023223476A1 (en) Information processing system and information processing method
JP6617246B1 (en) Exercise device function improvement support device, exercise device function improvement support system, and exercise device function improvement support method
WO2023026785A1 (en) Program, information processing device, and information processing method
JP7333537B2 (en) Program, information processing device, and information processing method
TW202006508A (en) Mixed reality type motion function evaluation system including a projection device, an imaging device, a wearable device, and a host
JP2020189073A (en) Moving organ function improvement support system and moving organ function improvement support method
US11544852B2 (en) Performance scanning system and method for improving athletic performance
JP7285876B2 (en) Methods and systems for analyzing biological materials and uses of such systems
RU2733870C1 (en) Method of training process organization and system for its implementation
WO2023027046A1 (en) Program, information processing device, and information processing method
WO2023017654A1 (en) Prediction device, prediction method, and prediction program

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2022529951

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22942681

Country of ref document: EP

Kind code of ref document: A1