CN106650217B - Information processing apparatus and information processing method - Google Patents

Information processing apparatus and information processing method Download PDF

Info

Publication number
CN106650217B
CN106650217B CN201610908922.1A CN201610908922A CN106650217B CN 106650217 B CN106650217 B CN 106650217B CN 201610908922 A CN201610908922 A CN 201610908922A CN 106650217 B CN106650217 B CN 106650217B
Authority
CN
China
Prior art keywords
subject
information
information processing
selection
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610908922.1A
Other languages
Chinese (zh)
Other versions
CN106650217A (en
Inventor
菅田裕纪
野崎晃
田中泰洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Marketing Japan Inc
Canon IT Solutions Inc
Original Assignee
Canon Marketing Japan Inc
Canon IT Solutions Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2015213489A external-priority patent/JP2017080199A/en
Priority claimed from JP2015213492A external-priority patent/JP2017080201A/en
Priority claimed from JP2015213491A external-priority patent/JP2017084194A/en
Priority claimed from JP2015213496A external-priority patent/JP2017080203A/en
Priority claimed from JP2015213490A external-priority patent/JP6803111B2/en
Application filed by Canon Marketing Japan Inc, Canon IT Solutions Inc filed Critical Canon Marketing Japan Inc
Priority to CN201910679377.7A priority Critical patent/CN110391017A/en
Publication of CN106650217A publication Critical patent/CN106650217A/en
Application granted granted Critical
Publication of CN106650217B publication Critical patent/CN106650217B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • G06F3/04855Interaction with scrollbars
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H15/00ICT specially adapted for medical reports, e.g. generation or transmission thereof
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Biophysics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention relates to an information processing apparatus and an information processing method. The information processing device is provided with: a comparison animation selection receiving unit for receiving the selection of the animation to be compared; a coordinate acquisition unit that acquires coordinate information of a joint of a subject; a bone information generating unit that generates bone information of the subject based on the coordinate information acquired by the coordinate acquiring unit; and a display unit configured to display the skeletal information of the subject and the skeletal information to be compared in parallel in a layout corresponding to the items related to the animation selected by the comparative animation selection receiving unit.

Description

Information processing apparatus and information processing method
Technical Field
The invention relates to an information processing apparatus and an information processing method.
Background
In recent years, effect measurement in nursing care and rehabilitation training has been performed using a motion capture technique that digitally records the movement of a person or an object.
As one of the methods of the motion capture technology, there is the following method: a marker is added to a subject and the subject is photographed from a plurality of directions by a camera, and the movement of the subject is recorded by converting the movement into digital data by detecting the marker. In addition, there are the following modes: the distance from the sensor to the subject is measured by an infrared sensor without using a marker, and the activity of the subject is recorded based on the measured value.
Patent document 1 discloses a mechanism for using a motion capture technique in a rehabilitation training field.
Patent document 1: japanese laid-open patent publication No. 2015-61577
Disclosure of Invention
Patent document 1 describes the following mechanism: the motion of the subject is displayed by generating skeleton information from coordinate information of each joint of the subject acquired by a motion capture technique and displaying the skeleton information on a screen.
In the mechanism described in patent document 1, when it is desired to compare the past own motion and the motion of another person with the current motion, it is necessary to check each image and animation one by one. However, in this processing method, it is difficult to grasp the difference in detail.
Further, for example, when the background is different, it is difficult to grasp the difference in the movement of the subject in detail due to the difference in the background.
In order to grasp the difference in detail, the size and position of the person to be compared need to be finely adjusted to match each other. However, this adjustment work is very time-consuming.
Therefore, an object of the present invention is to provide a mechanism for easily grasping a difference from a comparison target when comparing with another image or moving image.
Another object of the present invention is to provide a mechanism for reducing the time and effort required for adjusting the size and position of a person to be compared.
The present invention relates to an information processing apparatus, comprising: a comparison animation selection receiving unit for receiving the selection of the animation to be compared; a coordinate acquisition unit that acquires coordinate information of a joint of a subject; a bone information generating unit that generates bone information of the subject based on the coordinate information acquired by the coordinate acquiring unit; and a display unit configured to display the skeletal information of the subject and the skeletal information to be compared in parallel in a layout corresponding to the items related to the animation selected by the comparative animation selection receiving unit.
The present invention also relates to an information processing apparatus, including: a coordinate acquisition unit that acquires coordinate information of a joint of a subject; a bone information generating unit that generates bone information of the subject based on the coordinate information acquired by the coordinate acquiring unit; a layout determining unit that determines a layout according to an action direction of the subject; and a display unit configured to display the bone information of the subject and the bone information to be compared in parallel according to the layout determined by the layout determination unit.
The present invention also relates to an information processing apparatus, including: a coordinate acquisition unit that acquires coordinate information of a joint of a subject; a bone information generating unit that generates bone information of the subject based on the coordinate information acquired by the coordinate acquiring unit; a display unit configured to display the bone information of the subject and bone information to be compared in parallel; and a reverse instruction receiving means for receiving and displaying an instruction to reverse the left and right of the bone information of the subject or the bone information to be compared, wherein the display means displays the bone information in a reversed manner.
The present invention also relates to an information processing apparatus, including: a coordinate acquisition unit that acquires coordinate information of a joint of a subject; a bone information generating unit that generates bone information of the subject based on the coordinate information acquired by the coordinate acquiring unit; a display unit that displays the bone information of the subject and the bone information to be compared in a superimposed manner; a position adjustment instruction receiving unit that receives an instruction to adjust the positions of the bone information of the subject and the bone information to be compared; and a position adjustment unit that performs position adjustment by matching any one of the joint coordinates of the subject with a joint coordinate associated with the bone information to be compared corresponding to the joint coordinate when the instruction for position adjustment is received by the position adjustment instruction receiving unit.
The present invention also relates to an information processing apparatus, including: a coordinate acquisition unit that acquires coordinate information of a joint of a subject; a bone information generating unit that generates bone information of the subject based on the coordinate information acquired by the coordinate acquiring unit; a display unit that displays the bone information of the subject and the bone information to be compared in a superimposed manner; a size adjustment instruction receiving unit that receives an instruction to adjust the sizes of the bone information of the subject and the bone information to be compared; and a size adjustment unit that performs size adjustment by matching a distance between any one of the joints of the subject with a distance between joints related to the bone information to be compared corresponding to the distance between the joints when the size adjustment instruction receiving unit receives the size adjustment instruction.
The present invention also relates to an information processing apparatus, including: a coordinate acquisition unit that acquires coordinate information of a joint of a subject; a bone information generating unit that generates bone information of the subject based on the coordinate information acquired by the coordinate acquiring unit; a display unit configured to display the bone information of the subject and bone information to be compared in parallel; a setting acceptance unit accepting setting of a display format on the screen displayed by the display unit; and a storage unit that stores the setting contents accepted by the setting acceptance unit.
The present invention also relates to an information processing apparatus, including: a coordinate acquisition unit that acquires coordinate information of a joint of a subject; a bone information generating unit that generates bone information of the subject based on the coordinate information acquired by the coordinate acquiring unit; a display unit configured to display the bone information of the subject and bone information to be compared in parallel; a selection receiving unit that receives a selection as to whether or not to display a background image including the subject on the screen displayed on the display unit; and a determination unit configured to determine whether or not to display a background image including the subject, in accordance with the content of the selection accepted by the selection acceptance unit.
The present invention also relates to an information processing method in an information processing apparatus, including: a comparison animation selection reception step in which a comparison animation selection reception unit of the information processing apparatus receives selection of an animation to be compared; a coordinate acquisition step in which a coordinate acquisition unit of the information processing device acquires coordinate information of a joint of a subject; a bone information generating step of generating bone information of the subject based on the coordinate information acquired in the coordinate acquiring step by bone information generating means of the information processing apparatus; and a display step of displaying the bone information of the subject and the bone information to be compared in parallel by a layout corresponding to the item related to the moving image selected in the comparative moving image selection receiving step.
The present invention also relates to an information processing method in an information processing apparatus, including: a coordinate acquisition step in which a coordinate acquisition unit of the information processing device acquires coordinate information of a joint of a subject; a bone information generating step of generating bone information of the subject based on the coordinate information acquired in the coordinate acquiring step by bone information generating means of the information processing apparatus; a layout determination step in which a layout determination means of the information processing apparatus determines a layout in accordance with an operation direction of the subject; and a display step of displaying the bone information of the subject and the bone information to be compared in parallel by the layout determined in the layout determination step on a display unit of the information processing device.
The present invention also relates to an information processing method in an information processing apparatus, including: a coordinate acquisition step in which a coordinate acquisition unit of the information processing device acquires coordinate information of a joint of a subject; a bone information generating step of generating bone information of the subject based on the coordinate information acquired in the coordinate acquiring step by bone information generating means of the information processing apparatus; a display step of displaying the bone information of the subject and the bone information to be compared in parallel on a display unit of the information processing apparatus; and a reverse instruction receiving step of receiving an instruction to display the bone information of the subject or the bone information to be compared by reversing the left and right of the subject, wherein the display step displays the bone information in which the instruction to display the bone information is received by reversing the left and right of the bone information.
The present invention also relates to an information processing method in an information processing apparatus, including: a coordinate acquisition step in which a coordinate acquisition unit of the information processing device acquires coordinate information of a joint of a subject; a bone information generating step of generating bone information of the subject based on the coordinate information acquired in the coordinate acquiring step by bone information generating means of the information processing apparatus; a display step of displaying, on a display unit of the information processing apparatus, bone information of the subject and bone information to be compared in a superimposed manner; a position adjustment instruction receiving step in which position adjustment instruction receiving means of the information processing apparatus receives an instruction to adjust the positions of the bone information of the subject and the bone information to be compared; and a position adjustment step of, when an instruction for position adjustment is received in the position adjustment instruction receiving step, performing position adjustment by matching any one of the joint coordinates of the subject with a joint coordinate associated with the bone information to be compared corresponding to the joint coordinate.
The present invention also relates to an information processing method in an information processing apparatus, including: a coordinate acquisition step in which a coordinate acquisition unit of the information processing device acquires coordinate information of a joint of a subject; a bone information generation step of generating bone information of the subject based on the coordinate information acquired in the coordinate acquisition step; a display step of displaying, on a display unit of the information processing apparatus, bone information of the subject and bone information to be compared in a superimposed manner; a size adjustment instruction receiving step in which a size adjustment instruction receiving means of the information processing apparatus receives an instruction to adjust the sizes of the bone information of the subject and the bone information to be compared; and a size adjustment step of, when the size adjustment instruction is received by the size adjustment means of the information processing device in the size adjustment instruction reception step, performing size adjustment by matching a distance between any one of the joints of the subject with a distance between joints related to the bone information to be compared corresponding to the distance between the joints.
The present invention also relates to an information processing method in an information processing apparatus, including: a coordinate acquisition step in which a coordinate acquisition unit of the information processing device acquires coordinate information of a joint of a subject; a bone information generation step of generating bone information of the subject based on the coordinate information acquired in the coordinate acquisition step; a display step of displaying the bone information of the subject and the bone information to be compared in parallel on a display unit of the information processing apparatus; a setting acceptance step of accepting setting of a display format on the screen displayed in the display step by the setting acceptance means of the information processing apparatus; and a storage step of storing the setting contents accepted in the setting acceptance step in a storage unit of the information processing apparatus.
The present invention also relates to an information processing method in an information processing apparatus, including: a coordinate acquisition step in which a coordinate acquisition unit of the information processing device acquires coordinate information of a joint of a subject; a bone information generation step of generating bone information of the subject based on the coordinate information acquired in the coordinate acquisition step; a display step of displaying the bone information of the subject and the bone information to be compared in parallel on a display unit of the information processing apparatus; a selection receiving step in which a selection of whether or not to display a background image including the subject is received by the selection receiving means of the information processing apparatus on the screen displayed in the display step; and a determination step in which the determination means of the information processing device determines whether or not to display a background image including the subject, in accordance with the content of the selection accepted in the selection acceptance step.
According to the present invention, when compared with other images or moving images, it is possible to easily grasp a difference from a comparison target.
Further, according to the present invention, the time and effort required for the adjustment work of the size and position of the person to be compared are reduced.
Drawings
Fig. 1 is a diagram showing a functional configuration of an information processing apparatus 101.
Fig. 2 is a diagram showing a hardware configuration of the information processing apparatus 101.
Fig. 3 is a flowchart showing a process of performing measurement while photographing a subject.
Fig. 4 is a flowchart showing a measurement process of each item in the case of using the motion information recorded in advance.
Fig. 5 is a flowchart showing the report making process.
Fig. 6 is a diagram showing an example of the subject list.
Fig. 7 is a diagram showing an example of coordinate information of each joint.
Fig. 8 is a diagram showing the situation of the subject acquired by the exercise information acquisition unit 151.
Fig. 9 is a diagram showing an example of the skeletal information of the subject formed from the coordinate information of each joint.
Fig. 10 is a diagram showing an example of a screen for accepting a selection of a subject from a user to be displayed on the display 212 of the information processing apparatus 101 in the present invention.
Fig. 11 is a diagram showing an example of processing executed by the CPU201 of the information processing apparatus 101 when the selection of the item of "normal walking" is accepted from the user in step S301 in the present invention.
Fig. 12 is a diagram showing an example of a screen displayed on the display 212 of the information processing apparatus 101 when selection of an item of "normal walking" is accepted from the user in step S301 in the present invention.
Fig. 13 is a diagram showing an example of processing executed by the CPU201 of the information processing apparatus 101 in the case where selection of the item "stand with eyes open" is accepted from the user in step S301 in the present invention.
Fig. 14 is a diagram showing an example of a screen displayed on the display 212 of the information processing apparatus 101 when the information processing apparatus 101 starts photographing of the subject in step S302 in the present invention.
Fig. 15 is a diagram showing an example of processing for extracting the measurement result by the CPU201 of the information processing apparatus 101 in step S502 in the present invention.
Fig. 16 is a diagram showing an example of processing for the CPU201 of the information processing apparatus 101 to create a report in step S503 in the present invention.
Fig. 17 is a diagram showing an example of a screen displayed on the display 212 of the information processing apparatus 101 when the measurement result is extracted in the present invention.
Fig. 18 is a diagram showing an example of a screen displayed on the display 212 of the information processing apparatus 101 after the measurement result is extracted in the present invention.
Fig. 19 is a diagram showing an example of a measurement result management table in which measurement results are stored in the storage device 204 of the information processing apparatus 101 in the present invention.
Fig. 20 is a diagram showing an example of a report generated and output to a printing apparatus or the like by the CPU201 of the information processing apparatus 101 in the present invention.
Fig. 21 is an example of a screen in which a video image captured and acquired by the motion information acquiring unit 151 and skeleton information are combined.
Fig. 22 is an example of a screen in which the area of the region where the skeletal information is displayed is adjusted.
Fig. 23 is an example of a screen in which an animation of an evaluation target and an animation of a comparison target are displayed in parallel in the vertical direction.
Fig. 24 is an example of a screen in which an animation of an evaluation target and an animation of a comparison target are displayed side by side in a horizontal direction.
Fig. 25 is a diagram showing an example of a table in which a layout is registered for each item.
Fig. 26 is a diagram showing a case where the upper animation in the screen of fig. 23 is reversed left and right.
Fig. 27 is an example of a screen in which animation of a comparison target is displayed in a superimposed manner.
Fig. 28 is an example of a screen in which only skeleton information is displayed without displaying a background or the like.
Fig. 29 is a flowchart showing a process of evaluating the action of the subject and displaying a message for the subject and an assistant.
Fig. 30 is a diagram showing an example of the evaluation condition.
Fig. 31 is a diagram showing an example of a message table.
(symbol description)
101: an information processing apparatus.
Detailed Description
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
Fig. 1 is a diagram showing a functional configuration of an information processing apparatus 101.
The exercise information acquisition unit 151 detects the movement of a person (in the present invention, a subject (a person receiving care or rehabilitation training)) in a space where care or rehabilitation training is performed, and acquires exercise information of the subject.
Specifically, the motion information acquiring unit 151 is a sensor group known as Kinect (registered trademark), and incorporates an RGB camera, a depth sensor, an audio sensor, and the like, and is capable of recognizing the position, the movement, the audio, the face, and the like of an object (in the present invention, a subject (a person receiving nursing care or rehabilitation training)).
The depth sensor is a sensor that measures the distance to the subject using infrared rays. According to this aspect, the distance from the sensor to the subject can be measured without adding a marker or the like to the subject.
Each piece of data acquired by the motion information acquiring unit 151 is stored in the storage unit 165 as animation data.
The motion information acquired by each sensor of the motion information acquiring unit 151 is sent to the motion information analyzing unit 152, and data analysis is performed.
The motion information analysis unit 152 analyzes each piece of data acquired by the motion information acquisition unit 151, and converts the motion of the subject into coordinate information.
Fig. 8 shows the situation of the subject acquired by the exercise information acquisition unit 151. The motion information acquiring unit 151 detects the movement of the subject and matches the movement with a human body pattern stored in advance to obtain the coordinates of the human body surface of the subject. Then, coordinates of each joint forming the bone of the subject are calculated, and the bone information of the subject is generated from the calculated coordinate information.
Fig. 7 shows an example of coordinate information of each joint. Fig. 9 shows an example of the skeletal information of the subject formed from the coordinate information of each joint.
By calculating the coordinate information of each joint and detecting the change in the coordinates of each joint, the movement of the subject raising the foot or bending the knee can be recognized.
The measurement unit 153 performs measurement of the items based on the motion information analyzed by the motion information analysis unit 152 and the settings corresponding to the items.
The comparative animation selection receiving unit 154 receives selection of an animation to be compared with the animation to be measured, which is the animation stored in the storage unit 165.
The coordinate acquisition unit 155 acquires the coordinate information of the joint of the subject converted by the motion information analysis unit 152.
The bone information generating unit 156 generates bone information of the subject based on the coordinate information acquired by the coordinate acquiring unit 155.
The display unit 157 displays the skeleton information of the subject and the skeleton information of the comparison target side by side based on the layout corresponding to each measurement item and the layout determined by the layout determination unit 158. In addition, the bone information of the subject and the bone information of the comparison object are displayed in a superimposed manner.
The layout determination unit 158 determines the layout of the moving image data for displaying the skeleton information based on the movement direction of the subject. Specifically, a layout such as a vertical arrangement or a horizontal arrangement of the animation data of the subject and the animation data of the comparison target is determined.
The inversion instruction receiving unit 159 receives an instruction to display the bone information of the subject or the bone information to be compared by inverting the left and right. When the inversion instruction is received, the display unit 157 inverts the left and right of the bone information received the inversion instruction and displays the inverted bone information.
The position adjustment instruction receiving unit 160 receives an instruction to adjust the positions of the bone information of the subject and the bone information to be compared.
When the position adjustment instruction is received by the position adjustment instruction receiving unit 160, the position adjustment unit 161 adjusts the positions of the bone information of the subject and the bone information of the comparison target by matching any one of the joint coordinates of the subject with the position of the joint coordinate of the bone information of the comparison target corresponding to the joint coordinate.
The size adjustment instruction receiving unit 162 receives an instruction to adjust the sizes of the bone information of the subject and the bone information to be compared.
When the instruction is received by the resizing instruction receiving unit 162, the resizing unit 163 adjusts the size by matching the distance between any one of the joints of the subject with the distance between the joints of the bone to be compared corresponding to the distance between the joints. When an instruction to increase the size of the bone information or an instruction to decrease the size of the bone information is received by the size adjustment instruction receiving unit 162, the size is adjusted by increasing or decreasing the bone information in accordance with the instruction.
The setting receiving unit 164 receives settings of various kinds of setting information (setting information such as a combination, a layout, a position, and a size of a comparison moving image) on a screen displaying skeleton information of a subject and skeleton information of a comparison target.
Then, the set information is stored in the storage unit 165.
The selection receiving unit 166 receives a selection of displaying or not displaying the background image on the display screen.
The determination unit 167 determines whether or not to display the background video in accordance with the content received by the selection receiving unit 166.
Fig. 2 is a block diagram showing an example of the hardware configuration of the information processing apparatus 101 according to the embodiment of the present invention.
As shown in fig. 2, the information processing apparatus 101 is connected to a CPU (central processing Unit) 201, a ROM (Read Only Memory) 202, a RAM (Random Access Memory) 203, a storage device 204, an input controller 205, an audio input controller 206, a video controller 207, a Memory controller 208, and a communication I/F controller 209 via a system bus 200.
The CPU201 generally controls each device and a controller connected to the system bus 200.
The ROM202 or the storage device 204 holds a BIOS (Basic Input/Output System), an OS (Operating System), a computer-readable program for implementing the information processing method, and various necessary data (including a data table), which are control programs executed by the CPU 201.
The RAM203 functions as a main memory, a work area, and the like of the CPU 201. The CPU201 loads a necessary program or the like from the ROM202 or the storage device 204 to the RAM203 when executing processing, and executes the loaded program, thereby realizing various operations.
The input controller 205 controls input from an input device such as a keyboard/touch panel 210. The input device is not limited to this, and may be a touch panel such as a mouse or a multi-touch screen that can detect a position touched with a plurality of fingers.
The user can perform various instructions by pointing at and pressing (touching with a finger or the like) an icon, a cursor, or a button displayed on the touch panel.
Using the input device, a destination for a communication destination available among various communication devices is input.
The sound input controller 206 controls input from the microphone 211. Voice recognition can be performed on the voice input from the microphone 211.
The video controller 207 controls display to an external output device such as a display 212. The display is a display including a notebook personal computer integrated with the main body. The external output device is not limited to the display, and may be, for example, a projector. Further, the device capable of receiving the touch operation can receive an input from the keyboard/touch panel 210.
The video controller 207 can control a video memory (VRAM) for performing display control, and as a video memory region, a part of the RAM203 can be used, or a dedicated video memory can be separately provided.
In the present invention, the following are provided: a 1 st video memory area for display in a case where a user operates the information processing apparatus; and a 2 nd video memory area for displaying in superimposition with the display content of the 1 st video memory area in the case where a predetermined screen is displayed. The video memory area is not limited to 2, and may be plural as long as the resource of the information processing apparatus can be tolerated.
The memory controller 208 controls access to the external memory 213. As the external memory, an external storage device (hard disk) that stores a boot program, various applications, font data, user files, editing files, various data, and the like, a Floppy Disk (FD), a compact flash (registered trademark) memory that is connected to a PCMCIA card slot via an adapter, or the like can be used.
The communication control process by the network is executed by connecting/communicating with an external device via the communication I/F controller 209 and the network 214. For example, communication using TCP/IP, communication using a telephone line such as ISDN, and communication using a 3G line of a mobile phone can be performed.
Further, the storage device 204 is a medium for persistently storing information, and the manner is not limited to a storage device such as a hard disk. For example, the medium may be an SSD (Solid State Drive).
In addition, the present embodiment can also be used as a temporary memory area for various processes performed in the communication terminal.
Next, a process of performing measurement while photographing a subject will be described with reference to a flowchart shown in fig. 3. Further, the processing shown in the flowchart of fig. 3 is processing in which the CPU201 of the information processing apparatus 101 reads out and executes a predetermined program.
In step S301, a selection of a subject and a selection of an item are received from a user.
The subject information is registered in the subject list shown in fig. 6. Further, as items, there are "open-eye one-foot standing" that measures a time for which the open-eye one-foot stands, "chair sitting" that measures a time for which several sitting motions are possible within a limited time, "maximum step" that measures a maximum step length, "normal walking" that measures a time required for walking for a distance of 5m, and "TUG (timeup & Go)" that measures a time until the user sits on the chair by reciprocating a distance of 3m from the chair standing up and down.
Upon receiving a selection of an item, a setting file and a program necessary for measurement of the item are read.
The subject information may be selected from the already registered information or may be newly registered.
In step S302, a photographing instruction is received from the user. Upon receiving the photographing instruction, the motion information acquisition unit 151 of the information processing apparatus 101 starts photographing the subject. That is, an image including the subject is acquired. Further, motion information, which is information obtained by each sensor (coordinate information of each joint of the subject, etc.), is acquired.
In step S303, when there are a plurality of persons in the photographed image, a selection is received as to which person is the subject.
The processing in step S303 will be described in detail below with reference to fig. 10.
Fig. 10 is a screen on which the information processing apparatus 101 accepts a selection from the user as to which person is the subject in step S303, and the screen of fig. 10 is displayed on the display 212 of the information processing apparatus 101.
In 1003 of fig. 10, the moving image acquired by the motion information acquiring unit 151 of the information processing apparatus 101 is displayed.
In 1001 of fig. 10, the status of the person (the subject or an assistant) acquired by the motion information acquiring unit 151 is displayed together with the skeleton information specified by the coordinate information of each joint.
First, when the human switch button 1005 is pressed by the operation of the mouse pointer 1004 by the user, the skeletal information of the subject is displayed in 1001 based on the information acquired by the motion information acquisition unit 151.
Next, the user specifies the subject among the persons 1001 shown in fig. 10 by operating the mouse pointer 1004.
Then, from the drop-down list 1002, a user operates the mouse pointer 1004 to select who the person selected by the mouse pointer 1004 is.
In the pull-down list 1002, the names of the subjects included in the subject list (fig. 6) managed by the information processing apparatus 101 are displayed in advance.
In step S304, coordinate information of each joint of the subject is acquired from the motion information acquired in step S302. When a reproduction video is used, coordinate information of each joint of the subject is acquired using motion information included in video data to be reproduced. Further, bone information of the subject is generated based on the coordinate information.
Fig. 7 shows an example of coordinate information of each joint.
In step S305, the video acquired in step S303 and the bone information obtained from the coordinate information of each joint acquired in step S304 are synthesized and displayed on the display 212.
Fig. 21 shows an example of a screen in which a video image captured by the motion information acquisition unit 151 and skeleton information are combined.
In 2101 of fig. 21, the image and the bone information are displayed in a combined state. Further, in 2102, the skeleton information in the case of observing the subject from above is displayed, and in 2103, the skeleton information in the case of observing the subject from the side is displayed.
The bone information displayed in the areas 2102 and 2103 is calculated and generated based on the coordinate information of each joint acquired by the motion information acquiring unit 151.
2102. 2103 the area of the display region is adjusted to an area capable of displaying skeletal information. For example, the screen shown in fig. 21 is a screen in the case of no adjustment. In this case, although the bone information displayed at 2103 hardly moves, a large margin is generated around the bone information.
Fig. 22 is an example of a screen in the case where the size adjustment is performed. The blank area is reduced by adjusting the bone information so as to be contained therein.
By controlling so as not to display useless spaces, 3 pieces of skeletal information can be collected closer, and therefore, the observation is easier.
As a method of determining the display range, for example, a method of specifying a rectangle in which skeleton information is stored and displaying the rectangle with a predetermined width as a margin is considered.
In step S306, measurement corresponding to the item selected in step S301 is performed using the bone information analyzed in step S304.
The details of the specific processing will be described with reference to fig. 11, 13, and 29.
Next, with reference to fig. 11, in the present invention, a process executed by the CPU201 of the information processing apparatus 101 when the selection of the item "normal walking" is accepted from the user in step S301 will be described.
In the processing of fig. 11, for example, fig. 12 is a screen displayed on the display 212 of the information processing apparatus 101.
Here, fig. 12 is explained.
A measurement start (end) coordinate 1201 in fig. 12 indicates (x, y, z) coordinates at which measurement of normal walking set in advance is started or ended. Counting of seconds is started at timing when the x-coordinate of the subject's toe overlaps the 1 st time with the x-coordinate of the measurement start (end) coordinate 1201. Then, the count of the number of seconds is ended at a timing overlapping with the 2 nd time of the x-coordinate of the measurement start (end) coordinate 1201.
For example, when the subject moves from the left side of the screen of fig. 12, the counting of the number of seconds starts at a time point when the x-coordinate of the measurement start (end) coordinate 1201 on the left side of the screen of fig. 12 overlaps with the x-coordinate of the toe of the subject. Then, at a time point when the x-coordinate of the measurement start (end) coordinate 1201 on the right side of the screen of fig. 12 overlaps with the x-coordinate of the toe of the subject, the counting of seconds is ended.
In terms of the characteristics of the motion information acquiring unit 151, the accuracy of determining the coordinates of the joint is low in the range from both the left and right ends of the imaging range to a predetermined distance. Therefore, the measurement start (end) coordinates 1201 are set to positions (coordinates) that are a little distant from the left and right ends of the screen in fig. 12.
Fig. 12 shows a time line 1202, and the time line 1202 is displayed when the photographed data obtained by the video recording is evaluated.
Fig. 12 shows 1203 the evaluation time specification unit. The user can arbitrarily specify the time to be evaluated by moving the evaluation time specifying unit 1203 by an operation using a mouse or the like. The evaluation of the photographed data obtained by the video recording is used when it is desired to evaluate the movement of the subject in a section different from the measurement start (end) coordinate 1201.
The explanation returns to fig. 11.
In step S1101, the information processing apparatus 101 specifies the coordinates of the toe of the subject from the image of the frame to be currently processed.
In step S1102, the information processing apparatus 101 determines whether or not the coordinates of the toe of the subject can be specified from the image of the frame currently being processed in step S1101. If it can be determined, the process is shifted to step S1103, and if it cannot be determined, the present process is ended.
The reason why the process of step S1102 is executed will be described. The motion information acquisition unit 151 specifies the position coordinates of each joint of the subject on the assumption that the subject faces the motion information acquisition unit 151. Therefore, when the subject faces the side surface toward the motion information acquisition unit 151 as in normal walking, it is difficult to acquire the coordinate information of each joint of the subject in step S304 and step S305 of fig. 3. Since the processing from step S1103 onward cannot be executed unless the coordinate information of each joint is acquired, it is determined whether the processing from step S1103 onward is executable by the processing from step S1102.
In step S1103, the information processing apparatus 101 determines whether or not the coordinates of the toe of the subject specified in step S1101 are outside the 2 measurement start (end) coordinates 1201 set in advance.
When the coordinates of the toe of the subject determined in step S1101 are outside the 2 measurement start (end) coordinates 1201 set in advance, the information processing apparatus 101 ends the present process. When the coordinates of the toe of the subject specified in step S1101 are coordinates other than the outside (i.e., on the coordinates or on the inside) of the 2 measurement start (end) coordinates 1201 set in advance, the information processing apparatus 101 shifts the process to step S1104.
In step S1104, the information processing apparatus 101 determines whether or not the coordinate of the toe of the subject specified in step S1101 is any one of the 2 measurement start (end) coordinates 1201 set in advance.
When the coordinate of the toe of the subject specified in step S1101 is at any one of the 2 measurement start (end) coordinates 1201 set in advance, the information processing apparatus 101 shifts the process to step S1105. When the coordinates of the toe of the subject specified in step S1101 are not at any one of the 2 measurement start (end) coordinates 1201 set in advance, the information processing apparatus 101 determines that the coordinates of the toe of the subject specified in step S1101 are between (inside) the 2 measurement start (end) coordinates 1201 set in advance. Then, the process proceeds to step S1109.
In step S1105, the information processing apparatus 101 determines whether or not the 1 st time is at any one of the 2 measurement start (end) coordinates 1201 determined in step S1104 to be set in advance.
When it is determined in step S1104 that the 1 st time is present at any one of the 2 preset measurement start (end) coordinates 1201, the information processing apparatus 101 advances the process to step S1106. If it is determined that the processing is not the 1 st time (i.e., the 2 nd time), the information processing apparatus 101 shifts the processing to step S1107.
In step S1106, the information processing apparatus 101 counts the number of seconds, and ends the present process.
In step S1107, the information processing apparatus 101 ends the counting of the number of seconds executed in step S1106. Then, in step S1108, the evaluation of the movement of the subject is performed, and the evaluation is stored in the measurement result management table (fig. 19).
Describing the process of step S1108 more specifically, the information processing apparatus 101 first obtains the walking distance from the distance in the X direction between the measurement start point (X1, Y1, Z1) and the measurement end point (X2, Y2, Z2), that is, the absolute value (unit is meter) of X1 to X2.
The measurement start point referred to here means the measurement start (end) coordinates 1201 on the left side of the screen in fig. 12 when the subject has arrived from the left side of the screen in fig. 12 (the measurement start (end) coordinates 1201 on the right side when the subject has arrived from the right side), or the coordinates of the toe of the subject included in the image of the frame when the determination in step S1109 is "no".
In addition, the measurement end point referred to here means a measurement start (end) coordinate 1201 on the right side of the screen of fig. 12 in the case where the subject moves from the left side of the screen of fig. 12 (means a measurement start (end) coordinate 1201 on the left side in the case where the subject moves from the right side).
In this embodiment, basically, the subject walks to the front side to calculate the walking distance.
However, a case where the subject does not walk straight but deviates obliquely (toward the front side or the back side of the screen in fig. 12) is also considered. Therefore, as another example, a method of calculating the walking distance in consideration of the offset in the oblique direction (toward the front side or the back side of the screen in fig. 12) is also proposed. Specifically, the walking distance is calculated from the X-coordinate and the Z-coordinate between the measurement start point (X1, Y1, Z1) and the measurement end point (X2, Y2, Z2).
Next, the information processing apparatus 101 determines the walking speed of the subject by dividing the walking distance determined previously by the time taken until the end of counting in step S1107.
Then, the information processing device 101 finally calculates the time required for 5m walking at the walking speed previously obtained, and determines whether or not correct normal walking is performed based on whether the time is longer or shorter than a preset time. The number of 5m is a distance necessary for determining a normal walking motion of the subject.
By performing the above calculation, it is possible to determine whether or not the subject has correctly performed the normal walking operation even in a narrow room where the subject cannot walk by 5m, for example.
In step S1109, the information processing apparatus 101 determines whether or not the second counting is started.
If it is determined that the number of seconds is after the start of the second count, the information processing apparatus 101 ends the present process. If it is determined that the number of seconds is before the start of the second counting, the process proceeds to step S1106.
The determination of no in step S1109 means that although the subject is inside the preset 2 measurement start (end) coordinates 1201, the second counting is not started. For example, if the coordinate information of each joint of the subject is not specified (identified) when the toe of the subject reaches the preset 2 measurement start coordinates 1201, the determination in step S1109 is no.
In this case, in step S1106, counting of seconds is started from when the coordinates of the toe of the subject are recognized between the measurement start (end) coordinates 1201, and the movement of the subject can be evaluated without re-photographing.
This concludes the description of fig. 11.
Next, a process executed by the CPU201 of the information processing apparatus 101 when the selection of the item "stand with eyes open" is accepted from the user in step S301 in the present invention will be described with reference to fig. 13.
In step S1301, the information processing apparatus 101 determines whether or not the foot of the subject is lifted.
More specifically, first, before executing the process of step S1301, the information processing apparatus 101 determines in advance the value of the y coordinate of the heel of both feet of the subject. Then, it is determined that both feet contact the ground by determining that the difference between the values of the coordinates of the heel of each foot is within a predetermined range.
Next, in step S1301, it is determined whether or not the value of the y coordinate of a certain foot is changed by a predetermined amount or more from the image of the frame that is the current processing target, compared with the value of the y coordinate of the heel of both feet of the subject determined before the processing in step S1301 is executed. When the change is equal to or greater than a predetermined value, the information processing apparatus 101 determines that the foot is lifted.
If it is determined in step S1301 that the subject has his or her foot lifted, the information processing apparatus 101 proceeds to the process of step S1302. If it is determined that the subject' S foot is not lifted, the information processing apparatus 101 proceeds to the process of step S1304.
In step S1302, the information processing apparatus 101 determines whether or not the foot of the subject determined to be lifted in step S1301 is lifted to a predetermined height.
When it is determined in step S1301 that the foot of the lifted subject is lifted to a predetermined height, the information processing apparatus 101 starts counting of seconds in step S1303. If it is determined in step S1301 that the foot of the lifted subject is not lifted to the predetermined height, the information processing apparatus 101 ends the present process.
More specifically, the determination is made based on whether or not the value of the y coordinate of each foot of the subject has changed by a predetermined amount or more from the value of the y coordinate of the heel of both feet determined before the processing of step S1301 is executed. When the change is equal to or larger than a predetermined value, the information processing apparatus 101 determines that the foot is lifted up to a predetermined height (see fig. 30).
Further, the value of "predetermined" in step S1302 is a value larger than the value of "predetermined" in step S1301.
In step S1304, the information processing apparatus 101 determines whether or not it is determined that the foot of the subject is lifted in the frame before the image of the frame to be currently processed (yes in step S1301).
If it is determined that the foot of the subject is raised in the image of the frame preceding the image of the frame currently being processed (yes in step S1304), the process proceeds to step S1305. If it is determined that the foot of the subject is not determined to be lifted in the image of the frame preceding the image of the currently measured frame (no in step S1304), it is determined that the foot of the subject is not lifted, and the present process is terminated.
In step S1305, the information processing apparatus 101 determines whether the foot of the subject is dropped.
More specifically, in step S1305, it is determined whether or not the value of the y coordinate of each foot is not changed by a predetermined amount or more from the value of the y coordinate of the heel of both feet of the subject determined before the processing in step S1301 is executed, and if not, it is determined that the foot is dropped.
If it is determined in step S1305 that the foot of the subject has been lowered, the process proceeds to step S1306, and the count of seconds executed in step S1303 is ended. If it is determined that the subject's foot is not lowered, it is determined that the subject has raised his or her foot, and the process is terminated.
In step S1307, the information processing apparatus 101 evaluates the exercise ability of the subject based on whether the number of seconds counted from the start of counting the number of seconds in step S1303 to the end of counting in step S1306 is lower or higher than a predetermined number of seconds (for example, 24 seconds). The evaluation results are stored in the measurement result management table (fig. 19).
This concludes the description of fig. 13.
In fig. 13, the information processing apparatus 101 is configured to automatically determine whether or not the subject has lifted or dropped his or her foot, but the user (e.g., evaluator) may manually specify whether or not the subject has lifted or dropped his or her foot, as shown in fig. 14.
Therefore, a mechanism in which a user (e.g., an evaluator) manually specifies the raising and lowering of the subject's foot will be described below with reference to fig. 14.
Fig. 14 is a screen displayed on the display 212 of the information processing apparatus 101 when the information processing apparatus 101 starts photographing of the subject in step S302.
In 1401, the moving image acquired by the motion information acquisition unit 151 of the information processing apparatus 101 is displayed.
1402 is a raise and lower button. When the lift-and-drop button 1402 is pressed 1 st time by an operation of a mouse or the like performed by the user, the processing of fig. 13 is stopped, and it is determined that the subject has lifted his foot at the timing of the pressing, and counting of the number of seconds is started.
When the lift-and-drop button 1402 is pressed 2 nd time by an operation of a mouse or the like performed by the user, the information processing apparatus 101 determines that the subject has dropped his foot, and ends the counting of seconds. Then, the exercise ability of the subject was evaluated in accordance with the counted number of seconds.
Further, in the present embodiment, when the lift-and-drop button 1402 is pressed 1 st time by an operation of a mouse or the like performed by the user, the information processing apparatus 101 stops the processing of fig. 13. However, as another example, when the lift-and-drop button 1402 is pressed by an operation of a mouse or the like performed by the user after the process of step S1303 in fig. 13, the information processing apparatus 101 may end the counting of the number of seconds without performing the processes of step S1304 and step S1305, or when the lift-and-drop button 1402 is pressed by an operation of a mouse or the like performed by the user before the process of step S1302, the information processing apparatus 101 may start the counting of the number of seconds without performing the processes of step S1301 and step S1302, and then may execute the processes after step S1304.
As shown in fig. 14, the user (e.g., an evaluator) can manually specify the raising and lowering of the foot of the subject, and when the information processing apparatus 101 automatically determines the raising and lowering of the foot of the subject as shown in fig. 13, the determination result can be corrected even when the information processing apparatus determines that the foot is not raised even if the subject raises the foot, or when the information processing apparatus determines that the foot is not lowered even if the subject lowers the foot.
This concludes the description of fig. 14.
Next, a process of evaluating the movement of the subject and displaying a message for the subject and an assistant will be described with reference to fig. 29. The processing shown in the flowchart of fig. 29 is processing in which the CPU201 of the information processing apparatus 101 reads out and executes a predetermined control program.
In step S2901, it is determined whether the specified bone information satisfies a predetermined evaluation condition.
When the evaluation condition is satisfied (step S2901: yes), the process proceeds to step S2902.
If the evaluation condition is not satisfied (step S2901: no), the process proceeds to step S2903.
Fig. 30 shows an example of the evaluation condition. As shown in fig. 30, the evaluation condition and the evaluation content are associated with each other for each item.
For example, regarding "item: chair sitting "is evaluated as standing movement when the angle of the knee is 175 degrees or more, and as sitting movement when the angle is 95 degrees or less. Then, when the standing movement and the sitting movement were performed, it was evaluated that the sitting movement was performed 1 time.
When the difference in height between the left and right feet is 5cm or more while standing with the eyes open, it is evaluated that the foot is lifted.
In step S2902, the evaluation values are added and the like. For example, if the chair is sitting up, add 1 time. In addition, in the open-eye one-foot standing, the measurement of the one-foot standing time is started in a case where the one-foot standing start condition is satisfied. In the case where the end condition is satisfied, the measurement of the one-foot standing time is ended.
In step S2903, a message is determined based on the determined bone coordinates.
The message content is determined according to the message table shown in fig. 31. For example, in the item: when the angle of the knee is 175 degrees or more in the sitting position of the chair, a message of "sit-up" is determined. In addition, when the movement of sitting is shifted in a state where the angle of the knee is less than 175 degrees (when the angle of the knee becomes small), a message of "please stand stably" is determined.
In addition, in the open-eye monopod standing, if the state is that the foot is not lifted, a message of "please lift the foot" is determined, if the height of the lifted foot is lower, a message of "please lift the foot again" is determined, and if the state is that the foot is stably lifted, a message of "measuring" is determined.
By displaying the message in this manner, the subject or an assistant can recognize whether or not the action evaluated as being robust is performed, and what action is performed, the subject or the assistant is evaluated.
In step S2904, the message determined in step 2903 is displayed. As an example, 2201 of fig. 22 is shown.
Returning to the description of fig. 3.
The processing in steps S304 to S306 is performed each time the motion information acquisition unit 151 acquires a motion picture frame. By recognizing the bone information every time a frame is acquired, it is possible to detect a change in the coordinate information of each joint and recognize the motion.
In step S307, the CPU201 of the information processing apparatus 101 stores the result measured in step S306 (fig. 19).
Next, measurement processing of each item in the case of using the motion information recorded in advance will be described with reference to fig. 4.
Further, it is determined by the user's selection whether to perform the real-time measurement shown in fig. 3 (the process of performing the measurement while acquiring the motion information) or to perform the measurement using the motion information recorded in advance as shown in fig. 4.
In step S401, the user receives a selection of a moving image to be played back (a moving image obtained by photographing the motion of the subject desired to be measured), and receives an instruction to play back the moving image.
The animation data includes data acquired by each sensor of the motion information acquiring unit 151.
In step S402, the selection of the animation selected in step S401 and the selection of the animation to be compared are received.
In step S403, various settings are accepted for the animation to be played back and the animation to be compared.
In step S404, the setting accepted in step S402 is stored. The save processing is executed by pressing the save button of 2801 in fig. 28.
In step S405, the animation to which the reproduction instruction has been given and the animation set as the comparison target are reproduced.
At this time, when the playback buttons are simultaneously pressed, two moving images are played back simultaneously, and when a single playback button is pressed, the pressed moving images are played back individually.
Then, the processing after step S304 of fig. 3 is executed.
As shown in fig. 3 and 4, it is possible to switch between a case of evaluating an action in real time while photographing a subject and a case of storing an animation obtained by photographing a subject and then evaluating the action, and thus, for example, when it is desired to feed back to a subject on the spot, a user can set the action in real time, when it is desired to first record a plurality of subjects and then evaluate them, when the user (evaluator) is desired to confirm while repeatedly observing a detailed activity or the like, when it is desired to evaluate an animation obtained by photographing as another item, when it is desired to confirm an angle of a bone of a subject by an annotation function, when a subject and an evaluator are present, the user can set the action after storing an animation obtained by photographing a subject.
In addition, when the motion picture obtained by photographing the subject is stored and then the evaluation operation is performed, there is an advantage that the user can manually change the evaluation result without re-photographing when there is a false detection.
Fig. 23 and 24 show an example of the comparison screen.
Fig. 23 is an example of a screen in which a moving image (skeleton information) of an evaluation target and a moving image (skeleton information) of a comparison target are displayed in parallel in the vertical direction.
Fig. 24 is an example of a screen in which an animation of an evaluation target (skeleton information) and an animation of a comparison target (skeleton information) are displayed in parallel in the horizontal direction.
The layout of the comparison screen (vertical arrangement, horizontal arrangement) is, for example, as follows: whether the vertical arrangement or the horizontal arrangement is registered for each item in advance, and display is performed in accordance with the registered contents. Fig. 25 shows a table in which a layout is registered for each item.
As shown in FIG. 25, the chair is shown in a horizontal arrangement for sitting and in a vertical arrangement for walking. In this way, by registering in advance, it is possible to set items in which the subject moves in the vertical direction ("sitting in a chair" or the like) as a horizontal arrangement and items in which the subject moves in the horizontal direction ("walking in a normal manner" or the like) as a vertical arrangement. By determining the arrangement method in accordance with the moving direction in this manner, it is possible to display a layout that is easy to compare.
The method of performing the preliminary registration in this way is effective in the case where the item is determined in advance (the case where the item is determined in advance).
Information indicating which item is imaged and the selected animation data in steps S401 and S402 is recorded in association with each other, and the layout corresponding to the recorded item is specified using the table of fig. 25.
Further, there is a method of determining the layout based on the trajectory of the subject. For example, the layout is determined in accordance with the movement of the subject so that the subject is arranged up and down if the trajectory (moving direction) of the subject is the left-right direction, and so that the subject is arranged left and right if the trajectory of the subject is the up-down direction.
By the method of determining the trajectory of the subject in this manner, even when the item is not specified in advance, the item can be displayed in a layout that is easy to compare.
In addition, the left and right of the moving image can be reversed in the comparison screen.
Fig. 26 is a diagram showing a case where the upper animation in the screen of fig. 23 is reversed left and right. As shown in fig. 26, the person shot in the upper animation is oriented to the right in fig. 23, but oriented to the left in fig. 26.
By checking the check box shown at 2601 in fig. 26, an instruction indicating that the direction is reversed is received. When "top screen inversion" in 2601 is checked, the top screen is inverted. When "lower frame inversion" is selected, the lower frame is inverted. If the two are selected together, the upper and lower frames are reversed.
By reversing in this way, for example, even when measurement is performed by walking from left to right in a video 1 month before, but measurement is performed from right to left this time, the walking directions of both can be aligned, and therefore, comparison is easy and the difference between both can be grasped in detail.
In addition, the images to be compared can be displayed not only in parallel but also in a superimposed manner on the comparison screen.
Fig. 27 shows an example of the overlay display. 2701 and 2702 in fig. 27 are images of comparison subjects. Further, 2703 is an image obtained by superimposing the images of 2701 and 2702.
In the comparison screen, the size and position of the image to be compared can be adjusted.
The size is adjusted so that the length is matched with the skeletal information of the comparison subject, for example, by using the length up to the head and waist by checking the "size" in the automatic adjustment menu 2704 of fig. 27. In this way, the distance between any one of the joints in the subject's bone information is matched with the distance between the joints associated with the bone information to be compared corresponding to the distance between the joints, thereby performing the size adjustment.
In addition, not only can the adjustment be made automatically, but also the adjustment can be made by the operation of the user. In this case, the adjustment can be performed by pressing the size adjustment buttons 2705 and 2706 in fig. 27. The increase button 2705 is pressed to perform an increase, and the decrease button 2706 is pressed to perform a decrease.
By performing the size adjustment in this manner, even when people with different heights are compared, and when the magnification is different from that of a person photographed in the past, the difference between the two can be easily compared and grasped in detail.
The position is adjusted so that, for example, the position of the head matches the bone information of the comparison target by checking the "position" in the automatic adjustment menu 2704 of fig. 27. That is, the position is adjusted by matching any one of the joint coordinates of the subject with the joint coordinate associated with the bone information to be compared corresponding to the joint coordinate. In addition, the position is adjusted so that a person is projected to the center of the composite image.
In addition, the position can be adjusted by the user performing a drag operation.
By adjusting the positions in this manner, the bone information to be compared can be arranged at a desired position and compared or can be compared by superimposing them, and the comparison can be facilitated and the difference between the two can be grasped in detail.
In addition, when a check is made in "both" of the automatic adjustment menu 2704, both the size and the position are adjusted.
In addition, on the comparison screen, only the skeleton information can be displayed without displaying the background image including the subject.
Fig. 28 shows an example of a screen on which only skeleton information is displayed.
On the right end of the screen shown in fig. 28, there is a display switching menu, and when a check is made on an item such as "color" among them, the background is displayed, and when the check is canceled, only the skeleton information is displayed.
That is, a selection of whether to display a background image including the subject is received, and whether to display the background image including the subject is determined in accordance with the received content.
For example, when an animation of an object to be compared is photographed in different rooms, the difference in background interferes with the image, and it is difficult to compare the movement of the skeleton. In particular, in the composite image, different backgrounds are displayed so as to overlap each other, and therefore, they are very difficult to observe and compare. Therefore, the comparison target can be clearly displayed by eliminating the background, and the difference between the two can be grasped in detail.
The contents (combination of comparison animation, screen layout, position, size, etc.) set in the various screens are stored as a comparison script and can be reused.
The information of the comparison scenario includes information of the motion picture file, a playback start position, a playback end position, a zoom state, an adjusted position of the composite video, and settings of left and right inversion.
By storing the set contents as the comparison script in this way, even when the user wants to review again later, the user can save the time and labor for resetting.
Next, the report creation process will be described with reference to fig. 5.
In step S501, a selection of a subject is received from a user, and an output instruction of a measurement result (report) of the subject is received.
In step S502, the measurement result of the subject who has received the output instruction is extracted.
Here, the measurement result extraction process will be described in detail with reference to fig. 15.
In step S1501, the information processing apparatus 101 receives the setting of the extraction condition from the user. Fig. 17 shows an image of the extraction condition setting screen. In the screen of fig. 17, a start date 1701 and an end date 1702 are set as extraction conditions.
In step S1502, the information processing apparatus 101 receives a selection of an item from the user. In the screen of fig. 17, as item 1703, one is selected from open-eye one-foot standing, chair sitting, one-step maximum, normal walking, and TUG.
In step S1503, the information processing device 101 extracts the current and previous measurement results from the measurement result management table shown in fig. 19 in accordance with the extraction conditions and items received in steps S1501 and S1502. Specifically, the present process is executed by pressing the extraction button 1704 in fig. 17, and the measurement result management table is searched for the selected item 1703 within the range of the start date 1701 and the end date 1702 as the extraction conditions, and the latest measurement result and the second new measurement result are extracted as the current result and the previous result.
By extracting the latest measurement result and the second new result included in the designated period range as the current measurement result and the previous measurement result for the designated item (usually, a plurality of designations), it is possible to extract the measurement result for outputting the report without performing a series of operations of setting a search condition for each item, searching the measurement results, and selecting the current measurement result and the previous measurement result.
Here, a measurement result management table will be described with reference to fig. 19. The measurement result management table includes, as items, a measurement result ID1901, a subject 1902, a measurement date and time 1903, an item 1904, a measurement value 1905, an evaluation score 1906, and an animation file 1907. The measurement result ID1901 is a character string for uniquely identifying the measurement result. The subject 1902 was a subject who received the measurement. The measurement date 1903 is the date when measurement is started. Item 1904 is an item to be measured, and is one of standing with eyes open, sitting in a chair, one step at maximum, normal walking, and TUG. The measurement value 1905 is a numerical value indicating the measurement result. The evaluation score 1906 is a score evaluated based on the measurement value 1905. The animation file 1907 is a file name of animation data corresponding to the measurement result or a link to the animation data.
In step S1504, the information processing apparatus 101 displays the measurement result extracted in step S1503 on the display 212 or the like. Fig. 18 shows an image of the extraction result display screen. On the screen of fig. 18, a thumbnail image 1801 of the moving image data corresponding to the extracted measurement result is displayed.
Returning to the description of fig. 5.
In step S503, a report is generated based on the result extracted in step S502.
Here, the report creation process will be described in detail with reference to fig. 16.
The processing of steps S1601 to S1605 is repeatedly executed for each item for which selection has been accepted in step S1502.
In step S1601, the information processing apparatus 101 acquires the current and previous measurement results extracted in step S1503.
In step S1602, the information processing apparatus 101 determines whether or not the measurement result obtained in step S1601 is evaluated. Specifically, the determination is made based on whether or not a value is set in the evaluation score 1906 of the measurement result management table. If evaluated, the process proceeds to step S1604, and if not evaluated, the process proceeds to step S1603.
In step S1603, the information processing apparatus 101 evaluates the unevaluated measurement result, and registers the measurement value 1905 and the evaluation score 1906 in the measurement result management table.
In step S1604, the information processing apparatus 101 searches for each frame image of the moving image from the moving images corresponding to the measurement results, and extracts a frame image at a characteristic timing for the item. The condition for determining the characteristic timing is determined in advance for each item, and for example, if the user stands with his eyes open and his feet are placed on the ground, that is, the timing for changing the state from the state in which his feet are lifted to the state in which his feet are placed on the ground is determined, and if the condition is one step at maximum, the condition is determined as the timing for maximizing the spread of his feet, that is, the timing for placing his feet on the ground from the state in which his feet are lifted when the condition is determined to have the highest record. Specific conditions are set based on the value of the coordinate data for each joint, the state of change, the correlation between the coordinate data of a plurality of joints, and the like. The frame image with characteristic timing may be extracted in the present process, or may be extracted at the time of evaluation, stored in advance, and called out in the present process. In the present embodiment, 1 characteristic frame image is extracted for each item, but a plurality of frame images may be extracted depending on the condition.
In step S1605, the information processing apparatus 101 outputs the thumbnail image of the characteristic frame image extracted in step S1604 and the evaluation result for each item as a report element. When a plurality of characteristic frame images are extracted in step S1604, a plurality of corresponding thumbnail images may be arranged and output, or 1 thumbnail image may be combined and output.
By outputting the evaluation results for each item and the thumbnail images of the characteristic frame images as the report elements in this way, it is possible to visually grasp not only the condition of the measurement result but also the quality of the measurement result and the like visually and intuitively.
In step S1606, report elements output for each item are collected, and information related to measurement, such as a report creation date, subject information, and measurement location information, is added to create a report.
Returning to the description of fig. 5.
In step S504, the CPU201 of the information processing apparatus 101 outputs the report created in step S503.
Fig. 20 shows an output example of the report created in the present process and output in step S504. In the output example of fig. 20, the date of creation, the implementer (subject), the place of implementation, and the person in charge are displayed in the report information column 2001, the comment from the evaluator is displayed in the integrated comment column 2002, and the spider-web graph of the evaluation score for each item is displayed in the spider-web graph column 2003. In the pickup item field 2004, for the item selected by the user, a thumbnail image and a skeleton image are displayed from the characteristic image extracted in step S1605. The item category result field 2005 displays the current and previous execution days for each item category, the results (measurement values), the degree of improvement determined from the current and previous evaluation scores, comments made to the item by the evaluator, and thumbnail images of the characteristic frame images extracted in step S1605.
The program according to the present invention is a program for causing a computer to execute the processing shown in fig. 3 to 5, 11, 13, 15, 16, and 29. The program in the present invention may be a program for each of the processes in fig. 3 to 5, 11, 13, 15, 16, and 29.
As described above, it is needless to say that the object of the present invention can be achieved by supplying a recording medium on which a program for realizing the functions of the above-described embodiments is recorded to a system or an apparatus, and reading out and executing the program stored in the recording medium by a computer (or a CPU or MPU) of the system or the apparatus.
In this case, the program itself read from the recording medium realizes the new function of the present invention, and the recording medium on which the program is recorded constitutes the present invention.
Examples of the recording medium for supplying the program include a flexible disk, a hard disk, an optical magnetic disk, a CD-ROM, a CD-R, DVD-ROM, a magnetic tape, a nonvolatile memory card, a ROM, an EEPROM, and a silicon disk.
It is needless to say that the functions of the above-described embodiments are realized not only by the computer executing the read program but also by the OS (operating system) or the like operating on the computer performing a part or all of the actual processing in accordance with the instructions of the program and realizing the functions of the above-described embodiments by the processing.
Further, the following is naturally included: after the program read from the recording medium is written in a memory provided in a function expansion board inserted into the computer or a function expansion unit connected to the computer, a part or all of actual processing is performed by a CPU or the like provided in the function expansion board or the function expansion unit in accordance with an instruction of the program code, and the functions of the above-described embodiments are realized by this processing.
The present invention may be applied to a system including a plurality of devices, or may be applied to an apparatus including one device. The present invention is also applicable to a case where the program is supplied to a system or an apparatus. In this case, by reading out the recording medium storing the program for achieving the present invention to the system or apparatus, the system or apparatus can enjoy the effects of the present invention.
Further, by downloading and reading the program for realizing the present invention from a server, a database, or the like on the network by using the communication program, the system or the apparatus can enjoy the effects of the present invention. Note that all configurations in which the above embodiments and modifications are combined are also included in the present invention.

Claims (18)

1. An information processing apparatus, comprising:
a comparison animation selection receiving unit for receiving the selection of the animation to be compared;
a coordinate acquisition unit that acquires coordinate information of a joint of a subject;
a bone information generating unit that generates bone information of the subject based on the coordinate information acquired by the coordinate acquiring unit;
a storage unit which stores an item and a layout for displaying an animation related to the item in association with each other; and
and a display unit configured to display the skeletal information of the subject and the skeletal information to be compared in parallel by a layout stored in association with the items related to the animation selected by the comparative animation selection receiving unit.
2. The information processing apparatus according to claim 1, further comprising:
a selection receiving unit that receives a selection as to whether or not to display a background image including the subject on the screen displayed on the display unit; and
and a determination unit configured to determine whether or not to display a background image including the subject, in accordance with the content of the selection accepted by the selection acceptance unit.
3. The information processing apparatus according to claim 2,
the determination means determines to display a background image including the subject when the selection of displaying the background image including the subject is accepted by the selection acceptance means,
the display unit displays an image in which a background image including the subject and the skeletal information are synthesized.
4. The information processing apparatus according to claim 2,
the determination unit determines not to display the background image including the subject when the selection reception unit receives a selection not to display the background image including the subject,
the display unit displays the skeletal information without displaying a background image including the subject.
5. An information processing apparatus, comprising:
a coordinate acquisition unit that acquires coordinate information of a joint of a subject;
a bone information generating unit that generates bone information of the subject based on the coordinate information acquired by the coordinate acquiring unit;
a layout determining unit that determines a layout according to an action direction of the subject; and
and a display unit configured to display the bone information of the subject and the bone information to be compared in parallel according to the layout determined by the layout determination unit.
6. The information processing apparatus according to claim 5,
the layout determination means determines the layout so that the bone information of the subject and the bone information to be compared are displayed in a vertically aligned manner when the subject's movement is in the left-right direction, and the bone information of the subject and the bone information to be compared are displayed in a horizontally aligned manner when the subject's movement is in the up-down direction.
7. The information processing apparatus according to claim 5, further comprising:
a selection receiving unit that receives a selection as to whether or not to display a background image including the subject on the screen displayed on the display unit; and
and a determination unit configured to determine whether or not to display a background image including the subject, in accordance with the content of the selection accepted by the selection acceptance unit.
8. The information processing apparatus according to claim 7,
the determination means determines to display a background image including the subject when the selection of displaying the background image including the subject is accepted by the selection acceptance means,
the display unit displays an image in which a background image including the subject and the skeletal information are synthesized.
9. The information processing apparatus according to claim 7,
the determination unit determines not to display the background image including the subject when the selection reception unit receives a selection not to display the background image including the subject,
the display unit displays the skeletal information without displaying a background image including the subject.
10. An information processing method in an information processing apparatus, comprising:
a comparison animation selection reception step in which a comparison animation selection reception unit of the information processing apparatus receives selection of an animation to be compared;
a coordinate acquisition step in which a coordinate acquisition unit of the information processing device acquires coordinate information of a joint of a subject;
a bone information generating step of generating bone information of the subject based on the coordinate information acquired in the coordinate acquiring step by bone information generating means of the information processing apparatus;
a storage step of storing items in association with a layout for displaying animation related to the items; and
and a display step of displaying the bone information of the subject and the bone information to be compared in parallel by a layout stored in association with the items related to the moving image selected in the comparative moving image selection receiving step.
11. The information processing method according to claim 10, further comprising:
a selection receiving step in which a selection of whether or not to display a background image including the subject is received by the selection receiving means of the information processing apparatus on the screen displayed in the display step; and
and a determination step in which the determination means of the information processing device determines whether or not to display a background image including the subject, in accordance with the content of the selection received in the selection reception step.
12. The information processing method according to claim 11,
in the determining step, when the selection of displaying the background image including the subject is accepted in the selection accepting step, the background image including the subject is determined to be displayed,
in the display step, an image in which a background image of the subject and bone information are combined is displayed.
13. The information processing method according to claim 11,
in the determining step, when the selection of not displaying the background image including the subject is accepted in the selection accepting step, it is determined not to display the background image including the subject,
in the display step, the skeletal information is displayed without displaying a background image including the subject.
14. An information processing method in an information processing apparatus, comprising:
a coordinate acquisition step in which a coordinate acquisition unit of the information processing device acquires coordinate information of a joint of a subject;
a bone information generating step of generating bone information of the subject based on the coordinate information acquired in the coordinate acquiring step by bone information generating means of the information processing apparatus;
a layout determination step in which a layout determination means of the information processing apparatus determines a layout in accordance with an operation direction of the subject; and
and a display step of displaying the bone information of the subject and the bone information to be compared in parallel by the layout determined in the layout determination step on a display unit of the information processing device.
15. The information processing method according to claim 14,
in the layout determination step, the layout is determined so that the bone information of the subject and the bone information to be compared are displayed in a vertically aligned manner when the movement of the subject is in the left-right direction, and the bone information of the subject and the bone information to be compared are displayed in a horizontally aligned manner when the movement of the subject is in the up-down direction.
16. The information processing method according to claim 14, further comprising:
a selection receiving step in which a selection of whether or not to display a background image including the subject is received by the selection receiving means of the information processing apparatus on the screen displayed in the display step; and
and a determination step in which the determination means of the information processing device determines whether or not to display a background image including the subject, in accordance with the content of the selection received in the selection reception step.
17. The information processing method according to claim 16,
in the determining step, when the selection of displaying the background image including the subject is accepted in the selection accepting step, the background image including the subject is determined to be displayed,
in the display step, an image in which a background image of the subject and bone information are combined is displayed.
18. The information processing method according to claim 16,
in the determining step, when the selection of not displaying the background image including the subject is accepted in the selection accepting step, it is determined not to display the background image including the subject,
in the display step, the skeletal information is displayed without displaying a background image including the subject.
CN201610908922.1A 2015-10-29 2016-10-19 Information processing apparatus and information processing method Active CN106650217B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910679377.7A CN110391017A (en) 2015-10-29 2016-10-19 Information processing unit and information processing method

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
JP2015213489A JP2017080199A (en) 2015-10-29 2015-10-29 Information processing device, information processing method and program
JP2015213492A JP2017080201A (en) 2015-10-29 2015-10-29 Information processing device, information processing method and program
JP2015-213496 2015-10-29
JP2015-213492 2015-10-29
JP2015213491A JP2017084194A (en) 2015-10-29 2015-10-29 Information processing device, information processing method, and program
JP2015-213491 2015-10-29
JP2015213496A JP2017080203A (en) 2015-10-29 2015-10-29 Information processing device, information processing method and program
JP2015-213490 2015-10-29
JP2015213490A JP6803111B2 (en) 2015-10-29 2015-10-29 Information processing equipment, information processing methods, programs
JP2015-213489 2015-10-29

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201910679377.7A Division CN110391017A (en) 2015-10-29 2016-10-19 Information processing unit and information processing method

Publications (2)

Publication Number Publication Date
CN106650217A CN106650217A (en) 2017-05-10
CN106650217B true CN106650217B (en) 2020-06-30

Family

ID=58855608

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201610908922.1A Active CN106650217B (en) 2015-10-29 2016-10-19 Information processing apparatus and information processing method
CN201910679377.7A Pending CN110391017A (en) 2015-10-29 2016-10-19 Information processing unit and information processing method

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201910679377.7A Pending CN110391017A (en) 2015-10-29 2016-10-19 Information processing unit and information processing method

Country Status (1)

Country Link
CN (2) CN106650217B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107274467A (en) * 2017-06-29 2017-10-20 厦门游亨世纪科技有限公司 A kind of model animation interlock method based on Unity3D
JP2021182175A (en) 2018-08-10 2021-11-25 ソニーグループ株式会社 Information processing apparatus, information processing method, and program
CN109411089A (en) * 2018-09-05 2019-03-01 广州维纳斯家居股份有限公司 Determination method, apparatus, intelligent elevated table and the storage medium of user health situation
CN111026277A (en) * 2019-12-26 2020-04-17 深圳市商汤科技有限公司 Interaction control method and device, electronic equipment and storage medium

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4264368B2 (en) * 2004-02-24 2009-05-13 日本ナレッジ株式会社 Practical skill analysis system and program
JP4093226B2 (en) * 2004-11-08 2008-06-04 ソニー株式会社 Information processing apparatus and method, recording medium, and program
CN100534129C (en) * 2006-01-09 2009-08-26 上海乐金广电电子有限公司 System providing and editing action effect using video signal and its method
WO2009050846A1 (en) * 2007-10-16 2009-04-23 Panasonic Corporation Image display device and image display method
CN103155003B (en) * 2010-10-08 2016-09-28 松下电器产业株式会社 Posture estimation device and posture estimation method
CN102831380A (en) * 2011-06-15 2012-12-19 康佳集团股份有限公司 Body action identification method and system based on depth image induction
JP2015061577A (en) * 2013-01-18 2015-04-02 株式会社東芝 Movement-information processing device
JP6188332B2 (en) * 2013-01-21 2017-08-30 東芝メディカルシステムズ株式会社 Medical image display apparatus and program
EP3091737A3 (en) * 2013-02-14 2017-02-15 Panasonic Intellectual Property Management Co., Ltd. Digital mirror apparatus
EP3063616A1 (en) * 2013-10-30 2016-09-07 Barco Control Rooms GmbH Synchronization of videos in a display wall

Also Published As

Publication number Publication date
CN106650217A (en) 2017-05-10
CN110391017A (en) 2019-10-29

Similar Documents

Publication Publication Date Title
JP6842044B2 (en) Information processing equipment, information processing methods, programs
CN106650217B (en) Information processing apparatus and information processing method
US20190220659A1 (en) Information processing apparatus, information processing method, and storage medium
KR101514170B1 (en) Input device, input method and recording medium
JP5674465B2 (en) Image processing apparatus, camera, image processing method and program
JP6432592B2 (en) Information processing apparatus, information processing method, and program
KR20190045317A (en) Image processing apparatus, image generation method, and computer program
JP5613741B2 (en) Image processing apparatus, method, and program
JP6744559B2 (en) Information processing device, information processing method, and program
JP2017080197A (en) Information processing device, information processing method and program
US20150002518A1 (en) Image generating apparatus
JP6489117B2 (en) Information processing apparatus, information processing method, and program
JP5456175B2 (en) Video surveillance device
JP2011019627A (en) Fitness machine, method and program
JPWO2015141268A1 (en) Information processing apparatus, information processing method, and program
JP2020140719A (en) Information processor, information processing method, and program
JP2017080199A (en) Information processing device, information processing method and program
JP6803111B2 (en) Information processing equipment, information processing methods, programs
JP2017080203A (en) Information processing device, information processing method and program
JP2017080201A (en) Information processing device, information processing method and program
JP2017085427A (en) Information processing device, information processing method, and program
JP2017080198A (en) Information processing device, information processing method, and program
JP2017080202A (en) Information processing device, information processing method and program
JP5876121B2 (en) Image processing apparatus, method, and program
JP6631594B2 (en) Information processing apparatus, control method for information processing apparatus, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant