WO2021241558A1 - System and operation method - Google Patents

System and operation method Download PDF

Info

Publication number
WO2021241558A1
WO2021241558A1 PCT/JP2021/019754 JP2021019754W WO2021241558A1 WO 2021241558 A1 WO2021241558 A1 WO 2021241558A1 JP 2021019754 W JP2021019754 W JP 2021019754W WO 2021241558 A1 WO2021241558 A1 WO 2021241558A1
Authority
WO
WIPO (PCT)
Prior art keywords
error
user
unit
limb
contact
Prior art date
Application number
PCT/JP2021/019754
Other languages
French (fr)
Japanese (ja)
Inventor
守仁 黄
雄司 山川
正俊 石川
Original Assignee
国立大学法人 東京大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 国立大学法人 東京大学 filed Critical 国立大学法人 東京大学
Priority to CN202180034660.XA priority Critical patent/CN115605932A/en
Priority to US17/924,379 priority patent/US20230186784A1/en
Publication of WO2021241558A1 publication Critical patent/WO2021241558A1/en

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/24Use of tools

Definitions

  • the present invention relates to a system and an operation method.
  • Patent Document 1 discloses a skill training device used for training such a predetermined movement.
  • Patent Document 1 Since the skill training device disclosed in Patent Document 1 notifies as information when an operation different from the regulation is performed, the user must consciously read the information. Therefore, the learning effect is low depending on the age and motivation of the user.
  • the present invention has determined to provide a technique for assisting a user to effectively learn a predetermined operation regardless of the user's age and motivation.
  • a system includes a first contact portion, a sensor portion, and a second contact portion.
  • the first contact portion is connected to the operated portion.
  • the target position defined by the operated portion is variably configured according to the movement of the first limb.
  • the sensor unit is configured to measure an error from a predetermined trajectory of the target position.
  • the second contact portion includes an error sensation presenting portion and is configured to come into contact with a second limb different from the user's first limb.
  • the error sensation presenting unit is configured to present an error to the user by applying a force sense or a tactile sense based on the error to the second limb.
  • the user can effectively learn a predetermined operation regardless of the user's age and motivation.
  • the program for realizing the software appearing in the present embodiment may be provided as a non-transitory recording medium (Non-Transity Computer-Readable Medium) that can be read by a computer, or may be downloaded from an external server. It may be provided as possible, or it may be provided so that the program is started by an external computer and the function is realized by the client terminal (so-called cloud computing).
  • Non-Transity Computer-Readable Medium Non-Transity Computer-Readable Medium
  • the "part" may include, for example, a combination of hardware resources implemented by a circuit in a broad sense and information processing of software specifically realized by these hardware resources. ..
  • various information is handled in this embodiment, and these information are, for example, physical values of signal values representing voltage and current, and signal values as a bit aggregate of a binary number composed of 0 or 1. It is represented by high-low or quantum superposition (so-called qubit), and communication / operation can be executed on a circuit in a broad sense.
  • a circuit in a broad sense is a circuit realized by at least appropriately combining a circuit, a circuit, a processor, a memory, and the like. That is, an integrated circuit for a specific application (Application Specific Integrated Circuit: ASIC), a programmable logic device (for example, a simple programmable logic device (Simple Programmable Logic Device: SPLD), a composite programmable logic device (Complex Programg)). It includes a programmable gate array (Field Programmable Gate Array: FPGA) and the like.
  • System 1 1 and 2 are schematic views showing the overall configuration of the system 1.
  • the user U can use the system 1 to perform training on a predetermined operation.
  • the training here may be training for a healthy user U to acquire a predetermined movement, or training for an injured user U for the purpose of rehabilitation.
  • the system 1 includes an image pickup device 2 (an example of a sensor unit), a control device 3, and a main device 4, and these are electrically connected systems.
  • the image pickup device 2 is a so-called vision sensor (camera) configured to be able to capture information in the outside world, and it is particularly preferable to use a high-speed vision with a high frame rate.
  • vision sensor camera
  • the image pickup device 2 (sensor unit) is configured to measure the error E from the predetermined trajectory of the target position TP. This will be described in more detail later.
  • the frame rate (acquisition rate) of the image pickup apparatus 2 (sensor unit) is 100 fps (hertz) or more, and more specifically, for example, 100, 125, 150, 175, 200, 225, 250, 275.
  • the image pickup device 2 is connected to the communication unit 31 in the control device 3 described later by a telecommunication line (for example, a USB cable or the like), and is configured to be able to transfer the captured image IM to the control device 3.
  • a telecommunication line for example, a USB cable or the like
  • a camera capable of measuring not only visible light but also a band that cannot be perceived by humans such as an ultraviolet region and an infrared region may be adopted.
  • the system 1 according to the present embodiment can be implemented even in a dark field.
  • FIG. 3 is a block diagram showing a hardware configuration of the control device 3. As shown in FIG. 3, the control device 3 has a communication unit 31, a storage unit 32, and a control unit 33, and these components are electrically connected to the inside of the control device 3 via the communication bus 30. It is connected to the. Hereinafter, each component will be further described.
  • the communication unit 31 requires wireless LAN network communication, mobile communication such as 3G / LTE / 5G, Bluetooth (registered trademark) communication, and the like. It may be included according to the above. That is, it is more preferable to carry out as a set of these plurality of communication means. As a result, information and commands are exchanged between the control device 3 and other communicable devices.
  • the storage unit 32 stores various information defined by the above description. This is, for example, as a storage device such as a Solid State Drive (SSD), or a random access memory (Random Access Memory:) that stores temporarily necessary information (arguments, arrays, etc.) related to program operations. It can be implemented as a memory such as RAM). Further, these combinations may be used. Further, the storage unit 32 stores various programs that can be read by the control unit 33 described below. Further, the storage unit 32 stores a time-series image IM imaged by the image pickup device 2 and received by the communication unit 31.
  • the image IM is, for example, array information including pixel information of 8 bits for each of RGB.
  • the control unit 33 processes and controls the overall operation related to the control device 3.
  • the control unit 33 is, for example, a central processing unit (CPU) (not shown).
  • the control unit 33 realizes various functions related to the control device 3 by reading out a predetermined program stored in the storage unit 32. That is, information processing by software (stored in the storage unit 32) is specifically realized by hardware (control unit 33), and as shown in FIG. 3, as each functional unit in the control unit 33. Can be executed.
  • it is shown as a single control unit 33 in FIG. 3, it is not limited to this, and it may be implemented so as to have a plurality of control units 33 for each function. Further, it may be a combination thereof.
  • FIG. 4 is a schematic diagram showing a hardware configuration of the main device 4.
  • the main device 4 is a device in which the user U can operate the operated portion 43 by using his / her limbs. Further, the main device 4 is a device that receives a control signal CS from the control device 3 and drives it in various ways according to the control signal CS.
  • the main device 4 includes a first contact portion 41 and a second contact portion 42.
  • the first contact portion 41 is connected to the operated portion 43.
  • the first contact portion 41 is in contact with the first limb HF1 of the user U, so that the target position TP defined by the operated portion 43 is variably configured according to the movement of the first limb HF1.
  • the range of the target position TP that the user U can move using the first contact portion 41 is referred to as the first range.
  • the second contact portion 42 includes an error sensation presenting portion 45. It is configured to come into contact with a second limb HF2 that is different from the first limb HF1 of the user U.
  • the error sensation presenting unit 45 is configured to present the error E to the user U by imparting a force sense or a tactile sense based on the error E measured via the image pickup apparatus 2 to the second limb HF2. ..
  • the form of the first contact portion 41 and the second contact portion 42 is not particularly limited, but is appropriate depending on the usability of contacting the first limb HF1 or the second limb HF2.
  • the form should be selected. For example, if the first limb HF1 and the second limb HF2 are the left and right hands (left hand LH and right hand RH) of the user U, the first contact portion 41 and the second contact portion 42 are the left hand LH. It should be configured so that it can be grasped by the right hand RH and the right hand RH.
  • the main device 4 further includes a position adjusting unit 44.
  • the position adjusting unit 44 is, for example, a stage that can be driven in the XY direction, and it is preferable that the operated portion 43 can be displaced in a second range smaller than the first range that can be operated by the user U. With such a configuration, the position adjusting unit 44 can adjust the target position TP at the operated portion 43 so as to correct the error E.
  • the lower of the frame rate of the image pickup apparatus 2 and the drive rate of the position adjusting unit 44 functions as the control rate related to the correction of the error E.
  • the drive rate of the position adjusting unit 44 is 100 hertz or more, similar to the image pickup apparatus 2.
  • the correction by the position adjusting unit 44 is similar to the camera shake correction of the camera, and supplementarily realizes an appropriate predetermined operation.
  • the user U may be trained to correctly perform a predetermined operation even when the position adjusting unit 44 is not provided. In such a case, the user U will be required to perform a high-level operation, but it does not prevent the user U from carrying out such training.
  • FIG. 5 is a block diagram showing a functional configuration of the control device 3 (control unit 33).
  • the control device 3 includes a reception unit 331, an image processing unit 332, a calculation unit 333, and a control signal generation unit 334.
  • the control device 3 includes a reception unit 331, an image processing unit 332, a calculation unit 333, and a control signal generation unit 334.
  • the reception unit 331 receives information via the communication unit 31 or the storage unit 32, and is configured to be readable in the working memory.
  • the reception unit 331 is configured to receive various information (image IM, displacement information of the position adjustment unit 44, etc.) from the image pickup device 2 and / or the main device 4 via the communication unit 31.
  • the reception unit 331 may be implemented to receive the information transmitted from those devices.
  • various information received by the reception unit 331 will be described as being stored in the storage unit 32.
  • reception unit 331 After the reception unit 331 receives and temporarily reads the information into the working memory, it is not necessary to store at least a part of the information in the storage unit 32. Further, at least a part of the information may be stored in an external server other than the storage unit 32.
  • the image processing unit 332 is configured to read the program stored in the storage unit 32 into the image IM and execute predetermined image processing. For example, the image processing unit 332 executes image processing for specifying the line L which is a predetermined trajectory from the image IM. Details will be described later.
  • the calculation unit 333 is configured to perform a predetermined calculation using the image IM whose image processing has been executed by the image processing unit 332. For example, the calculation unit 333 calculates the error vector v1 or the symmetry vector v2 from the image IM. Details will be described later.
  • Control signal generation unit 3334 The control signal generation unit 334 is configured to generate a control signal CS for controlling the main device 4. Specifically, the control signal generation unit 334 generates the control signal CS1 that drives the position adjustment unit 44. Further, the control signal generation unit 334 generates the control signal CS2 that operates the error sense presentation unit 45.
  • the value of the control signal CS may be defined by, for example, a voltage.
  • FIG. 6 is an activity diagram showing an operation method of the system 1.
  • the user U is right-handed
  • the first limb HF1 is referred to as a right hand RH
  • the second limb HF2 is referred to as a left hand LH2. That is, the user U grasps the first contact portion 41 with the right hand RH and grasps the second contact portion 42 with the left hand LH (activity A101). Grasp is an example of contact.
  • the user U operates the first contact portion 41 with the right hand RH to move the target position TP at the operated portion 43 along the line L which is a predetermined trajectory (activity A102).
  • Such an operation is included in, for example, cutting work, coating work, medical practice, and the like.
  • the target position TP is also displaced accordingly.
  • the image pickup device 2 captures the target position TP and the line L, and the image IM is transmitted to the control device 3 (activity A103).
  • the reception unit 331 receives the image IM, which is stored in the storage unit 32.
  • FIG. 7 is a schematic diagram showing an example of an image IM in which the image processing unit 332 performs image processing.
  • the image processing unit 332 analyzes the image IM received by the reception unit 331 by image processing, and specifies the position of the line L in the image IM (activity A104). This is performed, for example, by binarizing the captured image IM by setting a threshold value for a predetermined parameter (brightness, etc.) related to the image. Then, the position of the line L can be specified by calculating the position of the center of gravity of the line L from the image IM.
  • the target position TP may be implemented as an intersection of the line of sight of the image pickup apparatus 2 and the defined surface P.
  • the image pickup device 2 is attached to the position adjusting unit 44. That is, the target position TP is the center of the image IM (image center CT) imaged by the image pickup apparatus 2.
  • image processing may be performed on a predetermined area ROI of a part of the image IM.
  • the line L is in the vicinity of a fixed position of the image IM (for example, the image center CT), and the region near this fixed position is referred to as the predetermined region ROI.
  • the number of pixels for image processing can be reduced.
  • the computational load of the control device 3 can be reduced to maintain a high control rate.
  • the calculation unit 333 calculates an error vector v1 representing an error E between the target position TP (image center CT) and the line L (activity A105).
  • FIG. 8 is a schematic diagram showing the error vector v1. If the error E falls within the second range that is the movable range of the position adjustment unit 44, the control signal generation unit 334 generates a control signal CS1 that corrects the error E and transmits it to the position adjustment unit 44 (activity A106). ). Further, the control signal generation unit 334 generates a control signal CS2 that presents the error E to the user U and transmits it to the error sense presentation unit 45 (activity A107).
  • control signal CS1 is transmitted to the position adjusting unit 44 in the main device 4 via the communication unit 31, so that the position adjusting unit 44 is driven, whereby the error E can be corrected.
  • the control method in this case is not particularly limited, but for example, P control, PD control, PID control and the like can be appropriately adopted. A preferable value may be set for each coefficient related to the control, if necessary.
  • the control signal CS2 is transmitted to the error sensation presenting unit 45 in the main device 4 via the communication unit 31, the error sensation presenting unit 45 operates, whereby the error E can be presented to the user U. ..
  • the control signal generation unit 334 does not generate the control signal CS1 for correcting the error E, and the error E is sent to the user U.
  • the control signal CS2 to be presented may be generated and transmitted to the error sense presenting unit 45 (activity A107).
  • the force or tactile sensation based on the error E is determined in proportion to the error vector v1 representing the error E. That is, in order to present the magnitude (degree) and direction of the error E to the user U, the force sense or the tactile sense is the user as a vector proportional to the error vector v1 (the proportionality constant is a positive or negative number and includes 1). It should be given to U.
  • the force sense or the tactile sense based on the error E is converted into a frequency suitable for human sense presentation and presented. By presenting the force or tactile sensation at a frequency perceptible to humans, the user U can grasp the state of the error E.
  • this system 1 operating method comprises first to fourth steps.
  • the first step the first limb HF1 of the user U is brought into contact with the first contact portion 41 of the system 1, and the second limb HF2 of the user U is brought into contact with the second contact portion 42 of the system 1.
  • the target position TP defined by the operated portion 43 of the system 1 is moved by moving the first limb HF1 in contact with the first contact portion 41.
  • the error E from the predetermined trajectory of the target position TP is measured.
  • the fourth step the error E is presented to the user U by applying a force sense or a tactile sense based on the error E to the second limb HF2 in contact with the second contact portion 42.
  • the first limb HF1 and the second limb HF2 are the left and right hands (left hand LH and right hand RH) or foot (left foot LF and right foot RF) of the user U.
  • Humans use the coordinated movement of both arms to perform various and complicated tasks. In order to move both human arms in a coordinated manner, it is thought that there is a brain mechanism that enables them to cooperate with each other while interfering with each other.
  • the user U when a force sense or a tactile sense is given to the left hand LH, the user U himself quickly adjusts the right hand RH in the direction of correcting the error E by the left-right synchronous movement. According to such a control process, the user U can more intuitively and effectively train and learn a predetermined motion regardless of the age and motivation of the user U.
  • the force or tactile sensation based on the error E may be determined in proportion to the symmetry vector v2 in which the error vector v1 representing the error E is symmetrically moved with respect to the plane of symmetry instead of the error vector v1 (FIG. 8). reference).
  • the plane of symmetry here is a plane extending back and forth from the center of the trunk of the user U. Even in consideration of stretching exercises and the like, human beings can naturally perform left-right symmetric movements by using a plane extending from the center of the trunk to the front and back as a symmetrical plane. Therefore, the user U may present the error E by a force sense or a tactile sense proportional to the symmetry vector v2 instead of the error vector v1. Further, the error vector v1 or the symmetry vector v2 may be selected according to the preference of the user U.
  • system 1 may be further ingeniously devised by the following aspects.
  • the system 1 may further include a guide light irradiation unit (not shown).
  • the guide light irradiation unit may be configured to be coaxial or relative to the image pickup device 2 (sensor unit) and to be capable of irradiating guide light indicating the target position TP. Since the relative positions of the guide light irradiation unit and the image pickup device 2 are known at the time of design, the target position TP can be irradiated from the guide light irradiation unit as projected light.
  • the image pickup apparatus 2 and the guide light irradiation unit are implemented as a coaxial optical system by using a beam splitter or the like. As a result, the user U can more intuitively grasp how the first contact portion 41 should be moved in order to displace the target position TP along a predetermined trajectory.
  • the target position TP is implemented as an intersection (image center CT) between the line of sight of the image pickup apparatus 2 and the defined surface P, but this is merely an example and is not limited to this.
  • a cutting tool for example, an end mill or a medical scalpel
  • the tip position of the cutting tool can be set to the target position TP.
  • the relative positions of the image pickup apparatus 2 and the cutting tool are known at the time of design. According to such a modification, the user U can carry out training in cutting and medical treatment.
  • a laser emitting portion (for processing) is attached to the position adjusting portion 44 at the operated portion 43, and the irradiation position (on the specified surface P) of the laser emitted from the laser emitting portion is set as the target position TP. can do.
  • the relative positions of the image pickup apparatus 2 and the laser emitting portion are known at the time of design. According to such a modification, the user U can be trained in laser machining so that the desired object has a defined shape.
  • a coating portion configured to be able to apply paint or the like can be attached to the position adjusting portion 44 at the operated portion 43, and the tip position of the coating portion can be set to the target position TP.
  • the relative positions of the image pickup apparatus 2 and the coating tool are known at the time of design. According to such a modification, the user can carry out training in the coating process.
  • Various objects can be considered as targets for determining the target position TP, including the cutting tool, laser emitting part, coating tool, etc. described above, and these can be freely attached and detached. ..
  • Another sensor may be used in place of or in combination with the image pickup apparatus 2.
  • a laser displacement sensor, an infrared sensor, or the like can be appropriately adopted.
  • control device 3 which is a part of the system 1 may be implemented by itself, not as the system 1.
  • a program that causes the computer to function as the control device 3 may be implemented.
  • the system further includes a guide light irradiating unit, the guide light irradiating unit having a fixed position coaxially or relative to the sensor unit and capable of irradiating a guide light indicating the target position.
  • the first and second limbs are the left and right hands of the user, and the first and second contact portions are configured to be graspable by the left and right hands, respectively.
  • the force or tactile sensation based on the error is determined in proportion to the error vector representing the error.
  • the first and second limbs are the left and right hands or feet of the user, and the force or tactile sensation based on the error causes the error vector representing the error to move symmetrically with respect to the plane of symmetry. Determined in proportion to the symmetry vector, where the plane of symmetry is a plane extending anterior-posteriorly from the center of the user's trunk.
  • the force or tactile sensation based on the error is converted into a frequency suitable for human sensory presentation and presented.
  • the sensor unit is an imaging unit configured to be capable of capturing information in the outside world.
  • the target position is the center of an image captured by the imaging unit.
  • the system further comprises a position adjusting unit, which can displace the operated portion in a second range smaller than the first range in which the user can operate, and corrects the error.
  • the position of the operated portion can be adjusted.
  • the acquisition rate of the sensor unit and the drive rate of the position adjustment unit are 100 hertz or more.
  • a method of operating the system comprising first to fourth steps, in which the user's first limbs are brought into contact with the first contact portion of the system and the user's second. The limbs are brought into contact with the second contact portion of the system, and in the second step, the first limbs in contact with the first contact portion are moved, whereby the operated portion of the system is used.
  • the specified target position is moved, the error from the predetermined trajectory of the target position is measured in the third step, and the force or tactile sensation based on the error is measured in the fourth step.
  • a method of presenting the error to the user by applying it to the second limb in contact with the contact portion of the body. Of course, this is not the case.
  • System 2 Image pickup device 3: Control device 30: Communication bus 31: Communication unit 32: Storage unit 33: Control unit 331: Reception unit 332: Image processing unit 333: Calculation unit 334: Control signal generation unit 4: Main device 41: First contact part 42: Second contact part 43: Operated part 44: Position adjustment part 45: Error sense presentation part CS: Control signal CS1: Control signal CS2: Control signal CT: Image center E: Error HF1 : First limb HF2: Second limb IM: Image L: Line LF: Left foot LH: Left hand P: Specified surface RF: Right foot RH: Right hand ROI: Predetermined area TP: Target position U: User v1: Error vector v2: Symmetric vector

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Rehabilitation Tools (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

[Problem] To provide technology that effectively assists a user in learning a prescribed motion, irrespective of the user's age and motivation. [Solution] According to one embodiment of the present invention, a system is provided. This system comprises a first contact unit, a sensor unit, and a second contact unit. The first contact unit is connected to a point to be operated. By the first contact unit coming into contact with a first limb of a user, a target position that is defined by the point to be operated is variably configured in conformity with the motion of the first limb. The sensor unit is configured so as to measure an error from the prescribed trajectory of the target position. The second contact unit is provided with an error sensation presentation unit and is configured so as to come into contact with a second limb that is different from the first limb of the user. The error sensation presentation unit is configured so as to add a kinesthetic or tactile sense based on the error to the second limb and thereby present the error to the user. 7/13

Description

システム及び操作方法System and operation method
 本発明は、システム及び操作方法に関する。 The present invention relates to a system and an operation method.
 人間が手足を使って所定の動作を伴うタスクを実行する場面は多々存在する。特許文献1にはこのような所定の動作を訓練するために使用される技能訓練装置が開示されている。 There are many scenes in which humans use their limbs to perform tasks that involve predetermined movements. Patent Document 1 discloses a skill training device used for training such a predetermined movement.
特開2020-12858号公報Japanese Unexamined Patent Publication No. 2020-12858
 特許文献1に開示された技能訓練装置は、規定とは異なる動作をした際に、それを情報として報知するものであるため、ユーザが意識的にその情報を読み取らなければならない。したがって、ユーザの年代やモチベーションによっては学習効果が低くなる。 Since the skill training device disclosed in Patent Document 1 notifies as information when an operation different from the regulation is performed, the user must consciously read the information. Therefore, the learning effect is low depending on the age and motivation of the user.
 本発明では上記事情を鑑み、ユーザの年代やモチベーションによらず、ユーザが効果的に所定の動作を学習することを支援する技術を提供することとした。 In view of the above circumstances, the present invention has determined to provide a technique for assisting a user to effectively learn a predetermined operation regardless of the user's age and motivation.
 本発明の一態様によれば、システムが提供される。このシステムは、第1の接触部と、センサ部と、第2の接触部とを備える。第1の接触部は、被操作箇所と接続される。ユーザの第1の手足と接触することで、第1の手足の動きに合わせて被操作箇所によって規定される目標位置を可変に構成される。センサ部は、目標位置の所定軌道からの誤差を計測するように構成される。第2の接触部は、誤差感覚提示部を備え、且つユーザの第1の手足とは異なる第2の手足と接触するように構成される。誤差感覚提示部は、誤差に基づいた力覚又は触覚を第2の手足に付与することで、ユーザに誤差を提示するように構成される。 According to one aspect of the present invention, a system is provided. This system includes a first contact portion, a sensor portion, and a second contact portion. The first contact portion is connected to the operated portion. By coming into contact with the user's first limb, the target position defined by the operated portion is variably configured according to the movement of the first limb. The sensor unit is configured to measure an error from a predetermined trajectory of the target position. The second contact portion includes an error sensation presenting portion and is configured to come into contact with a second limb different from the user's first limb. The error sensation presenting unit is configured to present an error to the user by applying a force sense or a tactile sense based on the error to the second limb.
 これによれば、ユーザの年代やモチベーションによらず、ユーザが効果的に所定の動作を学習することができる。 According to this, the user can effectively learn a predetermined operation regardless of the user's age and motivation.
システム1の全体構成を示す概要図である。It is a schematic diagram which shows the whole structure of the system 1. システム1の全体構成を示す概要図である。It is a schematic diagram which shows the whole structure of the system 1. 制御装置3のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware composition of the control device 3. メイン装置4のハードウェア構成を示す概要図である。It is a schematic diagram which shows the hardware composition of the main apparatus 4. 制御装置3(制御部33)の機能構成を示すブロック図である。It is a block diagram which shows the functional structure of the control device 3 (control unit 33). システム1の操作方法を示すアクティビティ図である。It is an activity diagram which shows the operation method of the system 1. 画像処理部332が画像処理を行う画像IMの一例を示す概要図である。It is a schematic diagram which shows an example of the image IM which image processing unit 332 performs image processing. 誤差ベクトルv1を表す概要図である。It is a schematic diagram which shows the error vector v1.
 以下、図面を用いて本発明の実施形態について説明する。以下に示す実施形態中で示した各種特徴事項は、互いに組み合わせ可能である。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. The various features shown in the embodiments shown below can be combined with each other.
 ところで、本実施形態に登場するソフトウェアを実現するためのプログラムは、コンピュータが読み取り可能な非一時的な記録媒体(Non-Transitory Computer-Readable Medium)として提供されてもよいし、外部のサーバからダウンロード可能に提供されてもよいし、外部のコンピュータで当該プログラムを起動させてクライアント端末でその機能を実現(いわゆるクラウドコンピューティング)するように提供されてもよい。 By the way, the program for realizing the software appearing in the present embodiment may be provided as a non-transitory recording medium (Non-Transity Computer-Readable Medium) that can be read by a computer, or may be downloaded from an external server. It may be provided as possible, or it may be provided so that the program is started by an external computer and the function is realized by the client terminal (so-called cloud computing).
 また、本実施形態において「部」とは、例えば、広義の回路によって実施されるハードウェア資源と、これらのハードウェア資源によって具体的に実現されうるソフトウェアの情報処理とを合わせたものも含みうる。また、本実施形態においては様々な情報を取り扱うが、これら情報は、例えば電圧・電流を表す信号値の物理的な値、0又は1で構成される2進数のビット集合体としての信号値の高低、又は量子的な重ね合わせ(いわゆる量子ビット)によって表され、広義の回路上で通信・演算が実行されうる。 Further, in the present embodiment, the "part" may include, for example, a combination of hardware resources implemented by a circuit in a broad sense and information processing of software specifically realized by these hardware resources. .. In addition, various information is handled in this embodiment, and these information are, for example, physical values of signal values representing voltage and current, and signal values as a bit aggregate of a binary number composed of 0 or 1. It is represented by high-low or quantum superposition (so-called qubit), and communication / operation can be executed on a circuit in a broad sense.
 また、広義の回路とは、回路(Circuit)、回路類(Circuitry)、プロセッサ(Processor)、及びメモリ(Memory)等を少なくとも適当に組み合わせることによって実現される回路である。すなわち、特定用途向け集積回路(Application Specific Integrated Circuit:ASIC)、プログラマブル論理デバイス(例えば、単純プログラマブル論理デバイス(Simple Programmable Logic Device:SPLD)、複合プログラマブル論理デバイス(Complex Programmable Logic Device:CPLD)、及びフィールドプログラマブルゲートアレイ(Field Programmable Gate Array:FPGA))等を含むものである。 Further, a circuit in a broad sense is a circuit realized by at least appropriately combining a circuit, a circuit, a processor, a memory, and the like. That is, an integrated circuit for a specific application (Application Specific Integrated Circuit: ASIC), a programmable logic device (for example, a simple programmable logic device (Simple Programmable Logic Device: SPLD), a composite programmable logic device (Complex Programg)). It includes a programmable gate array (Field Programmable Gate Array: FPGA) and the like.
1.ハードウェア構成
 本節では、実施形態に係るシステム1のハードウェア構成について説明する。
1. 1. Hardware Configuration This section describes the hardware configuration of the system 1 according to the embodiment.
1.1 システム1
 図1及び図2は、システム1の全体構成を示す概要図である。図1に示されるように、ユーザUは、システム1を用いて所定の動作の訓練を実施することができる。ここでの訓練とは、健康なユーザUが所定の動作を習得するための訓練でもよいし、怪我をしたユーザUがリハビリを目的とする訓練でもよい。また、図2に示されるように、システム1は、撮像装置2(センサ部の一例)と、制御装置3と、メイン装置4とを備え、これらが電気的に接続されたシステムである。
1.1 System 1
1 and 2 are schematic views showing the overall configuration of the system 1. As shown in FIG. 1, the user U can use the system 1 to perform training on a predetermined operation. The training here may be training for a healthy user U to acquire a predetermined movement, or training for an injured user U for the purpose of rehabilitation. Further, as shown in FIG. 2, the system 1 includes an image pickup device 2 (an example of a sensor unit), a control device 3, and a main device 4, and these are electrically connected systems.
1.2 撮像装置2
 撮像装置2は、外界の情報を撮像可能に構成される、いわゆるビジョンセンサ(カメラ)であり、特に高速ビジョンと称するフレームレートが高いものが採用されることが好ましい。
1.2 Imaging device 2
The image pickup device 2 is a so-called vision sensor (camera) configured to be able to capture information in the outside world, and it is particularly preferable to use a high-speed vision with a high frame rate.
 撮像装置2(センサ部)は、目標位置TPの所定軌道からの誤差Eを計測するように構成される。これについては、後にさらに詳述する。好ましくは、撮像装置2(センサ部)のフレームレート(取得レート)は、100fps(ヘルツ)以上であり、より具体的には例えば、100,125,150,175,200,225,250,275,300,325,350,375,400,425,450,475,500,525,550,575,600,625,650,675,700,725,750,775,800,825,850,875,900,925,950,975,1000,1025,1050,1075,1100,1125,1150,1175,1200,1225,1250,1275,1300,1325,1350,1375,1400,1425,1450,1475,1500,1525,1550,1575,1600,1625,1650,1675,1700,1725,1750,1775,1800,1825,1850,1875,1900,1925,1950,1975,2000fpsであり、ここで例示した数値の何れか2つの間の範囲内であってもよい。 The image pickup device 2 (sensor unit) is configured to measure the error E from the predetermined trajectory of the target position TP. This will be described in more detail later. Preferably, the frame rate (acquisition rate) of the image pickup apparatus 2 (sensor unit) is 100 fps (hertz) or more, and more specifically, for example, 100, 125, 150, 175, 200, 225, 250, 275. 300,325,350,375,400,425,450,475,500,525,550,575,600,625,650,675,700,725,750,775,800,825,850,875,900, 925,950,975,1000,1025,1050,1075,1100,1125,1150,1175,1200,1225,1250,1275,1300,1235,1350,1375,1400,1245,1450,1475,1500,1525 1550, 1575, 1600, 1625, 1650, 1675, 1700, 1725, 1750, 1775, 1800, 1825, 1850, 1875, 1900, 1925, 1950, 1975, 2000 fps, and any two of the numerical values exemplified here. It may be within the range between.
 撮像装置2は、後述の制御装置3における通信部31と電気通信回線(例えばUSBケーブル等)で接続され、撮像した画像IMを制御装置3に転送可能に構成される。 The image pickup device 2 is connected to the communication unit 31 in the control device 3 described later by a telecommunication line (for example, a USB cable or the like), and is configured to be able to transfer the captured image IM to the control device 3.
 また、撮像装置2において、可視光だけではなく紫外域や赤外域といったヒトが知覚できない帯域を計測可能なカメラを採用してもよい。このようなカメラを採用することによって、暗視野であっても本実施形態に係るシステム1を実施することができる。 Further, in the image pickup apparatus 2, a camera capable of measuring not only visible light but also a band that cannot be perceived by humans such as an ultraviolet region and an infrared region may be adopted. By adopting such a camera, the system 1 according to the present embodiment can be implemented even in a dark field.
1.3 制御装置3
 図3は、制御装置3のハードウェア構成を示すブロック図である。図3に示されるように、制御装置3は、通信部31と、記憶部32と、制御部33とを有し、これらの構成要素が制御装置3の内部において通信バス30を介して電気的に接続されている。以下、各構成要素についてさらに説明する。
1.3 Control device 3
FIG. 3 is a block diagram showing a hardware configuration of the control device 3. As shown in FIG. 3, the control device 3 has a communication unit 31, a storage unit 32, and a control unit 33, and these components are electrically connected to the inside of the control device 3 via the communication bus 30. It is connected to the. Hereinafter, each component will be further described.
 通信部31は、USB、IEEE1394、Thunderbolt、有線LANネットワーク通信等といった有線型の通信手段が好ましいものの、無線LANネットワーク通信、3G/LTE/5G等のモバイル通信、Bluetooth(登録商標)通信等を必要に応じて含めてもよい。すなわち、これら複数の通信手段の集合として実施することがより好ましい。これにより制御装置3と通信可能な他の機器との間で情報や命令のやりとりが実行される。 Although wired communication means such as USB, IEEE1394, Thunderbolt, and wired LAN network communication are preferable, the communication unit 31 requires wireless LAN network communication, mobile communication such as 3G / LTE / 5G, Bluetooth (registered trademark) communication, and the like. It may be included according to the above. That is, it is more preferable to carry out as a set of these plurality of communication means. As a result, information and commands are exchanged between the control device 3 and other communicable devices.
 記憶部32は、前述の記載により定義される様々な情報を記憶する。これは、例えばソリッドステートドライブ(Solid State Drive:SSD)等のストレージデバイスとして、あるいは、プログラムの演算に係る一時的に必要な情報(引数、配列等)を記憶するランダムアクセスメモリ(Random Access Memory:RAM)等のメモリとして実施されうる。また、これらの組合せであってもよい。また、記憶部32は、次に説明する制御部33が読み出し可能な各種のプログラムを記憶している。さらに、記憶部32は、撮像装置2によって撮像され且つ通信部31が受信した時系列の画像IMを記憶する。ここで、画像IMは、例えばRGB各8ビットのピクセル情報を具備する配列情報である。 The storage unit 32 stores various information defined by the above description. This is, for example, as a storage device such as a Solid State Drive (SSD), or a random access memory (Random Access Memory:) that stores temporarily necessary information (arguments, arrays, etc.) related to program operations. It can be implemented as a memory such as RAM). Further, these combinations may be used. Further, the storage unit 32 stores various programs that can be read by the control unit 33 described below. Further, the storage unit 32 stores a time-series image IM imaged by the image pickup device 2 and received by the communication unit 31. Here, the image IM is, for example, array information including pixel information of 8 bits for each of RGB.
 制御部33は、制御装置3に関連する全体動作の処理・制御を行う。制御部33は、例えば不図示の中央処理装置(Central Processing Unit:CPU)である。制御部33は、記憶部32に記憶された所定のプログラムを読み出すことによって、制御装置3に係る種々の機能を実現する。すなわち、ソフトウェア(記憶部32に記憶されている)による情報処理がハードウェア(制御部33)によって具体的に実現されることで、図3に示されるように、制御部33における各機能部として実行されうる。なお、図3においては、単一の制御部33として表記されているが、実際はこれに限るものではなく、機能ごとに複数の制御部33を有するように実施してもよい。またそれらの組合せであってもよい。 The control unit 33 processes and controls the overall operation related to the control device 3. The control unit 33 is, for example, a central processing unit (CPU) (not shown). The control unit 33 realizes various functions related to the control device 3 by reading out a predetermined program stored in the storage unit 32. That is, information processing by software (stored in the storage unit 32) is specifically realized by hardware (control unit 33), and as shown in FIG. 3, as each functional unit in the control unit 33. Can be executed. Although it is shown as a single control unit 33 in FIG. 3, it is not limited to this, and it may be implemented so as to have a plurality of control units 33 for each function. Further, it may be a combination thereof.
1.4 メイン装置4
 図4は、メイン装置4のハードウェア構成を示す概要図である。メイン装置4は、ユーザUが自らの手足を用いて被操作箇所43を操作することができる装置である。また、メイン装置4は、制御装置3から制御信号CSを受信して、これに応じて様々に駆動する装置である。メイン装置4は、第1の接触部41と、第2の接触部42とを備える。
1.4 Main device 4
FIG. 4 is a schematic diagram showing a hardware configuration of the main device 4. The main device 4 is a device in which the user U can operate the operated portion 43 by using his / her limbs. Further, the main device 4 is a device that receives a control signal CS from the control device 3 and drives it in various ways according to the control signal CS. The main device 4 includes a first contact portion 41 and a second contact portion 42.
 図4に示されるように、第1の接触部41は、被操作箇所43と接続される。第1の接触部41は、ユーザUの第1の手足HF1と接触することで、第1の手足HF1の動きに合わせて被操作箇所43によって規定される目標位置TPを可変に構成される。なお、ユーザUが第1の接触部41を用いて移動させることができる目標位置TPの範囲を第1の範囲と呼ぶこととする。 As shown in FIG. 4, the first contact portion 41 is connected to the operated portion 43. The first contact portion 41 is in contact with the first limb HF1 of the user U, so that the target position TP defined by the operated portion 43 is variably configured according to the movement of the first limb HF1. The range of the target position TP that the user U can move using the first contact portion 41 is referred to as the first range.
 図4に示されるように、第2の接触部42は、誤差感覚提示部45を備える。ユーザUの第1の手足HF1とは異なる第2の手足HF2と接触するように構成される。誤差感覚提示部45は、撮像装置2を介して計測された誤差Eに基づいた力覚又は触覚を第2の手足HF2に付与することで、ユーザUに誤差Eを提示するように構成される。 As shown in FIG. 4, the second contact portion 42 includes an error sensation presenting portion 45. It is configured to come into contact with a second limb HF2 that is different from the first limb HF1 of the user U. The error sensation presenting unit 45 is configured to present the error E to the user U by imparting a force sense or a tactile sense based on the error E measured via the image pickup apparatus 2 to the second limb HF2. ..
 なお、第1の接触部41及び第2の接触部42の形態は、特に限定されるものではないが、第1の手足HF1又は第2の手足HF2に接触するというユーザビリティに応じて、適切な形態が選択されるとよい。例えば、第1の手足HF1及び第2の手足HF2が、ユーザUの左右の手(左手LH及び右手RH)であるならば、第1の接触部41及び第2の接触部42は、左手LH及び右手RHでそれぞれ把握可能に構成されるとよい。 The form of the first contact portion 41 and the second contact portion 42 is not particularly limited, but is appropriate depending on the usability of contacting the first limb HF1 or the second limb HF2. The form should be selected. For example, if the first limb HF1 and the second limb HF2 are the left and right hands (left hand LH and right hand RH) of the user U, the first contact portion 41 and the second contact portion 42 are the left hand LH. It should be configured so that it can be grasped by the right hand RH and the right hand RH.
 また、メイン装置4は、位置調整部44をさらに備える。位置調整部44は、例えばXY方向に駆動可能なステージであり、被操作箇所43を、ユーザUが操作可能な第1の範囲よりも小さい第2の範囲で変位可能であるとよい。このような構成により、位置調整部44は、誤差Eを補正するように、被操作箇所43における目標位置TPを調整することができる。 Further, the main device 4 further includes a position adjusting unit 44. The position adjusting unit 44 is, for example, a stage that can be driven in the XY direction, and it is preferable that the operated portion 43 can be displaced in a second range smaller than the first range that can be operated by the user U. With such a configuration, the position adjusting unit 44 can adjust the target position TP at the operated portion 43 so as to correct the error E.
 システム1全体としては、撮像装置2のフレームレート及び位置調整部44の駆動レートのうちの低い方が、誤差Eの補正に係る制御レートとして機能する。換言すると、フレームレート及び駆動レートを同程度に高くすることで、予測を全く用いずにフィードバック制御だけで目標位置TPの誤差Eを補正することが可能となる。すなわち好ましくは、位置調整部44の駆動レートは、撮像装置2と同様に、100ヘルツ以上である。 In the system 1 as a whole, the lower of the frame rate of the image pickup apparatus 2 and the drive rate of the position adjusting unit 44 functions as the control rate related to the correction of the error E. In other words, by increasing the frame rate and the drive rate to the same extent, it is possible to correct the error E of the target position TP only by feedback control without using any prediction. That is, preferably, the drive rate of the position adjusting unit 44 is 100 hertz or more, similar to the image pickup apparatus 2.
 なお、ユーザUによる所定の動作の訓練するにあたり、位置調整部44による補正を実施しなくてもよい。位置調整部44による補正は、カメラの手ブレ補正のようなもので、適切な所定の動作を補助的に実現するものである。ユーザUは位置調整部44がない場面でも、所定の動作を正しく行えるように訓練をすることがよい。かかる場合、ユーザUにより高度な操作を課すこととなるが、そのような訓練を実施することを妨げない。 It should be noted that it is not necessary to perform the correction by the position adjusting unit 44 when the user U trains the predetermined operation. The correction by the position adjusting unit 44 is similar to the camera shake correction of the camera, and supplementarily realizes an appropriate predetermined operation. The user U may be trained to correctly perform a predetermined operation even when the position adjusting unit 44 is not provided. In such a case, the user U will be required to perform a high-level operation, but it does not prevent the user U from carrying out such training.
2.機能構成
 本節では、本実施形態の機能構成について説明する。図5は、制御装置3(制御部33)の機能構成を示すブロック図である。前述の制御部33に関して、制御装置3は、受付部331と、画像処理部332と、演算部333と、制御信号生成部334とを備える。以下、各構成要素についてさらに説明する。
2. 2. Functional configuration This section describes the functional configuration of this embodiment. FIG. 5 is a block diagram showing a functional configuration of the control device 3 (control unit 33). Regarding the above-mentioned control unit 33, the control device 3 includes a reception unit 331, an image processing unit 332, a calculation unit 333, and a control signal generation unit 334. Hereinafter, each component will be further described.
(受付部331)
 受付部331は、通信部31又は記憶部32を介して情報を受け付け、これを作業メモリに読出可能に構成される。特に、受付部331は、撮像装置2及び/又はメイン装置4から通信部31を介して種々の情報(画像IM、位置調整部44の変位情報等)を受け付けるように構成される。制御装置3が他の機器と接続されている場合は、受付部331がそれらの機器から送信された情報を受信するように実施してもよい。本実施形態では、受付部331が受け付けた種々の情報は、記憶部32に記憶されるものとして説明する。
(Reception Department 331)
The reception unit 331 receives information via the communication unit 31 or the storage unit 32, and is configured to be readable in the working memory. In particular, the reception unit 331 is configured to receive various information (image IM, displacement information of the position adjustment unit 44, etc.) from the image pickup device 2 and / or the main device 4 via the communication unit 31. When the control device 3 is connected to other devices, the reception unit 331 may be implemented to receive the information transmitted from those devices. In the present embodiment, various information received by the reception unit 331 will be described as being stored in the storage unit 32.
 なお、受付部331が受け付けて作業メモリに一時的に読み出した以降は、少なくとも一部の情報を記憶部32に記憶させなくてもよい。さらに、少なくとも一部の情報を記憶部32以外の外部サーバに記憶させてもよい。 After the reception unit 331 receives and temporarily reads the information into the working memory, it is not necessary to store at least a part of the information in the storage unit 32. Further, at least a part of the information may be stored in an external server other than the storage unit 32.
(画像処理部332)
 画像処理部332は、画像IMに記憶部32に記憶されたプログラムを読み出して、所定の画像処理を実行するように構成される。例えば、画像処理部332は、画像IMから所定軌道であるラインLを特定するための画像処理を実行する。詳細は後述する。
(Image processing unit 332)
The image processing unit 332 is configured to read the program stored in the storage unit 32 into the image IM and execute predetermined image processing. For example, the image processing unit 332 executes image processing for specifying the line L which is a predetermined trajectory from the image IM. Details will be described later.
(演算部333)
 演算部333は、画像処理部332によって画像処理が実行された画像IMを用いて、所定の演算を実行するように構成される。例えば、演算部333は、画像IMから誤差ベクトルv1又は対称ベクトルv2を演算する。詳細は後述する。
(Calculation unit 333)
The calculation unit 333 is configured to perform a predetermined calculation using the image IM whose image processing has been executed by the image processing unit 332. For example, the calculation unit 333 calculates the error vector v1 or the symmetry vector v2 from the image IM. Details will be described later.
(制御信号生成部334)
 制御信号生成部334は、メイン装置4を制御するための制御信号CSを生成するように構成される。具体的には、制御信号生成部334は、位置調整部44を駆動させる制御信号CS1を生成する。また、制御信号生成部334は、誤差感覚提示部45を作動させる制御信号CS2を生成する。制御信号CSの値は、例えば電圧で規定されるとよい。
(Control signal generation unit 334)
The control signal generation unit 334 is configured to generate a control signal CS for controlling the main device 4. Specifically, the control signal generation unit 334 generates the control signal CS1 that drives the position adjustment unit 44. Further, the control signal generation unit 334 generates the control signal CS2 that operates the error sense presentation unit 45. The value of the control signal CS may be defined by, for example, a voltage.
3.制御処理
 本節では、システム1の制御処理の流れについて説明する。
3. 3. Control processing This section describes the flow of control processing of system 1.
3.1 操作方法
 図6は、システム1の操作方法を示すアクティビティ図である。ここでは簡単のため、ユーザUが右利きで、第1の手足HF1を右手RHとし、第2の手足HF2を左手LH2として説明する。すなわちユーザUが右手RHで第1の接触部41を把握し、左手LHで第2の接触部42を把握する(アクティビティA101)。把握は接触の一例である。そして、ユーザUが右手RHで第1の接触部41を操作することで、被操作箇所43における目標位置TPを所定の軌道であるラインLに沿って移動させる(アクティビティA102)。このような動作は、例えば、切削作業、塗布作業、医療行為等において含まれるものである。
3.1 Operation method FIG. 6 is an activity diagram showing an operation method of the system 1. Here, for the sake of simplicity, the user U is right-handed, the first limb HF1 is referred to as a right hand RH, and the second limb HF2 is referred to as a left hand LH2. That is, the user U grasps the first contact portion 41 with the right hand RH and grasps the second contact portion 42 with the left hand LH (activity A101). Grasp is an example of contact. Then, the user U operates the first contact portion 41 with the right hand RH to move the target position TP at the operated portion 43 along the line L which is a predetermined trajectory (activity A102). Such an operation is included in, for example, cutting work, coating work, medical practice, and the like.
 ユーザUが第1の接触部41を変位させると、これに応じて目標位置TPも変位する。このとき、撮像装置2によって、目標位置TPとラインLとが撮像され、画像IMが制御装置3に送信される(アクティビティA103)。換言すると、受付部331が画像IMを受け付けて、これが記憶部32に記憶される。 When the user U displaces the first contact portion 41, the target position TP is also displaced accordingly. At this time, the image pickup device 2 captures the target position TP and the line L, and the image IM is transmitted to the control device 3 (activity A103). In other words, the reception unit 331 receives the image IM, which is stored in the storage unit 32.
 図7は、画像処理部332が画像処理を行う画像IMの一例を示す概要図である。画像処理部332は、受付部331が受け付けた画像IMを画像処理によって解析し、画像IM中におけるラインLの位置を特定する(アクティビティA104)。これは例えば、撮像された画像IMを、画像に関する所定のパラメータ(明度等)にしきい値を決めてバイナリ化することで実施される。そして、画像IMからラインLの重心位置を算出することでラインLの位置を特定することができる。 FIG. 7 is a schematic diagram showing an example of an image IM in which the image processing unit 332 performs image processing. The image processing unit 332 analyzes the image IM received by the reception unit 331 by image processing, and specifies the position of the line L in the image IM (activity A104). This is performed, for example, by binarizing the captured image IM by setting a threshold value for a predetermined parameter (brightness, etc.) related to the image. Then, the position of the line L can be specified by calculating the position of the center of gravity of the line L from the image IM.
 そして、目標位置TPが撮像装置2の視線と規定面Pとの交点として実施されるとよい。図4においては不図示であるが、位置調整部44に撮像装置2が取り付けられていることとなる。つまり、目標位置TPは、撮像装置2によって撮像される画像IMの中心(画像中心CT)である。 Then, the target position TP may be implemented as an intersection of the line of sight of the image pickup apparatus 2 and the defined surface P. Although not shown in FIG. 4, the image pickup device 2 is attached to the position adjusting unit 44. That is, the target position TP is the center of the image IM (image center CT) imaged by the image pickup apparatus 2.
 なお、図7に示されるように、画像IMの一部の所定領域ROIに対して画像処理を行うように実施してもよい。特に、誤差Eの補正を高い制御レートにて行うため、ラインLは、画像IMの決まった位置(例えば画像中心CT)の近傍にあることとなり、この決まった位置の近傍領域を所定領域ROIとすることで、画像処理をするピクセル数を削減することができる。これにより、制御装置3の計算負荷を小さくして高い制御レートを維持することができる。 As shown in FIG. 7, image processing may be performed on a predetermined area ROI of a part of the image IM. In particular, in order to correct the error E at a high control rate, the line L is in the vicinity of a fixed position of the image IM (for example, the image center CT), and the region near this fixed position is referred to as the predetermined region ROI. By doing so, the number of pixels for image processing can be reduced. As a result, the computational load of the control device 3 can be reduced to maintain a high control rate.
 続いて、演算部333が目標位置TP(画像中心CT)と、ラインLとの誤差Eを表す誤差ベクトルv1を演算する(アクティビティA105)。図8は、誤差ベクトルv1を表す概要図である。ここで誤差Eが位置調整部44の可動範囲である第2の範囲に収まる場合は、制御信号生成部334が誤差Eを補正する制御信号CS1を生成し位置調整部44に送信する(アクティビティA106)。さらに、制御信号生成部334が誤差EをユーザUに提示する制御信号CS2を生成し誤差感覚提示部45に送信する(アクティビティA107)。 Subsequently, the calculation unit 333 calculates an error vector v1 representing an error E between the target position TP (image center CT) and the line L (activity A105). FIG. 8 is a schematic diagram showing the error vector v1. If the error E falls within the second range that is the movable range of the position adjustment unit 44, the control signal generation unit 334 generates a control signal CS1 that corrects the error E and transmits it to the position adjustment unit 44 (activity A106). ). Further, the control signal generation unit 334 generates a control signal CS2 that presents the error E to the user U and transmits it to the error sense presentation unit 45 (activity A107).
 換言すると、通信部31を介して制御信号CS1がメイン装置4における位置調整部44に送信されることで、位置調整部44が駆動し、これにより誤差Eを補正することができる。この場合の制御手法は特に限定されないが、例えば、P制御、PD制御、PID制御等が適宜採用されうる。制御に係る各係数は、必要に応じて好ましい値を設定すればよい。また、通信部31を介して制御信号CS2がメイン装置4における誤差感覚提示部45に送信されることで、誤差感覚提示部45が作動し、これによりユーザUに誤差Eを提示することができる。 In other words, the control signal CS1 is transmitted to the position adjusting unit 44 in the main device 4 via the communication unit 31, so that the position adjusting unit 44 is driven, whereby the error E can be corrected. The control method in this case is not particularly limited, but for example, P control, PD control, PID control and the like can be appropriately adopted. A preferable value may be set for each coefficient related to the control, if necessary. Further, when the control signal CS2 is transmitted to the error sensation presenting unit 45 in the main device 4 via the communication unit 31, the error sensation presenting unit 45 operates, whereby the error E can be presented to the user U. ..
 一方、誤差Eが位置調整部44の可動範囲である第2の範囲に収まらない場合は、制御信号生成部334が誤差Eを補正する制御信号CS1を生成せずに、誤差EをユーザUに提示する制御信号CS2を生成し誤差感覚提示部45に送信するとよい(アクティビティA107)。 On the other hand, when the error E does not fall within the second range which is the movable range of the position adjusting unit 44, the control signal generation unit 334 does not generate the control signal CS1 for correcting the error E, and the error E is sent to the user U. The control signal CS2 to be presented may be generated and transmitted to the error sense presenting unit 45 (activity A107).
 誤差Eに基づいた力覚又は触覚は、誤差Eを表す誤差ベクトルv1に比例して決定される。つまり、誤差Eの大きさ(度合い)と向きとをユーザUに提示するために、誤差ベクトルv1に比例したベクトル(比例定数は正負の数であり1も含む)として、力覚又は触覚がユーザUに付与されるとよい。特に、操作している右手RHとは異なる左手LHに力覚又は触覚が付与されることで、操作感を損なうことなく、誤差EをユーザUに提示することができる。また、特に好ましくは、誤差Eに基づいた力覚又は触覚は、人間の感覚提示に適した周波数に変換されて提示される。人間が知覚可能な周波数で力覚又は触覚を提示することで、ユーザUが誤差Eの状態を把握することができる。 The force or tactile sensation based on the error E is determined in proportion to the error vector v1 representing the error E. That is, in order to present the magnitude (degree) and direction of the error E to the user U, the force sense or the tactile sense is the user as a vector proportional to the error vector v1 (the proportionality constant is a positive or negative number and includes 1). It should be given to U. In particular, by imparting a force sense or a tactile sense to the left hand LH different from the operating right hand RH, the error E can be presented to the user U without impairing the operation feeling. Further, particularly preferably, the force sense or the tactile sense based on the error E is converted into a frequency suitable for human sense presentation and presented. By presenting the force or tactile sensation at a frequency perceptible to humans, the user U can grasp the state of the error E.
 以上に説明した制御処理が制御レート単位で繰り返されることで、ユーザUが所定の動作を訓練し、学習することができる。まとめると、このシステム1の操作方法は、第1~第4のステップを備える。第1のステップでは、ユーザUの第1の手足HF1をシステム1の第1の接触部41に接触させるとともに、ユーザUの第2の手足HF2をシステム1の第2の接触部42に接触させる。第2のステップでは、第1の接触部41に接触している第1の手足HF1を動かすことで、システム1の被操作箇所43によって規定される目標位置TPを移動させる。第3のステップでは、目標位置TPの所定軌道からの誤差Eを計測する。第4のステップでは、誤差Eに基づいた力覚又は触覚を、第2の接触部42に接触している第2の手足HF2に付与することで、ユーザUに誤差Eを提示する。 By repeating the control process described above in control rate units, the user U can train and learn a predetermined operation. In summary, this system 1 operating method comprises first to fourth steps. In the first step, the first limb HF1 of the user U is brought into contact with the first contact portion 41 of the system 1, and the second limb HF2 of the user U is brought into contact with the second contact portion 42 of the system 1. .. In the second step, the target position TP defined by the operated portion 43 of the system 1 is moved by moving the first limb HF1 in contact with the first contact portion 41. In the third step, the error E from the predetermined trajectory of the target position TP is measured. In the fourth step, the error E is presented to the user U by applying a force sense or a tactile sense based on the error E to the second limb HF2 in contact with the second contact portion 42.
3.2 同期運動
 前述の仮定について補足すると、好ましくは、第1の手足HF1及び第2の手足HF2は、ユーザUの左右の手(左手LH及び右手RH)又は足(左足LF及び右足RF)である。人間は両腕協調運動を利用して様々かつ複雑な作業を実現している。人間の両腕を協調させて動かすために、邪魔をしながらも協調しあうことが可能になるといった脳のメカニズムが存在すると考えられる。特に、両腕の同期運動(例えば、右手RH及び左手LHで異なった運動を同時に行おうとしても、両手が同じような動きになる傾向)は日常生活中にも良く見られて、両腕の同期制御は脳にとって最も基本的な仕組みとも考えられる。
3.2 Synchronous movement To supplement the above assumption, preferably, the first limb HF1 and the second limb HF2 are the left and right hands (left hand LH and right hand RH) or foot (left foot LF and right foot RF) of the user U. Is. Humans use the coordinated movement of both arms to perform various and complicated tasks. In order to move both human arms in a coordinated manner, it is thought that there is a brain mechanism that enables them to cooperate with each other while interfering with each other. In particular, synchronous movements of both arms (for example, even if different movements are performed simultaneously with the right hand RH and the left hand LH, both hands tend to move in the same manner) are often seen in daily life, and both arms Synchronous control is also considered to be the most basic mechanism for the brain.
 つまり、左手LHに力覚又は触覚を付与すると、左右の同期運動によって、速やかに右手RHに誤差Eを補正する方向への調整がユーザU自身によって行われることとなる。このような制御処理によれば、ユーザUの年代やモチベーションによらず、ユーザUがより直感的に効果的に所定の動作を訓練し、学習することができる。 That is, when a force sense or a tactile sense is given to the left hand LH, the user U himself quickly adjusts the right hand RH in the direction of correcting the error E by the left-right synchronous movement. According to such a control process, the user U can more intuitively and effectively train and learn a predetermined motion regardless of the age and motivation of the user U.
 なお、誤差Eに基づいた力覚又は触覚は、誤差ベクトルv1に代えて、誤差Eを表す誤差ベクトルv1を対称面に関して対称移動させた対称ベクトルv2に比例して決定されてもよい(図8参照)。ここでの対称面とは、ユーザUの体幹中心から前後に延在する面である。ストレッチ体操等を鑑みても、人間は体幹中心から前後に延在する面を対称面として、左右対称の動作を自然に行うことができる。このため、誤差ベクトルv1ではなく対称ベクトルv2に比例する力覚又は触覚によって、誤差EをユーザUが提示してもよい。さらに、ユーザUの好みに応じて誤差ベクトルv1又は対称ベクトルv2を選択できるように実施してもよい。 The force or tactile sensation based on the error E may be determined in proportion to the symmetry vector v2 in which the error vector v1 representing the error E is symmetrically moved with respect to the plane of symmetry instead of the error vector v1 (FIG. 8). reference). The plane of symmetry here is a plane extending back and forth from the center of the trunk of the user U. Even in consideration of stretching exercises and the like, human beings can naturally perform left-right symmetric movements by using a plane extending from the center of the trunk to the front and back as a symmetrical plane. Therefore, the user U may present the error E by a force sense or a tactile sense proportional to the symmetry vector v2 instead of the error vector v1. Further, the error vector v1 or the symmetry vector v2 may be selected according to the preference of the user U.
4.その他
 次のような態様によって、システム1をさらに創意工夫してもよい。
4. In addition, the system 1 may be further ingeniously devised by the following aspects.
(1)システム1は、不図示のガイド光照射部をさらに備えてもよい。このガイド光照射部は、撮像装置2(センサ部)と同軸又は相対的位置が固定され、且つ目標位置TPを示すガイド光を照射可能に構成されるとよい。ガイド光照射部と撮像装置2との相対的位置は、設計の際に既知であるからガイド光照射部から目標位置TPを投影光として照射することができる。好ましくは、ビームスプリッタ等を用いて撮像装置2とガイド光照射部とを同軸光学系として実施するとよい。これによりユーザUは、目標位置TPを所定軌道に沿って変位させるために第1の接触部41をどのように動かすべきかを、より直感的に把握することができる。 (1) The system 1 may further include a guide light irradiation unit (not shown). The guide light irradiation unit may be configured to be coaxial or relative to the image pickup device 2 (sensor unit) and to be capable of irradiating guide light indicating the target position TP. Since the relative positions of the guide light irradiation unit and the image pickup device 2 are known at the time of design, the target position TP can be irradiated from the guide light irradiation unit as projected light. Preferably, the image pickup apparatus 2 and the guide light irradiation unit are implemented as a coaxial optical system by using a beam splitter or the like. As a result, the user U can more intuitively grasp how the first contact portion 41 should be moved in order to displace the target position TP along a predetermined trajectory.
(2)前述の実施形態では、目標位置TPを撮像装置2の視線と規定面Pとの交点(画像中心CT)として実施しているがあくまでも一例でありこの限りではない。例えば、被操作箇所43における位置調整部44に切削具(例えばエンドミルや医療用メス)を取り付けて、かかる切削具の先端位置を目標位置TPに設定することができる。この際、撮像装置2及び切削具の相対的位置は設計の際に既知である。このような変形例によれば、ユーザUが切削加工や医療施術の訓練を実施することができる。 (2) In the above-described embodiment, the target position TP is implemented as an intersection (image center CT) between the line of sight of the image pickup apparatus 2 and the defined surface P, but this is merely an example and is not limited to this. For example, a cutting tool (for example, an end mill or a medical scalpel) can be attached to the position adjusting portion 44 at the operated portion 43, and the tip position of the cutting tool can be set to the target position TP. At this time, the relative positions of the image pickup apparatus 2 and the cutting tool are known at the time of design. According to such a modification, the user U can carry out training in cutting and medical treatment.
(3)さらに、被操作箇所43における位置調整部44にレーザ射出部(加工用)を取り付けて、かかるレーザ射出部から射出されたレーザの照射位置(規定面P上)を目標位置TPに設定することができる。この際、撮像装置2及びレーザ射出部の相対的位置は設計の際に既知である。このような変形例によれば、ユーザUが所望の物体が規定された形状となるようにレーザ加工の訓練を施すことができる。 (3) Further, a laser emitting portion (for processing) is attached to the position adjusting portion 44 at the operated portion 43, and the irradiation position (on the specified surface P) of the laser emitted from the laser emitting portion is set as the target position TP. can do. At this time, the relative positions of the image pickup apparatus 2 and the laser emitting portion are known at the time of design. According to such a modification, the user U can be trained in laser machining so that the desired object has a defined shape.
(4)さらに、被操作箇所43における位置調整部44に塗料等を塗布可能に構成される塗布部を取り付けて、かかる塗布部の先端位置を目標位置TPに設定することができる。この際、撮像装置2及び塗布具の相対的位置は設計の際に既知である。このような変形例によれば、ユーザが塗布工程の訓練を実施することができる。 (4) Further, a coating portion configured to be able to apply paint or the like can be attached to the position adjusting portion 44 at the operated portion 43, and the tip position of the coating portion can be set to the target position TP. At this time, the relative positions of the image pickup apparatus 2 and the coating tool are known at the time of design. According to such a modification, the user can carry out training in the coating process.
(5)既述の切削具、レーザ射出部、塗布具等も含めて、目標位置TPを決定する対象として様々なものが考えられ、これらを自由に取り付け取り外しができるように実施することができる。 (5) Various objects can be considered as targets for determining the target position TP, including the cutting tool, laser emitting part, coating tool, etc. described above, and these can be freely attached and detached. ..
(6)撮像装置2に代えて、又はこれとともに他のセンサを用いてもよい。例えば、レーザ変位センサ、赤外線センサ等が適宜採用されうる。 (6) Another sensor may be used in place of or in combination with the image pickup apparatus 2. For example, a laser displacement sensor, an infrared sensor, or the like can be appropriately adopted.
(7)システム1としてではなく、その一部である制御装置3を単体で実施してもよい。 (7) The control device 3 which is a part of the system 1 may be implemented by itself, not as the system 1.
(8)コンピュータを制御装置3として機能させるプログラムを実施してもよい。 (8) A program that causes the computer to function as the control device 3 may be implemented.
 さらに、次に記載の各態様で提供されてもよい。
 前記システムにおいて、ガイド光照射部をさらに備え、前記ガイド光照射部は、前記センサ部と同軸又は相対的位置が固定され、且つ前記目標位置を示すガイド光を照射可能に構成される、もの。
 前記システムにおいて、前記第1及び第2の手足は、前記ユーザの左右の手であり、前記第1及び第2の接触部は、前記左右の手でそれぞれ把握可能に構成される、もの。
 前記システムにおいて、前記誤差に基づいた力覚又は触覚は、前記誤差を表す誤差ベクトルに比例して決定される、もの。
 前記システムにおいて、前記第1及び第2の手足は、前記ユーザの左右の手又は足であり、前記誤差に基づいた力覚又は触覚は、前記誤差を表す誤差ベクトルを対称面に関して対称移動させた対称ベクトルに比例して決定され、ここで前記対称面は、前記ユーザの体幹中心から前後に延在する面である、もの。
 前記システムにおいて、前記誤差に基づいた力覚又は触覚は、人間の感覚提示に適した周波数に変換されて提示される、もの。
 前記システムにおいて、前記センサ部は、外界の情報を撮像可能に構成される撮像部である、もの。
 前記システムにおいて、前記目標位置は、前記撮像部によって撮像される画像の中心である、もの。
 前記システムにおいて、位置調整部をさらに備え、前記位置調整部は、前記被操作箇所を、前記ユーザが操作可能な第1の範囲よりも小さい第2の範囲で変位可能で、前記誤差を補正するように、前記被操作箇所の位置を調整可能に構成される、もの。
 前記システムにおいて、前記センサ部の取得レート及び前記位置調整部の駆動レートは、100ヘルツ以上である、もの。
 システムの操作方法であって、第1~第4のステップを備え、前記第1のステップでは、ユーザの第1の手足を前記システムの第1の接触部に接触させるとともに、前記ユーザの第2の手足を前記システムの第2の接触部に接触させ、前記第2のステップでは、前記第1の接触部に接触している前記第1の手足を動かすことで、前記システムの被操作箇所によって規定される目標位置を移動させ、前記第3のステップでは、前記目標位置の所定軌道からの誤差を計測し、前記第4のステップでは、前記誤差に基づいた力覚又は触覚を、前記第2の接触部に接触している前記第2の手足に付与することで、前記ユーザに前記誤差を提示する、方法。
 もちろん、この限りではない。
Furthermore, it may be provided in each of the following embodiments.
The system further includes a guide light irradiating unit, the guide light irradiating unit having a fixed position coaxially or relative to the sensor unit and capable of irradiating a guide light indicating the target position.
In the system, the first and second limbs are the left and right hands of the user, and the first and second contact portions are configured to be graspable by the left and right hands, respectively.
In the system, the force or tactile sensation based on the error is determined in proportion to the error vector representing the error.
In the system, the first and second limbs are the left and right hands or feet of the user, and the force or tactile sensation based on the error causes the error vector representing the error to move symmetrically with respect to the plane of symmetry. Determined in proportion to the symmetry vector, where the plane of symmetry is a plane extending anterior-posteriorly from the center of the user's trunk.
In the system, the force or tactile sensation based on the error is converted into a frequency suitable for human sensory presentation and presented.
In the system, the sensor unit is an imaging unit configured to be capable of capturing information in the outside world.
In the system, the target position is the center of an image captured by the imaging unit.
The system further comprises a position adjusting unit, which can displace the operated portion in a second range smaller than the first range in which the user can operate, and corrects the error. As described above, the position of the operated portion can be adjusted.
In the system, the acquisition rate of the sensor unit and the drive rate of the position adjustment unit are 100 hertz or more.
A method of operating the system, comprising first to fourth steps, in which the user's first limbs are brought into contact with the first contact portion of the system and the user's second. The limbs are brought into contact with the second contact portion of the system, and in the second step, the first limbs in contact with the first contact portion are moved, whereby the operated portion of the system is used. The specified target position is moved, the error from the predetermined trajectory of the target position is measured in the third step, and the force or tactile sensation based on the error is measured in the fourth step. A method of presenting the error to the user by applying it to the second limb in contact with the contact portion of the body.
Of course, this is not the case.
 最後に、本発明に係る種々の実施形態を説明したが、これらは、例として提示したものであり、発明の範囲を限定することは意図していない。当該新規な実施形態は、その他の様々な形態で実施されることが可能であり、発明の要旨を逸脱しない範囲で、種々の省略、置き換え、変更を行うことができる。当該実施形態やその変形は、発明の範囲や要旨に含まれるとともに、特許請求の範囲に記載された発明とその均等の範囲に含まれるものである。 Finally, various embodiments of the present invention have been described, but these are presented as examples and are not intended to limit the scope of the invention. The novel embodiment can be implemented in various other embodiments, and various omissions, replacements, and changes can be made without departing from the gist of the invention. The embodiment and its modifications are included in the scope and gist of the invention, and are included in the scope of the invention described in the claims and the equivalent scope thereof.
1   :システム
2   :撮像装置
3   :制御装置
30  :通信バス
31  :通信部
32  :記憶部
33  :制御部
331 :受付部
332 :画像処理部
333 :演算部
334 :制御信号生成部
4   :メイン装置
41  :第1の接触部
42  :第2の接触部
43  :被操作箇所
44  :位置調整部
45  :誤差感覚提示部
CS  :制御信号
CS1 :制御信号
CS2 :制御信号
CT  :画像中心
E   :誤差
HF1 :第1の手足
HF2 :第2の手足
IM  :画像
L   :ライン
LF  :左足
LH  :左手
P   :規定面
RF  :右足
RH  :右手
ROI :所定領域
TP  :目標位置
U   :ユーザ
v1  :誤差ベクトル
v2  :対称ベクトル
1: System 2: Image pickup device 3: Control device 30: Communication bus 31: Communication unit 32: Storage unit 33: Control unit 331: Reception unit 332: Image processing unit 333: Calculation unit 334: Control signal generation unit 4: Main device 41: First contact part 42: Second contact part 43: Operated part 44: Position adjustment part 45: Error sense presentation part CS: Control signal CS1: Control signal CS2: Control signal CT: Image center E: Error HF1 : First limb HF2: Second limb IM: Image L: Line LF: Left foot LH: Left hand P: Specified surface RF: Right foot RH: Right hand ROI: Predetermined area TP: Target position U: User v1: Error vector v2: Symmetric vector

Claims (11)

  1. システムであって、
     第1の接触部と、センサ部と、第2の接触部とを備え、
     前記第1の接触部は、
      被操作箇所と接続され、
      ユーザの第1の手足と接触することで、前記第1の手足の動きに合わせて前記被操作箇所によって規定される目標位置を可変に構成され、
     前記センサ部は、前記目標位置の所定軌道からの誤差を計測するように構成され、
     前記第2の接触部は、
      誤差感覚提示部を備え、且つ
      前記ユーザの前記第1の手足とは異なる第2の手足と接触するように構成され、
     前記誤差感覚提示部は、前記誤差に基づいた力覚又は触覚を前記第2の手足に付与することで、前記ユーザに前記誤差を提示するように構成される、もの。
    It ’s a system,
    A first contact portion, a sensor portion, and a second contact portion are provided.
    The first contact portion is
    Connected to the operated part,
    By coming into contact with the user's first limb, the target position defined by the operated portion is variably configured according to the movement of the first limb.
    The sensor unit is configured to measure an error from a predetermined trajectory of the target position.
    The second contact portion is
    It is provided with an error sensation presenting unit, and is configured to come into contact with a second limb different from the first limb of the user.
    The error sensation presenting unit is configured to present the error to the user by applying a force sense or a tactile sense based on the error to the second limb.
  2. 請求項1に記載のシステムにおいて、
     ガイド光照射部をさらに備え、
     前記ガイド光照射部は、前記センサ部と同軸又は相対的位置が固定され、且つ前記目標位置を示すガイド光を照射可能に構成される、もの。
    In the system according to claim 1,
    Further equipped with a guide light irradiation unit,
    The guide light irradiation unit is configured such that the position coaxially or relative to the sensor unit is fixed and the guide light indicating the target position can be irradiated.
  3. 請求項1又は請求項2に記載のシステムにおいて、
     前記第1及び第2の手足は、前記ユーザの左右の手であり、
     前記第1及び第2の接触部は、前記左右の手でそれぞれ把握可能に構成される、もの。
    In the system according to claim 1 or 2.
    The first and second limbs are the left and right hands of the user.
    The first and second contact portions are configured so that they can be grasped by the left and right hands, respectively.
  4. 請求項1~請求項3の何れか1つに記載のシステムにおいて、
     前記誤差に基づいた力覚又は触覚は、前記誤差を表す誤差ベクトルに比例して決定される、もの。
    In the system according to any one of claims 1 to 3.
    The force or tactile sensation based on the error is determined in proportion to the error vector representing the error.
  5. 請求項1~請求項3の何れか1つに記載のシステムにおいて、
     前記第1及び第2の手足は、前記ユーザの左右の手又は足であり、
     前記誤差に基づいた力覚又は触覚は、前記誤差を表す誤差ベクトルを対称面に関して対称移動させた対称ベクトルに比例して決定され、ここで前記対称面は、前記ユーザの体幹中心から前後に延在する面である、もの。
    In the system according to any one of claims 1 to 3.
    The first and second limbs are the left and right hands or feet of the user.
    The force or tactile sensation based on the error is determined in proportion to the symmetry vector obtained by symmetrically moving the error vector representing the error with respect to the plane of symmetry, where the plane of symmetry is moved back and forth from the center of the trunk of the user. Things that are a protracted surface.
  6. 請求項4又は請求項5に記載のシステムにおいて、
     前記誤差に基づいた力覚又は触覚は、人間の感覚提示に適した周波数に変換されて提示される、もの。
    In the system according to claim 4 or 5.
    The force or tactile sensation based on the error is converted into a frequency suitable for human sense presentation and presented.
  7. 請求項1~請求項5の何れか1つに記載のシステムにおいて、
     前記センサ部は、外界の情報を撮像可能に構成される撮像部である、もの。
    In the system according to any one of claims 1 to 5.
    The sensor unit is an imaging unit configured to be capable of capturing information from the outside world.
  8. 請求項7に記載のシステムにおいて、
     前記目標位置は、前記撮像部によって撮像される画像の中心である、もの。
    In the system according to claim 7,
    The target position is the center of an image captured by the imaging unit.
  9. 請求項1~請求項8の何れか1つに記載のシステムにおいて、
     位置調整部をさらに備え、
     前記位置調整部は、
      前記被操作箇所を、前記ユーザが操作可能な第1の範囲よりも小さい第2の範囲で変位可能で、
      前記誤差を補正するように、前記被操作箇所の位置を調整可能に構成される、もの。
    In the system according to any one of claims 1 to 8.
    Further equipped with a position adjustment unit,
    The position adjustment unit is
    The operated portion can be displaced in a second range smaller than the first range in which the user can operate.
    It is configured so that the position of the operated portion can be adjusted so as to correct the error.
  10. 請求項9に記載のシステムにおいて、
     前記センサ部の取得レート及び前記位置調整部の駆動レートは、100ヘルツ以上である、もの。
    In the system of claim 9,
    The acquisition rate of the sensor unit and the drive rate of the position adjustment unit are 100 hertz or more.
  11. システムの操作方法であって、
     第1~第4のステップを備え、
     前記第1のステップでは、ユーザの第1の手足を前記システムの第1の接触部に接触させるとともに、前記ユーザの第2の手足を前記システムの第2の接触部に接触させ、
     前記第2のステップでは、前記第1の接触部に接触している前記第1の手足を動かすことで、前記システムの被操作箇所によって規定される目標位置を移動させ、
     前記第3のステップでは、前記目標位置の所定軌道からの誤差を計測し、
     前記第4のステップでは、前記誤差に基づいた力覚又は触覚を、前記第2の接触部に接触している前記第2の手足に付与することで、前記ユーザに前記誤差を提示する、方法。
    How to operate the system
    With the first to fourth steps
    In the first step, the user's first limb is brought into contact with the first contact portion of the system, and the user's second limb is brought into contact with the second contact portion of the system.
    In the second step, by moving the first limb in contact with the first contact portion, the target position defined by the operated portion of the system is moved.
    In the third step, the error from the predetermined trajectory of the target position is measured.
    In the fourth step, a method of presenting the error to the user by applying a force sense or a tactile sense based on the error to the second limb in contact with the second contact portion. ..
PCT/JP2021/019754 2020-05-26 2021-05-25 System and operation method WO2021241558A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202180034660.XA CN115605932A (en) 2020-05-26 2021-05-25 System and method of operation
US17/924,379 US20230186784A1 (en) 2020-05-26 2021-05-25 System and operation method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020091522A JP7351523B2 (en) 2020-05-26 2020-05-26 System and operating method
JP2020-091522 2020-05-26

Publications (1)

Publication Number Publication Date
WO2021241558A1 true WO2021241558A1 (en) 2021-12-02

Family

ID=78744475

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/019754 WO2021241558A1 (en) 2020-05-26 2021-05-25 System and operation method

Country Status (4)

Country Link
US (1) US20230186784A1 (en)
JP (1) JP7351523B2 (en)
CN (1) CN115605932A (en)
WO (1) WO2021241558A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009244428A (en) * 2008-03-28 2009-10-22 Brother Ind Ltd Operation training system
JP2011059219A (en) * 2009-09-08 2011-03-24 Nagoya Institute Of Technology Technical skill experience system
JP2017023223A (en) * 2015-07-16 2017-02-02 国立大学法人埼玉大学 Bi-directional remote control system using functional electric stimulation
JP2017134116A (en) * 2016-01-25 2017-08-03 キヤノン株式会社 Information processing apparatus, information processing method, and program
US20180369637A1 (en) * 2014-12-12 2018-12-27 Enflux Inc. Training systems with wearable sensors for providing users with feedback
JP2020012858A (en) * 2018-07-13 2020-01-23 株式会社日立製作所 Skill training device and skill training method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009244428A (en) * 2008-03-28 2009-10-22 Brother Ind Ltd Operation training system
JP2011059219A (en) * 2009-09-08 2011-03-24 Nagoya Institute Of Technology Technical skill experience system
US20180369637A1 (en) * 2014-12-12 2018-12-27 Enflux Inc. Training systems with wearable sensors for providing users with feedback
JP2017023223A (en) * 2015-07-16 2017-02-02 国立大学法人埼玉大学 Bi-directional remote control system using functional electric stimulation
JP2017134116A (en) * 2016-01-25 2017-08-03 キヤノン株式会社 Information processing apparatus, information processing method, and program
JP2020012858A (en) * 2018-07-13 2020-01-23 株式会社日立製作所 Skill training device and skill training method

Also Published As

Publication number Publication date
JP2021189223A (en) 2021-12-13
CN115605932A (en) 2023-01-13
US20230186784A1 (en) 2023-06-15
JP7351523B2 (en) 2023-09-27

Similar Documents

Publication Publication Date Title
CN109498384B (en) Massage part identification, positioning and massage method, device and equipment
US10820945B2 (en) System for facilitating medical treatment
US11262844B2 (en) Rehabilitation robot, rehabilitation system, rehabilitation method and rehabilitation device
WO2018014824A1 (en) Intelligent physical therapy robot system and operation method therefor
US11612803B2 (en) Bilateral limb coordination training system and control method
CN106214163B (en) Recovered artifical psychological counseling device of low limbs deformity correction postoperative
JP7107960B2 (en) Systems, apparatus, methods and machine-readable media for estimating the position and/or orientation of a handheld personal care device with respect to a user
CN111870268A (en) Method and system for determining target position information of beam limiting device
WO2021241558A1 (en) System and operation method
KR20180109385A (en) Wearable Device for rehabilitating dizziness
WO2019073689A1 (en) Information processing device, information processing method, and program
KR20110049703A (en) Surgical robot system and laparoscope handling method thereof
Desai et al. Controlling a wheelchair by gesture movements and wearable technology
KR101114234B1 (en) Surgical robot system and laparoscope handling method thereof
CN114569410A (en) Control method and device for rehabilitation robot training mode and storage medium
JP7126276B2 (en) Robot-assisted device and robot-assisted system.
CN113796963A (en) Mechanical arm control method with force sensing feedback adjustment function and control terminal
CN113752257B (en) Mechanical arm track correction method based on position feedback information and control terminal
Haufe et al. Reference trajectory adaptation to improve human-robot interaction: A database-driven approach
CN111860213A (en) Augmented reality system and control method thereof
CN112750108B (en) Massage apparatus control method, system and computer readable storage medium
Li et al. A Wearable Computer Vision System With Gimbal Enables Position-, Speed-, and Phase-Independent Terrain Classification for Lower Limb Prostheses
WO2020158727A1 (en) System, method, and program
CN112115746B (en) Human body action recognition device and method and electronic equipment
Hadi et al. Brain computer interface (BCI) for controlling path planning mobile robots: a review

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21814279

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21814279

Country of ref document: EP

Kind code of ref document: A1