WO2021193421A1 - Dispositif de traitement d'informations, procédé de traitement d'informations, programme et système de traitement d'informations - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations, programme et système de traitement d'informations Download PDF

Info

Publication number
WO2021193421A1
WO2021193421A1 PCT/JP2021/011342 JP2021011342W WO2021193421A1 WO 2021193421 A1 WO2021193421 A1 WO 2021193421A1 JP 2021011342 W JP2021011342 W JP 2021011342W WO 2021193421 A1 WO2021193421 A1 WO 2021193421A1
Authority
WO
WIPO (PCT)
Prior art keywords
information processing
tactile
tactile presentation
control unit
virtual object
Prior art date
Application number
PCT/JP2021/011342
Other languages
English (en)
Japanese (ja)
Inventor
諒 横山
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2021193421A1 publication Critical patent/WO2021193421A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • This technology relates to information processing devices, information processing methods, programs and information processing systems.
  • AR Augmented Reality
  • Patent Document 1 describes a technique for notifying the user whether or not the pen tip is in contact with the virtual plane by vibrating the electronic pen when displaying the trajectory of the pen tip of the electronic pen on the virtual plane. It is disclosed.
  • Patent Document 1 merely informs the user of the state of the pen tip of the electronic pen with respect to the virtual plane by vibrating the electronic pen. Therefore, a technique capable of performing even better tactile presentation is desired.
  • One of the purposes of this technology is to provide an information processing device, an information processing method, a program, and an information processing system capable of performing excellent tactile presentation.
  • This technology is an information processing device including a control unit that generates control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
  • the processor This is an information processing method that controls to generate control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
  • This technology On the computer This is a program that realizes a control function that generates control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
  • This technology A display device that displays virtual objects and A tactile presentation device that presents tactile sensations, An information processing device connected to the display device and the tactile presentation device, With The information processing device generates control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
  • the tactile presentation device is an information processing system that presents a tactile sensation based on the control information.
  • FIG. 1 is a diagram showing a configuration example of an information processing system according to the present embodiment.
  • FIG. 2 is a block diagram showing a configuration example of the information processing device.
  • FIG. 3 is a functional block diagram showing an example of a functional configuration controlled by the control unit.
  • FIG. 4 is a diagram showing a configuration example for grasping the position of the operating body.
  • FIG. 5 is a diagram showing another configuration example for grasping the position of the operating body.
  • FIG. 6 is a diagram showing still another configuration example for grasping the position of the operating body.
  • FIG. 7 is a diagram for explaining an operation example of the enlargement / reduction UI.
  • FIG. 8 is a flowchart showing an example of the processing flow by the control unit.
  • FIG. 9 is a diagram showing a specific example of using the information processing system.
  • Embodiment 1-1 Information processing system configuration 1-2. Configuration of information processing device 1-3. Information processing device functions 1-4. Specific examples of tactile presentation 1-5. Information processing device processing 1-6. Specific example of information processing system 2. summary
  • FIG. 1 shows a configuration example of an information processing system according to the present embodiment.
  • the information processing system 1 shown in FIG. 1 includes a display device 2, a tactile presentation device 3, an information processing device 4, and an information sharing server S.
  • the information processing device 4 is connected to the display device 2, the tactile presentation device 3, and the information sharing server S, respectively, and is configured to be capable of information communication with each of the display device 2, the tactile presentation device 3, and the information sharing server S. ..
  • the connection may be made regardless of the communication method such as wired or wireless.
  • the display device 2 has a function of displaying an image.
  • the display device 2 is composed of a device including a display unit such as a display panel.
  • the image includes a still image and a moving image (video).
  • the display device 2 displays a virtual space such as an AR (Augmented Reality) space, a VR (Virtual Reality) space, and an MR (Mixed Reality) space based on the image information provided by the information processing device 4. That is, the display device 2 expresses a virtual space by displaying an image.
  • the virtual space is a virtual three-dimensional space constructed by information processing executed by the information processing device 4. Examples of the content displayed on the display device 2 include games using virtual space, live distribution, sports broadcasting, navigation, education, tourist information, shopping, and other hands-on content.
  • Examples of the display device 2 include a mobile terminal (for example, a smartphone, a smart tablet, a mobile phone, a portable game machine, etc.), a wearable display device (for example, a head-mounted display (HMD), AR glass, a VR glass, etc.). can give.
  • the display device 2 may be a stationary display.
  • the tactile presentation device 3 has a tactile presentation function for presenting a tactile sensation to a user.
  • the tactile presentation device 3 is composed of a device including a tactile presentation unit such as a vibrator, an electric tactile device, an electric muscle stimulator, and a Pelche element.
  • the tactile presentation device 3 presents the user with a tactile sensation regarding a phenomenon in the virtual space (details will be described later) based on the control information provided by the information processing device 4.
  • sensations are roughly classified into three types. Special senses such as sight and hearing, visceral sensations such as visceral pain, and tactile sensations (tactile sensations in a narrow sense), pressure sensations, and vibrations from the skin, mucous membranes, and deep muscles, tendons, and joints of the body.
  • tactile sensation in the present specification and drawings means tactile sensation in a broad sense (tactile sensation in a broad sense), and refers to this somatosensory sensation. That is, the above-mentioned tactile presentation function refers to a function that gives the user these somatosensory sensations.
  • the above-mentioned tactile presentation unit may be any as long as it can give this somatosensory to the user.
  • Examples of the tactile presentation device 3 include the above-mentioned mobile terminal, a pen-type electronic device (so-called AR pen, etc.), a grip-type electronic device such as a controller, a glove type (so-called haptic glove, etc.), a bracelet type, and a ring type. Such as wearable electronic devices.
  • the tactile presentation device 3 may have a configuration capable of presenting a tactile sensation to the user.
  • the information processing system 1 may include a plurality of tactile presentation devices 3 that can be used by one user.
  • the information processing device 4 has a function of controlling the display device 2 and the tactile presentation device 3, a function of performing information communication with each of the display device 2, the tactile presentation device 3, and the information sharing server S.
  • Examples of the information processing device 4 include the above-mentioned mobile terminal, personal computer, game machine, and the like. The details of the information processing device 4 will be described later.
  • the information sharing server S has a configuration in which information such as image information and control information can be shared between the information processing device 4 and another information processing device (not shown). It should be noted that the configuration may be such that the clients communicate directly with each other without providing the information sharing server S. For example, in the case of the illustrated example, the information processing device 4 which is a client may be configured to directly communicate with another information processing device. Further, when sharing is not required, the information sharing server S may be omitted.
  • the information processing system 1 may be one in which at least two or more of the display device 2, the tactile presentation device 3, and the information processing device 4 are integrally configured.
  • the display device 2 may be provided with the functional configuration of the information processing device 4, or the tactile presentation device 3 may be provided with the functional configuration of the information processing device 4.
  • the display device 2 may be provided with the functional configurations of both the information processing device 4 and the tactile presentation device 3, or the display device 2 may be provided with the functional configurations of the tactile presentation device 3.
  • the information processing system 1 may include an audio output device (not shown) having an audio output unit that outputs audio such as a speaker.
  • the audio output device may be configured separately from other devices, or may be integrally configured with other devices such as the display device 2. Examples of the audio output device include speakers, headphones, wireless earphones, and the like.
  • the display device 2 and the tactile presentation device 3 are not limited to being indirectly connected to the information sharing server S via the information processing device 4, but may be directly connected. good.
  • the display device 2, the tactile presentation device 3, the information processing device 4, and the information sharing server S may be connected to each other by using a network such as a LAN (Local Area Network).
  • LAN Local Area Network
  • the information processing system 1 is for making the user feel the phenomenon in the virtual space not only visually but also by touch as described above. As a result, the user can enjoy an advanced virtual experience that is close to reality as if he / she had a real experience that cannot be obtained only by looking at the virtual space.
  • FIG. 2 is a block diagram showing a configuration example of the information processing device 4.
  • the information processing device 4 includes a storage unit 5, a control unit 6, a communication unit 7, and a detection unit 8.
  • the storage unit 5 is composed of, for example, a RAM (RandomAccessMemory), a ROM (ReadOnlyMemory), a hard disk, or the like.
  • the storage unit 5 stores information necessary for processing by the control unit 6, such as a program and data used in the program.
  • the control unit 6 is composed of, for example, a CPU (Central Processing Unit, that is, a processor) or the like.
  • the control unit 6 reads and executes the program stored in the storage unit 5. Then, the control unit 6 controls each component of the information processing device 4.
  • the program may be stored in an external storage such as a USB memory, may be provided via a network, or may be partially executed by another device via the network. May be good.
  • the communication unit 7 is composed of, for example, a communication device and communicates with each of the display device 2, the tactile presentation device 3, and the information sharing server S.
  • the communication unit 7 gives the information obtained from the information sharing server S, for example, image information, control information, and the like to the control unit 6. Further, the communication unit 7 gives the information (for example, image information, control information, etc.) obtained from the control unit 6 to the display device 2, the tactile presentation device 3, and the information sharing server S. Specifically, the communication unit 7 gives image information to the display device 2 and the information sharing server S, and gives control information to the tactile presentation device 3 and the information sharing server S.
  • the detection unit 8 is composed of, for example, an imaging device or the like, and provides the control unit 6 with detection information for detecting the above-mentioned phenomenon in the virtual space.
  • the detection unit 8 may be provided by the display device 2 or the tactile presentation device 3.
  • FIG. 3 is a functional block diagram showing an example of a functional configuration controlled by the control unit 6.
  • the control unit 6 mainly includes a display control unit 61, a transmission / reception control unit 62, an operating body recognition unit 63, and a tactile presentation control unit 64. These functional blocks function, for example, when the control unit 6 executes the above-mentioned program.
  • the display control unit 61 controls the operation of the display device 2 to display the virtual space on the display device 2. Specifically, the display control unit 61 generates image information for displaying the virtual space, and provides the generated image information to the display device 2. As a result, the display device 2 displays a virtual space (display object such as a virtual object or a real object) based on the image information. This image information is generated, for example, by executing a content program.
  • the tactile presentation program may be included in the content program or may be separate.
  • the transmission / reception control unit 62 shares the display of the virtual space among users. Specifically, the transmission / reception control unit 62 controls the communication unit 7 so as to send image information to the information sharing server S. As a result, the virtual space based on the image information of the information sharing server S can be displayed on another display device (not shown) as in the display device 2, and can be seen by another user.
  • the transmission / reception control unit 62 shares the tactile sensation among users. Specifically, the transmission / reception control unit 62 controls the communication unit 7 so as to send the control information generated by the tactile presentation control unit 64, which will be described later, to the information sharing server S. As a result, similarly to the tactile presentation device 3, another tactile presentation device (not shown) can be made to perform tactile presentation based on the control information of the information sharing server S, and can be perceived by another user.
  • the operation body recognition unit 63 grasps the position of the real object (operation body) that operates the virtual object in the virtual space.
  • the operating body is different from the tactile presentation device 3 (separate body).
  • the operating body 10 may include the functional configuration of the tactile presentation device 3. That is, the information processing system 1 includes a tactile presentation device 3 different from the operating body.
  • the virtual object means a virtual object (specifically, a display object without an entity) that is perceived by the user as if it exists in the real space.
  • virtual objects are represented by two-dimensional or three-dimensional computer graphics and placed in virtual space.
  • the virtual object includes a virtual object, a virtual UI, and the like. Virtual objects include not only those that are visible to the user but also those that are invisible (transparent objects, to put it bluntly).
  • a real object means a real object (specifically, a display object with an entity) that actually exists in the real space.
  • the real object also includes the human body.
  • FIG. 4 is a diagram showing a configuration example for grasping the position of the operating body.
  • the operating body 10 shown in FIG. 4 is used by the user to cause (manipulate) the phenomenon of the virtual space.
  • Examples of the operating body 10 include a pen-shaped (for example, AR pen, etc.) operating device as shown in the figure, and a part or all of the body such as the user's hands, feet, and head.
  • the operation unit 11 is a portion of the operation body 10 that serves as a point of contact with a virtual object. For example, the operation unit 11 becomes a pen tip when the operation body 10 is a pen-type instrument.
  • the operating body 10 is a user's finger, it corresponds to a fingertip, and when it is a user's hand, it corresponds to a palm or the like.
  • the operating body recognition unit 63 obtains the position (distance and direction from a predetermined position, etc.) of the operating unit 11 of the operating body 10 by the detecting unit 8.
  • the information processing device 4 which is the main body of the operating body 10 is composed of a smartphone
  • the detection unit 8 is composed of an imaging device built in the smartphone.
  • the operating body recognition unit 63 obtains the position of the operating unit 11 by detecting the operating unit 11 from the detection information (imaging information in this example) from the detecting unit 8.
  • the detection unit 8 is configured by a Depth camera capable of acquiring depth information by ToF (Time-of-Flight) or the like, and by detecting the operation unit 11, the third order in the XYZ three-dimensional coordinate system with a predetermined position as a reference point.
  • the original coordinate values (x, y, z) can be specified.
  • the position of the operation unit 11 can be satisfactorily specified.
  • the operating body 10 shown in FIG. 4 is provided with an operating button 12 on the grip portion.
  • the operation button 12 is used, for example, for notifying the information processing apparatus 4 of an operation start (writing start) trigger.
  • the operation button 12 is not limited to an electric switch, but may be a mechanical switch.
  • the structure may be such that when the operation button 12 is pressed, the operation unit 11 pops out and the reflective material comes out.
  • a mechanical switch By using a mechanical switch, a battery, an electric circuit, and the like are not required, and the operating body 10 can be made smaller and lighter.
  • FIG. 5 is a diagram showing another configuration example for grasping the position of the operating body 10.
  • the operating body 10 is composed of a pen-shaped instrument. Further, a plurality of markers (for example, invisible markers) 13 are attached to the surface of the operating body 10. Then, as described above, the detection unit 8 of the information processing device 4 is configured by the image pickup device. As a result, the operating body recognition unit 63 can estimate the angle, position, etc. of the operating body 10 by detecting the marker 13 of the operating body 10 from the detection information, and specify the position of the operating unit 11 of the pen tip. can. At this time, for example, the orientation and angle of the operating body 10 may be obtained by changing the shape and size of the marker 13. Also in the case shown in FIG. 5, it is possible to reduce the size and weight as described above.
  • markers for example, invisible markers
  • FIG. 6 is a diagram showing still another configuration example for grasping the position of the operating body 10.
  • the user's hand finger
  • the fingertip of the operating body 10 is used as the operating unit 11.
  • the operating body recognition unit 63 can specify the position of the operating unit 11 from the detection information of the detecting unit 8 by using the detecting unit 8 as an image pickup device. At this time, for example, the position of the fingertip can be specified from the shape of each finger of the user's hand.
  • the operating body 10 does not need to use a particularly fixed object, and may be an instrument that can be operated by the user in the air, or may be the user's body itself. Further, the operation unit 11 may be close to a display object such as a virtual object and may be able to grasp the position thereof. That is, the operating body 10 and the operating unit 11 may be appropriately determined according to the operation content and the like, and the position grasping and the like are not limited to a specific method, and known techniques can be used.
  • the operating body 10 is an instrument such as an AR pen or a haptic glove
  • the operating body 10 may be used as the tactile presentation device 3.
  • the information processing system 1 is provided with a tactile presentation device 3 different from the operating body 10 as described above.
  • the tactile presentation control unit 64 shown in FIG. 3 controls the operation of the tactile presentation device 3.
  • the tactile presentation control unit 64 detects a phenomenon in the virtual space according to the distance between the virtual object and the operating body 10, and generates control information according to the detected phenomenon. That is, the tactile presentation control unit 64 generates a control signal that controls the operation of the tactile presentation device 3 according to the distance between the virtual object and the operating body 10.
  • the tactile presentation control unit 64 causes the tactile presentation device 3 to present the tactile sensation representing the tactile sensation of the virtual object when the contact between the virtual object and the operating body 10 is detected. Further, the tactile presentation control unit 64 causes the tactile presentation device 3 to present a tactile sensation representing a state change of the virtual object. Further, the tactile presentation control unit 64 causes the tactile presentation device 3 to present a tactile sensation representing an operation using the operating body 10 on the virtual object.
  • the tactile presentation control unit 64 has the position of the operation unit 11 specified by the operation body recognition unit 63 and the arrangement position of the virtual object in the virtual space (for example, the three-dimensional coordinate value in the above-mentioned XYZ three-dimensional coordinate system). ) And, and the contact is detected by calculating the distance between the two. For example, if the distance between the two is within a predetermined value, it can be determined that they are close to each other, and if the distance between them is zero, it can be determined that they are in contact with each other.
  • the setting of the distance (contact / non-contact determination range) for determining contact may be changed according to a predetermined condition. For example, this predetermined condition can be set to brightness.
  • this predetermined condition can be set to brightness.
  • Click operation selection / decision, etc.
  • tactile presentation is performed by generating vibration or the like representing the operation.
  • a virtual object for example, a virtual object
  • an operation decision of a virtual UI User Interface
  • the method of the click operation is not particularly limited, and examples thereof include pressing the operation button 12 (see FIG. 4) of the operation body 10.
  • a vibration or the like for example, a vibration that feels “catch” or “chi” may be generated.
  • the tactile presentation is performed by generating vibration or the like representing the operation during the drag / drag & drop operation.
  • the tactile presentation device 3 may be caused to vibrate or the like to be felt as “tick” pronounced of the scale.
  • vibration or the like indicating that the virtual object is moving may be generated.
  • vibration or the like indicating the collision may be generated, or when the moving virtual object is released in the air, gravity may occur. When it falls and collides with a real object such as a desk, vibration or the like representing the collision may be generated.
  • the tactile sensation representing the drawing comfort (texture feeling) of the operation body 10 with respect to the virtual plane is tactile.
  • the presenting device 3 present by vibration or the like.
  • the tactile sensation corresponding to the setting of the pen type of the operating body 10 may be presented by vibration or the like. For example, if you are assuming a pencil, you should feel "rough", and if you are assuming a felt-tip pen, you should feel "squeaky”.
  • the tactile sensation is presented by generating vibration or the like that expresses the tactile sensation.
  • the tactile sensation according to the setting of the material of the touched virtual object may be presented. For example, if a plastic material is assumed, it causes vibrations with a short cycle such as "clicking" when hitting a hard material, and if it is a soft material, it responds to changes in the amount of penetration of the operation unit 11. To generate continuous vibrations that make you feel "gugugu".
  • the tactile sensation device 3 may be made to present a tactile sensation having a strength corresponding to the collision speed of the operating body 10 with the virtual object. For example, if the collision speed when the virtual object is touched is high, the vibration is strong, and if the collision speed is slow, the vibration is weak. As a result, the user can feel the impact according to the strength of the collision.
  • the force sensation may be presented by generating vibration or the like representing the force received from the virtual object.
  • a vibration that feels like receiving a reaction force on the side opposite to the struck direction may be generated.
  • the virtual object is pierced so as to rotate, it may cause vibration or the like to feel the rotational force.
  • vibration of the feel of receiving a rotational reaction force may be generated according to the display position of the pierced virtual object on the screen. This makes it possible for the user to perceive the direction and movement of the force received from the virtual object.
  • the tactile presentation device 3 is made to present the force sense, for example, when the virtual object is grasped, the force sense on the side of the grasped virtual object is opposite to the grasped hand. Make sure it is presented in your hand. This is realized by having the tactile presentation device 3 different from the operating body 10 perform the tactile presentation. This eliminates the need to vibrate the operating body 10.
  • the tactile sensation is not limited to vibration, but may be presented by pressure, for example, tightening of a VR device or the like.
  • the tactile sensation indicating contact or access prohibition is presented to the tactile presentation device 3.
  • a dangerous object to touch for example, a hot object such as fire
  • a tactile sensation indicating that the object is dangerous before actually touching it is presented.
  • a fragile real object such as a vase
  • a tactile sensation indicating that may be presented is presented.
  • the tactile sensation presenting device 3 is made to present a tactile sensation corresponding to the gripping force of the user holding the operating body 10.
  • the tactile sensation is changed according to the strength of holding the operating body 10. This allows the user to select the desired tactile sensation.
  • the tactile sensation device 3 is made to present a tactile sensation having a strength corresponding to the distance between the detection unit 8 and the operating body 10. For example, when the distance is long, a weak tactile presentation is performed, and when the distance is short, a strong tactile presentation is performed. As a result, for example, a sense of distance can be felt by touch.
  • the information processing system 1 may be configured to include a plurality of tactile presentation devices 3.
  • control information for controlling the operation of each of the plurality of tactile presentation devices 3 can be generated, and the tactile presentation device 3 for presenting the tactile sensation under predetermined conditions can be used properly.
  • the operating body 10 is an AR pen provided with a tactile presentation device 3
  • UI operation and drawing comfort by pen operation provide tactile feedback by vibrating the tactile presentation device 3 of the AR pen.
  • the tactile feedback is provided by vibrating or the like the tactile presentation device 3 on the main body side that is not the AR pen.
  • the user can intuitively grasp the type of phenomenon in the virtual space from the tactile sense.
  • one tactile presentation device 3 for example, AR pen
  • another tactile presentation device 3 for example, a smartphone on the main body side
  • the tactile presentation can be performed without worrying about the battery state.
  • the voice is generated from, for example, a vibration waveform for tactile presentation.
  • the vibration for presenting the tactile sensation may be generated from the voice waveform representing the tactile sensation.
  • the image visually expresses the sense of touch by shaking the display with, for example, the Shake effect.
  • the device that cannot present the tactile sensation may be made to output the voice expressing the tactile sensation and display the image.
  • FIG. 8 is a flowchart showing an example of the processing flow by the control unit 6. The order of the following processes can be changed as long as each process is not hindered.
  • the operating body recognition unit 63 recognizes the position, posture, and the like of the operating body 10 in the virtual space (step S1). That is, the position, posture, etc. of the operating body 10 (operating unit 11) such as the AR pen, the haptic glove, and the fingertip in the virtual space are recognized.
  • step S2 the contact state between the operation body 10 (operation unit 11), which is a real object, and a virtual object such as an operation UI or a virtual object is identified (step S2), and it is determined whether or not there is contact (step S3). .. As described above, whether or not there is contact can be determined by the distance between the operating body 10 (operating unit 11) and the virtual object.
  • the tactile presentation device 3 is controlled by the tactile presentation control unit 64 to perform tactile presentation (step S4).
  • the tactile presentation control unit 64 vibrates an operating body 10 including a tactile presentation device 3 such as an AR pen or a haptic glove to present a tactile sensation regarding the operation feeling, drawing comfort, texture, temperature, etc. of the virtual UI. To do so.
  • the tactile sensation may be presented by the tactile sensation presenting device 3 other than the operating body 10.
  • the operating body 10 does not require a battery, an electric circuit, or the like, and can be made smaller and lighter.
  • step S6 After the tactile presentation is performed in step S4, or when it is determined that there is no contact in step S3, the state change of the virtual object such as moving or falling on the floor is identified (step S5). ), It is determined whether or not there is a state change (step S6).
  • the tactile presentation device 3 is controlled by the tactile presentation control unit 64 to perform tactile presentation (step S7).
  • the tactile presentation control unit 64 vibrates a tactile presentation device 3 such as a mobile terminal or a head-mounted display, which is different from the operating body 10, to perform tactile presentation regarding recoil such as tilting, impact of falling on the ground, or the like.
  • step S7 After the tactile sensation is presented in step S7, or when it is determined in step S6 that there is no state change, the process by the control unit 6 ends.
  • the information processing system 1 will be described with reference to a specific example.
  • a case where the above-mentioned display device 2, the tactile presentation device 3, and the information processing device 4 are integrally configured will be described as an example.
  • the content a content that can draw lines, characters, and pictures in the AR space will be described as an example.
  • FIG. 9 shows a specific example of using the information processing system 1.
  • one smartphone hereinafter, simply referred to as a smartphone
  • the smartphone 100 integrally constitutes the above-mentioned display device 2, tactile presentation device 3, and information processing device 4. That is, the smartphone 100 includes a display device 2, a tactile presentation device 3, and an information processing device 4, which are integrally configured.
  • the smartphone 100 includes a display 21 as a component of the display device 2 described above. Further, the smartphone 100 includes a vibrator 31 as a component of the above-mentioned tactile presentation device 3. Further, the smartphone 100 includes an image pickup device 41 as a component of the detection unit 8 of the information processing device 4 described above. Further, the smartphone 100 includes an audio output device and a speaker 101 as a component thereof.
  • the smartphone 100 performs a process of displaying characters and pictures drawn by the user in the air in the virtual space displayed on the display 21 by executing the program stored in the storage unit 5 by the control unit 6 described above.
  • the user draws a picture on a virtual plane (not necessarily an exact plane) in the air using the AR pen 110 as the operating body 10 held by the user (in the figure, a broken line). Show).
  • the control unit 6 detects the trajectory of the pen tip of the AR pen 110 and controls so that the drawn picture (a picture invisible in the real space) is projected on the virtual space displayed on the display 21. ..
  • the smartphone 100 makes the user feel the object (display object) in the virtual space described above by executing the program stored in the storage unit 5 by the control unit 6 described above.
  • the smartphone 100 presents the user with a tactile sensation to a part (left hand part) different from the part that operates the virtual object in the virtual space (the part where the AR pen 110 touches the body). ..
  • the feeling of writing when the user draws a picture in space with the AR pen 110 with the right hand is tactilely presented to the left hand of the user holding the smartphone 100 by the vibration of the vibrator 31 of the smartphone 100. ing. That is, the tactile presentation is performed not on the right hand portion holding the AR pen 110 but on the left hand holding the smartphone 100.
  • This tactile sensation for example, makes you feel as if you are drawing with a pen on an actual plane.
  • the sense of touch not only the sense of touch but also the sense of hearing and sight are used.
  • a voice related to writing comfort for example, a crisp sound
  • an object in the virtual space is made to be felt by hearing.
  • an image related to writing comfort in the illustrated example, a wavy jagged display
  • the display 21 is visually perceived.
  • the phenomenon in the virtual space can be felt visually, auditorily, and tactilely, and the user can have a more realistic and advanced virtual experience similar to that in the real space.
  • the control unit 6 controls to generate control information for controlling the operation of the tactile presentation device 3 according to the distance between the virtual object and the real object that operates the virtual object.
  • the information processing system 1 is provided with a tactile presentation device 3 different from the operation body 10, and the control unit 6 controls the operation of the tactile presentation device 3, so that the tactile presentation on the operation body 10 can be performed. It becomes unnecessary.
  • the operating body 10 does not require a battery, an electric circuit, or the like, the operating body 10 can be made smaller and lighter. Even when the operating body 10 is a human body such as a user's hand, the user can be made to perform tactile presentation.
  • the body part that becomes the user's operating body 10 and the part that comes into contact with the operating body 10 are not limited to the hands and fingers. The same applies to the portion where the tactile sensation is presented by the tactile sensation presenting device 3. For example, by making these two different, excellent tactile presentation becomes possible.
  • the present technology is not limited to the above-described embodiment, and various modifications based on the technical idea of the present technology are possible. For example, various modifications as described below are possible. In addition, one or a plurality of arbitrarily selected modifications may be appropriately combined in the following modification modes. In addition, the configurations, methods, processes, shapes, materials, numerical values, and the like of the above-described embodiments can be combined with each other as long as they do not deviate from the gist of the present technology.
  • the phenomenon of the virtual space is mainly caused by the contact between the virtual object and the operating body 10, but the phenomenon is not limited to the contact and may be caused by the approach of a predetermined distance.
  • the tactile presentation can be performed without contacting the operating body 10 with the virtual object.
  • the tactile presentation may be performed by approaching a predetermined distance.
  • the temperature of the tactile presenting device 3 is changed by changing the temperature of the Perche element provided in the tactile presenting device 3 to present the temperature sensation. May be good. As a result, the sense of reality can be further enhanced.
  • the tactile presentation to one user has been described as the processing of the information processing device 4, but the present invention is not limited to this, and the display of the display device 2 and the tactile presentation of the tactile presentation device 3 can be performed by another user. It may be shared with the device.
  • the transmission / reception control unit 62 may control the display device 2 so that the image information can be shared between the display device 2 and another display device (display device for another user) different from the display device 2.
  • the tactile presentation device 3 and another tactile presentation device (tactile presentation device for other users) different from the tactile presentation device 3 may be controlled so that the control information can be shared.
  • the tactile presentation regarding the operation of the operation body 10 of A is performed by the operation body 10 including the tactile presentation device 3 of A. Fine-tuned tactile presentation is possible, such as by performing the tactile presentation device 3 on the main body side, which is not the operating body 10 of B.
  • An information processing device including a control unit that generates control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
  • the information processing device according to (1) wherein the control unit controls the operation of a tactile presentation device different from the real object.
  • the virtual object is a virtual plane
  • the real object is an operating body that draws a locus on the virtual plane.
  • the information processing device according to (1) or (2), wherein the control unit causes the tactile presentation device to present a tactile sensation representing the drawing comfort of the operating body on the virtual plane.
  • the control unit causes the tactile sensation presenting device to present a tactile sensation according to a setting of a pen type of the operating body.
  • the information processing device according to any one of (1) to (8).
  • the control unit controls the tactile presentation device and another tactile presentation device different from the tactile presentation device so that the control information can be shared.
  • Device. (11)
  • the control device generates control information for causing the tactile presenting device to present a sensation including any of tactile sensation, pressure sensation, vibration sensation, position, movement, force sensation, temperature sensation, and pain sensation in a narrow sense (1).
  • the information processing apparatus according to any one of 10).
  • the control unit causes the tactile presenting device to present a tactile sensation having a strength corresponding to the collision speed of the real object with the virtual object when the contact between the virtual object and the real object is detected.
  • the information processing device according to any one of (11).
  • the real object is an operating instrument
  • the information processing device according to any one of (1) to (13).
  • the control unit changes the setting of the distance for determining that the virtual object and the real object are in contact with each other according to a predetermined condition.
  • the information processing device according to any one of (1) to (15), wherein the control unit causes the tactile presentation device to present a tactile sensation having a strength corresponding to a distance between the detection unit and the real object.
  • the processor An information processing method that controls to generate control information that controls the operation of a tactile presentation device according to the distance between a virtual object and a real object that operates the virtual object.
  • On the computer A program that realizes a control function that generates control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
  • (19) A display device that displays virtual objects and A tactile presentation device that presents tactile sensations, An information processing device connected to the display device and the tactile presentation device, With The information processing device generates control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
  • the tactile presentation device is an information processing system that presents a tactile sensation based on the control information.

Abstract

L'invention concerne un dispositif de traitement d'informations comprenant une unité de commande pour commander le fonctionnement d'un dispositif de présentation haptique en fonction de la distance entre un objet virtuel et un objet réel (corps de fonctionnement) qui fait fonctionner l'objet virtuel. FIG. 1
PCT/JP2021/011342 2020-03-27 2021-03-19 Dispositif de traitement d'informations, procédé de traitement d'informations, programme et système de traitement d'informations WO2021193421A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020057037 2020-03-27
JP2020-057037 2020-03-27

Publications (1)

Publication Number Publication Date
WO2021193421A1 true WO2021193421A1 (fr) 2021-09-30

Family

ID=77892522

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/011342 WO2021193421A1 (fr) 2020-03-27 2021-03-19 Dispositif de traitement d'informations, procédé de traitement d'informations, programme et système de traitement d'informations

Country Status (1)

Country Link
WO (1) WO2021193421A1 (fr)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009011748A (ja) * 2007-07-09 2009-01-22 Nintendo Co Ltd ゲームプログラムおよびゲーム装置
JP2010287221A (ja) * 2009-05-11 2010-12-24 Univ Of Tokyo 力覚提示装置
JP2014222492A (ja) * 2013-05-14 2014-11-27 株式会社東芝 描画装置及び描画システム
JP2015212946A (ja) * 2014-05-05 2015-11-26 イマージョン コーポレーションImmersion Corporation ビューポートベースの拡張現実感触覚効果のためのシステムおよび方法
JP2016062593A (ja) * 2015-07-09 2016-04-25 キヤノン株式会社 情報処理装置、情報処理方法、プログラム
JP2018088260A (ja) * 2008-07-15 2018-06-07 イマージョン コーポレーションImmersion Corporation 受動及び能動モード間で触覚フィードバック機能を転換するシステム及び方法
WO2018116544A1 (fr) * 2016-12-19 2018-06-28 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
WO2018193650A1 (fr) * 2017-04-18 2018-10-25 株式会社ソニー・インタラクティブエンタテインメント Dispositif de commande de vibration

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009011748A (ja) * 2007-07-09 2009-01-22 Nintendo Co Ltd ゲームプログラムおよびゲーム装置
JP2018088260A (ja) * 2008-07-15 2018-06-07 イマージョン コーポレーションImmersion Corporation 受動及び能動モード間で触覚フィードバック機能を転換するシステム及び方法
JP2010287221A (ja) * 2009-05-11 2010-12-24 Univ Of Tokyo 力覚提示装置
JP2014222492A (ja) * 2013-05-14 2014-11-27 株式会社東芝 描画装置及び描画システム
JP2015212946A (ja) * 2014-05-05 2015-11-26 イマージョン コーポレーションImmersion Corporation ビューポートベースの拡張現実感触覚効果のためのシステムおよび方法
JP2016062593A (ja) * 2015-07-09 2016-04-25 キヤノン株式会社 情報処理装置、情報処理方法、プログラム
WO2018116544A1 (fr) * 2016-12-19 2018-06-28 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
WO2018193650A1 (fr) * 2017-04-18 2018-10-25 株式会社ソニー・インタラクティブエンタテインメント Dispositif de commande de vibration

Similar Documents

Publication Publication Date Title
JP6616546B2 (ja) ストレッチ特性を組み込んだ触覚デバイス
US10338681B2 (en) Systems and methods for multi-output electrostatic haptic effects
US10564730B2 (en) Non-collocated haptic cues in immersive environments
EP3425481B1 (fr) Dispositif de commande
US9134797B2 (en) Systems and methods for providing haptic feedback to touch-sensitive input devices
US10317997B2 (en) Selection of optimally positioned sensors in a glove interface object
JP2022000154A (ja) ゲーム装置および情報処理装置
JP6761225B2 (ja) 手持ち型情報処理装置
KR20050021500A (ko) 휴대용 컴퓨터 대화형 장치
US10474238B2 (en) Systems and methods for virtual affective touch
EP3333674A1 (fr) Systèmes et procédés pour la simulation d'élasticité avec rétroaction haptique
WO2021193421A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme et système de traitement d'informations
KR101528485B1 (ko) 스마트 디바이스 기반 가상현실 서비스 시스템 및 방법
EP3367216A1 (fr) Systèmes et procédés pour toucher affectif virtuel
WO2024090303A1 (fr) Dispositif de traitement d'informations et procédé de traitement d'informations
WO2023189405A1 (fr) Dispositif d'entrée/sortie
JP5773818B2 (ja) 表示制御装置、表示制御方法及びコンピュータプログラム
WO2022065120A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP2020072806A (ja) ゲーム装置および情報処理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21775665

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21775665

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP