WO2017141518A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et programme - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations et programme Download PDF

Info

Publication number
WO2017141518A1
WO2017141518A1 PCT/JP2016/085032 JP2016085032W WO2017141518A1 WO 2017141518 A1 WO2017141518 A1 WO 2017141518A1 JP 2016085032 W JP2016085032 W JP 2016085032W WO 2017141518 A1 WO2017141518 A1 WO 2017141518A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
information processing
processing apparatus
guide information
state
Prior art date
Application number
PCT/JP2016/085032
Other languages
English (en)
Japanese (ja)
Inventor
悠 石原
大森 和雄
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2017141518A1 publication Critical patent/WO2017141518A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output

Definitions

  • This disclosure relates to an information processing apparatus, an information processing method, and a program.
  • Patent Document 1 includes a projector that displays an operation menu on the floor, and a range sensor that detects an object on the floor that operates the displayed operation menu, and uses the detected position of the object to detect the object.
  • An actuator that performs processing corresponding to a position is disclosed.
  • the present disclosure proposes a mechanism that can make the interaction between the operation subject and the apparatus smoother.
  • the guide information is guided by the projection control unit that causes the projection unit to project guide information for guiding the operation of the operation subject requested by the operation subject according to the projection instruction of the guide information.
  • the projection control unit that causes the projection unit to project guide information for guiding the operation of the operation subject requested by the operation subject according to the projection instruction of the guide information.
  • a processor using a processor, projecting guide information for guiding the motion of the motion subject that the motion subject requests guidance to the projection unit according to a projection instruction of the guide information;
  • an information processing method including performing processing based on the state is provided.
  • the projection control function for projecting guide information for guiding the motion of the motion subject that the motion subject requests guidance to the projection unit according to a projection instruction of the guide information, and the guide information
  • a program for causing a computer to realize a processing control function for performing processing based on the state is provided.
  • FIG. 2 is a block diagram schematically illustrating an example of a functional configuration of an information processing system according to a first embodiment of the present disclosure.
  • FIG. 5 is a flowchart conceptually showing an example of processing of the information processing apparatus according to the embodiment.
  • FIG. 14 is a flowchart conceptually illustrating an example of processing of an information processing device according to a second embodiment of the present disclosure. It is a figure for demonstrating the process example of the information processing system which concerns on the 1st application example of the embodiment. It is a figure for demonstrating the process example of the information processing system which concerns on the 2nd application example of the embodiment.
  • FIG. 3 is an explanatory diagram illustrating a hardware configuration of an information processing apparatus according to an embodiment of the present disclosure.
  • a plurality of constituent elements having substantially the same functional configuration may be distinguished by adding different numbers after the same reference numerals.
  • a plurality of configurations having substantially the same function are differentiated as necessary, such as the device 10A and the device 10B.
  • the same reference numerals are given.
  • the device 10A and the device 10B they are simply referred to as the device 10.
  • the apparatus 10 has a communication function and a projection function.
  • the communication function is a function for receiving information from an external device or transmitting information to an external device.
  • the projection function is a function for projecting an image around the device. For this reason, the apparatus 10 can project an image based on information obtained through communication.
  • 1A and 1B are diagrams for describing a technology related to the information processing apparatus 100 according to each embodiment of the present disclosure.
  • the apparatus 10 projects the image of the shogi piece on the area on the apparatus 10 side of the shogi board as shown by the broken line in FIG. 1A.
  • the user moves the piece on the shogi board and inputs to the communication terminal 20 which piece is moved to where.
  • the communication terminal 20 and the apparatus 10 communicate piece movement information.
  • the apparatus 10 changes the position of the image of the frame to be projected based on the received frame movement information. In this way, a shogi battle between the user and the device 10 is realized.
  • another device 10 may be a shogi opponent on behalf of the user.
  • the device 10A and the device 10B project images of shogi pieces on each device side of the shogi board, respectively.
  • the device 10A determines which frame is moved to where, and changes the position of the image of the projected frame to the determined position.
  • the device 10A transmits the determined piece movement information to the device 10B.
  • the apparatus 10B changes the position of the image of the projected frame based on the received frame movement information. In this way, a shogi battle between the devices 10 is realized.
  • the interaction between the operation subject and the apparatus may be complicated. For example, it may be a burden on the user to input state change information such as piece movement information one by one.
  • a communication function is required for realizing the interaction. For example, if there is no device or communication function corresponding to the communication with the device 10, it is difficult to use the interaction with the device 10.
  • the information processing apparatus 100 has a recognition function that recognizes the operation state of the operation subject.
  • 2A and 2B are diagrams for describing an overview of a function for recognizing an operation state of the information processing apparatus 100 according to each embodiment of the present disclosure.
  • the information processing apparatus 100 captures an area corresponding to a shogi board as shown by a dotted line in FIG. 2A and recognizes the user's operation state based on an image obtained by the imaging.
  • the information processing apparatus 100 grasps piece movement information based on the position of the piece shown in the image.
  • the information processing apparatus 100 changes the position of the image of the projected frame based on the grasped movement information of the frame.
  • the user only moves the piece and does not perform an input operation for providing the information processing apparatus 100.
  • the information processing apparatus 100A determines which frame to move to and changes the position of the image of the projected frame to the determined position. However, the information processing apparatus 100A does not transmit the piece movement information to the information processing apparatus 100B as illustrated in FIG. 2B.
  • the information processing apparatus 100B captures an area corresponding to the shogi board, and recognizes the operation state of the information processing apparatus 100A based on an image obtained by the imaging. When it is recognized that the information processing apparatus 100A has moved the piece, the information processing apparatus 100B changes the position of the image of the piece to be projected based on the position of the piece shown in the image.
  • the information processing apparatus 100 has a recognition function for recognizing the operation state of the operation subject. For this reason, it is possible to reduce the operation or processing of the operation subject. Accordingly, it is possible to suppress complication of interaction between the operation subject and the apparatus.
  • an information processing system having the recognition function will be described in detail.
  • the information processing apparatus 100 according to the first and second embodiments is given a number corresponding to the embodiment at the end like the information processing apparatus 100-1 and the information processing apparatus 100-2. To distinguish. The same applies to the server 200 described later.
  • 2A and 2B show an example in which the information processing apparatus 100 is a dog-shaped robot, the information processing apparatus 100 is not limited to this, and may be an apparatus other than the robot.
  • First embodiment (basic form)> A first embodiment of the present disclosure will be described.
  • processing corresponding to the state is performed. Executed.
  • FIG. 3 is a block diagram schematically illustrating an example of a functional configuration of the information processing system according to the first embodiment of the present disclosure.
  • the information processing system includes an information processing apparatus 100-1 and a server 200-1.
  • the information processing apparatus 100-1 includes an audio input unit 102, an imaging unit 104, a control unit 106, a projection unit 108, an audio output unit 110, and a communication unit 112. Note that the information processing apparatus 100-1 may be a portable apparatus.
  • the voice input unit 102 collects voice around the information processing apparatus 100-1. Specifically, the voice input unit 102 collects voice around the information processing apparatus 100-1 and generates information related to the collected voice (hereinafter also referred to as voice information). Note that the voice input unit 102 may collect sound other than voice. Also, the voice information may be acquired from an external voice input device via the communication unit 112.
  • the imaging unit 104 images the periphery of the information processing apparatus 100-1. Specifically, the imaging unit 104 captures an area related to the operation of the information processing apparatus 100-1 (hereinafter also referred to as an operation area), and information related to the image obtained by the imaging (hereinafter referred to as a captured image). Also referred to as information).
  • the operation area is an area where an image is projected or can be projected by the projection unit 108. Note that the image obtained by imaging by the imaging unit 104 may be an image in which the action subject is shown, or may be an image in which the action subject is not shown.
  • the captured image information may be acquired from an external imaging device via the communication unit 112.
  • the control unit 106 controls the functions of the information processing apparatus 100-1 as a whole. Specifically, the control unit 106 controls processing of the voice input unit 102, the imaging unit 104, and the projection unit 108. For example, the control unit 106 controls processing contents of the voice input unit 102, the imaging unit 104, and the projection unit 108 by setting control parameters and the like.
  • control unit 106 acquires the recognition result for the voice information and the captured image information from the server 200-1.
  • the control unit 106 requests the server 200-1 to perform voice recognition processing and image recognition processing for each of the voice information input from the voice input unit 102 and the captured image information input from the imaging unit 104 via communication. To do. Then, the control unit 106 receives the voice recognition result and the image recognition result from the server 200-1 via communication.
  • the control unit 106 causes the projection unit 108 to project the guide information in response to the guide information projection instruction.
  • the guide information is information for guiding the action of the action subject that the action subject requests for guidance.
  • control unit 106 provides information for projecting the guide information (hereinafter also referred to as material information). Obtained from the server 200-1 via communication. Then, guide information is projected on the projection unit 108 based on the acquired material information. For example, the control unit 106 analyzes text information obtained by the speech recognition process, and determines whether the text information indicates a guide information projection instruction.
  • the control unit 106 When it is determined that the text information indicates a guide information projection instruction, the control unit 106 requests the server 200-1 to provide material information related to the guide information indicated by the text information via communication. When the material information is provided from the server 200-1, the control unit 106 causes the projection unit 108 to project guide information based on the provided material information.
  • the guide information projection instruction may be a direct instruction of the guide information projection or an indirect instruction.
  • the indirect instruction of the guide information projection may be a request or request to the information processing apparatus 100 related to the guide information.
  • the material information may be stored in the information processing apparatus 100-1 or may be generated in the information processing apparatus 100-1.
  • the material information provided from the server 200-1 may be processed in the information processing apparatus 100-1.
  • the control unit 106 performs processing based on the target state when the state in which the operation guided by the guide information is performed by the operating subject (hereinafter also referred to as a target state) is recognized. Specifically, the control unit 106 determines whether the operation state recognized based on the captured image information related to the image in which the operation result of the operation subject is reflected is the target state. When it is determined that the operation state is the target state, the control unit 106 performs processing based on the target state. For example, the control unit 106 sets a target state when projection of guide information is started. Next, the control unit 106 determines whether the operation state has reached the target state at predetermined time intervals.
  • control unit 106 determines a change in the position of the action subject (or the position of the object moved by the action subject) based on the position information of the object obtained by the image recognition process. When it is determined that the position of the operation subject has changed to the target state, the control unit 106 determines that the operation state has reached the target state. Then, the control unit 106 determines that the operation state has reached the target state, and after performing processing based on the target state, sets the next target state.
  • the projection unit 108 projects an image around the information processing apparatus 100-1. Specifically, the projection unit 108 projects an image related to image information (hereinafter also referred to as projection image information) input from the control unit 106 around the information processing apparatus 100-1. For example, the projection unit 108 projects guide information input from the control unit 106 based on material information to a position designated by the control unit 106. Note that the projection unit 108 may project images separately on a plurality of regions. Further, the projection area may be fixed or variable. When the projection area is variable, the projection area may be controlled by the control unit 106.
  • image information image information input from the control unit 106 around the information processing apparatus 100-1.
  • the projection unit 108 projects guide information input from the control unit 106 based on material information to a position designated by the control unit 106.
  • the projection unit 108 may project images separately on a plurality of regions.
  • the projection area may be fixed or variable. When the projection area is variable, the projection area may be controlled by the control unit 106.
  • the audio output unit 110 outputs audio to the outside of the information processing apparatus 100-1. Specifically, the audio output unit 110 outputs audio related to the audio information input from the control unit 106 to the outside of the information processing apparatus 100-1. Note that the audio information related to the output audio may be generated by the control unit 106 or acquired from an external device via the communication unit 112.
  • the communication unit 112 communicates with a device external to the information processing device 100-1. Specifically, the communication unit 112 transmits voice information, captured image information, and an information provision request to the server 200-1, and receives a voice recognition result, an image recognition result, and material information from the server 200-1. Note that the communication unit 112 may communicate with the server 200-1 using a wired communication method or may use a wireless communication method.
  • the server 200-1 includes a communication unit 202, a voice recognition unit 204, an image recognition unit 206, an information providing unit 208, and a storage unit 210.
  • the communication unit 202 communicates with a device external to the server 200-1. Specifically, the communication unit 202 receives voice information, captured image information, and an information provision request from the information processing apparatus 100-1, and transmits the voice recognition result, the image recognition result, and the material information to the information processing apparatus 100-1. To do.
  • the communication unit 202 has substantially the same function as the communication unit 112.
  • the voice recognition unit 204 recognizes voice related to voice information. Specifically, the voice recognition unit 204 generates text information based on the voice information. For example, the speech recognition unit 204 estimates characters corresponding to the content of speech related to input speech information, and generates text information as a speech recognition result using the estimated characters. Note that the voice recognition unit 204 may specify a speaker who speaks voice information.
  • the image recognition unit 206 recognizes an image related to image information. Specifically, the image recognizing unit 206 recognizes the positional relationship between the objects shown in the image related to the image information. For example, the image recognition unit 206 recognizes an object appearing in an image related to input captured image information, and estimates the arrangement of the recognized object. Then, the image recognition unit 206 generates information regarding the recognized object and the arrangement of the object as an image recognition result. Note that a technology such as SLAM (Simultaneous Localization and Mapping) may be used for the image recognition processing. Further, the image recognition unit 206 may further perform face recognition processing.
  • SLAM Simultaneous Localization and Mapping
  • the information providing unit 208 provides information to the information processing apparatus 100-1. Specifically, the information providing unit 208 selects information to be provided in response to an information provision request from the information processing apparatus 100-1, and transmits the selected information to the information processing apparatus 100-1 via communication. provide. For example, the information providing unit 208 acquires material information stored in the storage unit 210 in response to a material information provision request. Then, the information providing unit 208 causes the communication unit 202 to transmit the acquired material information to the information processing apparatus 100-1.
  • the storage unit 210 stores information used for processing of the server 200-1 and information provided to the information processing apparatus 100-1.
  • the storage unit 210 stores information used for voice recognition processing or image recognition processing, and image information or text information as material information.
  • FIG. 4 is a flowchart conceptually showing an example of processing of the information processing apparatus 100-1 according to the present embodiment.
  • the information processing apparatus 100-1 determines whether a guide information projection instruction has been acquired (step S302). Specifically, the control unit 106 causes the server 200-1 to perform voice recognition processing on the voice information input from the voice input unit 102, and acquires a voice recognition result. Then, the control unit 106 determines whether the acquired voice recognition result indicates a guide information projection instruction.
  • the information processing apparatus 100-1 sets a target state according to the guide information (step S304). Specifically, when it is determined that the voice recognition result indicates a guide information projection instruction, the control unit 106 sets an operation state corresponding to the guide information related to the projection instruction as a target state.
  • the information processing apparatus 100-1 recognizes the operating state (step S306). Specifically, the control unit 106 causes the server 200-1 to perform image recognition processing on the image information input from the imaging unit 104, and acquires an image recognition result.
  • the information processing apparatus 100-1 determines whether the recognized operation state is a target state (step S308). Specifically, the control unit 106 determines whether the operation state specified based on the acquired image recognition result matches the target state. Note that an error within a predetermined range may be allowed for the consistency with the target state (reachability to the target state).
  • the information processing apparatus 100-1 executes processing based on the target state (step S310). Specifically, when it is determined that the operation state specified based on the image recognition result matches the target state, the control unit 106 executes processing based on the target state.
  • the information processing apparatus 100-1 If it is determined that the recognized operation state is not the target state, the information processing apparatus 100-1 returns the process to step S306.
  • the information processing apparatus 100-1 determines whether or not the projection of the guide information has ended (step S312). When it is determined that the projection of the guide information has ended, the processing ends. If it is not determined that the projection of the guide information has been completed, the process returns to step S304 and the process is repeated.
  • FIG. 5 is a diagram for explaining a processing example of the information processing system according to the first application example of the present embodiment.
  • the user and the information processing apparatus 100-1 are arranged across the shogi board.
  • a piece as a real object hereinafter, the piece 30 will be described as a representative
  • the pieces arranged on the user side may be projected by the information processing apparatus 100-1.
  • the information processing apparatus 100-1 determines whether an instruction to project guide information related to shogi has been acquired based on the user's voice input. For example, as shown at time t1 in FIG. 5, the control unit 106 obtains text information related to the text “I want to play shogi” from the server 200-1 using the audio information input from the audio input unit 102. Then, the control unit 106 interprets the user instruction “Shogi confrontation” based on the text information, and determines that the projection instruction of “Shogi piece” has been acquired from the interpreted instruction.
  • the information processing apparatus 100-1 projects the guide information related to Shogi onto the user's motion area. For example, as shown at time t2 in FIG. 5, the control unit 106 displays a shogi piece (hereinafter, representatively a piece) on the information processing apparatus 100-1 side of the shogi board shown in the image obtained by the image pickup unit 104. G1 will be described.) Is projected on the projection unit 108.
  • a shogi piece hereinafter, representatively a piece
  • the information processing apparatus 100-1 sets, as a target state, a state in which a user operation is performed on guide information related to shogi projected on the motion area. Specifically, for example, the control unit 106 sets a state where the piece 30 is moved by the user as a target state.
  • the information processing apparatus 100-1 determines whether the operation state has reached the set target state. For example, the control unit 106 determines whether or not the piece 30 on the shogi board has moved based on the result of image recognition processing using image information obtained by imaging by the imaging unit 104. When the piece 30 is moved by the user as shown at time t3 in FIG. 5, the control unit 106 determines that the state in which the piece 30 has been moved by the user, that is, the target state has been reached.
  • the information processing apparatus 100-1 changes the projection content based on an operation related to the target state. For example, when the piece 30 is moved, the control unit 106 determines the next move based on the movement of the piece 30. Then, the control unit 106 determines the projection position of the frame G1 based on the next move to be determined, and causes the projection unit 108 to project the frame G1 at the projection position as shown at time t4 in FIG.
  • the user can enjoy shogi with the information processing apparatus 100-1 without complicated operations.
  • FIG. 6 is a diagram for explaining another example of the process of the information processing system according to the first application example of the present embodiment. Note that description that is substantially the same as the description using the example of FIG. 5 is omitted.
  • the information processing apparatus 100-1A and the information processing apparatus 100-1B are arranged across the shogi board. Also, nothing is placed on the shogi board. The shogi board may not be prepared. In that case, at least one of the information processing apparatuses 100-1A or 100-1B may project a shogi board.
  • the information processing apparatus 100-1 determines whether an instruction to project guide information related to shogi has been acquired based on the user's voice input. For example, the information processing apparatuses 100-1A and 100-1B use the voice information input from the voice input unit 102 as shown at time t5 in FIG. Get such text information. Then, the control unit 106 determines that the projection instruction of “shogi piece” is acquired from the user instruction “shogi confrontation” interpreted based on the text information.
  • the information processing apparatus 100-1 projects the guide information onto the user's motion area. For example, as shown at time t6 in FIG. 6, the shogi piece G1A is projected on the information processing apparatus 100-1A side of the shogi board by the information processing apparatus 100-1A, and the information processing apparatus 100-1B side of the shogi board The shogi piece G1B is projected by the information processing apparatus 100-B.
  • the information processing apparatus 100-1 sets, as a target state, a state in which an operation related to the movement of the piece of the information processing apparatus 100-1 of the other party is performed on the guide information projected on the motion area. Specifically, for example, when the information processing apparatus 100-1A is ahead, the information processing apparatus 100-1B sets the state where the piece G1A is moved by the information processing apparatus 100-1A as the target state. Further, the information processing apparatus 100-1A sets the state in which the piece G1B is moved by the information processing apparatus 100-1B as the target state after moving the piece G1A.
  • the information processing apparatus 100-1 determines whether the operation state of the other party has reached the set target state. For example, the information processing apparatus 100-1B determines whether or not the shogi board piece G1A has moved. When the piece G1A projected by the information processing apparatus 100-1A is moved as shown at time t7 in FIG. 6, the information processing apparatus 100-1B operates the set information processing apparatus 100-1A. It is determined that
  • the information processing apparatus 100-1 changes the projection content based on the operation. For example, when the piece G1A is moved, the information processing apparatus 100-1B determines the next move based on the movement of the piece G1A. Then, the information processing apparatus 100-1B determines the projection position of the frame G1B based on the determined next move, and places the frame G1B at the determined position as shown at time t8 in FIG. To project.
  • the user can realize the shogi confrontation between the information processing apparatuses 100-1 without burden.
  • the game to which the information processing system which concerns on this embodiment is applied may be another face-to-face game.
  • the applied game may be chess, othello or go.
  • the present invention may be applied to a game with three or more players (three devices).
  • FIG. 7 is a diagram for explaining a processing example of the information processing system according to the second application example of the present embodiment.
  • the information processing apparatus 100-1 determines whether or not a guide information projection instruction related to scanning has been acquired based on a user's voice input. For example, the control unit 106 obtains text information related to the text “I want to scan this paper” from the server 200-1 using the audio information input from the audio input unit 102 as shown at time t9 in FIG. . Then, the control unit 106 determines that the projection instruction of “scan area” has been acquired from the user instruction “scan paper” interpreted based on the text information.
  • the information processing apparatus 100-1 projects the guide information onto the scan area. For example, as shown at time t10 in FIG. 7, the control unit 106 displays a frame G2 indicating the scan area in the scan area of the information processing apparatus 100-1, and text G3 that prompts the user to place the scan target in the frame G2. Is projected on the projection unit 108.
  • the information processing apparatus 100-1 sets, as a target state, a state in which an operation related to a user's scan for guide information projected on a scan area is performed.
  • the control unit 106 sets, as a target state, a state in which the paper 50 to be scanned is arranged in a frame G2 indicating a scan area by the user.
  • the information processing apparatus 100-1 determines whether the operation state has reached the set target state. For example, the control unit 106 determines whether or not the scan target is arranged in the frame G2 indicating the scan area. As shown at time t11 in FIG. 7, when the user places the paper 50 in the frame G2 on which the paper 50 is projected, the control unit 106 recognizes the state in which the paper 50 is placed in the frame G2 based on the image information. Then, it is determined that the set user operation has been performed.
  • the information processing apparatus 100-1 executes processing corresponding to the projection instruction based on the operation related to the target state. For example, when the paper 50 is placed in the frame G2, the control unit 106 ends the projection of the text G3 as shown at time t11 in FIG. Next, the control unit 106 causes the imaging unit 104 to image the scan area indicated by the frame G2, thereby executing a scan corresponding to the projection instruction for the “scan area”. Then, the control unit 106 stores the image obtained by the imaging in the storage unit in the information processing apparatus 100-1, and captures the voice “captured” indicating that the scan has ended as shown at time t12 in FIG. Is output to the audio output unit 110.
  • FIG. 8 is a diagram for explaining a processing example of the information processing system according to the third application example of the present embodiment.
  • the information processing apparatus 100-1 determines whether an instruction to project guide information related to calligraphy is acquired based on a user's voice input. For example, the control unit 106 obtains text information related to the text “I want to practice calligraphy” from the server 200-1 using the voice information input from the voice input unit 102 as shown at time t13 in FIG. . Then, the control unit 106 determines that projection instructions of “characters subject to calligraphy” and “writing order” have been acquired from the user's instruction “support of calligraphy” interpreted based on the text information.
  • the information processing apparatus 100-1 projects the guide information onto an area used for the calligraphy (hereinafter also referred to as a calligraphy area). For example, the control unit 106 determines a calligraphy area around the information processing apparatus 100-1. In the calligraphy area to be determined, the control unit 106 writes the frame G4 indicating the calligraphy area, the character G5 (“T” in FIG. 8) to be the calligraphy object, and the character G5, as shown at time t14 in FIG. An image G6A indicating the order (hereinafter also referred to as a writing order G6A) is projected on the projection unit 108. The frame G4 may not be projected.
  • the calligraphy area may be paper prepared for calligraphy, and in that case, the control unit 106 may determine the calligraphy area by recognizing the paper.
  • the information processing apparatus 100-1 sets, as a target state, a state in which an operation of writing a user's character with respect to guide information projected on the calligraphy area is performed.
  • the control unit 106 sets, as a target state, a state in which a character portion corresponding to the writing order G6A is written in the frame G4 by the user. It is not necessary to actually write characters.
  • a state where a writing instrument such as a brush for writing characters or a finger is moved according to the writing order G6A may be set as the target state.
  • the information processing apparatus 100-1 determines whether the user's operation state has reached the set target state. For example, the control unit 106 determines whether the part of the character G5 corresponding to the projected writing order G6A has been written. As shown at time t15 in FIG. 8, when the part of the character G5 corresponding to the writing order G6A (the first drawing of “T”) is written by the user, the control unit 106 follows the writing order G6A. It is determined that the state where the part of the character G5 corresponding to is written is reached.
  • the information processing apparatus 100-1 projects the guide information next to the guide information being projected. For example, when the part of the character G5 corresponding to the writing order G6A (first drawing of “T”) is written, the control unit 106 projects the next writing order G6B as shown at time t16 in FIG. To project.
  • the character of calligraphy object may be any other characters, such as a Chinese character or an Arabic character.
  • the information processing system according to the present embodiment may be applied in the drawing order of symbols or figures other than characters.
  • the projection order information such as the drawing order may be included in the material information provided from the server 200-1.
  • the information processing apparatus 100-1 projects guide information indicating the position of the motion of the motion subject.
  • the control unit 106 causes the projection unit 108 to project only an image indicating the position where the ingredient is to be arranged as guide information, and the operation state is such that the ingredient is arranged at the arrangement position indicated by the projected image. If it is determined that has reached, the projection position next to the projection position to be projected is projected on the projection unit 108.
  • FIG. 9 is a diagram for explaining a processing example of the information processing system according to the fourth application example of the present embodiment.
  • the information processing apparatus 100-1 determines whether an instruction to project guide information related to the arrangement of ingredients has been acquired based on the user's voice input. For example, the control unit 106 uses the audio information input from the audio input unit 102 as shown at time t17 in FIG. 9, and the text information related to the text “Show me an example of arrangement” from the server 200-1. Get. Then, the control unit 106 determines that the projection instruction of “positioning position” has been acquired from the user instruction “presentation of ingredient arrangement” interpreted based on the text information.
  • the information processing apparatus 100-1 displays the area where the ingredients are arranged (hereinafter referred to as the arrangement area). (Also referred to as “.”).
  • the control unit 106 determines the arrangement area around the information processing apparatus 100-1.
  • the control unit 106 causes the projection unit 108 to project only the image G7A indicating the placement position (hereinafter also referred to as the placement position G7A), as shown at time t18 in FIG.
  • region may be the dish in which ingredients are arranged, and the control part 106 may determine the arrangement
  • the information processing apparatus 100-1 sets, as a target state, a state in which an operation for arranging the user's ingredients with respect to the guide information projected on the arrangement area is performed.
  • the control unit 106 sets, as a target state, a state in which ingredients corresponding to the placement position G7A are arranged at the projected placement position G7A.
  • the information processing apparatus 100-1 determines whether the user's operation state has reached the set target state. For example, the control unit 106 determines whether an ingredient has been arranged at the projected placement position G7A. As shown at time t19 in FIG. 9, when the ingredient 60A is placed at the placement position G7A by the user, the control unit 106 determines that the corresponding ingredient has been placed at the placement position G7A. To do.
  • the information processing apparatus 100-1 projects the guide information next to the guide information being projected. For example, when the ingredient 60A is placed at the placement position G7A, the control unit 106 projects only the next placement position G7B on the projection unit 108 as shown at time t20 in FIG. Then, as shown at time t21 in FIG. 9, the user places the ingredient 60B at the serving position G7B.
  • the information processing apparatus 100-1 projects guide information indicating the position of the action subject, as in the fourth application example.
  • the control unit 106 causes the projection unit 108 to project an image indicating the storage position of the object as guide information, and the operation state has reached the state where the object is stored at the storage position indicated by the projected image. If it is determined, the projection unit 108 is caused to project the storage position next to the storage position to be projected.
  • FIG. 10 is a diagram for explaining a processing example of the information processing system according to the fifth application example of the present embodiment.
  • the information processing apparatus 100-1 determines whether a guide information projection instruction related to storage of an object has been acquired based on a user's voice input. For example, the control unit 106 uses the audio information input from the audio input unit 102 as shown at time t22 in FIG. 10 to send the text information related to the text “Tell me how to store” from the server 200-1. obtain. Then, the control unit 106 determines that the projection instruction of “storage position” has been acquired from the user instruction “presentation of storage method” interpreted based on the text information.
  • the information processing apparatus 100-1 projects the guide information onto an area in which the object is stored (hereinafter also referred to as a storage area).
  • the control unit 106 determines a storage area around the information processing apparatus 100-1.
  • the control unit 106 causes the projection unit 108 to project an image G8A indicating the storage position of the object (hereinafter also referred to as storage position G8A) onto the determined storage area, as shown at time t23 in FIG.
  • the storage area may be an object having a storage function. In this case, the control unit 106 may determine the storage area by recognizing the object having the storage function.
  • the information processing apparatus 100-1 sets a state in which the user's object storage operation is performed on the guide information projected on the storage area as a target state. Specifically, for example, the control unit 106 sets, as a target state, a state in which an object corresponding to the storage position G8A is arranged at the storage position G8A to be projected.
  • the information processing apparatus 100-1 determines whether the user's operation state has reached the set target state. For example, the control unit 106 determines whether an object is placed at the storage position G8A to be projected. As shown at time t24 in FIG. 10, when the user places the object 62A at the storage position G8A, the control unit 106 determines that the corresponding object has been placed at the storage position G8A.
  • the information processing apparatus 100-1 projects the guide information next to the guide information being projected. For example, when the object 62A is arranged at the storage position G8A, the control unit 106 causes the projection unit 108 to project the storage position G8B of the next object as shown at time t25 in FIG. Then, as shown at time t26 in FIG. 10, the user places the object 62B at the storage position G8B.
  • the information processing system according to the present embodiment may be applied to storing an object in a refrigerator or a bookshelf.
  • the information processing apparatus 100-1 projects guide information indicating the position of the motion of the motion subject in accordance with the shape of the object existing in the projection area.
  • the control unit 106 causes the projection unit 108 to project an image showing a fold of a cloth for wrapping an object as guide information, and the operation state is such that the cloth is folded at the fold indicated by the projected image. If it is determined that the image has been reached, the projection unit 108 is caused to project an image indicating a fold next to the fold indicated by the projected image.
  • FIG. 11 is a diagram for explaining a processing example of the information processing system according to the sixth application example of this embodiment.
  • the information processing apparatus 100-1 determines whether a guide information projection instruction relating to how to wrap an object has been acquired based on a user's voice input. For example, the control unit 106 obtains text information related to the text “Tell me how to wrap” from the server 200-1 using the audio information input from the audio input unit 102 as shown at time t27 in FIG. . Then, the control unit 106 determines that the projection instruction of “folding cloth fold” has been acquired from the user instruction “presentation of how to wrap an object” interpreted based on the text information.
  • the information processing apparatus 100-1 projects the guide information along the shape of the object.
  • the control unit 106 recognizes an object and cloth placed around the information processing apparatus 100-1.
  • the control unit 106 causes the projection unit 108 to project an image G9A (hereinafter also referred to as a fold line G9A) indicating the fold of the cloth surrounding the object.
  • the information processing apparatus 100-1 sets, as a target state, a state in which an operation of folding the user's cloth for the projected guide information is performed.
  • the control unit 106 sets, as a target state, a state in which the cloth is folded at a fold line that matches the projected fold line G9A and the object is covered with the cloth.
  • the information processing apparatus 100-1 determines whether the user's operation state has reached the set target state. For example, the control unit 106 determines whether the cloth is folded like the projected crease G9A. As shown at time t29 in FIG. 11, when the cloth covering the object is folded along the fold G9A by the user, the control unit 106 determines that the cloth covering the object at the fold G9A has been folded. .
  • the information processing apparatus 100-1 projects the guide information next to the guide information being projected. For example, when the cloth is folded like the crease G9A, the control unit 106 causes the projection unit 108 to project the next crease G9B as shown at time t30 in FIG. Then, as shown at time t31 in FIG. 11, the user wraps the object by folding the cloth at the crease G9B.
  • FIG. 12 is a diagram for explaining a processing example of the information processing system according to the seventh application example of the present embodiment.
  • the information processing apparatus 100-1 determines whether or not an instruction to project guide information related to video reproduction has been acquired based on a user's voice input. For example, the control unit 106 obtains text information related to the text “I want to watch an XYZ program” from the server 200-1 using the audio information input from the audio input unit 102 as shown at time t32 in FIG. . Then, the control unit 106 determines that the projection instruction of “video playback button” has been acquired from the user instruction “video playback of XYZ program” interpreted based on the text information.
  • the information processing apparatus 100-1 projects the guide information onto the viewing area of the user. For example, the control unit 106 determines an area where the user can view the video. Then, the control unit 106 causes the projection unit 108 to project the video reproduction button G10 onto the determined viewing area as shown at time t33 in FIG.
  • the information processing apparatus 100-1 sets a state where the user's touch operation is performed on the projected guide information as a target state. Specifically, for example, the control unit 106 sets a state in which the video playback button G10 is touched by the user as a target state.
  • the information processing apparatus 100-1 determines whether the operation state has reached the set target state. For example, the control unit 106 determines whether the projected video playback button G10 has been touched by the user. When the user's hand and the video playback button G10 are overlapped as shown at time t34 in FIG. 12, the control unit 106 determines that the video playback button G10 has been touched by the user.
  • the information processing apparatus 100-1 executes processing corresponding to the projection instruction based on the operation related to the target state. For example, when the video playback button G10 is touched, the control unit 106 ends the projection of the video playback button G10 as shown at time t35 in FIG. Next, the control unit 106 starts projecting the video of the XYZ program corresponding to the projection instruction of “video playback of XYZ program”. When the user touches the video of the XYZ program, the control unit 106 stops the playback of the XYZ program and projects the video playback button G10 again as shown at time t36 in FIG.
  • the information processing apparatus 100-1 uses the guide information that guides the motion of the motion subject requested by the motion subject as an instruction to project the guide information. Accordingly, when a state in which the operation guided by the guide information is performed by the subject is recognized, a process based on the state is performed. Conventionally, when the operation for interaction becomes complicated, it becomes difficult for the operation subject to perform the operation, and the interaction may fail. However, according to the present embodiment, it is possible to prevent the user from getting lost in the operation or performing an incorrect operation by inducing the operation related to the interaction. Therefore, the interaction between the operation subject and the apparatus can be made smoother.
  • the action subject is a person
  • the correct action can be intuitively understood by guiding the action. Therefore, it is possible to reduce the stress of the user in the interaction.
  • the guide information is projected onto an object that can be visually recognized by a plurality of moving subjects, so that the plurality of moving subjects can share the guide information. Therefore, a plurality of operation subjects can cooperate to perform the operation, and the operation can be speeded up or made efficient.
  • the processing based on the above state includes changing the projection content. Therefore, the user can automatically change the projection content by operating according to the guide information. Therefore, it is possible to reduce the trouble of changing the projection contents.
  • the change of the projection content includes changing the projection target to guide information for guiding the next operation of the operation guided by the projected guide information.
  • the guide information projected by the projection destination is filled, and visibility may be lowered.
  • the viewing user may be confused.
  • the projection content changing process only part of the guide information is projected, and thus visibility can be maintained and user confusion can be suppressed.
  • the next guide information is projected in accordance with the end of the user's operation, the user can intuitively understand the operation order. Therefore, usability can be maintained even when there is a plurality of guide information.
  • the guide information includes only information for inducing a single action to the action subject. For this reason, the operation subject can easily grasp the guided operation. Therefore, it is possible to suppress a possibility that an operation different from the operation in which the operation subject is guided is erroneously performed.
  • the processing based on the state includes processing corresponding to the projection instruction. Therefore, the user can cause the information processing apparatus 100-1 to perform a desired process by operating according to the guide information. Therefore, it is possible to suppress the possibility of erroneous operation for executing the desired process.
  • the guide information includes information indicating a position where the operation subject operates. For this reason, the accuracy of the operation performed can be improved by guiding the position of the operation. Therefore, it is possible to suppress the processing based on the operation from being performed due to the displacement of the operation position, and it is possible to suppress discomfort such as user irritation. In other words, it is possible to cause the information processing apparatus 100-1 to smoothly execute a desired process.
  • the correct operation position (for example, the arrangement position) can be presented to the user. Therefore, the user can learn the position of the correct operation.
  • the guide information includes information indicating the order in which the operation subject operates. For this reason, the accuracy of the operation to be performed can be improved by inducing the order of the operations. Accordingly, it is possible to prevent the processing based on the operation from being performed due to an error in the operation order, and it is possible to suppress discomfort such as user irritation. In other words, it is possible to cause the information processing apparatus 100-1 to smoothly execute a desired process.
  • the correct operation order (for example, the writing order of characters) can be presented to the user. Therefore, the user can learn the correct operation order.
  • the guide information is projected according to the shape of the object existing in the projection area. For this reason, the operation
  • the operation includes an operation on the projected guide information. For this reason, the operation subject can operate the process corresponding to the projection content or the projection instruction by operating on the projected image. Therefore, it is possible to intuitively operate the processing corresponding to the projection content or the projection instruction. In addition, by performing the processing according to the operation, the operation subject can concentrate on the desired operation.
  • the state is recognized based on image information relating to an image in which the operation result of the operation subject is reflected. Therefore, the information processing apparatus 100-1 can recognize the operation state without the user inputting the operation state to the information processing apparatus 100-1. Therefore, it is possible to reduce the labor of the user.
  • the recognition of the state is performed by an external device of the information processing apparatus 100-1, and the recognition result is provided from the external device. For this reason, processing performed in the information processing apparatus 100-1 can be reduced. Therefore, the configuration of the information processing apparatus 100-1 can be simplified, the processing load can be reduced, and the processing speed can be increased.
  • information for projecting the guide information is provided from a device external to the information processing device 100-1. Therefore, the storage capacity required for the information processing apparatus 100-1 can be reduced as compared with the case where guide information or material information is stored in the information processing apparatus 100-1. Therefore, the cost of the information processing apparatus 100-1 can be reduced.
  • the projection instruction is input with the voice of the operation subject. For this reason, even when the user cannot release his / her hand, a projection instruction can be issued. Therefore, the use opportunity of interaction can be increased and the convenience for the user can be improved.
  • the information processing apparatus 100-1 may perform an operation for generating a sound based on guide information.
  • the control unit 106 causes the projection unit 108 to project guide information for guiding a motion generating sound, and when the target state is recognized based on the sound information related to the sound emitted by the operation subject, Processing based on the target state is executed.
  • FIG. 13 is a diagram for explaining a processing example of the information processing system according to the modification of the present embodiment.
  • the information processing apparatus 100-1 determines whether or not an instruction to project guide information related to a score has been acquired based on a user's voice input. For example, as shown at time t37 in FIG. 13, the control unit 106 obtains text information related to the text “Show the score” from the server 200-1 using the audio information input from the audio input unit 102. . Then, the control unit 106 determines that the instruction to project “score” has been acquired from the user's instruction “presentation of score” interpreted based on the text information.
  • the information processing apparatus 100-1 projects the guide information onto the viewing area of the user.
  • the control unit 106 determines the viewing area, and causes the projection unit 108 to project the score G11A to the determined viewing area as shown at time t38 in FIG.
  • the information processing apparatus 100-1 sets, as a target state, a state in which a sound corresponding to the guide information related to the score projected on the motion area is emitted. Specifically, for example, the control unit 106 sets a state in which a melody sound indicated by the score G11A is played by the user as a target state.
  • the information processing apparatus 100-1 determines whether the operation state has reached the set target state. For example, the control unit 106 obtains the result of the voice recognition process using the voice information obtained by collecting the voice input unit 102 from the server 200-1. When the result of the speech recognition processing indicates that the melody sound indicated by the score G11A is recognized as indicated at time t39 in FIG. 13, the control unit 106 has played the melody sound indicated by the score G11A. It is determined that the state has been reached. Note that the information processing apparatus 100-1 may present the recognized sound to the user. For example, as shown at time t39 in FIG. 11, the control unit 106 may emphasize (for example, add an underline) the recognized melody portion of the projected score G11A.
  • the information processing apparatus 100-1 projects the next guide information. For example, when the melody indicated by the score G11A projected as shown at time t40 in FIG. 13 is played, the control unit 106 indicates the continuation of the melody indicated by the score G11A as indicated at time t41 in FIG. The score G11B is projected.
  • the operation of guiding the guide information includes an operation of generating sound.
  • an operation subject particularly a human
  • a process corresponding to a desired operation can be performed only by performing a desired operation without directly performing an operation on the guide information or the information processing apparatus 100-1. It is possible to cause the information processing apparatus 100-1 to perform (for example, a series of projections of guide information). Therefore, the user can concentrate on a desired action for generating a sound.
  • the sound generating operation is a musical instrument performance has been described.
  • the sound generating operation may be other operations such as singing or tap dancing.
  • the operation state is recognized based on sound information related to the sound emitted by the operation subject. For this reason, the information processing apparatus 100-1 can recognize the operation state without the user consciously inputting the operation state to the information processing apparatus 100-1. Therefore, it is possible to reduce the labor of the user.
  • the above-described recognition processing based on the image information may be used in combination. In this case, it is possible to improve the accuracy and accuracy of the recognized operating state.
  • Second Embodiment (Example of Error Processing)>
  • the first embodiment of the present disclosure has been described.
  • a second embodiment of the present disclosure will be described.
  • an error is notified.
  • the control unit 106 performs a process different from the process based on the target state when the action of the action subject is different from the action guided by the guide information. Specifically, the control unit 106 performs error notification as a process different from the process based on the target state.
  • control unit 106 performs visual notification as an error notification. For example, when a user operation different from the guide information guiding operation is recognized, the control unit 106 additionally causes the projection unit 108 to project an image indicating that the user operation is different from the guide information guiding operation. . In addition to projecting an image, a visual effect such as a change in color or brightness or blinking may be applied to already projected guide information or background.
  • control unit 106 may perform an audible notification as the error notification. For example, when a user operation different from the guide information guiding operation is recognized, the control unit 106 causes the audio output unit 110 to output a sound indicating that the user operation is different from the guide information guiding operation.
  • the notification content may only indicate that the user's operation is different from the operation guided by the guide information, and may further include a notification of a correction method for the operation guided by the guide information.
  • FIG. 14 is a flowchart conceptually showing an example of processing of the information processing apparatus 100-2 according to the present embodiment. Note that description of processing that is substantially the same as processing according to the first embodiment will be omitted.
  • the information processing apparatus 100-2 determines whether the guide information projection instruction has been acquired (step S402). If it is determined that the guide information projection instruction has been acquired, the information processing apparatus 100-2 sets a target state according to the guide information (step S402). Step S404).
  • the information processing apparatus 100-2 recognizes the operation state (step S406) and determines whether the recognized operation state is the target state (step S408). If it is determined that the recognized operation state is the target state, the information processing apparatus 100-2 executes processing based on the target state (step S410). Until the projection of the guide information is completed (step S412), the processes of steps S404 to S410 are repeated.
  • the information processing apparatus 100-2 determines whether the operation state is transitioning (step S414). Specifically, the control unit 106 determines whether or not the change in the user's action grasped from the image recognition result continues.
  • the information processing apparatus 100-2 executes an error notification process (step S416). Specifically, when it is determined that the change in the user's operation has stopped, the control unit 106 causes the audio output unit 110 to output that the user's operation is different from the operation guided by the guide information, or to that effect. Is projected onto the projection unit 108.
  • the information processing apparatus 100-2 notifies an error using sound when the operation of the operation subject is different from the operation guided by the guide information.
  • the control unit 106 causes the projection unit 108 to project an image indicating the position where the ingredient is placed as guide information, and the ingredient is placed at a position different from the placement position indicated by the projected image. If it is determined that the operating state has been reached, the audio output unit 110 is made to output an error notification.
  • FIG. 15 is a diagram for explaining a processing example of the information processing system according to the first application example of the present embodiment.
  • the information processing apparatus 100-2 determines whether an instruction to project guide information related to the arrangement of ingredients has been acquired based on the user's voice input (time t42 in FIG. 15), and guides related to the arrangement of ingredients.
  • the guide information (laying position G7A) is projected onto the placing area (time t43 in FIG. 15).
  • the information processing apparatus 100-2 sets, as a target state, a state in which an operation for arranging the user's ingredients with respect to the guide information projected on the arrangement area is performed, and the target state in which the user's operation state is set It is determined whether it has reached.
  • the information processing apparatus 100-2 notifies the user of an error. For example, the control unit 106 determines whether an ingredient has been arranged at the projected placement position G7A. Here, as shown at time t44 in FIG. 15, when the ingredient 60A is arranged at a position different from the placement position G7A, the control unit 106 determines that the operation state is different from the target state. Note that the determination of the operation state is not finalized while the user moves the ingredient 60A. Next, as shown at time t ⁇ b> 45 in FIG. 15, the control unit 106 causes the audio output unit 110 to output a sound with the content “position is different”. The user who has received the notification by the voice notices that the arrangement of the ingredient 60A is wrong, and can rearrange the ingredient 60A at the projected placement position G7A as shown at time t46 in FIG. it can.
  • the information processing apparatus 100-2 changes the projected guide information instead of or together with the error notification when the operation of the operation subject is different from the operation guided by the guide information.
  • the control unit 106 records the action of the action subject while the guide information is projected. Then, when it is determined that the operation state has reached the state where the ingredients are arranged at a position different from the arrangement position indicated by the projected image, the control unit 106 determines whether or not the guide information can be changed.
  • FIG. 16 is a diagram for explaining a processing example of the information processing system according to the second application example of the present embodiment.
  • the information processing apparatus 100-2 determines whether an instruction to project guide information relating to the arrangement of ingredients has been acquired based on the user's voice input (time t47 in FIG. 16), and guides relating to the arrangement of ingredients.
  • the guide information (laying position G7A) is projected onto the placing area (time t48 in FIG. 16).
  • the information processing apparatus 100-2 sets, as a target state, a state in which an operation for arranging the user's ingredients with respect to the guide information projected on the arrangement area is performed, and the target state in which the user's operation state is set It is determined whether it has reached.
  • the information processing apparatus 100-2 When it is determined that the user's operation state has not reached the set target state, the information processing apparatus 100-2 notifies the user of an error and confirms whether the guide information can be changed. For example, as shown at time t49 in FIG. 16, when the ingredient 60A is disposed at a position different from the placement position G7A, the control unit 106 determines that the operation state is different from the target state. Then, as shown at time t49 in FIG. 16, the control unit 106 causes the audio output unit 110 to output the audio having the content “the position is different”. Further, the control unit 106 causes the audio output unit 110 to output a voice asking the user whether or not the guide information can be changed, “Do you want to change the arrangement?”.
  • the information processing apparatus 100-2 projects the guide information corrected based on the past operation history. For example, when there is an affirmative response to the question as to whether or not the guide information can be changed as shown at time t49 in FIG. 16, the control unit 106 has already arranged the placement position G7A based on the past placement position of the user. It changes to the position where 60A is arranged. Further, the control unit 106 also changes the arrangement of the placement position G7B next to the placement position G7A. Then, as shown at time t50 in FIG. 16, the control unit 106 causes the projection unit 108 to project the changed placement position G7B. As shown at time t51 in FIG. 16, the user places the ingredient 60B at the post-change placement position G7B.
  • the guide information is corrected based on the past operation history.
  • the guide information may be corrected based on the current operation different from the operation guided by the guide information. In this case, by respecting the current operation, it is possible to give the user a flexible impression of the processing of the information processing apparatus 100-2.
  • the information processing apparatus 100-2 performs a process different from the process based on the target state when the operation of the operation subject is different from the operation guided by the guide information. For this reason, by performing processing according to the fact that the target state has not been reached, it is possible to urge the operating subject to move toward the target state. Accordingly, it is possible to suppress the target state from being unreached and to shorten the time taken to reach the target state.
  • processing different from the processing based on the target state includes error notification. For this reason, by notifying the operating subject that the target state has not been reached, it is possible to directly prompt the operating subject to move toward the target state. Accordingly, it is possible to more surely suppress the target state from being unreached and reduce the time taken to reach the target state.
  • the processing different from the processing based on the target state includes a change of the projected guide information. For this reason, the projected guide information can be changed to guide information adapted to the subject. Therefore, usability can be improved.
  • the change of the guide information includes a change of the projection target to the guide information corrected based on the action performed by the action subject in the past. Therefore, the guide information can be changed to guide information that matches the nature of the operation subject. Therefore, the guide information can be more reliably adapted to the operation subject.
  • the guide information is changed based on whether the guide information can be changed.
  • the guide information is automatically changed, there is a possibility that the user may feel uncomfortable or confused when the operation subject is the user.
  • the guide information change process it is possible to confirm the change of the guide information with the user. Therefore, it is possible to suppress the possibility of giving the user an unpleasant feeling or confusing.
  • FIG. 17 is an explanatory diagram illustrating a hardware configuration of the information processing apparatus 100 according to an embodiment of the present disclosure.
  • the information processing apparatus 100 includes a processor 132, a memory 134, a bridge 136, a bus 138, an observation device 140, an input device 142, an output device 144, a connection port 146, and a communication device 148.
  • the processor 132 functions as an arithmetic processing unit, and realizes the function of the control unit 106 in the information processing apparatus 100 in cooperation with various programs.
  • the processor 132 operates various logical functions of the information processing apparatus 100 by executing a program stored in the memory 134 or another storage medium using the control circuit.
  • the processor 132 may be a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a DSP (Digital Signal Processor), or a SoC (System-on-a-Chip).
  • the memory 134 stores a program used by the processor 132 or an operation parameter.
  • the memory 134 includes a RAM (Random Access Memory), and temporarily stores a program used in the execution of the processor 132 or a parameter that changes as appropriate in the execution.
  • the memory 134 includes a ROM (Read Only Memory), and the function of the storage unit is realized by the RAM and the ROM. Note that an external storage device may be used as a part of the memory 134 via the connection port 146 or the communication device 148.
  • processor 132 and the memory 134 are connected to each other by an internal bus including a CPU bus or the like.
  • the bridge 136 connects the buses. Specifically, the bridge 136 includes an internal bus to which the processor 132 and the memory 134 are connected, and a bus 138 that connects the observation device 140, the input device 142, the output device 144, the connection port 146, and the communication device 148. Connecting.
  • the observation device 140 observes the information processing apparatus 100 and its surroundings, and realizes the functions of the voice input unit 102 and the imaging unit 104.
  • the observation device 140 includes an audio sensor and a camera sensor.
  • a microphone may be included as the audio sensor
  • an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) may be included as the camera sensor.
  • the observation device 140 may further include another sensor such as an acceleration sensor, an angular velocity sensor, or a GPS sensor.
  • the input device 142 is used for a user to operate the information processing apparatus 100 or input information to the information processing apparatus 100.
  • the input device 142 includes input means for a user to input information, an input control circuit that generates an input signal based on an input by the user, and outputs the input signal to the processor 132.
  • the input means may be a mouse, a keyboard, a touch panel, a switch or a lever.
  • a user of the information processing apparatus 100 can input various data or instruct a processing operation to the information processing apparatus 100 by operating the input device 142.
  • the output device 144 is used to notify the user of information, and realizes the functions of the projection unit 108 and the audio output unit 110.
  • the output device 144 may include a projection device and an audio output device.
  • the output device 144 includes a projector and a device such as a speaker or headphones.
  • the output device 144 may further include a display device such as a liquid crystal display (LCD) device or an organic light emitting diode (OLED) device. Further, the output device 144 may be a module that performs output to various devices.
  • LCD liquid crystal display
  • OLED organic light emitting diode
  • connection port 146 is a port for directly connecting a device to the information processing apparatus 100.
  • the connection port 146 may be a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, or the like.
  • the connection port 146 may be an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, or the like.
  • the communication device 148 mediates communication between the information processing device 100 and the external device, and realizes the function of the communication unit 112. Specifically, the communication device 148 performs communication according to a wired communication method.
  • the communication device 148 performs wired communication such as signal line communication, wired WAN (Wide Area Network) or wired LAN (Local Area Network) communication.
  • the communication device 148 may perform communication according to a wireless communication method.
  • the communication device 148 includes Bluetooth (registered trademark), NFC (Near Field Communication), wireless USB, or near field communication such as TransferJet (registered trademark), WCDMA (registered trademark) (Wideband Code Division Multiple Access), WiMAX.
  • Wireless communication may be executed according to an arbitrary wireless communication method such as a cellular communication method such as (registered trademark), LTE (Long Term Evolution) or LTE-A, or a wireless LAN method such as Wi-Fi (registered trademark).
  • the information processing apparatus 100 may not have a part of the configuration described with reference to FIG. 17 or may have an additional configuration.
  • a one-chip information processing module in which all or part of the configuration described with reference to FIG. 17 is integrated may be provided.
  • the process according to the fact that the target state has not been reached is performed, whereby the operation subject can be prompted to move toward the target state. Accordingly, it is possible to suppress the target state from being unreached and to shorten the time taken to reach the target state.
  • the information processing apparatus 100 is used by an individual such as a consumer, but the present technology is not limited to such an example.
  • the information processing apparatus 100 may be used in business such as an office or a factory.
  • the information processing apparatus 100 projects guide information indicating the installation location of parts.
  • the example in which the position of the motion is guided by the guide information has been described.
  • the timing of the motion may be guided.
  • guide information that guides the timing of adding seasonings in cooking may be projected.
  • a projection control unit that causes the projection unit to project guide information for guiding the operation of the operation subject that the operation subject requests guidance, according to a projection instruction of the guide information;
  • a processing control unit that performs processing based on the state;
  • An information processing apparatus comprising: (2) The processing based on the state includes a change of projection content, The information processing apparatus according to (1).
  • the change of the projection content includes changing the projection target to the guide information that guides the next operation of the operation guided by the projected guide information.
  • the guide information includes only information for inducing a single action to the action subject.
  • the information processing apparatus includes information indicating a position to be operated by the operating subject.
  • the information processing apparatus according to any one of (1) to (4).
  • the guide information includes information indicating an order in which the operation subject operates.
  • the information processing apparatus according to any one of (1) to (5).
  • the operation includes an operation on the projected guide information.
  • the information processing apparatus according to any one of (1) to (6).
  • the operation includes an operation of generating sound, The information processing apparatus according to any one of (1) to (7). (9)
  • the state is recognized based on image information relating to an image in which the operation result of the operation subject is reflected.
  • the information processing apparatus according to any one of (1) to (8).
  • the state is recognized based on sound information related to a sound emitted by the operating subject.
  • the information processing apparatus according to any one of (1) to (9).
  • (11) The recognition of the state is performed by a device external to the information processing device, and a recognition result is provided from the external device.
  • (12) Information for projecting the guide information is provided from an apparatus outside the information processing apparatus.
  • the projection instruction is input by the voice of the operation subject.
  • the process control unit performs a process different from the process based on the state when the operation of the operation subject is different from the operation guided by the guide information.
  • the information processing apparatus according to any one of (1) to (13).
  • the processing different from the processing based on the state includes an error notification.
  • the process different from the process based on the state includes a change of the projected guide information.
  • the change of the guide information includes a change to the guide information modified based on the action performed by the action subject in the past.
  • the change of the guide information is performed based on whether or not the guide information can be changed.
  • the information processing apparatus according to (16) or (17).
  • An information processing method including: (20) A projection control function that causes the projection unit to project guide information for guiding the operation of the operation subject that the operation subject requests guidance, according to a projection instruction of the guide information; When a state in which the operation guided by the guide information is performed by the operation subject is recognized, a processing control function for performing processing based on the state; A program to make a computer realize.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)
  • Manipulator (AREA)

Abstract

Le problème à résoudre par l'invention consiste à fournir un protocole qui permette de faciliter l'interaction entre un sujet actif et un dispositif. La solution de l'invention porte sur un dispositif de traitement d'informations comprenant : une unité de commande de projection qui amène une unité de projection à projeter des informations de guidage afin de guider l'action d'un sujet actif pour laquelle le sujet actif nécessite un guidage, en fonction d'une instruction de projection pour les informations de guidage; et une unité de commande de traitement qui, lors de la reconnaissance d'un état dans lequel l'action guidée par les informations de guidage a été réalisée par le sujet actif, réalise un traitement sur la base de cet état. L'invention concerne également : un procédé de traitement d'informations consistant à amener une unité de projection à projeter des informations de guidage pour guider l'action d'un sujet actif pour laquelle le sujet actif nécessite un guidage, en fonction d'une instruction de projection pour les informations de guidage et à réaliser, lors de la reconnaissance d'un état dans lequel l'action guidée par les informations de guidage a été réalisée par le sujet actif, un traitement sur la base de cet état; et un programme.
PCT/JP2016/085032 2016-02-17 2016-11-25 Dispositif de traitement d'informations, procédé de traitement d'informations et programme WO2017141518A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016028061A JP2017146782A (ja) 2016-02-17 2016-02-17 情報処理装置、情報処理方法およびプログラム
JP2016-028061 2016-12-26

Publications (1)

Publication Number Publication Date
WO2017141518A1 true WO2017141518A1 (fr) 2017-08-24

Family

ID=59624959

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/085032 WO2017141518A1 (fr) 2016-02-17 2016-11-25 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Country Status (2)

Country Link
JP (1) JP2017146782A (fr)
WO (1) WO2017141518A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6488039B1 (ja) * 2018-03-15 2019-03-20 株式会社コナミデジタルエンタテインメント ゲーム進行情報生成システム及びそのコンピュータプログラム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002107169A (ja) * 2000-10-03 2002-04-10 Hitachi Ltd 通信型ナビゲーション装置、および情報センター
JP2009089068A (ja) * 2007-09-28 2009-04-23 Victor Co Of Japan Ltd 電子機器の制御装置、制御方法及び制御プログラム
JP2010073192A (ja) * 2008-08-20 2010-04-02 Universal Entertainment Corp 会話シナリオ編集装置、ユーザ端末装置、並びに電話取り次ぎシステム
JP2015095002A (ja) * 2013-11-08 2015-05-18 株式会社ソニー・コンピュータエンタテインメント 表示制御装置、表示制御方法、プログラム及び情報記憶媒体
JP2015190988A (ja) * 2014-03-27 2015-11-02 ファインテック株式会社 光学エンジンおよびその製造方法、ならびにプロジェクタ

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002107169A (ja) * 2000-10-03 2002-04-10 Hitachi Ltd 通信型ナビゲーション装置、および情報センター
JP2009089068A (ja) * 2007-09-28 2009-04-23 Victor Co Of Japan Ltd 電子機器の制御装置、制御方法及び制御プログラム
JP2010073192A (ja) * 2008-08-20 2010-04-02 Universal Entertainment Corp 会話シナリオ編集装置、ユーザ端末装置、並びに電話取り次ぎシステム
JP2015095002A (ja) * 2013-11-08 2015-05-18 株式会社ソニー・コンピュータエンタテインメント 表示制御装置、表示制御方法、プログラム及び情報記憶媒体
JP2015190988A (ja) * 2014-03-27 2015-11-02 ファインテック株式会社 光学エンジンおよびその製造方法、ならびにプロジェクタ

Also Published As

Publication number Publication date
JP2017146782A (ja) 2017-08-24

Similar Documents

Publication Publication Date Title
US11463611B2 (en) Interactive application adapted for use by multiple users via a distributed computer-based system
US10528311B2 (en) Display device
JP6653526B2 (ja) 測定システムおよびユーザインタフェース装置
JP6722786B1 (ja) 空間情報管理装置
BR102012002995B1 (pt) Dispositivo de entrada, dispositivo de processamento de informação, método de aquisição de valor de entrada, e, meio de gravação legível por computador não transitório
WO2017130486A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
JP2006285859A (ja) 情報処理方法及び装置
WO2016152200A1 (fr) Système de traitement d'informations et procédé de traitement d'informations
JP2022153509A (ja) 測定支援システム
US20120313968A1 (en) Image display system, information processing apparatus, display device, and image display method
WO2017141518A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
CN106937156A (zh) 一种实现多资源同步播放的方法及装置和媒体播放器
JP5867743B2 (ja) 情報処理装置及び情報処理装置用プログラム
JP5981617B1 (ja) ユーザ・インタフェース画像表示のためのコンピュータ・プログラムおよびコンピュータ実装方法
WO2017208628A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP5731461B2 (ja) ゲーム装置及びプログラム
JP6564484B1 (ja) 同室感コミュニケーションシステム
JP5213913B2 (ja) プログラム及び画像生成システム
JP2010246809A (ja) ゲーム装置、ゲームシステム、ゲーム装置の制御方法、ならびに、プログラム
Pham The challenge of hand gesture interaction in the Virtual Reality Environment: evaluation of in-air hand gesture using the Leap Motion Controller
JP6837109B2 (ja) 制御システム
CN109977866B (zh) 内容翻译方法及装置、计算机系统及计算机可读存储介质
JP5687662B2 (ja) ゲーム装置、ゲーム装置の制御方法、ゲームシステム、ゲームシステムの制御方法、及びプログラム
JP2015133637A (ja) 情報処理機器および操作システム
Dinh A new interaction framework for human and robot

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16890655

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16890655

Country of ref document: EP

Kind code of ref document: A1