WO2018123497A1 - 情報処理装置、情報処理方法及びコンピュータプログラム - Google Patents
情報処理装置、情報処理方法及びコンピュータプログラム Download PDFInfo
- Publication number
- WO2018123497A1 WO2018123497A1 PCT/JP2017/043930 JP2017043930W WO2018123497A1 WO 2018123497 A1 WO2018123497 A1 WO 2018123497A1 JP 2017043930 W JP2017043930 W JP 2017043930W WO 2018123497 A1 WO2018123497 A1 WO 2018123497A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- information processing
- sensing
- request
- processing system
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/107—Static hand or arm
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
Definitions
- the present disclosure relates to an information processing apparatus, an information processing method, and a computer program.
- Patent Document 1 A technique for recognizing an object placed on a table and projecting an application related to the recognized object is disclosed in Patent Document 1, for example.
- the detection unit that performs sensing of an object that exists in a sensing area, and the reception of a request that specifies an identifier associated with the object that is transmitted from an external device via a network
- an information processing apparatus comprising: a processing unit that causes the detection unit to sense an object corresponding to an identifier and transmits a response to the request to the external device based on a result of the sensing.
- an information processing method including a processor performing sensing of an object corresponding to an identifier and transmitting a response to the request to the external device based on a sensing result.
- the computer senses an object that exists in the sensing area, and responds to reception of a request that specifies an identifier associated with the object transmitted from an external device via the network. Then, a computer program is provided that performs sensing of an object corresponding to the identifier, and transmits a response to the request to the external device based on the sensing result.
- FIG. 5 is a flowchart illustrating an example of operations of the information processing system 100 and the PC 800.
- 3 is an explanatory diagram illustrating sensing of an object by the information processing system 100.
- FIG. 6 is an explanatory diagram illustrating an example of information projection by the information processing system 100.
- FIG. 5 is a flowchart illustrating an example of operations of the information processing system 100 and the PC 800. It is explanatory drawing which shows the example of the object sensed in the operation example of FIG. 5 is a flowchart illustrating an example of operations of the information processing system 100 and the PC 800. It is explanatory drawing which shows the example of the object sensed in the operation example of FIG.
- FIG. 5 is a flowchart illustrating an example of operations of the information processing system 100 and the PC 800. It is explanatory drawing which shows the example of the object sensed in the operation example of FIG. It is explanatory drawing which shows a mode that the predetermined pattern is projected around the object which the information processing system 100 sensed. It is a flowchart which shows the operation example of the information processing system 100 which concerns on this embodiment. It is explanatory drawing which shows the example of the pattern which the information processing system 100 projects. It is explanatory drawing which shows the example of the pattern which the information processing system 100 projects. It is explanatory drawing which shows the example of the pattern which the information processing system 100 projects. It is explanatory drawing which shows the example of the pattern which the information processing system 100 projects. It is explanatory drawing which shows the hardware structural example.
- Embodiment of the present disclosure [1.1. System configuration example] First, a configuration example of the information processing system according to the embodiment of the present disclosure will be described.
- FIG. 1 is an explanatory diagram illustrating a configuration example of an information processing system according to an embodiment of the present disclosure.
- the system may mean a configuration for executing a predetermined process, and the system as a whole can be regarded as one device, or a system is configured by a plurality of devices. It can also be regarded as.
- the information processing system according to the present embodiment shown in FIG. 1 only needs to be configured to execute predetermined processing as the entire information processing system, and which configuration in the information processing system is considered as one device is arbitrary. It may be.
- an information processing system 100a includes an input unit 110a and an output unit 130a.
- the output unit 130a visually notifies the user of the information by displaying various types of information on the table 140a.
- a projector is used as the output unit 130a.
- the output unit 130a is disposed above the table 140a, for example, spaced from the table 140a by a predetermined distance while being suspended from the ceiling, and projects information on the top surface of the table 140a.
- a method for displaying information on the top surface of the table 140a from above is also referred to as a “projection type”.
- the entire area where information is displayed by the output unit 130a is also referred to as a display screen.
- the output unit 130a displays information presented to the user as the application is executed by the information processing system 100a on the display screen.
- the displayed information is, for example, an operation screen of each application.
- each display area in which the operation screen of such an application is displayed on the display screen is also referred to as a window.
- the output unit 130a has a so-called GUI (Graphical User Interface) component (widget) that accepts various operations by a user such as selection and input such as buttons, sliders, check boxes, text boxes, and keyboards on the display screen. ) Is displayed.
- GUI Graphic User Interface
- the window can be regarded as one of the GUI parts, in this specification, in order to distinguish the window from the other GUI parts, for convenience, the window is not included in the GUI part, and the display elements other than the window are not included. This is called a GUI component.
- the output unit 130a may include a lighting device.
- the information processing system 100a turns on the lighting device based on the content of information input by the input unit 110a and / or the content of information displayed by the output unit 130a. You may control states, such as light extinction.
- the output unit 130a may include a speaker and may output various kinds of information as sound.
- the number of speakers may be one or plural.
- the information processing system 100a may limit the speakers that output sound or adjust the direction in which sound is output.
- the input unit 110a is a device that inputs operation details of a user who uses the information processing system 100a.
- the input unit 110a includes a sensor and the like, and is provided above the table 140a, for example, in a state suspended from the ceiling. As described above, the input unit 110a is provided apart from the table 140a on which information is displayed.
- the input unit 110a may be configured by an imaging device that can capture the top surface of the table 140a, that is, the display screen.
- a camera that images the table 140a with one lens a stereo camera that can record information in the depth direction by imaging the table 140a with two lenses, or the like can be used.
- the input unit 110a is a stereo camera, for example, a visible light camera or an infrared camera can be used.
- the information processing system 100a is present on the table 140a by analyzing an image (captured image) captured by the camera. It is possible to detect the position of a physical object such as a user's hand.
- the information processing system 100a analyzes the captured image by the stereo camera, and in addition to the position information of the object located on the table 140a, Depth information can be acquired.
- the information processing system 100a can detect contact or proximity of the user's hand to the table 140a in the height direction and separation of the hand from the table 140a based on the depth information.
- contact when the user touches or brings an operating body such as a hand in contact with information on the display screen is also simply referred to as “contact”.
- the position of the operating body for example, the user's hand on the display screen (that is, the top surface of the table 140a) is detected, and based on the detected position of the operating body.
- Various information is input. That is, the user can perform various operation inputs by moving the operating tool on the display screen. For example, when a user's hand contact with a window or other GUI component is detected, an operation input for the window or other GUI component is performed.
- the operation body may be various operation members such as a stylus and a robot arm. May be used.
- the operation body is not limited to an operation input to the GUI component, but refers to an object that can move an object placed on the table 140a or the like.
- the input unit 110a may capture not only the top surface of the table 140a but also a user existing around the table 140a.
- the information processing system 100a can detect the position of the user around the table 140a based on the captured image.
- the information processing system 100a may perform personal recognition of the user by extracting physical features that can identify the individual of the user, such as the size of the user's face and body included in the captured image. .
- the present embodiment is not limited to such an example, and user operation input may be executed by other methods.
- the input unit 110a may be provided as a touch panel on the top surface of the table 140a, and the user's operation input may be detected by contact of the user's finger or the like with respect to the touch panel. Further, the user's operation input may be detected by a gesture with respect to the imaging device constituting the input unit 110a.
- the input unit 110a may include a voice input device such as a microphone that picks up sounds produced by the user and environmental sounds of the surrounding environment.
- a microphone array for collecting sound in a specific direction can be suitably used. Further, the microphone array can be configured such that the sound collection direction can be adjusted to an arbitrary direction.
- the input unit 110a When a voice input device is used as the input unit 110a, an operation input may be performed using the collected voice. Further, the information processing system 100a may perform individual recognition based on the voice by analyzing the collected voice.
- the input unit 110a may be configured by a remote control device (so-called remote control).
- the remote controller may be one in which a predetermined instruction is input by operating a predetermined button arranged on the remote controller, or movement of the remote controller by a sensor such as an acceleration sensor or a gyro sensor mounted on the remote controller. Or a posture may be detected, and a predetermined instruction may be input by an operation of the user moving the remote controller.
- the information processing system 100a may include other input devices such as a mouse, a keyboard, buttons, switches, and levers (not shown) as the input unit 110a, and user operations are input through these input devices. Also good.
- the configuration of the information processing system 100a according to the present embodiment has been described above with reference to FIG. Although not shown in FIG. 1, another device may be connected to the information processing system 100a.
- an illumination device for illuminating the table 140a may be connected to the information processing system 100a.
- the information processing system 100a may control the lighting state of the lighting device according to the state of the display screen.
- the configuration of the information processing system is not limited to that shown in FIG.
- the information processing system according to the present embodiment only needs to include an output unit that displays various types of information on the display screen, and an input unit that can accept at least an operation input for the displayed information. Is not limited.
- FIG. 2 to FIG. 4 another configuration example of the information processing system according to the present embodiment will be described. 2 to 4 are diagrams showing other configuration examples of the information processing system according to the present embodiment.
- an output unit 130a is provided below the table 140b.
- the output unit 130a is a projector, for example, and projects information from below toward the top plate of the table 140b.
- the top plate of the table 140b is formed of a transparent material such as a glass plate or a transparent plastic plate, and the information projected by the output unit 130a is displayed on the top surface of the table 140b.
- a method of projecting information from the bottom of the table 140b onto the output unit 130a and displaying the information on the top surface of the table 140b is also referred to as a “rear projection type”.
- the input unit 110b is provided on the top surface (front surface) of the table 140b.
- the input unit 110b is configured by, for example, a touch panel, and the operation input by the user is performed when the touch of the operating body on the display screen on the top surface of the table 140b is detected by the touch panel.
- the configuration of the input unit 110b is not limited to this example, and the input unit 110b may be provided below the table 140b and separated from the table 140b, similarly to the information processing system 100a shown in FIG.
- the input unit 110b is configured by an imaging device, for example, and can detect the position of the operation body on the top surface of the table 140b through a top plate formed of a transparent material.
- the input unit 110b includes a sensor capable of sensing the length and weight of the shadow, and can sense the weight and size of an object placed on the top surface of the table 140b.
- a touch panel type display is installed on a table with its display surface facing upward.
- the input unit 110c and the output unit 130c can be integrally configured as the touch panel display. That is, various types of information are displayed on the display screen of the display, and the operation input by the user is performed by detecting the touch of the operating body on the display screen of the display by the touch panel.
- an imaging device may be provided above the touch panel display as the input unit 110c. The position of the user around the table can be detected by the imaging device.
- the input unit 110c includes a sensor (in-cell sensor) that can sense the length and weight of a shadow, and can sense the weight and size of an object placed on the output unit 130c.
- the information processing system 100d shown in FIG. 4 includes a flat panel display.
- the output unit 130d is configured as a flat panel display, and various types of information are displayed on the display screen of the display.
- the input unit is configured by an input device such as a mouse, a keyboard, or a touch pad (not shown), and an operation input by the user is performed by operating a pointer in the display screen by these input devices.
- the input unit of the information processing system 100d may include a touch panel provided on a flat panel display, and an operation input by the user is performed via the touch panel. Also good.
- the input unit may include an imaging device that can capture an area facing the display surface of a flat panel display. The position of the user who observes the flat panel display can be detected by the imaging device.
- the information processing system according to the present embodiment can be realized by various configurations.
- the present embodiment will be described by taking as an example the configuration of the information processing system 100a in which the input unit 110a and the output unit 130a are provided above the table 140a shown in FIG.
- the information processing system 100a, the input unit 110a, and the output unit 130a are simply referred to as the information processing system 100, the input unit 110, and the output unit 130.
- FIG. 5 is an explanatory diagram illustrating a functional configuration example of the information processing system 100 according to the embodiment of the present disclosure.
- the functional configuration of the information processing system 100 according to the embodiment of the present disclosure will be described with reference to FIG.
- the information processing system 100 includes an input unit 110, a graphics display processing unit 120, an output unit 130, and an object information control unit 200. Composed.
- the input unit 110 is an input interface for inputting various information to the information processing system 100.
- a user can input various types of information to the information processing system 100 via the input unit 110.
- the input unit 110 is configured to receive at least a user operation input on the display screen generated by the output unit 130.
- the input unit 110 is configured by an imaging device including an image sensor, and captures a captured image including an operation body such as a user's hand on the display screen.
- the input unit 110 includes a depth sensor capable of acquiring three-dimensional information such as a stereo camera, a time of flight method, and a structured light method.
- Information for example, information about the captured image
- the input part 110 may be comprised by other input devices, such as a touch panel, a mouse
- the graphics display processing unit 120 performs processing of graphics displayed on the output unit 130 based on a user operation input received by the input unit 110. For example, the graphics display processing unit 120 controls drawing of various contents such as a window for displaying an application, and provides an event such as a user operation input to each content. In the present embodiment, the graphics display processing unit 120 provides the object information control unit 200 with the contents of the user operation input received from the input unit 110. Then, the graphics display processing unit 120 receives the content of the processing performed by the object information control unit 200, and executes the graphics processing based on the content.
- the graphics display processing unit 120 includes, for example, an image processing circuit.
- the output unit 130 is an output interface for notifying the user of various types of information processed by the information processing system 100.
- the output unit 130 is configured by a display device such as a display or a projector, for example, and displays various types of information on the display screen under the control of an object information control unit 200 described later. As described above, the output unit 130 displays the window and the GUI component on the display screen.
- the windows and GUI parts displayed on the output unit 130 are also referred to as “display objects”. Note that the present embodiment is not limited to this example, and the output unit 130 may further include an audio output device such as a speaker, and may output various types of information as audio.
- the object information control unit 200 executes various processes based on the user operation input received by the input unit 110.
- the object information control unit 200 recognizes the attribute information associated with the object and the possession state associated with the finger based on the information about the finger and the object acquired by the input unit 110, and includes these information. Based on the information, the object information is controlled.
- the object information control unit 200 includes, for example, a control circuit such as a CPU and a memory that stores a program for operating the control circuit.
- the object information control unit 200 includes a finger detection unit 210, an attribute processing unit 220, and an information accumulation processing unit 230.
- the finger detection unit 210 performs a process of detecting which object the user has operated using the positional relationship between the object and the finger included in the information acquired by the input unit 110. Specific processing contents by the finger detection unit 210 will be described in detail later.
- the attribute processing unit 220 performs processing related to assignment of attributes to objects and fingers that exist in a region where the input unit 110 can sense (sensable region). For example, the attribute processing unit 220 performs processing for estimating which attribute the object holds using information on whether the object and the finger are close to or in contact with each other. Further, for example, the attribute processing unit 220 performs processing for estimating which attribute the overlapping object holds using information on whether or not the object overlaps. Specific processing contents by the attribute processing unit 220 will be described in detail later.
- the information accumulation processing unit 230 performs processing for accumulating information on objects and fingers. Information accumulated by the information accumulation processing unit 230 is used for processing by the attribute processing unit 220. An example of information stored by the information storage processing unit 230 will be described later.
- 6A to 6C are explanatory diagrams showing finger and object detection techniques.
- FIG. 6A is an example of a detection technique based on recognition using contour information.
- This detection technique is a method for discriminating a hand and an object by extracting a contour from image information and three-dimensional information.
- Detection using contour information can be used with visible light or invisible light. When invisible light is used, there is an advantage that a display object does not interfere with a hand or an object even when projected from above.
- the detection using the contour information has a drawback that when the object is held in the hand, the object and the contour of the hand are integrated, so that the hand and the object cannot be distinguished.
- FIG. 6B is an example of detection technology based on recognition using segment information.
- This detection technique is a technique for discriminating a hand and an object by segmenting using pixel values of image information. For example, information acquired from a visible light camera can be segmented using colors.
- This detection technique has an advantage that if it is not covered with a hand, it is possible to distinguish the hand and the object even if the object is held by the hand.
- this detection technique is based on the premise that the hand color and the object color have a certain difference or more. If the difference is small, the hand and the object are recognized as a unit.
- this detection technique has a drawback in that, when projected from above, the possibility of erroneous determination increases when the projection overlaps a hand or an object. In addition, this detection technique cannot determine whether the object is a real object or a display object when displaying a color close to a hand or an object on the screen, regardless of the overall configuration of the information processing system shown in FIGS. There is a drawback.
- FIG. 6C is an example of a detection technique based on recognition using a marker.
- This detection technique is a method of recognizing a finger and an object by attaching markers (stars in the figure) in advance.
- This detection technique has an advantage that even if the object marker is not covered by the hand, it can be recognized even if the object is held by the hand.
- this detection technique has a drawback that only a registered (marked) object can be handled because a marker needs to be added to a finger or an object in advance.
- FIG. 7A to 7C are explanatory diagrams showing examples of situations where fingers and objects are placed at the time of detection.
- FIG. 7A shows a state where an object is held in the hand but the object is still visible. In this state, depending on the detection technique and state, it is possible to distinguish between a finger and an object.
- FIG. 7B shows a state where the object is held in such a manner that the hand is completely covered. In this state, it is impossible to determine using any of the three detection techniques described above.
- FIG. 7C shows a state in which an object to be detected is covered with an object held by a hand. An object held by the hand can also be an operating body of the present disclosure. This state cannot be determined using any of the three detection techniques described above as in the case of FIG. 7B.
- the finger and the object cannot be discriminated depending on the environment or the state. Therefore, the object can be tracked and associated with the object even in a situation where the attribute cannot be maintained by tracking the object. Allows holding attributes.
- Tables 1 to 3 below show information stored in the information storage processing unit 230.
- Table 1 is an object management table
- Table 2 is an ownership state management table
- Table 3 is an attribute management table.
- Each table is realized in a format such as a relational database.
- the object ID is an ID of an object returned by the sensor of the input unit 110. This information is maintained as long as the sensor determines whether it is a finger or an object. Since the object ID is shaken in a unit that can be recognized by the sensor, the overlapping objects are given one object ID for the lump. If the hand is covered as described above, the object ID is incremented even if the object is the same.
- Finger ID is a finger ID returned by the sensor of the input unit 110. One ID is assigned to each hand.
- the recognition ID is an ID that is recognized and tracked by the recognition technique according to the present embodiment. With this recognition technology, it is possible to maintain a unique recognition ID even if an object is covered with a hand. There are as many unique recognition IDs as the number of properties of the object.
- the object management table in Table 1 is a table for managing the state of an object.
- the recognition ID array holds which recognition ID the object group is recognized for the object ID. When the objects overlap, it is expressed as an array of recognition IDs, and the head element indicates the recognition ID of the object at the lower stage.
- the feature amount is a parameter acquired from a sensor such as the height and weight of the object.
- the processing flag is a flag indicating whether or not the object ID has been processed. For example, if processing has been completed, information indicating that processing has been completed, such as true or “1”, and if processing has not been performed, information such as false or “0” has not been processed. Information indicating processing is stored.
- the ownership status management table in Table 2 is a table that manages what hand holds what.
- the recognition ID array represents which recognition ID the finger holds with respect to the finger ID. When a plurality of objects are held, it becomes an array, and the first element is a recognition ID of an object that is closer to the hand (the order of placement is later).
- the attribute management table in Table 3 is a table for managing which identification ID holds which attribute.
- An attribute is a characteristic determined depending on the application.
- FIG. 8 is a flowchart showing an operation example of the finger detection unit 210 according to the present embodiment, and shows an example of a process shown as “nearest finger detection process” in the subsequent figure.
- the finger detection unit 210 acquires the finger position and the object position, calculates the distance, and determines whether the distance is equal to or less than a predetermined threshold. (Step S102).
- FIG. 9 is a schematic diagram when calculating the distance of each finger from the object.
- hands a, b, c, and d shown in FIG. 9, hands c and d are far away from the object (object 1), and the hand operating the object because the distance between the object and the hand is outside the threshold. Is not considered.
- the finger detection unit 210 can calculate the distance based on the position of the center and the center of gravity of the object and the center of the back and the position of the fingertip of the hand.
- the finger detection unit 210 may make a determination using the shape of the hand, the surface of the hand, or the like. If there is no significant difference in distance, the finger detection unit 210 may consider the hand that is moving compared to the previous frame as the hand that is operating the object.
- step S102 If it is determined in step S102 that the distance is not less than or equal to the threshold (No in step S102), the finger detection unit 210 returns to the processing of another finger.
- the finger detection unit 210 compares the distance detected so far and determines whether the newly calculated distance is the minimum (step S103).
- step S103 If the result of determination in step S103 is that the distance is not minimum (step S103, No), the finger detection unit 210 returns to processing of another finger.
- the finger detection unit 210 determines that the finger is the nearest finger of the object, and records (updates) the finger ID (step S104).
- the finger detection unit 210 performs the above processing repeatedly (step S105), thereby acquiring the finger ID closest to the object.
- FIG. 10 is a flowchart illustrating an operation example of the object information control unit 200 according to the embodiment of the present disclosure.
- an operation example of the object information control unit 200 according to the embodiment of the present disclosure will be described with reference to FIG.
- the object information control unit 200 first acquires finger information and object information acquired by the input unit 110 through sensing (step S111). The subsequent processing is repeated every frame until an end condition such as a predetermined operation by the user is satisfied.
- the frame in this case refers to a frame for rendering graphics or a frame having a cycle for acquiring sensor information from the input unit 110.
- the object information control unit 200 Upon acquiring the finger information and the object information, the object information control unit 200 subsequently clears all the processing flags in the object management table (step S112). By clearing all the processing flags in the object management table, all the objects recognized in the previous frame are in an unprocessed state.
- the object information control unit 200 executes a process when an object exists (step S113). Details of processing when an object exists will be described later.
- the object information control unit 200 executes a process when there is no object (step S114). Details of the processing when no object exists will be described later.
- the object information control unit 200 executes a process when the finger disappears outside the sensing area (step S115). Details of processing when the finger disappears outside the sensing area will be described later.
- the object information control unit 200 determines whether an end condition such as a predetermined operation by the user is satisfied (step S116). If the end condition is not satisfied (step S116, No), the object information control unit 200 returns to the process of step S111. If the end condition is satisfied (step S116, Yes), the object information control unit 200 performs a series of processes. Exit.
- FIG. 11 is a flowchart illustrating an operation example of the object information control unit 200 according to the embodiment of the present disclosure, and is an explanatory diagram illustrating an operation example when the object in step S113 is present. Note that the following series of processing can be executed by the attribute processing unit 220, for example.
- the object information control unit 200 repeats the following series of processes for the object acquired from the input unit 110 (step S121). If all the objects have not been processed (No at Step S121), the object information control unit 200 determines whether the object ID of the object to be processed is registered in the object management table (Step S122). When the object ID of the object to be processed is not registered in the object management table (No in step S122), the object information control unit 200 is described later because the object is an object that did not exist in the previous frame.
- Object estimation processing based on proximity information is performed (step S123).
- the object estimation process based on the proximity information is a process for determining whether the object is generated (placed) from a finger or not and is not generated by a finger and estimates a recognition ID.
- the object information control unit 200 determines that the object is the object in which the previous frame also exists. It is determined whether there is any change in the feature quantity such as the height and weight of the object (step S124). At the time of the determination in step S124, whether the difference in feature amount is an error due to sensor performance or an obvious significant difference may be added as a determination criterion.
- the object information control unit 200 proceeds to processing of the next object.
- the object information control unit 200 performs an object estimation process based on superimposition information described later (step S125).
- the object estimation process based on the superimposition information is a process for determining the recognition ID by determining how the objects are overlapped and increased or the number of overlapped objects is decreased.
- the object information control unit 200 When executing the object estimation process based on proximity information or the object estimation process based on superimposition information, the object information control unit 200 updates the object ID, the recognition ID array, and the feature amount in the object management table using the estimated recognition ID (step S126). ). Then, the object information control unit 200 sets the processing flag for the object to be processed (step S127).
- the object information control unit 200 repeats the series of processes described above for the object acquired from the input unit 110 (step S128).
- FIG. 12 is a flowchart illustrating an operation example of the object information control unit 200 according to the embodiment of the present disclosure, and is an explanatory diagram illustrating an operation example when the object in step S114 does not exist.
- the following series of processes can be executed by the finger detection unit 210 and the attribute processing unit 220, for example.
- An object that does not have a processing flag set up so far is an object that was present in the previous frame but not processed in the current frame, so the object has been removed or has disappeared for some reason. It can be judged that there is.
- the object information control unit 200 repeatedly performs the following processing as long as there is an object for which the processing flag is not set (step S131).
- the object information control unit 200 acquires a finger ID that is estimated to have removed the object by the nearest finger detection process (step S132). It is determined whether or not the finger ID has been found (step S133). If the finger ID can be found by the nearest finger detection process (step S133, Yes), it is considered that the object is owned by the finger, so the object information control unit 200 has the corresponding finger ID in the possession state management table. A recognition ID is added to the end of the recognition ID array (step S134). If there is no corresponding finger ID, the object information control unit 200 adds the finger ID and the recognition ID of the object.
- the object information control unit 200 may consider that the object is owned by the operating body.
- the object information control unit 200 is registered with respect to the object whose sensing is shielded at the end of the recognition ID array of the finger ID corresponding to the operating body that at least partially shields the sensing of the object in the ownership state management table.
- a recognition ID may be added.
- the unit 200 deletes the attribute information from the attribute management table (step S135).
- the object information control unit 200 deletes the corresponding object row from the object management table (step S136).
- the object information control unit 200 repeats the above-described series of processes as long as there is an object with no processing flag (step S137).
- FIG. 13 is a flowchart illustrating an operation example of the object information control unit 200 according to the embodiment of the present disclosure, and is an explanatory diagram illustrating an operation example when the finger in Step S115 described above disappears out of the range. Note that the following series of processing can be executed by the attribute processing unit 220, for example.
- the object information control unit 200 repeatedly performs the following processing for the fingers registered in the possession state table (step S141).
- the object information control unit 200 determines whether the corresponding finger exists within the sensing area (step S142) based on the information from the input unit 110. If the finger is present within the sensing area (step S142, Yes), the object information control unit 200 determines that the finger continues to hold the object, and performs the next finger processing without doing anything. Move.
- the object information control unit 200 determines that the finger has disappeared outside the range, and performs processing when the finger has disappeared outside the range. (Step S143). The processing when the finger disappears out of the range depends on the use of the function desired to be realized by the application executed by the information processing system 100.
- the object information control unit 200 has an object management table, an ownership state management table, an attribute Delete the corresponding item from all management tables.
- the object information control unit 200 maintains all items and stores the flag with the finger ID of the possession state management table on the premise that the object appears again with the same object even if it disappears outside the range. Keep it.
- the fingers that have reappeared are given the same ID, or if the fingers enter from the same direction, the IDs are considered to be the same even if they are different.
- the object information control unit 200 performs a series of processes to track the movement of the object and the finger, and can retain the attribute associated with the object even if the object is covered with the finger.
- FIG. 14 is a flowchart illustrating an operation example of the object information control unit 200 according to the embodiment of the present disclosure, and is a flowchart illustrating the object estimation process based on proximity information illustrated in step S123 of FIG.
- a series of processing can be executed by the finger detection unit 210 and the attribute processing unit 220.
- the object information control unit 200 When performing object estimation processing based on proximity information, first, the object information control unit 200 performs closest finger detection processing (step S151). Then, the object information control unit 200 determines whether or not the nearest finger ID has been found by the nearest finger detection process (step S152).
- the object information control unit 200 determines whether the finger ID is registered in the possession state management table (step S153).
- the object information control unit 200 determines whether or not the recognition ID array exists in the hand ID (step S154).
- step S154 If there is a recognition ID array in the finger ID (step S154, Yes), it is considered that the finger having the finger ID has the last object in the recognition ID array.
- the control unit 200 estimates this object as an object having an existing recognition ID, and deletes the tail end of the recognition ID array used for estimation (step S155).
- the object information control unit 200 deletes the row for each finger ID if the recognition ID array still exists after deletion of the tail end of the recognition ID array and does not exist.
- the object information control unit 200 determines that this object is an unknown object generated for some reason, and a new recognition ID. Are allocated (step S156).
- the object information control unit 200 determines that the object is an unknown object generated from a new finger and performs a new recognition. An ID is allocated (step S156).
- step S154 If the recognition ID array does not exist in the finger ID in the determination in step S154 (No in step S154), the object information control unit 200 determines that the finger is known but the placed object is unknown, and a new recognition ID is obtained. Are allocated (step S156). This case corresponds to, for example, a case where an object is held from the beginning outside the range and another object is held within the range.
- the object information control unit 200 recognizes the object by detecting the nearest finger of the object when the object is generated in the sensing possible region by executing this series of processing. Can be judged.
- FIG. 15 is a flowchart illustrating an operation example of the object information control unit 200 according to the embodiment of the present disclosure, and is a flowchart illustrating the object estimation process based on superimposition information illustrated in step S125 of FIG. 11.
- a series of processing can be executed by the finger detection unit 210 and the attribute processing unit 220.
- the sensing target object has a unit feature amount in a predetermined height direction.
- an object to be sensed is an object having a shape like a flat cylindrical coin.
- the depth direction of sensing is the height direction
- the feature amount in the depth direction of sensing is the feature amount in the height direction (the height of the top surface of the object with reference to the surface on which the object is placed).
- the present disclosure is not limited to such an example. For example, when an object having magnetic force is attracted to the wall surface, sensing or projection is performed on the wall surface.
- the depth direction of sensing is a direction perpendicular to the wall surface
- the feature amount in the depth direction of sensing is a feature amount perpendicular to the wall surface (with respect to the wall surface, The height of the parallel surfaces).
- Information other than the height from the reference plane may be used as the feature amount in the depth direction of sensing.
- the position in the depth direction (depth coordinate information) of the three-dimensional information of the object obtained by the depth sensor may be used as the feature amount in the depth direction of sensing.
- the object information control unit 200 first determines whether or not the feature amount of the object has decreased (step S161). If the feature amount has decreased (step S161, Yes), the object information control unit 200 executes processing when the feature amount has decreased (step S162). The processing when the feature amount decreases will be described in detail later. If the feature amount has not decreased (No in step S161), the object information control unit 200 determines whether or not the increase difference is one stage (one unit feature amount) (step S163). If the increase is one stage (step S163, Yes), the object information control unit 200 executes a process when the feature amount is increased by one stage (step S164). Processing when the feature amount is increased by one step will be described in detail later. If the increase is two steps or more (step S163, No), the object information control unit 200 executes processing when the feature amount is increased by two steps or more (step S165). Processing when the feature amount increases by two or more stages will be described in detail later.
- FIG. 16 is a flowchart illustrating an operation example of the object information control unit 200 according to the embodiment of the present disclosure, and is a flowchart illustrating processing when the feature amount illustrated in step S162 of FIG. 15 is reduced.
- a series of processing can be executed by the finger detection unit 210 and the attribute processing unit 220.
- the object information control unit 200 first performs the nearest finger detection process (step S171). Then, the object information control unit 200 determines whether the nearest finger ID has been found by the nearest finger detection process (step S172).
- the object information control unit 200 determines that the feature amount has decreased because it is owned by the finger, and recognizes the recognition ID array of the corresponding finger ID in the possession state management table. Is added to the end of the recognition ID of the owned object (step S173).
- the object information control unit 200 adds a plurality of recognition IDs of the owned object to the end of the recognition ID array of finger IDs. In this case, the upper part is added to the recognition ID array in the order closer to the top of the array (that is, closer to the hand).
- the object information control unit 200 determines whether the uppermost recognition ID of the remaining object is known (step S174). If it is known (step S174, Yes), the object information control unit 200 does nothing. On the other hand, if it is unknown (step S174, No), the object information control unit 200 has the same property as the object carried by the finger because the remaining object was placed at the same time as the owned object. It is estimated that the object is an object, and the same recognition ID as that of the object carried by the finger is given (step S175).
- step S172 If the closest finger ID cannot be found in the determination in step S172 (step S172, Yes), the object information control unit 200 determines that the object has decreased without depending on the finger, and the remaining objects are: Although the cause is unknown, it is presumed that the object has the same property as the object that existed in the upper stage, and the same recognition ID as that of the lost object is assigned (step S175).
- FIG. 17 is a flowchart illustrating an operation example of the object information control unit 200 according to the embodiment of the present disclosure, and is a flowchart illustrating processing when the feature amount in step S164 of FIG. 15 is increased by one stage.
- a series of processing can be executed by the finger detection unit 210 and the attribute processing unit 220.
- the object information control unit 200 first performs the nearest finger detection process (step S181). Then, the object information control unit 200 determines whether or not the nearest finger ID has been found by the nearest finger detection process (step S182).
- the object information control unit 200 determines whether the finger ID is registered in the possession state management table and holds the recognition ID array (step). S183). If the finger ID is registered in the possession state management table and the recognition ID array is held (step S183, Yes), the object information control unit 200 uses this tail of the recognition ID array to detect this feature. The recognition ID of the object whose amount is increased by one step is estimated, and the tail end of the recognition ID array used for the estimation is deleted (step S184). If the finger ID is not registered in the possession state management table or does not hold the recognition ID array (step S183, No), the object information control unit 200 determines that the unknown object generated from the new finger is one level. Since the number is increased and the object is overlapped at the same place, it is estimated that the object has the same property, and the same recognition ID as that of the object one level below is assigned (step S185).
- step S182 If the nearest finger ID cannot be found in the determination of step S182 (step S182, Yes), the object information control unit 200 has increased the number of unknown objects generated from the new finger by one step and Since it is overlaid, it is estimated that the object has the same property, and the same recognition ID as that of the object one level below is assigned (step S185).
- FIG. 18 is a flowchart illustrating an operation example of the object information control unit 200 according to the embodiment of the present disclosure, and is a flowchart illustrating processing when the feature amount in step S165 of FIG. 15 is increased by two or more stages.
- a series of processing can be executed by the finger detection unit 210 and the attribute processing unit 220.
- the object information control unit 200 first performs the nearest finger detection process (step S191). Then, the object information control unit 200 determines whether or not the nearest finger ID has been found by the nearest finger detection process (step S192).
- the object information control unit 200 determines whether the finger ID is registered in the possession state management table and holds the recognition ID array (step). S193). If the finger ID is registered in the possession state management table and the recognition ID array is held (step S193, Yes), then the object information control unit 200 has recognition ID arrays for the increased number of stages. It is determined whether or not to perform (step S194). If there are recognition ID arrays corresponding to the increased number of stages (Yes in step S194), since the increased objects are all known objects, the object information control unit 200 determines the recognition ID arrays corresponding to the increased number of stages. In addition, an existing recognition ID is assigned to the object, and recognition ID arrays corresponding to the increased number of stages are deleted from the ownership state management table (step S195).
- step S194 If it is determined in step S194 that there are no recognition ID arrays for the increased number of steps (No in step S194), the object information control unit 200 determines that an unknown object is sandwiched between the increased number of steps. Then, the recognition ID that can be used is used. The object information control unit 200 estimates that the shortage will be an object having the same property as the bottom stage of the objects placed together, and updates the attribute management table with the same recognition ID (step S196). Further, the object information control unit 200 deletes the used recognition ID array.
- step S192 If it is determined in step S192 that the finger ID cannot be detected (No in step S192), the number of unknown objects generated from the new finger has increased by several stages, and the object information control unit 200 is placed at the same location. Since it is superimposed, it is presumed that the object has the same property. Then, the object information control unit 200 assigns the same recognition ID as that of the object one level lower (step S197).
- step S193 If it is determined in step S193 that the finger ID is not registered in the possession state management table or does not hold the recognition ID array (No in step S193), the object information control unit 200 is overlapped at the same place. Therefore, it is estimated that the objects have the same properties. Then, the object information control unit 200 assigns the same recognition ID as that of the object one level lower (step S197).
- the object information control unit 200 can track the movement of an object such as an object placed on the same object by being overlapped or placed by a finger by detecting whether the feature amount of the object has changed in this way. it can. Further, the object information control unit 200 can estimate that an object moving in the same manner will have the same property through such a series of processes, and can assign a recognition ID to an unknown object.
- a series of processing of the object information control unit 200 described so far will be described with a specific example.
- the height of an object acquired from a three-dimensional sensor will be described as an example of the feature amount stored in the object management table.
- an example in which an unknown object appears in the sensing area will be described.
- FIGS. 19A to 19C are explanatory views for explaining the operation when an unknown object appears in the sensing possible region.
- FIGS. 19A to 19C also show the states of the tables when the table is a sensing possible area and an unknown object appears in the table.
- FIG. 19A is a diagram showing a state in which a hand (hand a) holding an object appears. In this case, it is assumed that no information is registered in the object management table, the ownership state management table, and the attribute management table.
- FIG. 19B is a diagram illustrating a state in which a hand holding an object places the object on the table. Even at this point, no information is registered in the object management table, the ownership state management table, and the attribute management table.
- FIG. 19C is a diagram illustrating a state in which the hand has released the object after placing the object on the table.
- the object information control unit 200 recognizes the object placed on the table and registers information as shown in FIG. 19C for the object. .
- the object information control unit 200 registers the object ID as “1” for the object, and registers “A” as the recognition ID in the recognition ID array.
- the object information control unit 200 also registers the information of the recognition ID “A” in the attribute management table. At this point, since it is unknown what attribute this object has, the object information control unit 200 registers it as an unknown object in the attribute management table.
- 20A to 20D are explanatory diagrams for explaining the operation when an object is placed in the sensing area and the object is moved in the sensing area.
- 20A to 20D also show the state of each table when the table is a sensing area and an object whose attribute is known is moved on the table.
- FIG. 20A is a diagram showing a state in which an object with the object ID “1” is placed on the table. In this state, it is assumed that information as illustrated in FIG. 20A is registered in the object management table, the ownership state management table, and the attribute management table.
- FIG. 20B is a diagram illustrating a state where the object with the object ID “1” is held in the hand (hand a).
- the object information control unit 200 moves information of the recognition ID array corresponding to the object to the possession state management table.
- the object information control unit 200 changes the processing flag of the object management table to false.
- FIG. 20C is a diagram illustrating a state where the hand moves on the table with the object having the recognition ID array “A”. At this time, no information is registered in the object management table, and it is registered in the possession state management table that the hand of the finger ID “a” owns the object of the recognition ID array “A”. .
- FIG. 20D is a diagram illustrating a state in which the hand with finger ID “a” has released the hand from the object after placing the object with recognition ID array “A” on the table.
- the object information control unit 200 recognizes the object placed on the table and registers information as shown in FIG. 20D for the object. .
- the object information control unit 200 registers the object ID as “2” for the object, and registers “A” as the recognition ID in the recognition ID array.
- FIGS. 21A to 21D are explanatory diagrams for explaining the operation when an object is placed in the sensing area and the object is moved in the sensing area.
- 21A to 21D also show the states of the tables when the table is a sensing-capable area and an object with a known attribute is moved on the table.
- FIG. 21A is a diagram showing a state where an object with an object ID “1” is placed on the table. In this state, it is assumed that information as illustrated in FIG. 21A is registered in the object management table, the ownership state management table, and the attribute management table.
- FIG. 21B is a diagram showing a state where the object with the object ID “1” is held in the hand (hand a).
- the object information control unit 200 moves information of the recognition ID array corresponding to the object to the possession state management table.
- the object information control unit 200 registers the detected finger ID of the hand as “a”.
- the object information control unit 200 changes the processing flag of the object management table to false.
- FIG. 21C is a diagram showing a state where the hand with the finger ID “a” is moving on the table with the object having the recognition ID array “A”. At this time, no information is registered in the object management table, and it is registered in the possession state management table that the hand of the finger ID “a” owns the object of the recognition ID array “A”. .
- FIG. 21D is a diagram illustrating a state where the hand with the finger ID “a” has gone out of the sensing possible area.
- the object information control unit 200 clears the information in the possession state management table. Since there is no other object with the recognition ID “A” described in the recognition ID array in the sensing area, the object information control unit 200 also clears the information in the attribute management table.
- 22A to 22F are explanatory diagrams for explaining the operation when an object is placed in the sensing area and the object is moved within the sensing area.
- the table is a sensing possible region, and the state of each table when the object is moved on the table is also illustrated.
- FIG. 22A is a diagram showing a state in which an object with the object ID “1” is placed on the table. In this state, it is assumed that information as illustrated in FIG. 22A is registered in the object management table, the ownership state management table, and the attribute management table.
- FIG. 22B is a diagram illustrating a state where the object with the object ID “1” is held in the hand (hand a).
- the object information control unit 200 moves information of the recognition ID array corresponding to the object to the possession state management table.
- the object information control unit 200 registers the detected finger ID of the hand as “a”.
- the object information control unit 200 changes the processing flag of the object management table to false.
- FIG. 22C is a diagram showing a state where the hand with the finger ID “a” is moving on the table with the object having the recognition ID array “A”. At this time, no information is registered in the object management table, and it is registered in the possession state management table that the hand of the finger ID “a” owns the object of the recognition ID array “A”. .
- FIG. 22D is a diagram illustrating a state where the hand with the finger ID “a” has moved out of the sensing possible region. Even when the hand with the finger ID “a” goes out of the sensing area and the input unit 110 cannot recognize, the object information control unit 200 maintains the information in the ownership state management table and the attribute management table. At this time, the object information control unit 200 attaches information indicating that the hand with the finger ID “a” has gone out of the sensing enabled area. Here, the object information control unit 200 indicates that the hand has moved out of the sensing enabled region by attaching parentheses to the finger ID.
- FIG. 22E is a diagram showing a state in which the hand with the finger ID “a” has returned to the sensing enabled area.
- the hand with the finger ID “a” still owns the object with the recognition ID array “A”, and the object information control unit 200 removes the parentheses attached to the finger ID in the possession state management table.
- FIG. 22F is a diagram illustrating a state in which the hand with the finger ID “a” has released the hand from the object after placing the object with the recognition ID array “A” on the table.
- the object information control unit 200 recognizes the object placed on the table and registers information as shown in FIG. 22F for the object. .
- the object information control unit 200 registers the object ID as “2” for the object, and registers “A” as the recognition ID in the recognition ID array.
- FIGS. 23A to 23D are explanatory diagrams for explaining the operation when an object is placed in the sensing area and the object is moved in the sensing area.
- the table is a sensing possible region, and the state of each table when the object is moved on the table is also illustrated.
- FIG. 23A is a diagram showing a state in which an object with the object ID “1” is placed on the table.
- the object with the object ID “1” is a stack of two objects having a height of 1 centimeter as a reference feature amount in the height direction.
- the object information control unit 200 recognizes that the upper object of the two stacked objects is the recognition ID “A”, but cannot recognize the lower object. Therefore, the object information control unit 200 registers the object with the object ID “1” as “?, A” in the recognition ID array of the object management table.
- FIG. 23B is a diagram illustrating a state where the object with the object ID “1” is held in the hand (hand a).
- the object information control unit 200 moves information of the recognition ID array corresponding to the object to the possession state management table.
- the object information control unit 200 registers the detected finger ID of the hand as “a”.
- FIG. 23C is a diagram illustrating a state where the hand with the finger ID “a” is moving on the table with the upper object among the objects of the recognition ID array “A”.
- the object information control unit 200 registers the remaining object in the object management table with the object ID “2”.
- the object information control unit 200 assigns “A” to the recognition ID array for the object with the object ID “2” that is unknown. This is based on the assumption that objects stacked in the same place have the same attributes.
- the object information control unit 200 registers in the possession state management table that the hand with the finger ID “a” owns the object with the recognition ID array “A”.
- FIG. 23D is a diagram illustrating a state in which the hand with the finger ID “a” places an object on the table.
- the object information control unit 200 registers an object ID “3” in the object management table for an object away from the hand. At that time, since the object having the finger ID “a” owned the object having the recognition ID array “A” until immediately before, the object information control unit 200 sets “A” in the recognition ID array for the object having the object ID “3”. ".
- 24A to 24F are explanatory diagrams for explaining the operation when an object is placed in the sensing area and the object is moved within the sensing area.
- the table is a sensing possible region, and the state of each table when an object is moved on the table is also illustrated.
- FIG. 24A is a diagram showing a state in which objects with object IDs “1” and “2” are placed on the table.
- the objects with the object IDs “1” and “2” are objects each having a height of 1 centimeter. In this state, it is assumed that information as illustrated in FIG. 24A is registered in the object management table, the ownership state management table, and the attribute management table.
- the recognition ID array of the object with the object ID “1” is “A”
- the recognition ID array of the object with the object ID “2” is “B”.
- FIG. 24B is a diagram illustrating a state where the object with the object ID “1” is held in the hand (hand a).
- the object information control unit 200 moves information of the recognition ID array corresponding to the object to the possession state management table.
- the object information control unit 200 registers the detected finger ID of the hand as “a”.
- FIG. 24C is a diagram illustrating a state in which an object with the object ID “1” is stacked on the object “2” by hand.
- the object information control unit 200 displays the recognition ID array corresponding to the object with the object ID “2”. Information is also transferred to the ownership management table.
- FIG. 24D is a diagram illustrating a state in which stacked objects are lifted by a hand.
- the object information control unit 200 registers the recognition ID array “A, B” in the row “a” of the finger ID in the ownership state management table.
- the object information control unit 200 grasps that the hand having the finger ID “a” has an object having the recognition ID array “A, B”.
- FIG. 24E is a diagram showing a state in which the lower object is placed on the table among the stacked objects.
- the object information control unit 200 registers an object ID “3” in the object management table for an object away from the hand. At that time, since the object with the finger ID “a” owned the object with the recognition ID array “A” until immediately before, the object information control unit 200 displays the object with the object ID “3” in the recognition ID array. “B” that is the recognition ID assigned to the object is assigned.
- FIG. 24F is a diagram illustrating a state in which the hand with finger ID “a” places an object on the table.
- the object information control unit 200 registers an object ID “4” in the object management table for an object away from the hand. At that time, since the hand of the finger ID “a” owned the object of the recognition ID array “A” until immediately before, the object information control unit 200 sets “A” in the recognition ID array of the object of the object ID “4”. ".
- 25A to 25D are explanatory diagrams for explaining the operation when another object is stacked on the object placed in the sensing area.
- the table is a sensing possible region, and the state of each table when the object is moved on the table is also illustrated.
- FIG. 25A is a diagram showing a state in which an object with the object ID “1” is placed on the table.
- the object with the object ID “1” is an object having a height of 1 centimeter. In this state, it is assumed that information as illustrated in FIG. 25A is registered in the object management table, the ownership state management table, and the attribute management table.
- the recognition ID array of the object with the object ID “1” is “A”.
- FIG. 25B is a diagram illustrating an example of a state in which another object is stacked on the object with the object ID “1”.
- FIG. 25C is a diagram illustrating an example of a state where the hand is released from the object. Since no information is registered in the possession state management table, the object that appears (becomes recognizable) in FIG. Therefore, it is also unclear what kind of object the object is. Therefore, the object information control unit 200 changes the object ID “1” to “2” and registers the recognition ID array as “?, A” in the object management table.
- FIG. 25D is a diagram illustrating an example of estimating recognition IDs of stacked objects.
- the object information control unit 200 estimates that the stacked objects have the same recognition ID as the object placed underneath, and manages the object with the recognition ID array of the object with the object ID “2” as “?, A”. Register in the table.
- FIGS. 26A to 26D are explanatory diagrams for explaining the operation when another object is stacked in two stages on the object placed in the sensing area.
- the table is a sensing possible region, and the state of each table when an object is moved on the table is also illustrated.
- FIG. 25A is a diagram showing a state in which an object with the object ID “1” is placed on the table.
- the object with the object ID “1” is an object having a height of 1 centimeter.
- information as illustrated in FIG. 25A is registered in the object management table, the ownership state management table, and the attribute management table.
- the recognition ID array of the object with the object ID “1” is “A”.
- the recognition ID array is registered as “B” in the hand of the finger ID “a”. That is, the object information control unit 200 knows that at this time, only one object with the recognition ID “B” is held in the hand with the finger ID “a”.
- FIG. 26B is a diagram illustrating an example of a state in which another object is stacked on the object with the object ID “1”.
- FIG. 26C is a diagram illustrating an example of a state where the hand is released from the object.
- the object information control unit 200 changes the object ID “1” to “2”.
- the object information control unit 200 moves the information of the recognition ID array registered in the possession state management table to the object management table.
- the height has increased from 1 centimeter to 3 centimeters. Two other objects are stacked on the object “1”.
- the object information control unit 200 recognizes that only one object with the recognition ID “B” is held in the hand with the finger ID “a”.
- the recognition ID of the bottom object is “A” and the recognition ID of the top object is “B”, but it is not known what the recognition ID of the middle object is. Therefore, the object information control unit 200 registers the recognition ID array of the object ID “2” as “A,?, B”.
- FIG. 26D is a diagram illustrating an example of estimating recognition IDs of stacked objects.
- the object information control unit 200 estimates that an object whose recognition ID is unknown has the same recognition ID as the object held together, and sets the recognition ID array of the object with the object ID “2” to “A, B, B”. To the object management table.
- the recognition ID is given to the object as described above, and by recognizing the hand near the object, the user covers the object placed on the table with the hand and places the object on the table again. Even in this case, the recognition ID can be continuously given to the object.
- FIG. 27 is an explanatory diagram illustrating a configuration example of a communication system using the information processing system 100 according to the embodiment of the present disclosure.
- FIG. 27 illustrates the information processing system 100 and a router 700 and a PC 800 connected to the information processing system 100 via a network.
- the network may be a WAN (Wide Area Network) such as the Internet, or a LAN (Local Area Network).
- the network connection among the information processing system 100, the router 700, and the PC 800 may be wired or wireless.
- TCP / IP may be used as the network communication protocol, and other protocols such as a uniquely defined protocol may be used.
- FIG. 27 also shows objects 601 and 602 placed on a table in an area that can be sensed by the information processing system 100.
- the objects 601 and 602 are objects that do not have a communication function, such as tableware. Therefore, in this embodiment, an identifier such as an IP address that can uniquely identify the object is assigned to these objects 601 and 602. This identifier is an identifier for virtually accessing the objects 601 and 602.
- the information processing system 100, the router 700, and the PC 800 have IP addresses used for actual communication as objects 601 and 602. 1 illustrates a virtual IP address that can be uniquely identified from the information processing system 100.
- the virtual IP addresses assigned to the objects 601 and 602 are not used by the information processing system 100 for communication using the IP addresses, but are used by the information processing system 100 to identify the objects. It is.
- an IP address is used as an identifier that can uniquely identify an object.
- the present disclosure is not limited to such an example, and the information processing system 100 may use another object ID for sensing an object. Or an object ID with a predetermined index (objectIndex) added may be used.
- the attribute processing unit 220 assigns the attribute processing unit 220 to the objects 601 and 602 shown in FIG.
- the object whose object ID is 1 is the object 601
- the object whose object ID is 2 is the object 602.
- the information processing system 100 is a virtual local area network that virtually acts as a gateway and behaves like a DHCP (Dynamic Host Configuration Protocol) server.
- DHCP Dynamic Host Configuration Protocol
- the information processing system 100 can include various sensors as described above, for example, information such as the weight, size, number, color, and temperature of an object placed in the sensing area can be sensed. Based on the request from the PC 800, the information processing system 100 can grasp which information of which object should be sensed. Note that the information processing system 100 may simultaneously perform sensing of two or more states based on a request from an external device. For example, when a request for sensing the number and weight of objects designated from the PC 800 is transmitted, the information processing system 100 senses the number and weight of objects designated based on the request.
- FIG. 28 is a flowchart illustrating an operation example of the information processing system 100 according to the embodiment of the present disclosure.
- FIG. 28 illustrates sensing of the state of an object placed in an area that can be sensed by the information processing system 100 based on an instruction from the outside, or performing projection of information based on an instruction from the outside. It is an operation example of the information processing system 100 in the case of performing.
- the information processing system 100 determines whether or not a predetermined end condition such as a predetermined operation by the user is satisfied (step S201), and if the end condition is satisfied (step S201, Yes), a series of processing However, if the termination condition is not satisfied (No in step S201), it is subsequently determined whether a request is received from an external device (step S202).
- an external apparatus transmits, as a request, either an information acquisition request for acquiring the state of an object or the like or an information output request for outputting information to the information processing system 100 via a network.
- step S202, No If a request has not been received from an external device such as the PC 800 (step S202, No), the information processing system 100 returns to the determination process of whether or not the predetermined termination condition in step S201 is satisfied.
- the information processing system 100 analyzes the request (step S203), and the received request acquires information on the state of the object. It is determined whether the request is an acquisition request (step S204). The determination can be performed by the attribute processing unit 220, for example. If it is an information acquisition request for acquiring the state of the object (step S204, Yes), the information processing system 100 performs an information acquisition request process for acquiring the state of the object (step S205).
- the information acquisition request process for acquiring the state of the object is a process for sensing an object placed in the sensing area using the input unit 110. In this case, the information acquisition request is a sensing request.
- An object to be sensed can be instructed from an external device such as the PC 800. If the content of the information acquisition request is information acquisition that does not require sensing, the information processing system 100 does not perform object sensing processing, reads specified information from a storage medium such as a memory, and sends the read information as a response. Send as.
- Information acquisition that does not require sensing is, for example, acquisition of the state of the information processing system 100. Specifically, for example, the operation time from the power-on of the information processing system 100, the total operation time, the operation time zone, the firmware When acquiring information such as version. If the state of the information processing system 100 such as a sensor or a projector (normal or abnormal) can be acquired, the state of these devices can also be included in the state of the information processing system 100.
- the information processing system 100 performs an information output request process for projecting information (step S206). These processes can be executed by the attribute processing unit 220, for example.
- the information projection target can be instructed from an external device such as the PC 800.
- the information processing system 100 senses the position, height, etc. of the object specified by the identifier included in the request, the state of the projection target, etc. Project to the right place. For example, if the visibility of the object is black because the object is black, the top surface of the object is undulating, or the area of the top surface of the object is small, the information processing system 100 Project to.
- the information processing system 100 displays the top surface of the object. Project to.
- the information processing system 100 may project information such as a character string or an image in a direction facing the user. That is, when an external device, for example, the PC 800 transmits an information output request, even if the PC 800 does not grasp the object to which information is to be projected and the current state of the periphery of the object, the projection location can be detailed with coordinates and the like.
- the request for projecting information may include a command for designating a projection position relative to the object (for example, the top surface of the object, the right direction, the left direction, the upward direction, or the downward direction).
- the object information control unit 200 determines the current position of the object corresponding to the object identifier specified in the received request. Sensing is performed, and the specified information is projected on the relative position specified by the command with the current position as a reference.
- the information output request process is not limited to information projection.
- the information output request may include a command for instructing voice output.
- the information processing system 100 may output voice through a directional speaker or the like as the information output request process.
- the information processing system 100 When the acquisition request process or the projection request process is completed, the information processing system 100 performs a reply process for the apparatus that sent the request (step S207).
- the information processing system 100 can perform sensing of an object or projection of information in response to a request from an external device by executing a series of operations.
- FIG. 29 is a flowchart illustrating an operation example of the information processing system 100 and the PC 800.
- information obtained by adding a predetermined index (objectIndex) to the object ID is used as information for identifying the object.
- the PC 800 transmits a request to the information processing system 100 as to whether or not an object exists in the sensing area of the information processing system 100 (step S211).
- the address of the information processing system 100 is designated from the PC 800, and the recognition ID of the target object (here, the recognition ID is “B”, the same applies hereinafter) is designated as an argument.
- the information processing system 100 performs sensing and returns a response indicating whether the designated object exists to the PC 800 (step S212).
- the information processing system 100 returns the code number 200 to the PC 800 if the object exists, and returns the code number 404 to the PC 800 if it does not exist.
- These codes can use status codes defined in HTTP (Hyper Text Transfer Protocol), and can be used as, for example, WEB API (Application Programming Interface).
- the code used between the information processing system 100 and the PC 800 is not limited to the status code defined by HTTP.
- the PC 800 transmits a request for acquiring the temperature of the object to the information processing system 100 (step S213).
- the information processing system 100 senses the temperature of the designated object, and returns the sensing result to the PC 800 (step S214).
- there are two objects having a recognition ID “B” and the information processing system 100 senses the temperatures of the two objects and returns them to the PC 800.
- the information processing system 100 indicates that the temperature of the object whose “objectIndex” is “1” is 20 degrees Celsius and the temperature of the object whose “objectIndex” is “2” is 10 degrees Celsius. It is returned to PC800.
- the PC 800 transmits a request for projecting information to the information processing system 100 (step S215).
- a request to project a message “How are you?” Is sent from the PC 800 to the information processing system 100 for an object whose temperature is 20 degrees Celsius.
- the information processing system 100 projects information based on a request to project information
- the information processing system 100 returns the result of the projection processing to the PC 800 (step S216).
- FIGS. 30A and 30B are explanatory diagrams illustrating an example of object sensing and information projection by the information processing system 100.
- glasses filled with beer are illustrated as objects 601 and 602, respectively.
- the objects 601 and 602 are both registered with a recognition ID (recognition ID) of “B”.
- the object 601 is registered with “objectIndex” being “1”, and the object 602 is registered with “objectIndex” being “2”.
- the information processing system 100 senses the temperatures of the objects 601 and 602 in response to a request for acquiring the temperature of the object sent from the PC 800. As a result, the temperature of the object 601 is 20 degrees and the temperature of the object 602 is 10 degrees. Each was measured.
- the operator of the PC 800 transmits a request to the information processing system 100 to project a message “How are you?” To the object 601.
- the information processing system 100 can project a message “How are you?” Around the object 601 as shown in FIG. 30B.
- information with a predetermined index (objectIndex) added to the object ID is used as information for identifying the object.
- objectIndex information with a predetermined index
- the object ID itself may be used as the information for identifying the object.
- an object ID with an IP address added may be used.
- the argument portion (parameter after “?”) Of the URL changes depending on what information is used as information for identifying the object.
- the attribute associated with the recognition ID may be other than the IP address.
- information on the type (type) of the object may be associated as an attribute associated with the recognition ID.
- Examples of the types of objects include dishes, coffee cups, and wine glasses for tableware.
- the information on the type (type) of the object is a type identifier for virtually accessing an object belonging to the type indicated by the recognition ID.
- the information processing system 100 designates the recognition ID to perform the information acquisition process for a plurality of objects of the same type under the sensing environment with a single request. Or information output processing may be performed.
- the information processing system 100 may set an action ID corresponding to each recognition ID and include the recognition ID and the action ID in the information output request.
- This action ID indicates information output processing corresponding to the type of object.
- the information processing system 100 can execute an appropriate action according to the type of the object by including the recognition ID and the action ID in the information output request.
- a database in which the action ID and the information output process are associated may be prepared, and the information processing system 100 may determine the information output process with reference to this database.
- the request from the PC 800 is an information output request with the type identifier and the action identifier
- the information processing system 100 provides information corresponding to the action ID for an object belonging to the type indicated by the object type information.
- the output is performed by the output unit 130.
- an operation example of the information processing system 100 that has received a request from the PC 800 will be described.
- FIG. 31 is a flowchart showing an operation example of the information processing system 100 and the PC 800.
- information obtained by adding a predetermined index (objectIndex) to the object ID is used as information for identifying the object.
- FIG. 32 is an explanatory diagram showing an example of an object sensed in the operation example of FIG. FIG. 32 shows a coffee cup placed on a table as an object to be sensed and coffee being poured into the coffee cup.
- the PC 800 transmits a request to the information processing system 100 as to whether or not an object exists in the sensing possible area of the information processing system 100 (step S221).
- the address of the information processing system 100 is designated from the PC 800, and the recognition ID of the target object (here, the recognition ID is “C”, the same applies hereinafter) is designated as an argument.
- the information processing system 100 performs sensing and returns a response indicating whether the designated object exists to the PC 800 (step S222).
- the information processing system 100 returns the code number 200 to the PC 800 if the object exists, and returns the code number 404 to the PC 800 if it does not exist.
- These codes can use status codes defined by HTTP, and can be used as, for example, WEB API.
- the code used between the information processing system 100 and the PC 800 may be a predetermined code that can be processed by the information processing system 100 and the PC 800, and is not necessarily limited to the status code defined by HTTP. Absent.
- the PC 800 transmits a request for acquiring the internal height of the object to the information processing system 100 (step S223).
- the information processing system 100 performs sensing of the internal height of the specified object, and returns the sensing result to the PC 800 (step S224).
- the information processing system 100 senses the internal height of the object and returns it to the PC 800.
- the PC 800 transmits a request for projecting characters to the object to the information processing system 100 (step S225).
- a request for projecting characters to the object to the information processing system 100 (step S225).
- the information processing system 100 returns a response related to the result of projecting the designated character to the PC 800 (step S226).
- the information processing system 100 that has received the request from the PC 800 can project the letters “good morning” on the surface of the coffee poured into the coffee cup, as shown in FIG.
- the PC 800 transmits a request for acquiring the internal height of the sensed object to the information processing system 100 (step S227).
- the information processing system 100 performs sensing of the internal height of the designated object, and returns the sensing result to the PC 800 (step S228).
- the information processing system 100 senses the internal height of the object and returns it to the PC 800.
- the PC 800 transmits a request for projecting characters onto the object to the information processing system 100 (step S229).
- a request for projecting characters onto the object is shown.
- the information processing system 100 returns a response related to the result of projecting the designated character to the PC 800 (step S230).
- the information processing system 100 that has received a request from the PC 800 can display the current time around the coffee cup and project the characters “It's time to go”.
- FIG. 33 is a flowchart showing an operation example of the information processing system 100 and the PC 800.
- information obtained by adding a predetermined index (objectIndex) to the object ID is used as information for identifying the object.
- FIG. 34 is an explanatory diagram showing an example of an object sensed in the operation example of FIG. FIG. 34 shows a plate and a wine glass placed on a table as objects to be sensed.
- the PC 800 first transmits a request to the information processing system 100 as to whether or not an object exists in the sensing possible area of the information processing system 100 (step S231). At this time, the PC 800 designates the address of the information processing system 100, designates the recognition ID of the target object (here, the recognition ID is “D”) as an argument, and returns the position of the object in coordinates. Specify an argument for.
- the information processing system 100 performs sensing and returns a response indicating whether or not the designated object exists to the PC 800 (step S232).
- the information processing system 100 returns the code number 200 to the PC 800 if the object exists, and returns the code number 404 to the PC 800 if it does not exist.
- These codes can use status codes defined by HTTP, and can be used as, for example, WEB API.
- the code used between the information processing system 100 and the PC 800 is not limited to the status code defined by HTTP. Further, when returning the response to the PC 800, the information processing system 100 returns the coordinates of the center position of the object.
- the PC 800 transmits a request to the information processing system 100 as to whether or not an object exists in the sensing possible area of the information processing system 100 (step S233).
- the PC 800 designates the address of the information processing system 100, designates the recognition ID of the target object (here, the recognition ID is “E”) as an argument, and returns the position of the object in coordinates. Specify an argument for.
- the information processing system 100 performs sensing and returns a response indicating whether the designated object exists to the PC 800 (step S234).
- the information processing system 100 returns the code 200 to the PC 800 if the object exists. Further, when returning the response to the PC 800, the information processing system 100 returns the coordinates of the center position of the object.
- the PC 800 determines the predetermined value for the object whose “objectIndex” is “1”. A request for projecting the pattern is transmitted (step S235). The information processing system 100 returns a response regarding the result of projecting the predetermined pattern to the PC 800 (step S236).
- the PC 800 can send an instruction to the information processing system 100 to project a pattern or the like onto various objects.
- FIG. 35 is a diagram illustrating a state in which a predetermined pattern is projected onto a wine glass placed near a plate.
- FIG. 36 is a diagram showing a state in which a predetermined pattern is projected onto a beer mug placed near a plate.
- the information processing system 100 can project a pattern according to an object placed near the dish.
- FIG. 37 is a flowchart showing an operation example of the information processing system 100 and the PC 800.
- information obtained by adding a predetermined index (objectIndex) to the object ID is used as information for identifying the object.
- FIG. 38 is an explanatory diagram showing an example of an object sensed in the operation example of FIG. FIG. 38 shows a plate and a wine glass placed on a table as objects to be sensed.
- the PC 800 transmits a request to the information processing system 100 as to whether or not an object exists in the sensing possible area of the information processing system 100 (step S241).
- the PC 800 designates the address of the information processing system 100, designates the recognition ID of the target object (here, the recognition ID is “D”) as an argument, and returns the position of the object in coordinates. Specify an argument for.
- the information processing system 100 performs sensing and returns a response indicating whether the designated object exists to the PC 800 (step S242).
- the information processing system 100 returns the code number 200 to the PC 800 if the object exists, and returns the code number 404 to the PC 800 if it does not exist.
- These codes can use status codes defined by HTTP, and can be used as, for example, WEB API. However, the code used between the information processing system 100 and the PC 800 is not limited to the status code defined by HTTP. Further, when returning a response to the PC 800 in step S242, the information processing system 100 returns the coordinates of the recognized center position of the object.
- the PC 800 transmits a request as to whether or not an object exists in the sensing area of the information processing system 100 (step S243).
- the PC 800 designates the address of the information processing system 100, designates the recognition ID of the target object (here, the recognition ID is “G”) as an argument, and returns the position of the object in coordinates. Specify an argument for.
- the information processing system 100 performs sensing and returns a response indicating whether the designated object exists to the PC 800 (step S244). Further, when returning a response to the PC 800 in step S244, the information processing system 100 returns the coordinates of the center position of the recognized object.
- the PC 800 transmits a request as to whether or not an object exists in the sensing area of the information processing system 100 (step S245).
- the PC 800 designates the address of the information processing system 100, designates the recognition ID of the target object (here, the recognition ID is “H”) as an argument, and returns the position of the object in coordinates. Specify an argument for.
- the information processing system 100 performs sensing and returns a response indicating whether the designated object exists to the PC 800 (step S246).
- FIG. 39 is an explanatory diagram illustrating a state in which a predetermined pattern is projected around the object (dish) sensed by the information processing system 100.
- FIG. 40 is a flowchart showing an operation example of the information processing system 100 according to the present embodiment.
- FIG. 40 corresponds to the operation example shown in FIG. 37, and is a flowchart illustrating the operation of the information processing system 100.
- the information processing system 100 first determines whether or not an object having a recognition ID of D (dish) is found in accordance with a sensing instruction from the PC 800 (step S251). If an object having the recognition ID D (dish) is not found (step S251, No), the information processing system 100 ends the process without doing anything thereafter (step S252).
- step S251 when an object with a recognition ID of D (dish) is found (step S251, Yes), the information processing system 100 subsequently detects an object with a recognition ID of G (soup plate) according to a sensing instruction from the PC 800. It is determined whether it has been found (step S253). If an object with the recognition ID G (soup plate) is not found (step S253, No), the information processing system 100 executes a process of displaying a pattern when an object with the recognition ID D (dish) is found. (Step S254).
- FIG. 41 is an explanatory diagram illustrating an example of a pattern projected by the information processing system 100 when an object having a recognition ID D (dish) is found.
- FIG. 41 shows a state in which a predetermined pattern is projected around the plate placed on the table and on the surface of the plate.
- step S253 when an object with a recognition ID of G (soup plate) is found (step S253, Yes), the information processing system 100 subsequently detects an object with a recognition ID of H (napkin) according to a sensing instruction from the PC 800. It is determined whether it has been found (step S255). If an object with the recognition ID H (napkin) is not found (step S255, No), the information processing system 100 performs a process of displaying a pattern when an object with the recognition ID G (soup plate) is found. (Step S256).
- FIG. 42 is an explanatory diagram illustrating an example of a pattern projected by the information processing system 100 when an object with a recognition ID of G (soup plate) is found. FIG. 42 shows a state in which a predetermined pattern is projected around the soup plate.
- FIG. 43 is an explanatory diagram illustrating an example of a pattern projected by the information processing system 100 when an object having a recognition ID of H (napkin) is found.
- FIG. 43 shows a state in which a predetermined pattern is projected around the dish and on the surface of the napkin.
- the information processing system 100 can execute control for projecting different patterns according to the recognized object by an instruction from the PC 800.
- FIG. 44 is a block diagram illustrating a hardware configuration example of the information processing system 100 according to the embodiment of the present disclosure.
- the information processing system 100 includes a CPU (Central Processing unit) 901, a ROM (Read Only Memory) 903, and a RAM (Random Access Memory) 905.
- the information processing system 100 may also include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 923, and a communication device 925.
- the information processing system 100 may include an imaging device 933 and a sensor 935 as necessary.
- the information processing system 100 may include a processing circuit called DSP (Digital Signal Processor) or ASIC (Application Specific Integrated Circuit) instead of or in addition to the CPU 901.
- DSP Digital Signal Processor
- ASIC Application Specific Integrated Circuit
- the CPU 901 functions as an arithmetic processing unit and a control unit, and controls all or a part of the operation in the information processing system 100 according to various programs recorded in the ROM 903, the RAM 905, the storage device 919, or the removable recording medium 927.
- the ROM 903 stores programs and calculation parameters used by the CPU 901.
- the RAM 905 primarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like.
- the CPU 901, the ROM 903, and the RAM 905 are connected to each other by a host bus 907 configured by an internal bus such as a CPU bus. Further, the host bus 907 is connected to an external bus 911 such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 909.
- PCI Peripheral Component Interconnect / Interface
- the input device 915 is a device operated by the user, such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever.
- the input device 915 may include a microphone that detects the user's voice.
- the input device 915 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device 929 such as a mobile phone corresponding to the operation of the information processing system 100.
- the input device 915 includes an input control circuit that generates an input signal based on information input by the user and outputs the input signal to the CPU 901.
- the user operates the input device 915 to input various data to the information processing system 100 and instruct processing operations.
- An imaging device 933 which will be described later, can also function as an input device by imaging a user's hand movement and the like.
- the output device 917 is a device that can notify the user of the acquired information visually or audibly.
- the output device 917 includes, for example, a display device such as an LCD (Liquid Crystal Display), a PDP (Plasma Display Panel), an organic EL (Electro-Luminescence) display, a projector, a hologram display device, a sound output device such as a speaker and headphones, As well as a printer device.
- the output device 917 outputs the result obtained by the processing of the information processing system 100 as a video such as text or an image, or outputs it as a voice such as voice or sound.
- the output device 917 may include a light or the like to brighten the surroundings.
- the storage device 919 is a data storage device configured as an example of a storage unit of the information processing system 100.
- the storage device 919 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
- the storage device 919 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like.
- the drive 921 is a reader / writer for a removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and is built in or externally attached to the information processing system 100.
- the drive 921 reads information recorded on the attached removable recording medium 927 and outputs the information to the RAM 905.
- the drive 921 writes a record in the attached removable recording medium 927.
- the connection port 923 is a port for directly connecting a device to the information processing system 100.
- the connection port 923 can be, for example, a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, or the like.
- the connection port 923 may be an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, or the like.
- the communication device 925 is a communication interface configured with, for example, a communication device for connecting to the communication network 931.
- the communication device 925 may be, for example, a communication card for wired or wireless LAN (Local Area Network), Bluetooth (registered trademark), or WUSB (Wireless USB).
- the communication device 925 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communication.
- the communication device 925 transmits and receives signals and the like using a predetermined protocol such as TCP / IP with the Internet and other communication devices, for example.
- the communication network 931 connected to the communication device 925 is a wired or wireless network, such as the Internet, a home LAN, infrared communication, radio wave communication, or satellite communication.
- the imaging device 933 uses various members such as an imaging element such as a CCD (Charge Coupled Device) or CMOS (Complementary Metal Oxide Semiconductor), and a lens for controlling the imaging of a subject image on the imaging element. It is an apparatus that images a real space and generates a captured image.
- the imaging device 933 may capture a still image or may capture a moving image.
- the sensor 935 is various sensors such as an acceleration sensor, a gyro sensor, a geomagnetic sensor, an optical sensor, and a sound sensor.
- the sensor 935 obtains information about the state of the information processing system 100 itself such as the posture of the housing of the information processing system 100, and information about the surrounding environment of the information processing system 100 such as brightness and noise around the information processing system 100. To do.
- the sensor 935 may include a GPS sensor that receives a GPS (Global Positioning System) signal and measures the latitude, longitude, and altitude of the apparatus.
- GPS Global Positioning System
- Each component described above may be configured using a general-purpose member, or may be configured by hardware specialized for the function of each component. Such a configuration can be appropriately changed according to the technical level at the time of implementation.
- the information processing system 100 that can individually identify an object that does not have a communication function and can sense the state of the object.
- the information processing system 100 identifies an object in response to a request from an external device, senses the object, and identifies an object in response to a request from an external device. Can be projected.
- each step in the processing executed by each device in this specification does not necessarily have to be processed in chronological order in the order described as a sequence diagram or flowchart.
- each step in the processing executed by each device may be processed in an order different from the order described as the flowchart, or may be processed in parallel.
- a detection unit for sensing an object existing in the sensing possible area In response to receiving a request in which an identifier associated with the object transmitted from an external device via a network is specified, the sensing unit performs sensing of the object corresponding to the identifier, and the sensing result
- a processing unit that transmits a response to the request to the external device An information processing apparatus comprising: (2) The request is at least a sensing request for requesting sensing for an object or an information output request for requesting information output for an object, The processing unit causes the detection unit to perform sensing on the object when the request is a sensing request, and causes the output unit to output information on the object when the request is an information output request.
- the information processing apparatus (1).
- the object is a physical object having no network connection function, The information processing apparatus according to (2), wherein the processing unit performs a process of associating the identifier with the object sensed in the sensing possible region.
- the identifier is an identifier for virtually accessing the object.
- the type identifier is associated with an action identifier indicating information output processing corresponding to the type of object
- the processing unit When the request is an information output request with the type identifier and the action identifier, the processing unit outputs the information output corresponding to the action identifier to the object belonging to the type indicated by the type identifier Let the department do it, The information processing apparatus according to (6).
- the processing unit senses the position of the object corresponding to the identifier specified in the information output request after receiving the information output request, and sets the location corresponding to the sensed position.
- the information processing apparatus according to any one of (2) to (7), wherein the output unit outputs information to the output unit.
- the information processing apparatus according to any one of (2) to (8), wherein when the request is the information output request, the output unit performs a projection output or a sound output of characters or images related to the object.
- the detection unit senses a temperature of the object and transmits a response to the request based on the sensed temperature.
- the detection unit senses the color of the object and transmits a response to the request based on the sensed color.
- the detection unit senses the weight of the object and transmits a response to the request based on the sensed weight.
- the information processing apparatus according to any one of (1) to (12), wherein the detection unit senses the number of the objects and transmits a response to the request based on the sensed number.
- the detection unit senses the size of the object as the state of the object, and transmits a response to the request based on the sensed size.
- the information processing apparatus according to any one of (1) to (14), further including an output unit configured to output information based on a processing result from the processing unit.
- Sensing objects present in the sensing area In response to receiving a request in which an identifier associated with the object transmitted from an external device via a network is specified, sensing the object corresponding to the identifier, and responding to the request based on the sensing result Sending a response to the external device;
- An information processing method including: (17) On the computer, Sensing objects present in the sensing area; In response to receiving a request in which an identifier associated with the object transmitted from an external device via a network is specified, sensing of the object corresponding to the identifier is performed, and the request is received based on the sensing result. Sending a response to the external device;
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
1.本開示の実施の形態
1.1.システム構成例
1.2.機能構成例
1.3.動作例
2.ハードウェア構成例
3.まとめ
[1.1.システム構成例]
まず、本開示の実施の形態に係る情報処理システムの構成例を説明する。
図5は、本開示の実施の形態に係る情報処理システム100の機能構成例を示す説明図である。以下、図5を用いて本開示の実施の形態に係る情報処理システム100の機能構成について説明する。
http://11.22.33.2/index.html?recognitionID=A&query=number
http://11.22.33.2/index.html?recognitionID=A&objectIndex=1&query=color
http://11.22.33.2/index.html?recognitionID=B&objectIndex=2&query=size&unit=mm
次に、図44を参照して、本開示の一実施形態にかかる情報処理システム100のハードウェア構成について説明する。図44は、本開示の実施形態にかかる情報処理システム100のハードウェア構成例を示すブロック図である。
以上説明したように本開示の実施の形態によれば、通信機能を有していない物体を個別に識別して、その物体の状態のセンシングが可能な情報処理システム100が提供される。本開示の実施の形態に係る情報処理システム100は、外部の装置からのリクエストに応じて物体を識別して物体のセンシングを行うとともに、外部の装置からのリクエストに応じて物体を識別して情報の投影を行うことが可能となる。
(1)
センシング可能領域に存在する物体のセンシングを行う検出部と、
外部装置からネットワークを介して送信された前記物体に関連付けられた識別子が指定されたリクエストの受信に応じて、前記識別子に対応する物体のセンシングを前記検出部に行わせて、該センシングの結果に基づき前記リクエストに対するレスポンスを前記外部装置に送信する処理部と、
を備える、情報処理装置。
(2)
前記リクエストは少なくとも物体に対するセンシングを要求するセンシングリクエスト又は物体に対する情報出力を要求する情報出力リクエストであり、
前記処理部は、前記リクエストがセンシングリクエストである場合、前記検出部に前記物体に対するセンシングを行わせ、前記リクエストが情報出力リクエストである場合、前記物体に関する情報の出力を出力部に行わせる、前記(1)に記載の情報処理装置。
(3)
前記物体はネットワーク接続機能を持たない物理的な物体であり、
前記処理部は、前記センシング可能領域内でセンシングした前記物体に前記識別子を関連付ける処理を行う、前記(2)に記載の情報処理装置。
(4)
前記識別子は、前記物体に仮想的にアクセスするための識別子である、前記(3)に記載の情報処理装置。
(5)
前記識別子は、前記物体に仮想的にアクセスするためのアドレスを含む、前記(4)に記載の情報処理装置。
(6)
前記識別子は物体の種類を示す情報であって、前記識別子で示された種類に属する前記物体に仮想的にアクセスするための種類識別子である、前記(4)または(5)に記載の情報処理装置。
(7)
前記種類識別子には、物体の種類に対応した情報出力処理を示すアクション識別子が関連付けられており、
前記処理部は、前記リクエストが前記種類識別子と前記アクション識別子を伴う情報出力リクエストである場合、前記種類識別子により示された種類に属する前記物体に対して前記アクション識別子に対応する情報出力を前記出力部に行わせる、
前記(6)に記載の情報処理装置。
(8)
前記処理部は、前記情報出力リクエストが受信された場合、前記情報出力リクエストの受信後に、前記情報出力リクエストで指定された識別子に対応する物体の位置をセンシングし、センシングした位置に対応する場所に対して前記出力部に情報の出力を行わせる、前記(2)~(7)のいずれかに記載の情報処理装置。
(9)
前記出力部は、前記リクエストが前記情報出力リクエストである場合、前記物体に関する文字若しくは画像の投影出力又は音声出力を行う、前記(2)~(8)のいずれかに記載の情報処理装置。
(10)
前記検出部は、前記物体の温度をセンシングし、センシングした温度に基づいて前記リクエストに対するレスポンスを送信する、前記(1)~(9)に記載の情報処理装置。
(11)
前記検出部は、前記物体の色をセンシングし、センシングした色に基づいて前記リクエストに対するレスポンスを送信する、前記(1)~(10)に記載の情報処理装置。
(12)
前記検出部は、前記物体の重さをセンシングし、センシングした重さに基づいて前記リクエストに対するレスポンスを送信する、前記(1)~(11)に記載の情報処理装置。
(13)
前記検出部は、前記物体の数をセンシングし、センシングした数に基づいて前記リクエストに対するレスポンスを送信する、前記(1)~(12)に記載の情報処理装置。
(14)
前記検出部は、前記物体の状態として物体のサイズをセンシングし、センシングしたサイズに基づいて前記リクエストに対するレスポンスを送信する、前記(1)~(13)に記載の情報処理装置。
(15)
前記処理部からの処理の結果に基づいて情報の出力を行う出力部を備える、前記(1)~(14)に記載の情報処理装置。
(16)
センシング可能領域に存在する物体のセンシングを行うことと、
外部装置からネットワークを介して送信された前記物体に関連付けられた識別子が指定されたリクエストの受信に応じて、前記識別子に対応する物体のセンシングを行わせて、該センシングの結果に基づき前記リクエストに対するレスポンスを前記外部装置に送信することと、
をプロセッサが行うことを含む、情報処理方法。
(17)
コンピュータに、
センシング可能領域に存在する物体のセンシングを行うことと、
外部装置からネットワークを介して送信された前記物体に関連付けられた識別子が指定されたリクエストの受信に応じて、前記識別子に対応する物体のセンシングを行わせて、該センシングの結果に基づき前記リクエストに対するレスポンスを前記外部装置に送信することと、
を実行させる、コンピュータプログラム。
200 物体情報制御部
Claims (17)
- センシング可能領域に存在する物体のセンシングを行う検出部と、
外部装置からネットワークを介して送信された前記物体に関連付けられた識別子が指定されたリクエストの受信に応じて、前記識別子に対応する物体のセンシングを前記検出部に行わせて、該センシングの結果に基づき前記リクエストに対するレスポンスを前記外部装置に送信する処理部と、
を備える、情報処理装置。 - 前記リクエストは少なくとも物体に対するセンシングを要求するセンシングリクエスト又は物体に対する情報出力を要求する情報出力リクエストであり、
前記処理部は、前記リクエストがセンシングリクエストである場合、前記検出部に前記物体に対するセンシングを行わせ、前記リクエストが情報出力リクエストである場合、前記物体に関する情報の出力を出力部に行わせる、請求項1に記載の情報処理装置。 - 前記物体はネットワーク接続機能を持たない物理的な物体であり、
前記処理部は、前記センシング可能領域内でセンシングした前記物体に前記識別子を関連付ける処理を行う、請求項2に記載の情報処理装置。 - 前記識別子は、前記物体に仮想的にアクセスするための識別子である、請求項3に記載の情報処理装置。
- 前記識別子は、前記物体に仮想的にアクセスするためのアドレスを含む、請求項4に記載の情報処理装置。
- 前記識別子は物体の種類を示す情報であって、前記識別子で示された種類に属する前記物体に仮想的にアクセスするための種類識別子である、請求項4に記載の情報処理装置。
- 前記種類識別子には、物体の種類に対応した情報出力処理を示すアクション識別子が関連付けられており、
前記処理部は、前記リクエストが前記種類識別子と前記アクション識別子を伴う情報出力リクエストである場合、前記種類識別子により示された種類に属する前記物体に対して前記アクション識別子に対応する情報出力を前記出力部に行わせる、
請求項6に記載の情報処理装置。 - 前記処理部は、前記情報出力リクエストが受信された場合、前記情報出力リクエストの受信後に、前記情報出力リクエストで指定された識別子に対応する物体の位置をセンシングし、センシングした位置に対応する場所に対して前記出力部に情報の出力を行わせる、請求項2に記載の情報処理装置。
- 前記出力部は、前記リクエストが前記情報出力リクエストである場合、前記物体に関する文字若しくは画像の投影出力又は音声出力を行う、請求項2に記載の情報処理装置。
- 前記検出部は、前記物体の温度をセンシングし、センシングした温度に基づいて前記リクエストに対するレスポンスを送信する、請求項1に記載の情報処理装置。
- 前記検出部は、前記物体の色をセンシングし、センシングした色に基づいて前記リクエストに対するレスポンスを送信する、請求項1に記載の情報処理装置。
- 前記検出部は、前記物体の重さをセンシングし、センシングした重さに基づいて前記リクエストに対するレスポンスを送信する、請求項1に記載の情報処理装置。
- 前記検出部は、前記物体の数をセンシングし、センシングした数に基づいて前記リクエストに対するレスポンスを送信する、請求項1に記載の情報処理装置。
- 前記検出部は、前記物体の状態として物体のサイズをセンシングし、センシングしたサイズに基づいて前記リクエストに対するレスポンスを送信する、請求項1に記載の情報処理装置。
- 前記処理部からの処理の結果に基づいて情報の出力を行う出力部を備える、請求項1に記載の情報処理装置。
- センシング可能領域に存在する物体のセンシングを行うことと、
外部装置からネットワークを介して送信された前記物体に関連付けられた識別子が指定されたリクエストの受信に応じて、前記識別子に対応する物体のセンシングを行わせて、該センシングの結果に基づき前記リクエストに対するレスポンスを前記外部装置に送信することと、
をプロセッサが行うことを含む、情報処理方法。 - コンピュータに、
センシング可能領域に存在する物体のセンシングを行うことと、
外部装置からネットワークを介して送信された前記物体に関連付けられた識別子が指定されたリクエストの受信に応じて、前記識別子に対応する物体のセンシングを行わせて、該センシングの結果に基づき前記リクエストに対するレスポンスを前記外部装置に送信することと、
を実行させる、コンピュータプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112017006606.9T DE112017006606T8 (de) | 2016-12-27 | 2017-12-07 | Informationsverarbeitungsvorrichtung, informationsverarbeitungsverfahren und computerprogramm |
JP2018558971A JP7006619B2 (ja) | 2016-12-27 | 2017-12-07 | 情報処理装置、情報処理方法及びコンピュータプログラム |
KR1020197017110A KR20190099207A (ko) | 2016-12-27 | 2017-12-07 | 정보 처리 장치, 정보 처리 방법 및 컴퓨터 프로그램 |
US16/470,727 US11086393B2 (en) | 2016-12-27 | 2017-12-07 | Information processing device, information processing method, and computer program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016253842 | 2016-12-27 | ||
JP2016-253842 | 2016-12-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018123497A1 true WO2018123497A1 (ja) | 2018-07-05 |
Family
ID=62708086
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/043930 WO2018123497A1 (ja) | 2016-12-27 | 2017-12-07 | 情報処理装置、情報処理方法及びコンピュータプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US11086393B2 (ja) |
JP (1) | JP7006619B2 (ja) |
KR (1) | KR20190099207A (ja) |
DE (1) | DE112017006606T8 (ja) |
WO (1) | WO2018123497A1 (ja) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102539579B1 (ko) * | 2018-12-18 | 2023-06-05 | 삼성전자주식회사 | 정보의 표시 영역을 적응적으로 변경하기 위한 전자 장치 및 그의 동작 방법 |
WO2023240166A1 (en) * | 2022-06-08 | 2023-12-14 | Voyetra Turtle Beach, Inc. | Input device, indicator device, and parameter adjustment method thereof |
WO2023240167A1 (en) * | 2022-06-08 | 2023-12-14 | Voyetra Turtle Beach, Inc. | Input device and indicator device |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002023064A (ja) * | 2000-07-11 | 2002-01-23 | Goto Optical Mfg Co | 天体観測施設の制御システム |
WO2015098190A1 (ja) * | 2013-12-27 | 2015-07-02 | ソニー株式会社 | 制御装置、制御方法及びコンピュータプログラム |
JP2016194762A (ja) * | 2015-03-31 | 2016-11-17 | ソニー株式会社 | 情報処理システム、情報処理方法及びプログラム |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4423302B2 (ja) | 2002-11-25 | 2010-03-03 | 日本電信電話株式会社 | 実世界オブジェクト認識方法および実世界オブジェクト認識システム |
WO2012040827A2 (en) * | 2010-10-01 | 2012-04-05 | Smart Technologies Ulc | Interactive input system having a 3d input space |
JP5844288B2 (ja) | 2011-02-01 | 2016-01-13 | パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America | 機能拡張装置、機能拡張方法、機能拡張プログラム、及び集積回路 |
KR101620777B1 (ko) * | 2012-03-26 | 2016-05-12 | 애플 인크. | 증강된 가상 터치패드 및 터치스크린 |
US20140108979A1 (en) * | 2012-10-17 | 2014-04-17 | Perceptive Pixel, Inc. | Controlling Virtual Objects |
JP6070512B2 (ja) | 2013-11-05 | 2017-02-01 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
WO2015120913A1 (en) * | 2014-02-17 | 2015-08-20 | Metaio Gmbh | Method and device for detecting a touch between a first object and a second object |
WO2015139002A1 (en) * | 2014-03-14 | 2015-09-17 | Sony Computer Entertainment Inc. | Gaming device with volumetric sensing |
JP2015177383A (ja) | 2014-03-17 | 2015-10-05 | カシオ計算機株式会社 | 投影装置及びプログラム |
US10429923B1 (en) * | 2015-02-13 | 2019-10-01 | Ultrahaptics IP Two Limited | Interaction engine for creating a realistic experience in virtual reality/augmented reality environments |
US11127306B2 (en) * | 2017-08-21 | 2021-09-21 | Precisionos Technology Inc. | Medical virtual reality surgical system |
US10540941B2 (en) * | 2018-01-30 | 2020-01-21 | Magic Leap, Inc. | Eclipse cursor for mixed reality displays |
US10523921B2 (en) * | 2018-04-06 | 2019-12-31 | Zspace, Inc. | Replacing 2D images with 3D images |
-
2017
- 2017-12-07 US US16/470,727 patent/US11086393B2/en active Active
- 2017-12-07 WO PCT/JP2017/043930 patent/WO2018123497A1/ja active Application Filing
- 2017-12-07 KR KR1020197017110A patent/KR20190099207A/ko active IP Right Grant
- 2017-12-07 JP JP2018558971A patent/JP7006619B2/ja active Active
- 2017-12-07 DE DE112017006606.9T patent/DE112017006606T8/de active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002023064A (ja) * | 2000-07-11 | 2002-01-23 | Goto Optical Mfg Co | 天体観測施設の制御システム |
WO2015098190A1 (ja) * | 2013-12-27 | 2015-07-02 | ソニー株式会社 | 制御装置、制御方法及びコンピュータプログラム |
JP2016194762A (ja) * | 2015-03-31 | 2016-11-17 | ソニー株式会社 | 情報処理システム、情報処理方法及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
DE112017006606T5 (de) | 2019-09-12 |
US20200089312A1 (en) | 2020-03-19 |
JPWO2018123497A1 (ja) | 2019-10-31 |
JP7006619B2 (ja) | 2022-01-24 |
US11086393B2 (en) | 2021-08-10 |
KR20190099207A (ko) | 2019-08-26 |
DE112017006606T8 (de) | 2019-10-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10019074B2 (en) | Touchless input | |
CN104423881B (zh) | 信息处理装置及其控制方法 | |
JP5560794B2 (ja) | 制御装置、制御方法およびプログラム | |
TWI531929B (zh) | 基於影像來識別觸控表面的目標接觸區域之技術 | |
CA2790491C (en) | Building footprint extraction apparatus, method and computer program product | |
JP7006619B2 (ja) | 情報処理装置、情報処理方法及びコンピュータプログラム | |
WO2017047182A1 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
TW201346640A (zh) | 影像處理裝置及電腦程式產品 | |
WO2019174398A1 (zh) | 一种利用手势模拟鼠标操作的方法、装置及终端 | |
US20130057515A1 (en) | Depth camera as a touch sensor | |
WO2012054060A1 (en) | Evaluating an input relative to a display | |
CN113168221B (zh) | 信息处理设备、信息处理方法和程序 | |
JP2013164697A (ja) | 画像処理装置、画像処理方法、プログラム及び画像処理システム | |
JP6911870B2 (ja) | 表示制御装置、表示制御方法及びコンピュータプログラム | |
CN103713755A (zh) | 一种触摸识别装置及识别方法 | |
US20180373392A1 (en) | Information processing device and information processing method | |
KR101281461B1 (ko) | 영상분석을 이용한 멀티 터치 입력 방법 및 시스템 | |
CN115081643B (zh) | 对抗样本生成方法、相关装置及存储介质 | |
WO2018123485A1 (ja) | 情報処理装置、情報処理方法及びコンピュータプログラム | |
CN106339089B (zh) | 一种人机交互动作识别系统及方法 | |
JP6384376B2 (ja) | 情報処理装置、プログラム、携帯端末、及び情報処理システム | |
WO2018123475A1 (ja) | 情報処理装置、情報処理方法及びコンピュータプログラム | |
US10289203B1 (en) | Detection of an input object on or near a surface | |
KR20190049349A (ko) | 프로젝션 영상에서 사용자의 터치를 인식하는 방법 및 이러한 방법을 수행하는 장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17886332 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2018558971 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20197017110 Country of ref document: KR Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17886332 Country of ref document: EP Kind code of ref document: A1 |