US20180060661A1 - Information processing apparatus, information processing method, and program - Google Patents
Information processing apparatus, information processing method, and program Download PDFInfo
- Publication number
- US20180060661A1 US20180060661A1 US15/559,890 US201615559890A US2018060661A1 US 20180060661 A1 US20180060661 A1 US 20180060661A1 US 201615559890 A US201615559890 A US 201615559890A US 2018060661 A1 US2018060661 A1 US 2018060661A1
- Authority
- US
- United States
- Prior art keywords
- information
- terminal
- appearance
- image
- information processing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G06K9/00671—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/14—Session management
- H04L67/141—Setup of application sessions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/255—Detecting or recognising potential candidate objects based on visual cues, e.g. shapes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G06K2209/03—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/02—Recognising information on displays, dials, clocks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/10—Protocols in which an application is distributed across nodes in the network
Definitions
- the present disclosure relates to an information processing apparatus, an information processing method, and a program.
- Patent Literature 1 describes technology that assumes a program itself functions as an authentication key.
- Patent Literature 1 JP 2002-344444A
- an information processing apparatus including: an appearance information acquisition unit configured to acquire appearance information indicating a feature of appearance of an own device; and a sending unit configured to send the appearance information to communicate with another device that has imaged the appearance of the own device.
- the information processing apparatus may include a display screen.
- the appearance information acquisition unit may include an image information generation unit configured to generate image information indicating a feature of a screen displayed on the display screen as the appearance information.
- the sending unit may send, together with the image information, communication information for communicating with the other device.
- a plurality of applications may be displayed on the display screen, the image information generation unit may generate the image information for each of the plurality of applications, and the sending unit may send the image information generated for each application.
- communication may be performed with the other device for which it has determined that a captured image of the display screen and the image information match.
- the information processing apparatus may include an identification information acquisition unit configured to acquire identification information for identifying the other device.
- the sending unit may send the identification information together with the image information.
- the identification information may include at least a portion of an IP address of the other device.
- the identification information acquisition unit may acquire the identification information sent by beacon, sound, or light.
- the information processing apparatus may include a position information acquisition unit configured to acquire position information.
- the sending unit may send the position information together with the image information.
- an information processing method including: acquiring appearance information indicating a feature of appearance of an own device; and sending the appearance information to communicate with another device that has imaged the appearance of the own device.
- FIG. 1A is a schematic view of an outline of a system according to an embodiment of the present disclosure.
- FIG. 1B is a schematic view of an outline of a system according to an embodiment of the present disclosure.
- FIG. 2 is a schematic view of the configuration of the system according to the embodiment.
- FIG. 3 is a flowchart illustrating the processes of generating and recording image information by a terminal to be recognized.
- FIG. 4 is a flowchart for explaining the process of image recognition by a recognizing terminal.
- FIG. 5 is a flowchart for explaining the process of a dictionary data storage function of a server.
- FIG. 6 is a schematic view of examples of communication information.
- FIG. 7 is a schematic view of a system in which a tabletop interactive system and a terminal such as a smartphone are linked.
- FIG. 8 is an explanatory view illustrating a functional configuration example of an information processing system in FIG. 7 .
- FIG. 9 is a schematic view of an example of linking a stand-alone display to a wearable device.
- FIG. 10 is a schematic view illustrating a case in which applications on a large screen display installed on a wall are recognized.
- FIG. 11 is a schematic view of objects such as home electric appliances that are connected to a network at home.
- FIG. 1B and FIG. 1A are schematic views of an outline of a system according to an embodiment of the present disclosure.
- This system performs communication between a recognized terminal and a recognizing terminal that are linked by one terminal recognizing, via a camera image, an application running on another terminal, when linking a plurality of terminals.
- a mobile device that serves as a recognizing terminal 200 images an application screen 110 of a tablet terminal that serves as a terminal 100 to be recognized.
- the terminal 100 to be recognized sequentially records the application screen 110 as dictionary data on a server 300 .
- the recognizing terminal 200 acquires the dictionary data from the server 300 and compares the dictionary data with an image obtained through the imaging. Then, if the result of the comparison is such that the dictionary data and the image match, the terminal 100 and the terminal 200 start to communicate.
- FIG. 2 is a schematic view of the configuration of a system 1000 according to the embodiment.
- the system 1000 includes the terminal 100 to be recognized, the recognizing terminal 200 , and the server 300 .
- the terminal 100 to be recognized displays an application or the like on a display screen.
- the terminal 100 to be recognized also includes an image information generation unit 102 , which generates the application screen on the terminal 100 as image information (dictionary data).
- the terminal 100 records the image information on the server 300 by sending the image information from a communication unit 104 to a communication unit 304 of the server 300 .
- the server 300 has a storage unit 302 for image information. Note that constituent elements of the terminals 100 and 200 , and the server 300 , illustrated in FIG.
- the 1 may be formed by hardware (circuits), or by a central processing unit such as a CPU, and a program (software) that makes the central processing unit function.
- the program can be stored on a recording medium such as memory provided inside a device, or memory connected from the outside.
- the image information generation unit 102 of the terminal 100 generates image information regarding features for image recognition and a snapshot of an application being displayed by the terminal 100 , and records the data on a storage unit 302 of the server 300 . If a plurality of application screens are being displayed on the terminal 100 , image information is sent for each application screen.
- the terminal 100 simultaneously records, on the server 300 , both communication information for the terminal 100 and the terminal 200 to communicate with each other, and communication information for the terminal 200 to communicate with an application being displayed on the terminal 100 .
- the image information can also be generated on the server 300 using snapshot data, instead of being generated on the terminal 100 to be recognized.
- the application screen may be a still image or a moving image.
- the recognizing terminal 200 has a camera 202 , an image recognition unit 204 , and a communication unit 206 .
- the camera 202 images the application screen displayed on the terminal 100
- the image recognition unit 204 recognizes an image obtained through the imaging.
- the communication unit 206 communicates with the communication unit 304 of the server 300 , and acquires the image information stored on the storage unit 302 .
- the image recognition unit 204 recognizes the application being displayed on the terminal 100 , by comparing the image information with the image data input from the camera 202 .
- the storage unit 302 of the server 300 stores image information to be recorded from the terminal 100 to be recognized, information for the terminal 100 and the terminal 200 to communicate with each other, and information for the terminal 200 to communicate with the application of the terminal 100 , and provides the stored data in response to a request from the image recognition unit 202 of the recognizing terminal 200 .
- a dictionary data storage function of the server 300 may be configured on the terminal 100 having a dictionary data generating function, or on the terminal 200 having an image recognizing function.
- the recognizing terminal 200 and terminal 100 to be recognized can be linked by the recognizing terminal 200 recognizing, via a camera image, an application running on the terminal 100 to be recognized, when linking a plurality of terminals.
- the terminal 100 that is running the application screen to be recognized sends the image information such as the features for image recognition and the snapshot of the application screen to the server 300 in real time in accordance with a change of the screen.
- An unknown application or an application with a dynamically changing state can then be identified by image recognition, without generating and recording dictionary data beforehand, by the recognizing terminal 200 comparing this image information with the image from the camera 202 .
- the terminal 100 to be recognized, the recognizing terminal 200 , and the server 300 are connected beforehand by a network or P2P so as to be able to communicate with one another.
- the method of connection is not particularly limited, however.
- step S 10 when an application on the terminal 100 to be recognized is launched, it is determined in step S 10 whether the display screen has changed. If the display screen has changed, the process proceeds on to step S 12 , where image information regarding the features for image recognition and the snapshot of the display screen is generated. On the other hand, if there is no change in the display screen, the process waits for a certain period of time in step S 19 , and then returns to step S 10 , where it is again determined whether the display screen has changed.
- step S 12 the process proceeds on to step S 14 , where the server 300 records the snapshot and the features for image recognition generated in step S 12 .
- Communication information is also recorded at this time.
- the communication information is information for the terminal 200 to communicate with the terminal 100 , and information for the terminal 200 to communicate with the application of the terminal 100 .
- step S 16 it is determined whether the function of the application has ended. If the function of the application has ended, the server 300 is notified of this, and the process proceeds on to step S 18 . In step S 18 , the data recorded on the server 300 is erased. After step S 18 , the process ends.
- step S 16 the process waits a certain period of time in step S 19 and then returns to step S 10 , and the processes thereafter are performed again.
- step S 20 when an application on the recognizing terminal 200 is launched, it is determined in step S 20 whether it is necessary to recognize the terminal 100 to be recognized, to communicate with the terminal 100 to be recognized. If it is necessary to recognize the terminal 100 to be recognized, the process proceeds on to step S 22 , where the image information regarding the image features and the snapshot are acquired from the server 300 . On the other hand, if it is not necessary to recognize the terminal 100 to be recognized, the process waits a certain period of time in step S 32 and then returns to step S 20 , and the processes thereafter are performed again.
- step S 22 the process proceeds on to step S 24 , where the image input from the camera 202 is compared with the image information acquired in step S 22 . If the result of the comparison is such that the image input from the camera 202 matches the image information acquired in step S 22 , the process proceeds on to step S 26 , and the communication information stored on the server 300 is acquired. On the other hand, if the image input from the camera 202 does not match the image information acquired in step S 22 , the process waits a certain period of time in step S 32 and then returns to step S 20 , and the processes thereafter are performed again.
- the captured image is analyzed, the features are extracted, and these features are compared with the features in the image information, by a method similar to a well-known face detection algorithm or the like, for example. Then it is determined whether the images match, on the basis of the degree of correlation by a template matching process or the like.
- step S 28 the terminal 200 communicates with the terminal 100 on the basis of the communication information. As a result, the terminal 200 is able to communicate with the application displayed on the terminal 100 .
- step S 30 it is determined whether the function of the application has ended. If the function of the application has ended, the process ends.
- step S 32 the process waits a certain period of time in step S 32 and then returns to step S 20 , and the processes thereafter are performed again.
- step S 40 the process waits for a communication request from an application on the terminal 100 to be recognized or an application on the recognizing terminal 200 in step S 40 . If there is a request to record image information from the terminal 100 to be recognized in the next step, step S 42 , the process proceeds on to step S 44 and the image information is recorded.
- step S 46 the process proceeds on to step S 48 .
- step S 48 the communication information is recorded.
- step S 50 the process proceeds on to step S 52 .
- step S 52 the image information is provided to the terminal 200 .
- step S 54 determines whether there is a request in step S 54 to acquire communication information from the terminal 200 . If there is a request in step S 54 to acquire communication information from the terminal 200 , the process proceeds on to step S 56 .
- step S 56 the communication information is provided to the terminal 200 .
- step S 58 it is determined whether the function of the application has ended. If the function of the application has ended, the process ends. On the other hand, if the function of the application has not ended, the process returns to step S 40 , and the processes thereafter are performed again.
- FIG. 6 is a schematic view of examples of communication information.
- FIG. 6 is a schematic view of communication information for a device A, a device B, and a device C, respectively.
- the communication information is defined for each of a network interface, an internet layer, a transport layer, an application layer, and a communication format.
- Information relating to the network interface includes WiFi, Bluetooth (registered trademark), Ethernet (registered trademark), and WiFi Direct, and the like.
- Information relating to the internet layer includes an IP address and a port number (IPv4 and IPV6).
- Information relating to the transport layer is TCP or UDP information.
- Information relating to the application layer includes HTTP, HTTPS, WebSocket (ws) and secure WebSocket (wss) and the like.
- Information relating to the communication format includes JSON PRC, SOAP, and REST, and the like.
- the terminal 100 to be recognized and the recognizing terminal 200 are able to communicate with each other by sharing communication information via the server 300 .
- the terminal 200 recognizes the terminal 100 by the IP address included in the communication information.
- the terminal 200 also recognizes the application of the terminal 100 by the port number included in the communication information.
- the communication information is linked to the image information and sent from the terminal 100 to the terminal 200 for each application screen, and is stored, together with the image information, on the storage unit 302 .
- the image information is sent, together with the linked communication information, to the terminal 200 in response to a request from the terminal 200 to acquire the image information. Therefore, even if there are a plurality of application screens on the terminal 100 , the terminal 200 is able to communicate with the application imaged by the camera 202 , among the plurality of applications, by acquiring the port number corresponding to the image information.
- the application screen may be a moving image. If the application screen is a moving image, a mechanism for absorbing a time lag in the communication can be introduced.
- a frame number may be sent from the terminal 100 to be recognized to the server 300 before the image information. There is no time lag in the transmission of the frame number. Time information is linked to the frame number, so the server 300 is able to recognize in advance that image information will be received.
- the server 300 receives the image information from the terminal 100 after the frame number. Then, when the server 300 receives the image information from the terminal 100 , the image information for the frame corresponding to the requested time is extracted and sent to the terminal 200 in response to the request already received from the terminal 200 . As a result, the terminal 200 is able to determine whether the image from the camera at the requested time matches the image information sent from the server 300 .
- a moving image captured by the camera 202 may also be stored (cached) for just a certain period of time in the recognizing terminal 200 .
- a moving image captured by the camera 202 may also be stored (cached) for just a certain period of time in the recognizing terminal 200 .
- the server 300 that has received a request to acquire the image information and communication information from the recognizing terminal 200 narrows down the information from among the large amount of image information and communication information recorded, and sends the information to the terminal 200 that sent out the request to acquire the information.
- a search on the server 300 side can be made easier by using supplementary information for narrowing down the information.
- Position information is an example of such supplementary information.
- the terminal 100 to be recognized sends, together with the dictionary data, position information for the terminal 100 acquired by a position information acquisition unit (GPS) 106 , to the server 300 .
- the server 300 records the position information together with the image information and the communication information.
- GPS position information acquisition unit
- the recognizing terminal 200 When the recognizing terminal 200 requests image information from the server 300 , the recognizing terminal 200 sends the position information for the terminal 200 acquired by a position information acquisition unit (GPS) 208 to the server 300 .
- An information extraction unit 306 of the server 300 narrows down the image information and the communication information on the basis of the position information acquired from the terminal 200 .
- the server 300 extracts image information and communication information for a terminal 100 positioned within a 10-meter radius of the position of the terminal 200 on the basis of the position information acquired from the terminal 200 , and sends this image information and communication information to the terminal 200 .
- a comparison between the image information and the imaging information can be easily performed on the terminal 200 side, which enables the processing load to be significantly reduced.
- an identification information output unit 209 of the terminal 200 sends identification information to the terminal 100 using Beacon Wi-Fi, sound, or light or the like, from the recognizing terminal 200 toward the terminal to be recognized.
- An identification information acquisition unit 108 of the terminal 100 to be recognized acquires the identification information.
- the terminal 100 sends the identification information, together with the image information and the communication information, to the server 300 , and the server 300 then records this identification information, together with the image information and the communication information.
- the recognizing terminal 200 When the recognizing terminal 200 requests image information from the server 300 , the recognizing terminal 200 sends the identification information to the server 300 .
- the server 300 narrows down the image information on the basis of the identification information acquired from the terminal 200 , and then sends the image information and communication information linked to identification information that matches the identification information sent from the terminal 200 , from among the image information and communication information recorded, to the terminal 200 .
- the terminal 200 is able to extract only the image information for the imaged terminal 100 , from the large amount of image information recorded.
- the IP address, or a portion of the IP address, of the terminal 200 can be used as the identification information.
- the dictionary data may be searched in order from the most recently recorded, on the basis of the order in which the dictionary data was recorded on the server 300 .
- FIG. 7 is a schematic view of a system in which a tabletop interactive system and a terminal such as a smartphone are linked.
- this system 1100 a includes an input unit 1110 a and an output unit 1130 a .
- the information processing system 1100 a according to an embodiment of the present disclosure illustrated in FIG. 7 displays information on a top surface of a table 1140 a , and allows a user using the information processing system 1100 a to manipulate the information displayed on the table 1140 a .
- the method for displaying the information on the top surface of the table 1140 a is also referred to as a “projection type”.
- the input unit 1110 a is a device that inputs content of an operation by the user using the information processing system 1100 a , and the shape and pattern and the like of an object placed on the table 1140 a .
- the input unit 1110 a is provided in a state suspended from a ceiling, for example, above the table 1140 a . That is, the input unit 1110 a is provided away from the table 1140 a on which the information is to be displayed.
- a camera that images the table 1140 a with a single lens, a stereo camera capable of imaging the table 1140 a with two lenses and recording information in the depth direction, or a microphone for recording sounds spoken by a user using the information processing system 1100 a or ambient sounds of the environment where the information processing system 1100 a is placed, or the like may be used as the input unit 1110 a.
- the information processing system 1100 a is able to detect an object placed on the table 1140 a , by analyzing the image captured by the camera. Also, if a stereo camera is used as the input unit 1110 a , a visible light camera or an infrared camera or the like, for example, can be used as the stereo camera. By using a stereo camera as the input unit 1110 a , the input unit 1110 a can acquire depth information. By acquiring depth information with the input unit 1110 a , the information processing system 1100 a is able to detect a hand or an object placed on the table 1140 a , for example.
- the information processing system 1100 a is able to detect when a hand of the user contacts or is close to the table 1140 a , and detect when the hand leaves the table 1140 a .
- movements in which the user brings an operating body such as a hand into contact with, or close to, an information display surface will also collectively be referred to simply as a “touch”.
- a microphone array for picking up sounds in a specific direction can be used as the microphone. If a microphone array is used as the input unit 1110 a , the information processing system 1100 a may adjust the pickup direction of the microphone array to a suitable direction.
- an operation by the user is detected from an image captured by the input unit 1110 a
- the operation by the user may also be detected by a touch panel that detects the touch of a finger or the like of the user.
- a user operation that can be acquired by the input unit 1110 a can include a stylus operation with respect to an information display surface, or a gesture with respect to a camera or the like, for example.
- the output unit 1130 a is a device that displays information on the table 1140 a and outputs audio, in accordance with information input by the input unit 1110 a , such as the content of an operation by the user using the information processing system 1100 a , the content of information being output by the output unit 130 a , and the shape and pattern and the like of an object placed on the table 1140 a .
- a projector or a speaker or the like, for example, is used as the output unit 1130 a .
- the output unit 1130 a is provided in a state suspended from a ceiling, for example, above the table 140 a .
- the output unit 1130 a projects information onto the top surface of the table 1140 a . If the output unit 1130 a is configured by a speaker, the output unit 1130 a outputs audio on the basis of an audio signal. If the output unit 1130 a is configured by a speaker, the number of speakers may be one or a plurality. If the output unit 1130 a is configured by a plurality of speakers, the information processing system 1100 a may limit the speakers from which audio is output, or may adjust the direction in which the audio is output.
- the output unit 1130 a may also include lighting equipment. If the output unit 1130 a includes lighting equipment, the information processing system 1100 a may control the on/off state and the like of the lighting equipment on the basis of information input by the input unit 1110 a.
- the user using the information processing system 1100 a is able to manipulate the information displayed on the table 1140 a by the output unit 1130 a , by placing a finger or the like on the table 1140 a . Also, by placing an object on the table 1140 a and having the input unit 1110 a recognize the object, the user using the information processing system 1100 a is able to execute various operations relating to the recognized object.
- another device may be connected to the information processing system 1100 a .
- lighting equipment for illuminating the table 1140 a may be connected to the information processing system 1100 a .
- the information processing system 1100 a is able to control the lighting state of the lighting equipment in accordance with the state of the information display surface.
- FIG. 8 is an explanatory view illustrating a functional configuration example of an information processing system 1100 in FIG. 7 .
- a functional configuration example of an image processing system according to an embodiment of the present disclosure will be described with reference to FIG. 8 .
- the information processing system 1100 includes an input unit 1110 a control unit 1120 , and an output unit 1130 .
- the input unit 1110 inputs content of an operation with respect to the information processing system 1100 by a user using the information processing system 1100 , and the shape and pattern and the like of an object placed on a surface (e.g., the table 1140 a illustrated in FIG. 7 ) onto which information is output by the output unit 1130 .
- the content of an operation with respect the information processing system 1100 by a user using the information processing system 1100 includes the content of an operation with respect to GUI that the information processing system 1100 outputs onto the information display surface.
- Information input by the input unit 1110 such as the content of an operation with respect to the information processing system 1100 , and the shape and pattern and the like of the object, is sent to the control unit 1120 .
- the input unit 1110 may be configured by a camera with a single lens, a stereo camera with two lenses, or a microphone, or the like.
- the control unit 1120 controls the various units of the information processing system 1100 .
- the control unit 1120 generates information to be output from the output unit 1130 , using information input by the input unit 1110 .
- the control unit 1120 includes a detection unit 1121 and an output control unit 1122 .
- the detection unit 1121 executes a process for detecting the content of an operation with respect to the information processing system 1100 by a user using the information processing system 1100 , the content of information being output by the output unit 1130 , and the shape and pattern and the like of an object placed on a surface (e.g., the table 1140 a illustrated in FIG. 7 ) onto which information is output by the output unit 1130 .
- the content detected by the detection unit 1121 is sent to the output control unit 1122 .
- the output control unit 1122 executes control to generate information to be output from the output unit 1130 , on the basis of the content detected by the detection unit 1121 .
- the information generated by the output control unit 1122 is sent to the output unit 1130 .
- the detection unit 1121 is able to detect what portion of the GUI an operating body such as a hand of the user touched, by a correction being made beforehand such that the coordinates on the information display surface match the coordinates where the operating body such as the hand of the user touched the display surface.
- the control unit 1120 may also be configured by a central processing unit (CPU) or the like, for example. If the control unit 1120 is configured by a device such as a CPU, the device may be configured by an electronic circuit.
- CPU central processing unit
- control unit 1120 may include a communication function for performing wireless communication with another device, and a function for controlling the operation of another device, e.g., lighting equipment, connected to the information processing system 1100 .
- the output unit 1130 outputs information input by the input unit 1110 , in accordance with information such as the content of an operation by the user using the information processing system 1100 , the content of information being output by the output unit 1130 , and the shape and pattern and the like of an object placed on a surface (e.g., the table 1140 a illustrated in FIG. 7 ) onto which the output unit 1130 outputs information.
- the output unit 1130 outputs the information on the basis of the information generated by the output control unit 1122 .
- the information output by the output unit 1130 includes information to be displayed on the information display surface, and audio to be output from a speaker (not shown) or the like, and so on.
- the information processing system 1100 illustrated in FIG. 8 may be configured as a single device, or a portion of the information processing system 1100 or the entire information processing system 1100 illustrated in FIG. 8 may be configured by separate devices.
- the control unit 1120 may be provided in a device such as a server that is connected to the input unit 1110 and the output unit 1130 by a network or the like.
- the control unit 1120 is provided in a device such as a server, information from the input unit 1110 is sent to the device such as the server over the network or the like.
- the control unit 1120 then processes the information from the input unit 1110 , and information to be output by the output unit 1130 is sent from the device such as the server to the output unit 1130 over the network or the like.
- the information processing system 1100 can be linked to a mobile terminal such as a smartphone on the table.
- the information processing system 100 is able to identify a mobile terminal such as a smartphone, and link to the identified mobile terminal, by the user placing the mobile terminal on the table and having the input unit 1110 recognize the mobile terminal.
- the information processing system 1100 will be unable to determine which of the mobile terminals to link to.
- the terminal 100 to be recognized corresponds to the mobile terminal
- the recognizing terminal 100 corresponds to the information processing system 1100 . Therefore, the information processing system 1100 can be linked to each of the mobile terminals.
- FIG. 9 is a schematic view of an example in which a stand-alone display 400 and a wearable device 450 are linked.
- the stand-alone display 400 corresponds to the terminal 100 to be recognized
- the wearable device 450 corresponds to the recognizing terminal 200 .
- the wearable device 450 images one application screen 410 , 420 , or 430 displayed on the stand-alone display 400 using the camera 202 , and compares the image information recorded on the server 300 beforehand with the imaging information. If, upon this comparison, the image information recorded on the server 300 beforehand and the imaging information match, the wearable device 450 is able to communicate with the application.
- FIG. 10 is a schematic view illustrating a case in which applications on a large screen display 500 installed on a wall are recognized. As illustrated in FIG. 10 , the large screen display 500 is installed with a screen 502 vertical to the ground. A plurality of applications 510 , 520 , and 530 are running on the screen 502 .
- Image information for each application, or an arbitrary one or a plurality of applications, displayed on the screen 502 of the large screen display 500 is sent, together with communication information, to the server 300 and recorded on the server 300 .
- the user uses an application on his or her smartphone 600 and images the application screen displayed on the screen 502 .
- the smartphone 600 recognizes the screens of the applications 510 , 520 , and 530 .
- the smartphone 600 corresponds to the recognizing terminal 200 described above.
- the smartphone 600 compares the image information for the applications 510 , 520 , and 530 recorded on the server 300 with the captured image. If, upon this comparison, the image information for the applications 510 , 520 , and 530 recorded on the server 300 and the captured image match, communication between the smartphone 600 and the application 510 is realized.
- Various linked applications can be executed by using communication obtained by the smartphone 600 recognizing the application screen. For example, image, video, and music data on the smartphone 600 can be played on the application 510 of the large screen display 500 . Also, a plurality of users can also play card games and the like by smartphones owned by the plurality of users recognizing one application 510 on the large screen display 500 and communicating with each other.
- the applications 510 , 520 , and 530 on the large screen display 500 are recognized, but an application on a screen of the smartphone 600 of the user can also be recognized by a camera placed on the large screen display 500 .
- the large screen display 500 corresponds to the recognizing terminal 200
- the smartphone 600 corresponds to the terminal 100 to be recognized.
- FIG. 11 is a schematic view of objects 700 such as home electric appliances that are connected to a network at home. These objects 700 that are connected to the network correspond to the terminal 100 to be recognized.
- the objects 700 such as home electric appliances record pictures of the appearance and 3D model data of themselves in the dictionary data storage function of the server 300 .
- the objects 700 corresponding to the terminal 100 to be recognized acquire appearance information relating to appearance features of themselves, and record this appearance information on the server 300 .
- the user wears a wearable device 450 similar to the wearable device in FIG. 9 .
- This wearable device 450 corresponds to the recognizing terminal 100 .
- the wearable device 450 acquires images of these objects 700 by imaging the objects 700 with the camera 202 , and determines whether the images matches the appearance information provided by the server 300 . If the images match the appearance information provided by the server 300 , the wearable device 450 communicates with the objects 700 .
- an application for setting an air conditioner can be executed by an operation from the wearable device 450 , as a result of recognizing the air conditioner.
- an application for unlocking a lock in a door knob can be executed by an operation from the wearable device 450 , as a result of recognizing the lock.
- recognition is performed by the wearable device 450 , but recognition may also be performed by a mobile device such as a smartphone.
- the device connected to the network in FIG. 10 is an example, and is not limited to being a connected device or object.
- an unknown application or a dynamically changing application can be recognized, via image recognition, by the sending, in real time, features and a snapshot of the application that is to be recognized, and using the features and the snapshot as dictionary data in the terminal 100 that performs the recognition.
- a linking application using a plurality of devices can also be executed by being recognized by the plurality of devices. Also, when a device or an object is connected to a network, the device or object can be recognized by, and linked to, another device without performing the recording operation beforehand, by dynamically recording an image of the appearance, and 3D model data, of the device or object as dictionary data.
- present technology may also be configured as below.
- An information processing apparatus including:
- an appearance information acquisition unit configured to acquire appearance information indicating a feature of appearance of an own device
- a sending unit configured to send the appearance information to communicate with another device that has imaged the appearance of the own device.
- the information processing apparatus including:
- the appearance information acquisition unit includes an image information generation unit configured to generate image information indicating a feature of a screen displayed on the display screen as the appearance information.
- the information processing apparatus in which the sending unit sends, together with the image information, communication information for communicating with the other device.
- the image information generation unit generates the image information for each of the plurality of applications, and
- the sending unit sends the image information generated for each application.
- the information processing apparatus in which communication is performed with the other device for which it has determined that a captured image of the display screen and the image information match.
- the information processing apparatus including:
- an identification information acquisition unit configured to acquire identification information for identifying the other device
- the sending unit sends the identification information together with the image information.
- the information processing apparatus in which the identification information includes at least a portion of an IP address of the other device.
- the information processing apparatus in which the identification information acquisition unit acquires the identification information sent by beacon, sound, or light.
- the information processing apparatus including:
- a position information acquisition unit configured to acquire position information
- the sending unit sends the position information together with the image information.
- An information processing method including:
- An information processing apparatus including:
- an imaging unit configured to image another device
- an appearance information acquisition unit configured to acquire appearance information indicating a feature of appearance of the other device from a server
- an image recognition unit configured to compare the captured image obtained through the imaging performed by the imaging unit with the appearance information
- a communication unit configured to communicate with the other device if the result of the comparison by the image recognition unit is such that the captured image obtained through the imaging performed by the imaging unit and the appearance information match.
- An information processing apparatus including:
- an appearance information acquisition unit configured to acquire appearance information indicating a feature of appearance of a first terminal from the first terminal
- a storage unit configured to store the appearance information
- a sending unit configured to send, in response to a request from a second terminal, the appearance information to the second terminal to cause the second terminal to compare imaging information obtained by imaging appearance of the first terminal with the appearance information.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
- Information Transfer Between Computers (AREA)
- Image Analysis (AREA)
Abstract
Description
- The present disclosure relates to an information processing apparatus, an information processing method, and a program.
- Technology that augments the real environment obtained through a camera or like using a computer is being studied as augmented reality (AR). In particular, many AR applications that recognize an object and display appropriate information in a superimposed manner, by holding a camera of a mobile terminal over the object are being developed due to mobile terminals equipped with cameras being easier to use as the result of the popularization of smartphones in recent years. Thus,
Patent Literature 1 below describes technology that assumes a program itself functions as an authentication key. - Patent Literature 1: JP 2002-344444A
- In an AR application, it was necessary to add a special tag image or marker image to an object that serves as the subject, to perform object recognition with an image obtained from a camera. Therefore, a markerless AR method that recognizes an object by analyzing features obtained from the image, without using a marker image, is also conceivable.
- However, with either approach, it was necessary to record the features to be recognized as dictionary data beforehand, to recognize the object. Therefore, objects for which it is difficult to acquire features beforehand, such as unknown applications running on another terminal, and applications in which the state of a screen dynamically changes, were difficult to use as objects to be recognized.
- Thus, there has been a desire to link devices by recognizing the appearance of a device, such as an unknown application or a dynamically changing application.
- According to the present disclosure, there is provided an information processing apparatus including: an appearance information acquisition unit configured to acquire appearance information indicating a feature of appearance of an own device; and a sending unit configured to send the appearance information to communicate with another device that has imaged the appearance of the own device.
- The information processing apparatus may include a display screen. The appearance information acquisition unit may include an image information generation unit configured to generate image information indicating a feature of a screen displayed on the display screen as the appearance information.
- In addition, the sending unit may send, together with the image information, communication information for communicating with the other device.
- In addition, a plurality of applications may be displayed on the display screen, the image information generation unit may generate the image information for each of the plurality of applications, and the sending unit may send the image information generated for each application.
- In addition, communication may be performed with the other device for which it has determined that a captured image of the display screen and the image information match.
- In addition, the information processing apparatus may include an identification information acquisition unit configured to acquire identification information for identifying the other device. The sending unit may send the identification information together with the image information.
- In addition, the identification information may include at least a portion of an IP address of the other device.
- In addition, the identification information acquisition unit may acquire the identification information sent by beacon, sound, or light.
- In addition, the information processing apparatus may include a position information acquisition unit configured to acquire position information. The sending unit may send the position information together with the image information.
- In addition, according to the present disclosure, there is provided an information processing method including: acquiring appearance information indicating a feature of appearance of an own device; and sending the appearance information to communicate with another device that has imaged the appearance of the own device.
- In addition, according to the present disclosure, there is provided a program for causing a computer to function as means for acquiring appearance information indicating a feature of appearance of an own device, and means for sending the appearance information to communicate with another device that has imaged the appearance of the own device.
- As described above, according to the present disclosure, it is possible to link devices by recognizing the appearance of a device, such as an unknown application or a dynamically changing application. Note that the effects described above are not necessarily limitative. With or in the place of the above effects, there may be achieved any one of the effects described in this specification or other effects that may be grasped from this specification.
-
FIG. 1A is a schematic view of an outline of a system according to an embodiment of the present disclosure. -
FIG. 1B is a schematic view of an outline of a system according to an embodiment of the present disclosure. -
FIG. 2 is a schematic view of the configuration of the system according to the embodiment. -
FIG. 3 is a flowchart illustrating the processes of generating and recording image information by a terminal to be recognized. -
FIG. 4 is a flowchart for explaining the process of image recognition by a recognizing terminal. -
FIG. 5 is a flowchart for explaining the process of a dictionary data storage function of a server. -
FIG. 6 is a schematic view of examples of communication information. -
FIG. 7 is a schematic view of a system in which a tabletop interactive system and a terminal such as a smartphone are linked. -
FIG. 8 is an explanatory view illustrating a functional configuration example of an information processing system inFIG. 7 . -
FIG. 9 is a schematic view of an example of linking a stand-alone display to a wearable device. -
FIG. 10 is a schematic view illustrating a case in which applications on a large screen display installed on a wall are recognized. -
FIG. 11 is a schematic view of objects such as home electric appliances that are connected to a network at home. - Hereinafter, (a) preferred embodiment(s) of the present disclosure will be described in detail with reference to the appended drawings. In this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
- Note that the description will be given in the following order.
- 1. Configuration example of the system
- 2. Terminal and server processes
- 3. Examples of communication information
- 4. Case in which application screen is a moving image
- 5. Narrowing down recorded information
- 6. Examples of application of the embodiment
-
- 6.1. Application to a tabletop interactive system
- 6.2. Wearable devices and other display devices
- 6.3. Recognition of applications on a large screen display installed on a wall
- 6.4. Wearable cameras and home electric appliances
-
FIG. 1B andFIG. 1A are schematic views of an outline of a system according to an embodiment of the present disclosure. This system performs communication between a recognized terminal and a recognizing terminal that are linked by one terminal recognizing, via a camera image, an application running on another terminal, when linking a plurality of terminals. As illustrated inFIG. 1A , in this system, a mobile device that serves as a recognizingterminal 200 images anapplication screen 110 of a tablet terminal that serves as a terminal 100 to be recognized. - As illustrated in
FIG. 1B , the terminal 100 to be recognized sequentially records theapplication screen 110 as dictionary data on aserver 300. The recognizingterminal 200 acquires the dictionary data from theserver 300 and compares the dictionary data with an image obtained through the imaging. Then, if the result of the comparison is such that the dictionary data and the image match, the terminal 100 and the terminal 200 start to communicate. -
FIG. 2 is a schematic view of the configuration of a system 1000 according to the embodiment. As illustrated inFIG. 2 , the system 1000 includes the terminal 100 to be recognized, the recognizingterminal 200, and theserver 300. The terminal 100 to be recognized displays an application or the like on a display screen. The terminal 100 to be recognized also includes an imageinformation generation unit 102, which generates the application screen on the terminal 100 as image information (dictionary data). The terminal 100 records the image information on theserver 300 by sending the image information from acommunication unit 104 to acommunication unit 304 of theserver 300. Theserver 300 has astorage unit 302 for image information. Note that constituent elements of theterminals server 300, illustrated inFIG. 1 may be formed by hardware (circuits), or by a central processing unit such as a CPU, and a program (software) that makes the central processing unit function. In this case, the program can be stored on a recording medium such as memory provided inside a device, or memory connected from the outside. - The image
information generation unit 102 of the terminal 100 generates image information regarding features for image recognition and a snapshot of an application being displayed by the terminal 100, and records the data on astorage unit 302 of theserver 300. If a plurality of application screens are being displayed on the terminal 100, image information is sent for each application screen. The terminal 100 simultaneously records, on theserver 300, both communication information for the terminal 100 and the terminal 200 to communicate with each other, and communication information for the terminal 200 to communicate with an application being displayed on theterminal 100. The image information can also be generated on theserver 300 using snapshot data, instead of being generated on the terminal 100 to be recognized. The application screen may be a still image or a moving image. - The recognizing
terminal 200 has acamera 202, animage recognition unit 204, and acommunication unit 206. Thecamera 202 images the application screen displayed on the terminal 100, and theimage recognition unit 204 recognizes an image obtained through the imaging. Thecommunication unit 206 communicates with thecommunication unit 304 of theserver 300, and acquires the image information stored on thestorage unit 302. Theimage recognition unit 204 recognizes the application being displayed on the terminal 100, by comparing the image information with the image data input from thecamera 202. Then, if the application screen imaged by thecamera 202 matches the image information acquired from theserver 300, information for communicating with the terminal 100 and information for communicating with the application being displayed on the terminal 100 is acquired from thestorage unit 302 of theserver 300, and communication with the terminal 100 to be recognized starts. - The
storage unit 302 of theserver 300 stores image information to be recorded from the terminal 100 to be recognized, information for the terminal 100 and the terminal 200 to communicate with each other, and information for the terminal 200 to communicate with the application of the terminal 100, and provides the stored data in response to a request from theimage recognition unit 202 of the recognizingterminal 200. - Note that a dictionary data storage function of the
server 300 may be configured on the terminal 100 having a dictionary data generating function, or on the terminal 200 having an image recognizing function. - Therefore, according to the system of the embodiment, the recognizing
terminal 200 and terminal 100 to be recognized can be linked by the recognizingterminal 200 recognizing, via a camera image, an application running on the terminal 100 to be recognized, when linking a plurality of terminals. - The terminal 100 that is running the application screen to be recognized sends the image information such as the features for image recognition and the snapshot of the application screen to the
server 300 in real time in accordance with a change of the screen. An unknown application or an application with a dynamically changing state can then be identified by image recognition, without generating and recording dictionary data beforehand, by the recognizingterminal 200 comparing this image information with the image from thecamera 202. - As a precondition when carrying out the embodiment, the terminal 100 to be recognized, the recognizing
terminal 200, and theserver 300 are connected beforehand by a network or P2P so as to be able to communicate with one another. The method of connection is not particularly limited, however. - Next, the processes of generating and recording the image information by the terminal 100 to be recognized will be described with reference to
FIG. 3 . First, when an application on the terminal 100 to be recognized is launched, it is determined in step S10 whether the display screen has changed. If the display screen has changed, the process proceeds on to step S12, where image information regarding the features for image recognition and the snapshot of the display screen is generated. On the other hand, if there is no change in the display screen, the process waits for a certain period of time in step S19, and then returns to step S10, where it is again determined whether the display screen has changed. - After step S12, the process proceeds on to step S14, where the
server 300 records the snapshot and the features for image recognition generated in step S12. Communication information is also recorded at this time. The communication information is information for the terminal 200 to communicate with the terminal 100, and information for the terminal 200 to communicate with the application of the terminal 100. - In the next step, step S16, it is determined whether the function of the application has ended. If the function of the application has ended, the
server 300 is notified of this, and the process proceeds on to step S18. In step S18, the data recorded on theserver 300 is erased. After step S18, the process ends. - Also, if in step S16 the function of the application has not ended, the process waits a certain period of time in step S19 and then returns to step S10, and the processes thereafter are performed again.
- Next, the process of image recognition by the recognizing
terminal 200 will be described with reference toFIG. 4 . First, when an application on the recognizingterminal 200 is launched, it is determined in step S20 whether it is necessary to recognize the terminal 100 to be recognized, to communicate with the terminal 100 to be recognized. If it is necessary to recognize the terminal 100 to be recognized, the process proceeds on to step S22, where the image information regarding the image features and the snapshot are acquired from theserver 300. On the other hand, if it is not necessary to recognize the terminal 100 to be recognized, the process waits a certain period of time in step S32 and then returns to step S20, and the processes thereafter are performed again. - After step S22, the process proceeds on to step S24, where the image input from the
camera 202 is compared with the image information acquired in step S22. If the result of the comparison is such that the image input from thecamera 202 matches the image information acquired in step S22, the process proceeds on to step S26, and the communication information stored on theserver 300 is acquired. On the other hand, if the image input from thecamera 202 does not match the image information acquired in step S22, the process waits a certain period of time in step S32 and then returns to step S20, and the processes thereafter are performed again. In this matching determination, the captured image is analyzed, the features are extracted, and these features are compared with the features in the image information, by a method similar to a well-known face detection algorithm or the like, for example. Then it is determined whether the images match, on the basis of the degree of correlation by a template matching process or the like. - After step S26, the process proceeds on to step S28. In step S28, the terminal 200 communicates with the terminal 100 on the basis of the communication information. As a result, the terminal 200 is able to communicate with the application displayed on the
terminal 100. In the next step, step S30, it is determined whether the function of the application has ended. If the function of the application has ended, the process ends. - On the other hand, if the function of the application has not ended, the process waits a certain period of time in step S32 and then returns to step S20, and the processes thereafter are performed again.
- Next, the process of the dictionary data storage function of the
server 300 will be described with reference toFIG. 5 . First, when the dictionary data storage function of theserver 300 starts, the process waits for a communication request from an application on the terminal 100 to be recognized or an application on the recognizingterminal 200 in step S40. If there is a request to record image information from the terminal 100 to be recognized in the next step, step S42, the process proceeds on to step S44 and the image information is recorded. - Also, if there is a request in step S46 to record communication information from the terminal 100, the process proceeds on to step S48. In step S48, the communication information is recorded.
- Also, if there is a request from the terminal 200 in step S50 to acquire image information, the process proceeds on to step S52. In step S52, the image information is provided to the terminal 200.
- Also, if there is a request in step S54 to acquire communication information from the terminal 200, the process proceeds on to step S56. In step S56, the communication information is provided to the terminal 200.
- In step S58, it is determined whether the function of the application has ended. If the function of the application has ended, the process ends. On the other hand, if the function of the application has not ended, the process returns to step S40, and the processes thereafter are performed again.
- Information such as protocol name, port number, and IP address are examples of communication information.
FIG. 6 is a schematic view of examples of communication information.FIG. 6 is a schematic view of communication information for a device A, a device B, and a device C, respectively. The communication information is defined for each of a network interface, an internet layer, a transport layer, an application layer, and a communication format. Information relating to the network interface includes WiFi, Bluetooth (registered trademark), Ethernet (registered trademark), and WiFi Direct, and the like. Information relating to the internet layer includes an IP address and a port number (IPv4 and IPV6). Information relating to the transport layer is TCP or UDP information. Information relating to the application layer includes HTTP, HTTPS, WebSocket (ws) and secure WebSocket (wss) and the like. Information relating to the communication format includes JSON PRC, SOAP, and REST, and the like. - The terminal 100 to be recognized and the recognizing
terminal 200 are able to communicate with each other by sharing communication information via theserver 300. The terminal 200 recognizes the terminal 100 by the IP address included in the communication information. The terminal 200 also recognizes the application of the terminal 100 by the port number included in the communication information. The communication information is linked to the image information and sent from the terminal 100 to the terminal 200 for each application screen, and is stored, together with the image information, on thestorage unit 302. The image information is sent, together with the linked communication information, to the terminal 200 in response to a request from the terminal 200 to acquire the image information. Therefore, even if there are a plurality of application screens on the terminal 100, the terminal 200 is able to communicate with the application imaged by thecamera 202, among the plurality of applications, by acquiring the port number corresponding to the image information. - As described above, the application screen may be a moving image. If the application screen is a moving image, a mechanism for absorbing a time lag in the communication can be introduced. For example, a frame number may be sent from the terminal 100 to be recognized to the
server 300 before the image information. There is no time lag in the transmission of the frame number. Time information is linked to the frame number, so theserver 300 is able to recognize in advance that image information will be received. Theserver 300 receives the image information from the terminal 100 after the frame number. Then, when theserver 300 receives the image information from the terminal 100, the image information for the frame corresponding to the requested time is extracted and sent to the terminal 200 in response to the request already received from the terminal 200. As a result, the terminal 200 is able to determine whether the image from the camera at the requested time matches the image information sent from theserver 300. - Alternatively, a moving image captured by the
camera 202 may also be stored (cached) for just a certain period of time in the recognizingterminal 200. As a result, even if there is a time lag when the terminal 200 receives the image information from theserver 300, it is possible to determine whether the image information matches the cached moving image by comparing the received image information with the stored moving image, on the basis of the time information for the frame, on the terminal 200 side. - In a case where
multiple terminals 100 to be recognized have recorded image information and communication information on theserver 300, theserver 300 that has received a request to acquire the image information and communication information from the recognizingterminal 200 narrows down the information from among the large amount of image information and communication information recorded, and sends the information to the terminal 200 that sent out the request to acquire the information. - In the embodiment, a search on the
server 300 side can be made easier by using supplementary information for narrowing down the information. Position information is an example of such supplementary information. The terminal 100 to be recognized sends, together with the dictionary data, position information for the terminal 100 acquired by a position information acquisition unit (GPS) 106, to theserver 300. Theserver 300 records the position information together with the image information and the communication information. - When the recognizing terminal 200 requests image information from the
server 300, the recognizingterminal 200 sends the position information for the terminal 200 acquired by a position information acquisition unit (GPS) 208 to theserver 300. An information extraction unit 306 of theserver 300 narrows down the image information and the communication information on the basis of the position information acquired from the terminal 200. For example, theserver 300 extracts image information and communication information for a terminal 100 positioned within a 10-meter radius of the position of the terminal 200 on the basis of the position information acquired from the terminal 200, and sends this image information and communication information to the terminal 200. By narrowing down the image information and communication information formultiple terminals 100 on the basis of position information in this way, a comparison between the image information and the imaging information can be easily performed on the terminal 200 side, which enables the processing load to be significantly reduced. - Various types of information aside from position information can be used as the supplementary information. For example, an identification
information output unit 209 of the terminal 200 sends identification information to the terminal 100 using Beacon Wi-Fi, sound, or light or the like, from the recognizingterminal 200 toward the terminal to be recognized. An identificationinformation acquisition unit 108 of the terminal 100 to be recognized acquires the identification information. The terminal 100 sends the identification information, together with the image information and the communication information, to theserver 300, and theserver 300 then records this identification information, together with the image information and the communication information. - When the recognizing terminal 200 requests image information from the
server 300, the recognizingterminal 200 sends the identification information to theserver 300. Theserver 300 narrows down the image information on the basis of the identification information acquired from the terminal 200, and then sends the image information and communication information linked to identification information that matches the identification information sent from the terminal 200, from among the image information and communication information recorded, to the terminal 200. As a result, the terminal 200 is able to extract only the image information for the imagedterminal 100, from the large amount of image information recorded. The IP address, or a portion of the IP address, of the terminal 200 can be used as the identification information. By narrowing down the image information formultiple terminals 100 on the basis of identification information in this way, a comparison between the image information and the imaging information can be easily performed on the terminal 200 side, which enables the processing load to be significantly reduced. - Also, the dictionary data may be searched in order from the most recently recorded, on the basis of the order in which the dictionary data was recorded on the
server 300. - 6.1. Application to a Tabletop Interactive System
- Several examples in which the embodiment has been applied are described below.
FIG. 7 is a schematic view of a system in which a tabletop interactive system and a terminal such as a smartphone are linked. As illustrated inFIG. 7 , thissystem 1100 a includes aninput unit 1110 a and anoutput unit 1130 a. Theinformation processing system 1100 a according to an embodiment of the present disclosure illustrated inFIG. 7 displays information on a top surface of a table 1140 a, and allows a user using theinformation processing system 1100 a to manipulate the information displayed on the table 1140 a. As illustrated inFIG. 7 , the method for displaying the information on the top surface of the table 1140 a is also referred to as a “projection type”. - The
input unit 1110 a is a device that inputs content of an operation by the user using theinformation processing system 1100 a, and the shape and pattern and the like of an object placed on the table 1140 a. In the example illustrated inFIG. 7 , theinput unit 1110 a is provided in a state suspended from a ceiling, for example, above the table 1140 a. That is, theinput unit 1110 a is provided away from the table 1140 a on which the information is to be displayed. A camera that images the table 1140 a with a single lens, a stereo camera capable of imaging the table 1140 a with two lenses and recording information in the depth direction, or a microphone for recording sounds spoken by a user using theinformation processing system 1100 a or ambient sounds of the environment where theinformation processing system 1100 a is placed, or the like may be used as theinput unit 1110 a. - If a camera that images the table 1140 a with a single lens is used as the
input unit 1110 a, theinformation processing system 1100 a is able to detect an object placed on the table 1140 a, by analyzing the image captured by the camera. Also, if a stereo camera is used as theinput unit 1110 a, a visible light camera or an infrared camera or the like, for example, can be used as the stereo camera. By using a stereo camera as theinput unit 1110 a, theinput unit 1110 a can acquire depth information. By acquiring depth information with theinput unit 1110 a, theinformation processing system 1100 a is able to detect a hand or an object placed on the table 1140 a, for example. Also, by acquiring depth information with theinput unit 1110 a, theinformation processing system 1100 a is able to detect when a hand of the user contacts or is close to the table 1140 a, and detect when the hand leaves the table 1140 a. Note that in the description below, movements in which the user brings an operating body such as a hand into contact with, or close to, an information display surface will also collectively be referred to simply as a “touch”. - Also, if a microphone is used as the
input unit 1110 a, a microphone array for picking up sounds in a specific direction can be used as the microphone. If a microphone array is used as theinput unit 1110 a, theinformation processing system 1100 a may adjust the pickup direction of the microphone array to a suitable direction. - Hereinafter, mainly a case in which an operation by the user is detected from an image captured by the
input unit 1110 a will be described, but the present disclosure is not limited to this example. The operation by the user may also be detected by a touch panel that detects the touch of a finger or the like of the user. Also, aside from this, a user operation that can be acquired by theinput unit 1110 a can include a stylus operation with respect to an information display surface, or a gesture with respect to a camera or the like, for example. - The
output unit 1130 a is a device that displays information on the table 1140 a and outputs audio, in accordance with information input by theinput unit 1110 a, such as the content of an operation by the user using theinformation processing system 1100 a, the content of information being output by the output unit 130 a, and the shape and pattern and the like of an object placed on the table 1140 a. A projector or a speaker or the like, for example, is used as theoutput unit 1130 a. In the example illustrated inFIG. 7 , theoutput unit 1130 a is provided in a state suspended from a ceiling, for example, above the table 140 a. If theoutput unit 1130 a is configured by a projector, theoutput unit 1130 a projects information onto the top surface of the table 1140 a. If theoutput unit 1130 a is configured by a speaker, theoutput unit 1130 a outputs audio on the basis of an audio signal. If theoutput unit 1130 a is configured by a speaker, the number of speakers may be one or a plurality. If theoutput unit 1130 a is configured by a plurality of speakers, theinformation processing system 1100 a may limit the speakers from which audio is output, or may adjust the direction in which the audio is output. - Also, if the
information processing system 1100 a is a projection type system as illustrated inFIG. 7 , theoutput unit 1130 a may also include lighting equipment. If theoutput unit 1130 a includes lighting equipment, theinformation processing system 1100 a may control the on/off state and the like of the lighting equipment on the basis of information input by theinput unit 1110 a. - The user using the
information processing system 1100 a is able to manipulate the information displayed on the table 1140 a by theoutput unit 1130 a, by placing a finger or the like on the table 1140 a. Also, by placing an object on the table 1140 a and having theinput unit 1110 a recognize the object, the user using theinformation processing system 1100 a is able to execute various operations relating to the recognized object. - Note that, although not illustrated in
FIG. 7 , another device may be connected to theinformation processing system 1100 a. For example, lighting equipment for illuminating the table 1140 a may be connected to theinformation processing system 1100 a. By connecting lighting equipment for illuminating the table 1140 a to theinformation processing system 1100 a, theinformation processing system 1100 a is able to control the lighting state of the lighting equipment in accordance with the state of the information display surface. -
FIG. 8 is an explanatory view illustrating a functional configuration example of aninformation processing system 1100 inFIG. 7 . Below, a functional configuration example of an image processing system according to an embodiment of the present disclosure will be described with reference toFIG. 8 . - As illustrated in
FIG. 8 , theinformation processing system 1100 according to an embodiment of the present disclosure includes aninput unit 1110 acontrol unit 1120, and anoutput unit 1130. - The
input unit 1110 inputs content of an operation with respect to theinformation processing system 1100 by a user using theinformation processing system 1100, and the shape and pattern and the like of an object placed on a surface (e.g., the table 1140 a illustrated inFIG. 7 ) onto which information is output by theoutput unit 1130. The content of an operation with respect theinformation processing system 1100 by a user using theinformation processing system 1100 includes the content of an operation with respect to GUI that theinformation processing system 1100 outputs onto the information display surface. Information input by theinput unit 1110, such as the content of an operation with respect to theinformation processing system 1100, and the shape and pattern and the like of the object, is sent to thecontrol unit 1120. - If the
information processing system 1100 is a projection type system, theinput unit 1110 may be configured by a camera with a single lens, a stereo camera with two lenses, or a microphone, or the like. - The
control unit 1120 controls the various units of theinformation processing system 1100. For example, thecontrol unit 1120 generates information to be output from theoutput unit 1130, using information input by theinput unit 1110. As illustrated inFIG. 8 , thecontrol unit 1120 includes adetection unit 1121 and anoutput control unit 1122. Thedetection unit 1121 executes a process for detecting the content of an operation with respect to theinformation processing system 1100 by a user using theinformation processing system 1100, the content of information being output by theoutput unit 1130, and the shape and pattern and the like of an object placed on a surface (e.g., the table 1140 a illustrated inFIG. 7 ) onto which information is output by theoutput unit 1130. The content detected by thedetection unit 1121 is sent to theoutput control unit 1122. Theoutput control unit 1122 executes control to generate information to be output from theoutput unit 1130, on the basis of the content detected by thedetection unit 1121. The information generated by theoutput control unit 1122 is sent to theoutput unit 1130. - For example, if the
information processing system 1100 is the projection type system illustrated inFIG. 7 , thedetection unit 1121 is able to detect what portion of the GUI an operating body such as a hand of the user touched, by a correction being made beforehand such that the coordinates on the information display surface match the coordinates where the operating body such as the hand of the user touched the display surface. - The
control unit 1120 may also be configured by a central processing unit (CPU) or the like, for example. If thecontrol unit 1120 is configured by a device such as a CPU, the device may be configured by an electronic circuit. - Also, although not illustrated in
FIG. 8 , thecontrol unit 1120 may include a communication function for performing wireless communication with another device, and a function for controlling the operation of another device, e.g., lighting equipment, connected to theinformation processing system 1100. - The
output unit 1130 outputs information input by theinput unit 1110, in accordance with information such as the content of an operation by the user using theinformation processing system 1100, the content of information being output by theoutput unit 1130, and the shape and pattern and the like of an object placed on a surface (e.g., the table 1140 a illustrated inFIG. 7 ) onto which theoutput unit 1130 outputs information. Theoutput unit 1130 outputs the information on the basis of the information generated by theoutput control unit 1122. The information output by theoutput unit 1130 includes information to be displayed on the information display surface, and audio to be output from a speaker (not shown) or the like, and so on. - The
information processing system 1100 illustrated inFIG. 8 may be configured as a single device, or a portion of theinformation processing system 1100 or the entireinformation processing system 1100 illustrated inFIG. 8 may be configured by separate devices. For example, in the functional configuration example of theinformation processing system 1100 illustrated inFIG. 8 , thecontrol unit 1120 may be provided in a device such as a server that is connected to theinput unit 1110 and theoutput unit 1130 by a network or the like. In the case where thecontrol unit 1120 is provided in a device such as a server, information from theinput unit 1110 is sent to the device such as the server over the network or the like. Thecontrol unit 1120 then processes the information from theinput unit 1110, and information to be output by theoutput unit 1130 is sent from the device such as the server to theoutput unit 1130 over the network or the like. - If the
information processing system 1100 according to an embodiment of the present disclosure is configured to project information onto a table, and enable a user to manipulate the information, as illustrated inFIG. 7 , for example, theinformation processing system 1100 can be linked to a mobile terminal such as a smartphone on the table. For example, theinformation processing system 100 according to an example of the present disclosure is able to identify a mobile terminal such as a smartphone, and link to the identified mobile terminal, by the user placing the mobile terminal on the table and having theinput unit 1110 recognize the mobile terminal. - However, if a plurality of users owning the exact same mobile terminal place these same mobile terminals separately on the table at the same time, and try to have the
information processing system 1100 recognize these mobile terminals, theinformation processing system 1100 will be unable to determine which of the mobile terminals to link to. - Therefore, with an embodiment of the present disclosure, even if a plurality of users owning the exact same mobile terminal place these same mobile terminals separately on the table at the same time, it is possible to easily determine which mobile terminal to link to by making the determination using the image information described above. In this case, the terminal 100 to be recognized corresponds to the mobile terminal, and the recognizing
terminal 100 corresponds to theinformation processing system 1100. Therefore, theinformation processing system 1100 can be linked to each of the mobile terminals. - 6.2. Wearable Devices and Other Display Devices
-
FIG. 9 is a schematic view of an example in which a stand-alone display 400 and awearable device 450 are linked. Here, the stand-alone display 400 corresponds to the terminal 100 to be recognized, and thewearable device 450 corresponds to the recognizingterminal 200. Thewearable device 450 images oneapplication screen alone display 400 using thecamera 202, and compares the image information recorded on theserver 300 beforehand with the imaging information. If, upon this comparison, the image information recorded on theserver 300 beforehand and the imaging information match, thewearable device 450 is able to communicate with the application. - 6.3. Recognition of Applications on a Large Screen Display Installed on a Wall
-
FIG. 10 is a schematic view illustrating a case in which applications on alarge screen display 500 installed on a wall are recognized. As illustrated inFIG. 10 , thelarge screen display 500 is installed with ascreen 502 vertical to the ground. A plurality ofapplications screen 502. - Image information for each application, or an arbitrary one or a plurality of applications, displayed on the
screen 502 of thelarge screen display 500 is sent, together with communication information, to theserver 300 and recorded on theserver 300. - On the other hand, the user uses an application on his or her
smartphone 600 and images the application screen displayed on thescreen 502. As a result, thesmartphone 600 recognizes the screens of theapplications - The
smartphone 600 corresponds to the recognizingterminal 200 described above. Thesmartphone 600 compares the image information for theapplications server 300 with the captured image. If, upon this comparison, the image information for theapplications server 300 and the captured image match, communication between thesmartphone 600 and theapplication 510 is realized. - Various linked applications can be executed by using communication obtained by the
smartphone 600 recognizing the application screen. For example, image, video, and music data on thesmartphone 600 can be played on theapplication 510 of thelarge screen display 500. Also, a plurality of users can also play card games and the like by smartphones owned by the plurality of users recognizing oneapplication 510 on thelarge screen display 500 and communicating with each other. - Note that in
FIG. 10 , theapplications large screen display 500 are recognized, but an application on a screen of thesmartphone 600 of the user can also be recognized by a camera placed on thelarge screen display 500. In this case, thelarge screen display 500 corresponds to the recognizingterminal 200, and thesmartphone 600 corresponds to the terminal 100 to be recognized. - 6.4. Wearable Cameras and Home Electric Appliances
-
FIG. 11 is a schematic view ofobjects 700 such as home electric appliances that are connected to a network at home. Theseobjects 700 that are connected to the network correspond to the terminal 100 to be recognized. Theobjects 700 such as home electric appliances record pictures of the appearance and 3D model data of themselves in the dictionary data storage function of theserver 300. - Therefore, with the system illustrated in
FIG. 11 , theobjects 700 corresponding to the terminal 100 to be recognized acquire appearance information relating to appearance features of themselves, and record this appearance information on theserver 300. The user wears awearable device 450 similar to the wearable device inFIG. 9 . Thiswearable device 450 corresponds to the recognizingterminal 100. Thewearable device 450 acquires images of theseobjects 700 by imaging theobjects 700 with thecamera 202, and determines whether the images matches the appearance information provided by theserver 300. If the images match the appearance information provided by theserver 300, thewearable device 450 communicates with theobjects 700. - Various applications can be executed using communication obtained by recognition. For example, an application for setting an air conditioner can be executed by an operation from the
wearable device 450, as a result of recognizing the air conditioner. Also, an application for unlocking a lock in a door knob can be executed by an operation from thewearable device 450, as a result of recognizing the lock. Note that inFIG. 10 , recognition is performed by thewearable device 450, but recognition may also be performed by a mobile device such as a smartphone. Also, the device connected to the network inFIG. 10 is an example, and is not limited to being a connected device or object. - As described above, according to the embodiment, an unknown application or a dynamically changing application can be recognized, via image recognition, by the sending, in real time, features and a snapshot of the application that is to be recognized, and using the features and the snapshot as dictionary data in the terminal 100 that performs the recognition.
- Also, a linking application using a plurality of devices can also be executed by being recognized by the plurality of devices. Also, when a device or an object is connected to a network, the device or object can be recognized by, and linked to, another device without performing the recording operation beforehand, by dynamically recording an image of the appearance, and 3D model data, of the device or object as dictionary data.
- The preferred embodiment(s) of the present disclosure has/have been described above with reference to the accompanying drawings, whilst the present disclosure is not limited to the above examples. A person skilled in the art may find various alterations and modifications within the scope of the appended claims, and it should be understood that they will naturally come under the technical scope of the present disclosure.
- Further, the effects described in this specification are merely illustrative or exemplified effects, and are not limitative. That is, with or in the place of the above effects, the technology according to the present disclosure may achieve other effects that are clear to those skilled in the art from the description of this specification.
- Additionally, the present technology may also be configured as below.
- (1)
- An information processing apparatus including:
- an appearance information acquisition unit configured to acquire appearance information indicating a feature of appearance of an own device; and
- a sending unit configured to send the appearance information to communicate with another device that has imaged the appearance of the own device.
- (2)
- The information processing apparatus according to (1), including:
- a display screen,
- in which the appearance information acquisition unit includes an image information generation unit configured to generate image information indicating a feature of a screen displayed on the display screen as the appearance information.
- (3)
- The information processing apparatus according to (2), in which the sending unit sends, together with the image information, communication information for communicating with the other device.
- (4)
- The information processing apparatus according to (2), in which
- a plurality of applications are displayed on the display screen,
- the image information generation unit generates the image information for each of the plurality of applications, and
- the sending unit sends the image information generated for each application.
- (5)
- The information processing apparatus according to (2), in which communication is performed with the other device for which it has determined that a captured image of the display screen and the image information match.
- (6)
- The information processing apparatus according to (2), including:
- an identification information acquisition unit configured to acquire identification information for identifying the other device,
- in which the sending unit sends the identification information together with the image information.
- (7)
- The information processing apparatus according to (6), in which the identification information includes at least a portion of an IP address of the other device.
- (8)
- The information processing apparatus according to (6), in which the identification information acquisition unit acquires the identification information sent by beacon, sound, or light.
- (9)
- The information processing apparatus according to (2), including:
- a position information acquisition unit configured to acquire position information,
- in which the sending unit sends the position information together with the image information.
- (10)
- An information processing method including:
- acquiring appearance information indicating a feature of appearance of an own device; and
- sending the appearance information to communicate with another device that has imaged the appearance of the own device.
- (11)
- A program for causing a computer to function as
- means for acquiring appearance information indicating a feature of appearance of an own device, and
- means for sending the appearance information to communicate with another device that has imaged the appearance of the own device.
- (12)
- An information processing apparatus including:
- an imaging unit configured to image another device;
- an appearance information acquisition unit configured to acquire appearance information indicating a feature of appearance of the other device from a server;
- an image recognition unit configured to compare the captured image obtained through the imaging performed by the imaging unit with the appearance information; and
- a communication unit configured to communicate with the other device if the result of the comparison by the image recognition unit is such that the captured image obtained through the imaging performed by the imaging unit and the appearance information match.
- (13)
- An information processing apparatus including:
- an appearance information acquisition unit configured to acquire appearance information indicating a feature of appearance of a first terminal from the first terminal;
- a storage unit configured to store the appearance information; and
- a sending unit configured to send, in response to a request from a second terminal, the appearance information to the second terminal to cause the second terminal to compare imaging information obtained by imaging appearance of the first terminal with the appearance information.
-
- 100 terminal to be recognized
- 102 image information generation unit
- 104 communication unit
- 106 GPS
- 108 identification information acquisition unit
- 200 recognizing terminal
Claims (11)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-073745 | 2015-03-31 | ||
JP2015073745A JP2016194756A (en) | 2015-03-31 | 2015-03-31 | Information processing device, information processing method, and program |
PCT/JP2016/056874 WO2016158206A1 (en) | 2015-03-31 | 2016-03-04 | Information processing device, information processing method, and program |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/056874 A-371-Of-International WO2016158206A1 (en) | 2015-03-31 | 2016-03-04 | Information processing device, information processing method, and program |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/444,454 Continuation US10789476B2 (en) | 2015-03-31 | 2019-06-18 | Information processing apparatus and information processing method |
Publications (2)
Publication Number | Publication Date |
---|---|
US20180060661A1 true US20180060661A1 (en) | 2018-03-01 |
US10360453B2 US10360453B2 (en) | 2019-07-23 |
Family
ID=57004508
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/559,890 Active US10360453B2 (en) | 2015-03-31 | 2016-03-04 | Information processing apparatus and information processing method to link devices by recognizing the appearance of a device |
US16/444,454 Active US10789476B2 (en) | 2015-03-31 | 2019-06-18 | Information processing apparatus and information processing method |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/444,454 Active US10789476B2 (en) | 2015-03-31 | 2019-06-18 | Information processing apparatus and information processing method |
Country Status (4)
Country | Link |
---|---|
US (2) | US10360453B2 (en) |
JP (1) | JP2016194756A (en) |
DE (1) | DE112016001499T5 (en) |
WO (1) | WO2016158206A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11197156B2 (en) | 2017-12-06 | 2021-12-07 | Samsung Electronics Co., Ltd. | Electronic device, user terminal apparatus, and control method thereof |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7200187B2 (en) | 2020-09-28 | 2023-01-06 | 株式会社日立製作所 | Rail vehicle air conditioner |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009255600A (en) | 2006-06-30 | 2009-11-05 | Nec Corp | Communication party identifying apparatus, communication party identifying method and communication party identifying program |
JP5318734B2 (en) * | 2009-11-26 | 2013-10-16 | 日本電信電話株式会社 | Image collecting method and portable device |
JP5683998B2 (en) * | 2011-02-28 | 2015-03-11 | オリンパス株式会社 | Server system and client device control method |
WO2012135563A1 (en) * | 2011-03-31 | 2012-10-04 | Sony Mobile Communications Ab | System and method for establishing a communication session |
KR101788060B1 (en) * | 2011-04-13 | 2017-11-15 | 엘지전자 주식회사 | Image display device and method of managing contents using the same |
JP5753009B2 (en) * | 2011-06-24 | 2015-07-22 | オリンパス株式会社 | Imaging device, wireless system |
EP2584800B1 (en) * | 2011-09-20 | 2014-11-05 | LG Electronics Inc. | Digital system and method of processing service data thereof |
KR20130123506A (en) | 2012-05-03 | 2013-11-13 | 삼성전자주식회사 | Operation method and system for module determination information, and electronic device supporting the same |
KR102009928B1 (en) * | 2012-08-20 | 2019-08-12 | 삼성전자 주식회사 | Cooperation method and apparatus |
KR101957943B1 (en) * | 2012-08-31 | 2019-07-04 | 삼성전자주식회사 | Method and vehicle for providing information |
US9530232B2 (en) * | 2012-09-04 | 2016-12-27 | Qualcomm Incorporated | Augmented reality surface segmentation |
KR102047494B1 (en) * | 2012-09-10 | 2019-11-21 | 삼성전자주식회사 | Transparent Display Apparatus and Object Selection Method Thereof |
JP5999582B2 (en) * | 2012-10-11 | 2016-09-28 | カシオ計算機株式会社 | Information output device and program |
EP2919103B1 (en) * | 2012-11-09 | 2020-04-01 | Sony Corporation | Information processing device, information processing method and computer-readable recording medium |
-
2015
- 2015-03-31 JP JP2015073745A patent/JP2016194756A/en active Pending
-
2016
- 2016-03-04 DE DE112016001499.6T patent/DE112016001499T5/en not_active Ceased
- 2016-03-04 US US15/559,890 patent/US10360453B2/en active Active
- 2016-03-04 WO PCT/JP2016/056874 patent/WO2016158206A1/en active Application Filing
-
2019
- 2019-06-18 US US16/444,454 patent/US10789476B2/en active Active
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11197156B2 (en) | 2017-12-06 | 2021-12-07 | Samsung Electronics Co., Ltd. | Electronic device, user terminal apparatus, and control method thereof |
Also Published As
Publication number | Publication date |
---|---|
JP2016194756A (en) | 2016-11-17 |
WO2016158206A1 (en) | 2016-10-06 |
US10789476B2 (en) | 2020-09-29 |
DE112016001499T5 (en) | 2018-02-22 |
US10360453B2 (en) | 2019-07-23 |
US20190303675A1 (en) | 2019-10-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10572073B2 (en) | Information processing device, information processing method, and program | |
US9293118B2 (en) | Client device | |
US9628843B2 (en) | Methods for controlling electronic devices using gestures | |
US9727298B2 (en) | Device and method for allocating data based on an arrangement of elements in an image | |
US9204131B2 (en) | Remote control system | |
US20150187137A1 (en) | Physical object discovery | |
JP6054527B2 (en) | User recognition by skin | |
US11373650B2 (en) | Information processing device and information processing method | |
CN109246360B (en) | Prompting method and mobile terminal | |
CN111897507B (en) | Screen projection method and device, second terminal and storage medium | |
US11350264B2 (en) | Method and apparatus for establishing device connection | |
CN110059652B (en) | Face image processing method, device and storage medium | |
US10564712B2 (en) | Information processing device, information processing method, and program | |
US20150139483A1 (en) | Interactive Controls For Operating Devices and Systems | |
US10789476B2 (en) | Information processing apparatus and information processing method | |
US11589222B2 (en) | Electronic apparatus, user terminal, and method for controlling the electronic apparatus and the user terminal | |
US20130321404A1 (en) | Operating area determination method and system | |
US11475664B2 (en) | Determining a control mechanism based on a surrounding of a remove controllable device | |
JP2022548804A (en) | Image processing method, electronic device, storage medium and computer program | |
JP2016524397A (en) | Remote control programming using images | |
US10929703B2 (en) | Method apparatus and program product for enabling two or more electronic devices to perform operations based on a common subject | |
US10545716B2 (en) | Information processing device, information processing method, and program | |
US20210333863A1 (en) | Extended Reality Localization | |
US20210225381A1 (en) | Information processing device, information processing method, and program | |
CN111367492B (en) | Webpage display method and device and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SATO, NAOYUKI;IZUMI, AKIHIKO;TORII, KUNIAKI;AND OTHERS;SIGNING DATES FROM 20170628 TO 20170705;REEL/FRAME:043919/0479 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |