WO2015198672A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et programme - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations et programme Download PDF

Info

Publication number
WO2015198672A1
WO2015198672A1 PCT/JP2015/059759 JP2015059759W WO2015198672A1 WO 2015198672 A1 WO2015198672 A1 WO 2015198672A1 JP 2015059759 W JP2015059759 W JP 2015059759W WO 2015198672 A1 WO2015198672 A1 WO 2015198672A1
Authority
WO
WIPO (PCT)
Prior art keywords
recognition result
action recognition
behavior
terminal device
function
Prior art date
Application number
PCT/JP2015/059759
Other languages
English (en)
Japanese (ja)
Inventor
倉田 雅友
呂尚 高岡
由幸 小林
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2015198672A1 publication Critical patent/WO2015198672A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor

Definitions

  • This disclosure relates to an information processing apparatus, an information processing method, and a program.
  • a behavior recognition technology for recognizing a user's behavior using a detection value of an acceleration sensor or the like mounted on a mobile device or a wearable device carried or worn by the user has been developed.
  • An example of such behavior recognition technology and information provided to the user using information obtained by the behavior recognition technology can be found in Patent Document 1, for example.
  • action recognition is performed using position information of a user acquired using GPS (Global Positioning System) together with a detection value of an acceleration sensor or the like.
  • GPS Global Positioning System
  • the location information can be used to identify the location where the user's action occurred, the user's moving speed, and the like, thereby improving the reliability of action recognition.
  • the location information when used to identify the location where the user's action occurred, in addition to the high reliability of the location information, a detailed map of the user's surrounding environment including the floor of the building Information is needed.
  • a large-scale facility such as an object recognition system using a camera is required. It is currently difficult to prepare such information or equipment for the entire user's surrounding environment.
  • the present disclosure proposes a new and improved information processing apparatus, information processing method, and program capable of improving the reliability of action recognition using the proximity relationship between terminal devices.
  • an information processing apparatus provided with a processing circuit that realizes an adjustment function for generating or correcting a second action recognition result indicating the action of the user of the second terminal device based on the first action recognition result Is done.
  • the 1st action recognition result which shows the user's action of the 1st terminal unit is acquired, and the information which shows the 2nd terminal unit which adjoins to the 1st terminal unit is acquired.
  • an information processing method including: a processing circuit generating or correcting a second action recognition result indicating the action of the user of the second terminal device based on the first action recognition result. Is done.
  • the information which shows the function which acquires the 1st action recognition result which shows the user's action of the 1st terminal device, and the 2nd terminal device which adjoins the 1st terminal device is acquired.
  • the reliability of action recognition can be improved using the proximity relationship between terminal devices.
  • FIG. 1 is a diagram for conceptually explaining an embodiment of the present disclosure.
  • FIG. 3 is a block diagram illustrating a functional configuration example of an embodiment of the present disclosure.
  • FIG. In one embodiment of this indication it is a figure for explaining an example which adjusts an action recognition result based on the number of common action recognition results.
  • FIG. 3 is a block diagram illustrating a hardware configuration example of an information processing apparatus according to an embodiment of the present disclosure.
  • FIG. 1 is a diagram for conceptually explaining an embodiment of the present disclosure.
  • terminal devices 100a to 100c (hereinafter collectively referred to as terminal device 100) are carried or mounted by users Ua to Uc (hereinafter collectively referred to as user U), respectively.
  • Each terminal device 100 is connected to the action recognition server 300 via the network 200.
  • the action recognition server 300 performs action recognition of the user U of each terminal device 100.
  • the behavior recognition server 300 performs behavior recognition based on sensor data acquired by a sensor included in the terminal device 100.
  • the behavior recognition server 300 may refer to a behavior estimation model.
  • the action recognition server 300 may implement action recognition based on information indicating the current action arbitrarily input by the user U, a schedule input in advance by the user U, and the like.
  • each of the terminal devices 100a to 100c performs inter-device communication using, for example, Bluetooth (registered trademark), Wi-Fi, infrared, or ultrasonic waves, and close to each other based on the result of the communication. Detect that Further, the terminal devices 100a to 100c are not only close to each other but are in the same area R.
  • the area R may be, for example, a geographical region including a building or the like, or may be a vehicle such as a train or a bus that moves with a user. Whether the terminal devices 100a to 100c are in the same area R is determined based on, for example, position information acquired by each terminal device 100 through communication with a GPS or Wi-Fi base station.
  • the behavior recognition server 300 performs a behavior indicating the behavior of the user U of these terminal devices 100. Adjust the recognition result. More specifically, for example, the behavior recognition server 300 replaces a behavior recognition result with a low reliability with a behavior recognition result with a higher reliability.
  • the reliability of the behavior recognition result indicating the behavior of the user Ua is determined by the other users Ub and Uc. It is higher than the reliability of the action recognition result indicating the action.
  • the action recognition server 300 estimates the actions of the users Ub and Uc based on the action recognition result of the user Ua. That is, the behavior recognition server 300 adopts the behavior recognition result of the user Ua as a behavior recognition result common to the users Ub and Uc.
  • the action recognition server 300 adjusts the action recognition results of the users Ua to Uc in various ways. For example, in the above example, when the action recognition result of the user Ua exists, but the action recognition result of the users Ub and Uc does not exist (the terminal devices 100b and 100c have a function of providing sensor data for action recognition, etc.
  • the action recognition server 300 may newly generate the action recognition results of the users Ub and Uc based on the action recognition result of the user Ua. Further, when there is an action recognition result for each of the users Ua to Uc but they do not match, the action recognition server 300 may replace the action recognition result of the minority with the action recognition result of the majority.
  • the action recognition server 300 displays the action recognition result of the user Uc as the user Ua and Ub. It may be replaced by the action recognition result.
  • FIG. 2 is a block diagram illustrating a functional configuration example of an embodiment of the present disclosure.
  • the terminal device 100 includes a gyro sensor 101, an acceleration sensor 102, an atmospheric pressure sensor 103, and a proximity sensor 104. Further, the terminal device 100 includes a proximity terminal detection function 105.
  • the terminal device 100 may be a mobile terminal device such as a smartphone or a tablet carried by the user U, or may be a wearable terminal device such as a glasses type, a bracelet type, or a ring type worn by the user U. Good.
  • the proximity terminal detection function 105 is realized by a processing circuit such as a CPU (Central Processing Unit) included in the terminal device 100, for example, and detects other terminal devices close to the terminal device 100 based on the detection result of the proximity sensor 104.
  • the proximity sensor 104 includes a communication device that performs communication between devices using Bluetooth (registered trademark), Wi-Fi, infrared rays, ultrasonic waves, or the like.
  • Bluetooth registered trademark
  • Wi-Fi wireless fidelity
  • infrared rays infrared rays
  • ultrasonic waves or the like.
  • the proximity terminal detection function 105 is based on the position information, and other terminal devices that are close to the terminal device 100 May be detected.
  • the proximity terminal detection function 105 may be realized in the action recognition server 300, for example.
  • the action recognition server 300 receives the position information of each terminal device 100 transmitted from the plurality of terminal devices 100, and the terminal devices 100 in which the distance between coordinates indicated by the position information is less than a threshold are close to each other. May be detected.
  • functions 311 to 315 are realized based on data 301 to 303 stored in a memory or storage.
  • the functions 311 to 316 are realized by, for example, a processing circuit such as a CPU included in one or a plurality of information processing apparatuses constituting the action recognition server 300.
  • a processing circuit such as a CPU included in one or a plurality of information processing apparatuses constituting the action recognition server 300.
  • each function will be further described.
  • the behavior recognition function 311 acquires a behavior recognition result indicating the behavior of the user U of the terminal device 100 by analyzing sensor data provided by the gyro sensor 101, the acceleration sensor 102, and the atmospheric pressure sensor 103 included in the terminal device 100. To do.
  • the behavior recognition function 311 may analyze sensor data provided from a plurality of terminal devices 100 and acquire a plurality of behavior recognition results indicating the behavior of the user U of each terminal device 100. In analyzing sensor data, the behavior recognition function 311 refers to the behavior recognition model 301.
  • the action recognition result acquired by the action recognition function 311 is stored as action recognition data 302.
  • a well-known action recognition technique described in many documents such as Japanese Patent Application Laid-Open No. 2012-8771 can be used as appropriate.
  • the behavior recognition model 301 can be a learning model generated by machine learning. In the action recognition model 301, for example, it is prepared for each user or in common with the user. For example, if there are more action logs stored for the user Ua of the terminal device 100a than the action logs stored for the users Ub and Uc of the terminal devices 100b and 100c, a personalized action recognition model for the user Ua. While it is possible to refer to 301, there may be a case where only the common action recognition model 301 can be referred to for the users Ub and Uc. In such a case, it can be determined that the accuracy of the behavior recognition model 301 referred to for the user Ua is higher than the accuracy of the behavior recognition model 301 referred to for the users Ub and Uc.
  • the behavior recognition function 311 is realized by the processing circuit of the terminal device 100, and the behavior recognition result transmitted from the terminal device 100 to the behavior recognition server 300 is stored as the behavior recognition data 302. Also good.
  • the function of acquiring the action recognition result is realized by the communication device that receives the action recognition result in the action recognition server 300.
  • the same area determination function 312 determines whether or not the terminal devices 100 determined to be close to each other are in the same area based on the information provided by the proximity terminal detection function 105. More specifically, for example, the same area determination function 312 is in proximity to the terminal device 100a together with the position information acquired by the terminal device 100a itself using GPS or the like from the terminal device 100a including the proximity terminal detection function 105. The terminal ID of the other terminal device 100b detected to be present or the user ID thereof is acquired. The same area determination function 312 further determines whether or not the terminal devices 100a and 100b are in the same area by referring to the position information of the terminal device 100b transmitted from the other terminal device 100b. Alternatively, the terminal device 100a may receive position information from the terminal device 100b through communication, and both the position information of the terminal devices 100a and 100b may be transmitted to the action recognition server 300.
  • the same area determination function 312 indicates that the terminal devices 100a and 100b are in the same area when the location information of the terminal devices 100a and 100b indicates that the terminal devices 100 are in a common geographical area. It may be determined that they are close together. Further, the same area determination function 312 indicates that the terminal devices 100a and 100b are close to each other in the same area when the location information of the terminal devices 100a and 100b indicates that the terminal devices 100 are moving in a similar locus. You may determine that you are doing. In this case, the terminal devices 100a and 100b may be moving on vehicles such as the same train or the same bus.
  • the same area determination function 312 can be executed, for example, to extract terminal devices 100 that are determined to be close to each other and that are likely to be continuously close after that. Therefore, for example, when the process of determining the proximity between the terminal devices 100 and adjusting the action recognition result described later is periodically executed at short time intervals, the same area determination function 312 is not implemented.
  • the action recognition result may be adjusted between the terminal devices 100 detected to be close by the proximity terminal detection function 105.
  • the behavior sharing function 313 shares information related to behavior among the terminal devices 100 determined to be close in the same area by the same area determination function 312.
  • the behavior sharing function 313 refers to the behavior recognition data 302 and the schedule information 303 based on the terminal ID or user ID of the terminal device 100 determined to be close in the same area.
  • the behavior sharing function 313 provides information acquired from the behavior recognition data 302 and / or the schedule information 303 to the reliability determination function 314.
  • the reliability determination function 314 determines the reliability of information related to actions shared between users by the action sharing function 313.
  • the reliability determination function 314 can determine that the reliability of the action recognition result based on the information input by the user is higher than the reliability of the action recognition result generated by analyzing the sensor data.
  • the user may be able to manually correct the action recognition result generated based on the sensor data.
  • the information input by the user can be regarded as a correct action recognition result.
  • the schedule information input in advance by the user can be regarded as a correct action recognition result if the schedule is executed as scheduled. For example, when the action recognition result generated based on the sensor data is consistent with the schedule information (the action recognition result “getting on the train” is obtained at the time when the schedule information “move by train” is input).
  • This behavior recognition result is more reliable than the behavior recognition result generated based only on the sensor data.
  • the information input by the user regarding the action recognition result is not limited to information input as correction or schedule information of the action recognition result, and may be extracted from, for example, recent postings on social media.
  • the reliability determination function 314 increases the accuracy of the behavior recognition model 301. You may determine with the reliability of a recognition result being high. As described above, for example, when there is a difference in the amount of action logs accumulated for each user, a difference may occur in the accuracy of the action recognition model 301 referred to for each user.
  • the reliability determination function 314 may determine that the behavior recognition result generated by analyzing the sensor data has a higher reliability of the behavior recognition result as the number of types of sensor data is larger.
  • the terminal device 100 includes the gyro sensor 101, the acceleration sensor 102, and the atmospheric pressure sensor 103, but other types of sensors may be available depending on the type of the terminal device 100. sell. Similarly, there may be few types of sensors that can be used. The more types of sensor data, the more advanced action recognition processing can be performed and the reliability of action recognition results can be improved.
  • the reliability determination function 314 may determine that the behavior recognition result generated by analyzing the sensor data has a higher reliability of the behavior recognition result as the processing capability of the apparatus that performs the analysis is higher. Good.
  • the behavior recognition function 311 may be realized by the behavior recognition server 300 or the terminal device 100, for example.
  • the behavior recognition function 311 is realized by the terminal device 100, that is, when the behavior recognition process is executed by a different device for each user, a difference occurs in the processing capability of the device that performs the analysis for behavior recognition. sell. Since processing using a higher-level algorithm becomes possible when processing is performed by a device with high processing capability, the reliability of the action recognition result can be improved.
  • the action recognition result adjustment function 315 adjusts the action recognition result between the terminal device 100 determined to be close within the same area or the user U based on the reliability determined by the reliability determination function 314. To do. If it demonstrates using the example of FIG. 1, the action recognition result adjustment function 315 will be more specifically based on the 1st action recognition result which shows the action of the user Ua of the 1st terminal device 100a. A second behavior recognition result indicating the behavior of the user Ub of the second terminal device 100b adjacent to the terminal device 100a is generated or corrected. For example, the behavior recognition result adjustment function 315 acquires the reliability for each of the user Ua's behavior recognition result and the user Ub's behavior recognition result based on the analysis result of the reliability determination function 314 and the like. When the reliability of the recognition result is higher than the second action recognition result, the second action recognition result is replaced with the first action recognition result.
  • the action recognition result adjustment function 315 may adjust the action recognition result regardless of the reliability determined by the reliability determination function 314. More specifically, for example, there is a first behavior recognition result indicating the behavior of the user Ua of a certain terminal device 100a, but there is a second behavior recognition result indicating the behavior of the user Ub of another terminal device 100b. If not (for example, if the terminal device 100b does not include a sensor or invalidates the sensor), the behavior recognition result adjustment function 315 performs the second behavior recognition result based on the first behavior recognition result. May be newly generated.
  • the behavior recognition result adjustment The function 315 includes the number of action recognition results common to the first action recognition result indicating the action of the user Ua and the second action recognition result indicating the action of the user Ub among the plurality of acquired action recognition results. And the number of action recognition results in common with each other. Further, the behavior recognition result adjustment function 315 may change the first behavior when the number of behavior recognition results common to the first behavior recognition result is larger than the number of behavior recognition results common to the second behavior recognition result. The second action recognition result is replaced with the recognition result.
  • FIG. 3 is a diagram for describing an example in which an action recognition result is adjusted based on the number of common action recognition results in an embodiment of the present disclosure.
  • the user Ua, Ub, and Ud have the action recognition result “moving on the train”, while they are on the train together.
  • the action recognition result “moving on the bus” is obtained for the user Uc who is.
  • the action recognition result “moving on the bus” is actually misrecognized.
  • the number of action recognition results including action recognition results indicating the actions of the users Ub and Ud
  • the action recognition result adjustment function 315 replaces the second action recognition result indicating the action of the user Uc with the first action recognition result. .
  • the feedback control function 316 is additionally provided, and based on the result of the action recognition result adjustment process in the action recognition result adjustment function 315, at least of the processing system including the sensor of the terminal device 100. Executes control to make a part of the mode shift to the power saving mode. More specifically, the feedback control function 316 performs at least a part of a processing system for generating the second action recognition result when the second action recognition result is replaced by the first action recognition result. Transition to power saving mode.
  • the feedback control function 316 The sensor 100b (for example, the gyro sensor 101, the acceleration sensor 102, or the atmospheric pressure sensor 103) or the entire terminal device 100b including the sensor is shifted to the power saving mode.
  • the feedback control function 316 is a device that performs analysis for behavior recognition (for example, the terminal device 100). You may make it change to power saving mode.
  • the fact that the action recognition result of the user Ub based on the sensor data provided by the terminal device 100b has been replaced means that a more reliable action recognition result has been acquired by another means. . Therefore, in such a case, in the terminal device 100b, power consumption can be reduced by, for example, stopping the sensor or switching the entire terminal device 100b to the power saving mode.
  • the terminal device 100 can continue the normal operation of the proximity sensor 104 even when transitioning to the power saving mode. For example, when another terminal device that provided a more reliable action recognition result is no longer close to the terminal apparatus 100, the terminal apparatus 100 returns from the power saving mode, and the action recognition result of the user U is displayed. There is a need to resume providing sensor data to generate.
  • the proximity sensor 104 detects that the terminal device 100 is no longer close to another terminal device.
  • FIG. 4 is a block diagram illustrating a hardware configuration example of the information processing apparatus according to the embodiment of the present disclosure.
  • the illustrated information processing apparatus 900 can realize, for example, the mobile terminal apparatus, wearable terminal apparatus, and / or action recognition server in the above-described embodiment.
  • the information processing apparatus 900 includes a CPU (Central Processing unit) 901, a ROM (Read Only Memory) 903, and a RAM (Random Access Memory) 905.
  • the information processing apparatus 900 may include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 923, and a communication device 925.
  • the information processing apparatus 900 may include an imaging device 933 and a sensor 935 as necessary.
  • the information processing apparatus 900 may include a processing circuit such as a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable Gate Array) instead of or in addition to the CPU 901.
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • the CPU 901 functions as an arithmetic processing device and a control device, and controls all or a part of the operation in the information processing device 900 according to various programs recorded in the ROM 903, the RAM 905, the storage device 919, or the removable recording medium 927.
  • the ROM 903 stores programs and calculation parameters used by the CPU 901.
  • the RAM 905 primarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like.
  • the CPU 901, the ROM 903, and the RAM 905 are connected to each other by a host bus 907 configured by an internal bus such as a CPU bus. Further, the host bus 907 is connected to an external bus 911 such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 909.
  • PCI Peripheral Component Interconnect / Interface
  • the input device 915 is a device operated by the user, such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever.
  • the input device 915 may be, for example, a remote control device that uses infrared rays or other radio waves, or may be an external connection device 929 such as a mobile phone that supports the operation of the information processing device 900.
  • the input device 915 includes an input control circuit that generates an input signal based on information input by the user and outputs the input signal to the CPU 901. The user operates the input device 915 to input various data and instruct processing operations to the information processing device 900.
  • the output device 917 is configured by a device capable of notifying the acquired information to the user using a sense such as vision, hearing, or touch.
  • the output device 917 can be, for example, a display device such as an LCD (Liquid Crystal Display) or an organic EL (Electro-Luminescence) display, an audio output device such as a speaker or headphones, or a vibrator.
  • the output device 917 outputs the result obtained by the processing of the information processing device 900 as video such as text or image, sound such as sound or sound, or vibration.
  • the storage device 919 is a data storage device configured as an example of a storage unit of the information processing device 900.
  • the storage device 919 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
  • the storage device 919 stores, for example, programs executed by the CPU 901 and various data, and various data acquired from the outside.
  • the drive 921 is a reader / writer for a removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and is built in or externally attached to the information processing apparatus 900.
  • the drive 921 reads information recorded on the attached removable recording medium 927 and outputs the information to the RAM 905.
  • the drive 921 writes a record in the attached removable recording medium 927.
  • the connection port 923 is a port for connecting a device to the information processing apparatus 900.
  • the connection port 923 can be, for example, a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, or the like.
  • the connection port 923 may be an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, or the like.
  • the communication device 925 is a communication interface configured with, for example, a communication device for connecting to the communication network 931.
  • the communication device 925 can be, for example, a communication card for LAN (Local Area Network), Bluetooth (registered trademark), Wi-Fi, or WUSB (Wireless USB).
  • the communication device 925 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communication.
  • the communication device 925 transmits and receives signals and the like using a predetermined protocol such as TCP / IP with the Internet and other communication devices, for example.
  • the communication network 931 connected to the communication device 925 is a network connected by wire or wireless, and may include, for example, the Internet, a home LAN, infrared communication, radio wave communication, satellite communication, or the like.
  • the imaging device 933 uses various members such as an imaging element such as a CMOS (Complementary Metal Oxide Semiconductor) or a CCD (Charge Coupled Device), and a lens for controlling the formation of a subject image on the imaging element. It is an apparatus that images a real space and generates a captured image.
  • the imaging device 933 may capture a still image or may capture a moving image.
  • the sensor 935 is various sensors such as an acceleration sensor, an angular velocity sensor, a geomagnetic sensor, an illuminance sensor, a temperature sensor, an atmospheric pressure sensor, or a sound sensor (microphone).
  • the sensor 935 acquires information about the state of the information processing apparatus 900 itself, such as the posture of the information processing apparatus 900, and information about the surrounding environment of the information processing apparatus 900, such as brightness and noise around the information processing apparatus 900, for example. To do.
  • the sensor 935 may include a GPS receiver that receives a GPS (Global Positioning System) signal and measures the latitude, longitude, and altitude of the device.
  • GPS Global Positioning System
  • Each component described above may be configured using a general-purpose member, or may be configured by hardware specialized for the function of each component. Such a configuration can be appropriately changed according to the technical level at the time of implementation.
  • Embodiments of the present disclosure include, for example, an information processing device (behavior recognition server), a system, an information processing method executed by the information processing device or system, a program for causing the information processing device to function, and It may include a non-transitory tangible medium on which the program is recorded.
  • an information processing device behavior recognition server
  • a system an information processing method executed by the information processing device or system
  • a program for causing the information processing device to function and It may include a non-transitory tangible medium on which the program is recorded.
  • a function of acquiring a first behavior recognition result indicating the behavior of the user of the first terminal device A function of acquiring information indicating a second terminal device proximate to the first terminal device;
  • An information processing apparatus comprising: a processing circuit that realizes an adjustment function that generates or corrects a second action recognition result indicating the action of the user of the second terminal device based on the first action recognition result.
  • the processing circuit includes: A function of acquiring the reliability of the first action recognition result; Further realizing a function of obtaining a reliability of the second action recognition result and the second action recognition result, The adjustment function replaces the second action recognition result by the first action recognition result when the reliability of the first action recognition result is higher than the reliability of the second action recognition result.
  • the information processing apparatus according to (1) or (2).
  • the adjustment function determines that the reliability of the action recognition result based on information input by the user is higher than the reliability of the action recognition result generated by analyzing the sensor data. ).
  • the adjustment function determines that the higher the accuracy of the learning model, the higher the reliability of the behavior recognition result.
  • the information processing apparatus according to (3) or (4).
  • the adjustment function determines that the behavior recognition result generated by analyzing the sensor data has a higher reliability of the behavior recognition result as the type of the sensor data is larger.
  • the information processing apparatus according to any one of 5).
  • the adjustment function determines that the reliability of the behavior recognition result is higher as the processing capability of the device that performs the analysis is higher.
  • the information processing apparatus according to any one of (6) to (6).
  • the processing circuit includes: A function of acquiring a plurality of behavior recognition results that are close to each other and each indicate a user's behavior of a plurality of terminal devices including the first terminal device and the second terminal device; A function of counting the number of action recognition results common to the first action recognition result and the number of action recognition results common to the second action recognition result among the plurality of action recognition results; Realized, When the number of action recognition results common to the first action recognition result is larger than the number of action recognition results common to the second action recognition result, the adjustment function is configured to use the first action recognition result.
  • the information processing apparatus wherein the second action recognition result is replaced by: (9)
  • the adjustment function replaces the second action recognition result by the first action recognition result according to a predetermined condition
  • the processing circuit sets at least a part of a processing system for generating the second action recognition result to a power saving mode when the second action recognition result is replaced by the first action recognition result.
  • the information processing apparatus according to any one of (1) to (8), further realizing a control function for transition.
  • the control function causes the sensor that provides the sensor data or a device including the sensor to transition to the power saving mode.
  • the information processing apparatus according to (9).
  • the control function causes the device that performs the analysis to transition to the power saving mode.
  • the information processing apparatus according to 10).
  • An information processing method including: a processing circuit generating or correcting a second behavior recognition result indicating a behavior of a user of the second terminal device based on the first behavior recognition result.
  • (13) a function of acquiring a first behavior recognition result indicating the behavior of the user of the first terminal device; A function of acquiring information indicating a second terminal device proximate to the first terminal device; A program for causing a processing circuit to realize an adjustment function that generates or corrects a second behavior recognition result indicating a behavior of a user of the second terminal device based on the first behavior recognition result.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)

Abstract

L'invention a pour but d'améliorer la fiabilité lors de la reconnaissance d'un comportement à l'aide de la relation de proximité entre des dispositifs de terminal. Pour atteindre ce but, l'invention concerne un dispositif de traitement d'informations qui est pourvu d'un circuit de traitement qui exécute : une fonction pour obtenir un premier résultat de reconnaissance de comportement indiquant le comportement d'un utilisateur d'un premier dispositif de terminal ; une fonction pour obtenir des informations indiquant un second dispositif de terminal à proximité du premier dispositif de terminal ; une fonction d'ajustement pour générer ou réviser un second résultat de reconnaissance de comportement indiquant le comportement d'un utilisateur du second dispositif de terminal, sur la base du premier résultat de reconnaissance de comportement.
PCT/JP2015/059759 2014-06-24 2015-03-27 Dispositif de traitement d'informations, procédé de traitement d'informations et programme WO2015198672A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-129170 2014-06-24
JP2014129170 2014-06-24

Publications (1)

Publication Number Publication Date
WO2015198672A1 true WO2015198672A1 (fr) 2015-12-30

Family

ID=54937773

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/059759 WO2015198672A1 (fr) 2014-06-24 2015-03-27 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Country Status (1)

Country Link
WO (1) WO2015198672A1 (fr)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012014357A (ja) * 2010-06-30 2012-01-19 Nippon Telegr & Teleph Corp <Ntt> 移動体特徴記述装置、移動体特徴記述方法、移動体特徴記述プログラム及び記録媒体

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012014357A (ja) * 2010-06-30 2012-01-19 Nippon Telegr & Teleph Corp <Ntt> 移動体特徴記述装置、移動体特徴記述方法、移動体特徴記述プログラム及び記録媒体

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
KENJI TOMIOKA: "Design for Support System of Watching Over Children based on Symbiotic Computing", IEICE TECHNICAL REPORT, vol. 107, no. 353, 19 November 2007 (2007-11-19), pages 35 - 40 *
TAKASHI NOMURA ET AL.: "A Proposal of Trajectory Estimation Algorithm for Mobile Nodes and Its Evaluation with Realistic Scenarios", SYMPOSIUM ON MULTIMEDIA, DISTRIBUTED, COOPERATIVE AND MOBILE SYSTEMS (DICOM02008) RONBUNSHU, IPSJ SYMPOSIUM SERIES, vol. 2008, no. 1, 2 July 2008 (2008-07-02), pages 1066 - 1074 *
XIAOPENG LIU: "Spring Model based Collaborative Indoor Position Estimation with Neighbor Mobile Terminals", KENKYU HOKOKU UBIQUITOUS COMPUTING SYSTEM, vol. 2013 -UB, no. 10, 7 March 2013 (2013-03-07), pages 1 - 8, XP055247163 *
YUKA KUME: "Android ni Okeru Sensor Joho no Kiroku Saisei Kino no Sekkei", INFORMATION PROCESSING SOCIETY OF JAPAN , KENKYU HOKOKU CONSUMER DEVICES & SYSTEMS (CDS, 15 May 2014 (2014-05-15), pages 1 - 6 *

Similar Documents

Publication Publication Date Title
JP6756328B2 (ja) 情報処理装置、情報処理方法、およびプログラム
JP6897728B2 (ja) 画像処理装置、画像処理方法およびプログラム
US10190869B2 (en) Information processing device and information processing method
JP6459972B2 (ja) 表示制御装置、表示制御方法、およびプログラム
US20210072398A1 (en) Information processing apparatus, information processing method, and ranging system
US10713525B2 (en) Image processing device and method to obtain a 360° image without remapping
US11720814B2 (en) Method and system for classifying time-series data
WO2020108041A1 (fr) Procédé et dispositif de détection de points clé de région d&#39;oreille et support de stockage
KR20160017933A (ko) 로봇 청소기, 단말장치의 제어방법 및 이를 포함하는 로봇 청소기 제어 시스템
US10962738B2 (en) Information processing apparatus and information processing method to calibrate line-of-sight of a user
CN108846817B (zh) 图像处理方法、装置以及移动终端
WO2015198672A1 (fr) Dispositif de traitement d&#39;informations, procédé de traitement d&#39;informations et programme
CN113432620B (zh) 误差估计方法、装置、车载终端及存储介质
WO2015194215A1 (fr) Dispositif de traitement d&#39;informations, procédé de traitement d&#39;informations et programme
WO2020031795A1 (fr) Dispositif de traitement d&#39;informations, procédé de traitement d&#39;informations et programme
US11372473B2 (en) Information processing apparatus and information processing method
JPWO2015151548A1 (ja) 電子機器および記録媒体
US20200342229A1 (en) Information processing device, information processing method, and program
JP2015111371A (ja) 情報処理装置、情報処理方法およびプログラム
US11232581B2 (en) Information processing apparatus, information processing method, and recording medium
WO2020183602A1 (fr) Dispositif de traitement d&#39;informations et procédé de traitement d&#39;informations
US10855639B2 (en) Information processing apparatus and information processing method for selection of a target user
Matsumoto et al. Image processing device and method to obtain a 360 image without remapping
KR20150066350A (ko) 블랙박스 기능을 가지는 휴대용 단말기
CN111488898A (zh) 对抗数据获取方法、装置、设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15812485

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15812485

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP