WO2015198672A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2015198672A1
WO2015198672A1 PCT/JP2015/059759 JP2015059759W WO2015198672A1 WO 2015198672 A1 WO2015198672 A1 WO 2015198672A1 JP 2015059759 W JP2015059759 W JP 2015059759W WO 2015198672 A1 WO2015198672 A1 WO 2015198672A1
Authority
WO
WIPO (PCT)
Prior art keywords
recognition result
action recognition
behavior
terminal device
function
Prior art date
Application number
PCT/JP2015/059759
Other languages
French (fr)
Japanese (ja)
Inventor
倉田 雅友
呂尚 高岡
由幸 小林
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2015198672A1 publication Critical patent/WO2015198672A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor

Definitions

  • This disclosure relates to an information processing apparatus, an information processing method, and a program.
  • a behavior recognition technology for recognizing a user's behavior using a detection value of an acceleration sensor or the like mounted on a mobile device or a wearable device carried or worn by the user has been developed.
  • An example of such behavior recognition technology and information provided to the user using information obtained by the behavior recognition technology can be found in Patent Document 1, for example.
  • action recognition is performed using position information of a user acquired using GPS (Global Positioning System) together with a detection value of an acceleration sensor or the like.
  • GPS Global Positioning System
  • the location information can be used to identify the location where the user's action occurred, the user's moving speed, and the like, thereby improving the reliability of action recognition.
  • the location information when used to identify the location where the user's action occurred, in addition to the high reliability of the location information, a detailed map of the user's surrounding environment including the floor of the building Information is needed.
  • a large-scale facility such as an object recognition system using a camera is required. It is currently difficult to prepare such information or equipment for the entire user's surrounding environment.
  • the present disclosure proposes a new and improved information processing apparatus, information processing method, and program capable of improving the reliability of action recognition using the proximity relationship between terminal devices.
  • an information processing apparatus provided with a processing circuit that realizes an adjustment function for generating or correcting a second action recognition result indicating the action of the user of the second terminal device based on the first action recognition result Is done.
  • the 1st action recognition result which shows the user's action of the 1st terminal unit is acquired, and the information which shows the 2nd terminal unit which adjoins to the 1st terminal unit is acquired.
  • an information processing method including: a processing circuit generating or correcting a second action recognition result indicating the action of the user of the second terminal device based on the first action recognition result. Is done.
  • the information which shows the function which acquires the 1st action recognition result which shows the user's action of the 1st terminal device, and the 2nd terminal device which adjoins the 1st terminal device is acquired.
  • the reliability of action recognition can be improved using the proximity relationship between terminal devices.
  • FIG. 1 is a diagram for conceptually explaining an embodiment of the present disclosure.
  • FIG. 3 is a block diagram illustrating a functional configuration example of an embodiment of the present disclosure.
  • FIG. In one embodiment of this indication it is a figure for explaining an example which adjusts an action recognition result based on the number of common action recognition results.
  • FIG. 3 is a block diagram illustrating a hardware configuration example of an information processing apparatus according to an embodiment of the present disclosure.
  • FIG. 1 is a diagram for conceptually explaining an embodiment of the present disclosure.
  • terminal devices 100a to 100c (hereinafter collectively referred to as terminal device 100) are carried or mounted by users Ua to Uc (hereinafter collectively referred to as user U), respectively.
  • Each terminal device 100 is connected to the action recognition server 300 via the network 200.
  • the action recognition server 300 performs action recognition of the user U of each terminal device 100.
  • the behavior recognition server 300 performs behavior recognition based on sensor data acquired by a sensor included in the terminal device 100.
  • the behavior recognition server 300 may refer to a behavior estimation model.
  • the action recognition server 300 may implement action recognition based on information indicating the current action arbitrarily input by the user U, a schedule input in advance by the user U, and the like.
  • each of the terminal devices 100a to 100c performs inter-device communication using, for example, Bluetooth (registered trademark), Wi-Fi, infrared, or ultrasonic waves, and close to each other based on the result of the communication. Detect that Further, the terminal devices 100a to 100c are not only close to each other but are in the same area R.
  • the area R may be, for example, a geographical region including a building or the like, or may be a vehicle such as a train or a bus that moves with a user. Whether the terminal devices 100a to 100c are in the same area R is determined based on, for example, position information acquired by each terminal device 100 through communication with a GPS or Wi-Fi base station.
  • the behavior recognition server 300 performs a behavior indicating the behavior of the user U of these terminal devices 100. Adjust the recognition result. More specifically, for example, the behavior recognition server 300 replaces a behavior recognition result with a low reliability with a behavior recognition result with a higher reliability.
  • the reliability of the behavior recognition result indicating the behavior of the user Ua is determined by the other users Ub and Uc. It is higher than the reliability of the action recognition result indicating the action.
  • the action recognition server 300 estimates the actions of the users Ub and Uc based on the action recognition result of the user Ua. That is, the behavior recognition server 300 adopts the behavior recognition result of the user Ua as a behavior recognition result common to the users Ub and Uc.
  • the action recognition server 300 adjusts the action recognition results of the users Ua to Uc in various ways. For example, in the above example, when the action recognition result of the user Ua exists, but the action recognition result of the users Ub and Uc does not exist (the terminal devices 100b and 100c have a function of providing sensor data for action recognition, etc.
  • the action recognition server 300 may newly generate the action recognition results of the users Ub and Uc based on the action recognition result of the user Ua. Further, when there is an action recognition result for each of the users Ua to Uc but they do not match, the action recognition server 300 may replace the action recognition result of the minority with the action recognition result of the majority.
  • the action recognition server 300 displays the action recognition result of the user Uc as the user Ua and Ub. It may be replaced by the action recognition result.
  • FIG. 2 is a block diagram illustrating a functional configuration example of an embodiment of the present disclosure.
  • the terminal device 100 includes a gyro sensor 101, an acceleration sensor 102, an atmospheric pressure sensor 103, and a proximity sensor 104. Further, the terminal device 100 includes a proximity terminal detection function 105.
  • the terminal device 100 may be a mobile terminal device such as a smartphone or a tablet carried by the user U, or may be a wearable terminal device such as a glasses type, a bracelet type, or a ring type worn by the user U. Good.
  • the proximity terminal detection function 105 is realized by a processing circuit such as a CPU (Central Processing Unit) included in the terminal device 100, for example, and detects other terminal devices close to the terminal device 100 based on the detection result of the proximity sensor 104.
  • the proximity sensor 104 includes a communication device that performs communication between devices using Bluetooth (registered trademark), Wi-Fi, infrared rays, ultrasonic waves, or the like.
  • Bluetooth registered trademark
  • Wi-Fi wireless fidelity
  • infrared rays infrared rays
  • ultrasonic waves or the like.
  • the proximity terminal detection function 105 is based on the position information, and other terminal devices that are close to the terminal device 100 May be detected.
  • the proximity terminal detection function 105 may be realized in the action recognition server 300, for example.
  • the action recognition server 300 receives the position information of each terminal device 100 transmitted from the plurality of terminal devices 100, and the terminal devices 100 in which the distance between coordinates indicated by the position information is less than a threshold are close to each other. May be detected.
  • functions 311 to 315 are realized based on data 301 to 303 stored in a memory or storage.
  • the functions 311 to 316 are realized by, for example, a processing circuit such as a CPU included in one or a plurality of information processing apparatuses constituting the action recognition server 300.
  • a processing circuit such as a CPU included in one or a plurality of information processing apparatuses constituting the action recognition server 300.
  • each function will be further described.
  • the behavior recognition function 311 acquires a behavior recognition result indicating the behavior of the user U of the terminal device 100 by analyzing sensor data provided by the gyro sensor 101, the acceleration sensor 102, and the atmospheric pressure sensor 103 included in the terminal device 100. To do.
  • the behavior recognition function 311 may analyze sensor data provided from a plurality of terminal devices 100 and acquire a plurality of behavior recognition results indicating the behavior of the user U of each terminal device 100. In analyzing sensor data, the behavior recognition function 311 refers to the behavior recognition model 301.
  • the action recognition result acquired by the action recognition function 311 is stored as action recognition data 302.
  • a well-known action recognition technique described in many documents such as Japanese Patent Application Laid-Open No. 2012-8771 can be used as appropriate.
  • the behavior recognition model 301 can be a learning model generated by machine learning. In the action recognition model 301, for example, it is prepared for each user or in common with the user. For example, if there are more action logs stored for the user Ua of the terminal device 100a than the action logs stored for the users Ub and Uc of the terminal devices 100b and 100c, a personalized action recognition model for the user Ua. While it is possible to refer to 301, there may be a case where only the common action recognition model 301 can be referred to for the users Ub and Uc. In such a case, it can be determined that the accuracy of the behavior recognition model 301 referred to for the user Ua is higher than the accuracy of the behavior recognition model 301 referred to for the users Ub and Uc.
  • the behavior recognition function 311 is realized by the processing circuit of the terminal device 100, and the behavior recognition result transmitted from the terminal device 100 to the behavior recognition server 300 is stored as the behavior recognition data 302. Also good.
  • the function of acquiring the action recognition result is realized by the communication device that receives the action recognition result in the action recognition server 300.
  • the same area determination function 312 determines whether or not the terminal devices 100 determined to be close to each other are in the same area based on the information provided by the proximity terminal detection function 105. More specifically, for example, the same area determination function 312 is in proximity to the terminal device 100a together with the position information acquired by the terminal device 100a itself using GPS or the like from the terminal device 100a including the proximity terminal detection function 105. The terminal ID of the other terminal device 100b detected to be present or the user ID thereof is acquired. The same area determination function 312 further determines whether or not the terminal devices 100a and 100b are in the same area by referring to the position information of the terminal device 100b transmitted from the other terminal device 100b. Alternatively, the terminal device 100a may receive position information from the terminal device 100b through communication, and both the position information of the terminal devices 100a and 100b may be transmitted to the action recognition server 300.
  • the same area determination function 312 indicates that the terminal devices 100a and 100b are in the same area when the location information of the terminal devices 100a and 100b indicates that the terminal devices 100 are in a common geographical area. It may be determined that they are close together. Further, the same area determination function 312 indicates that the terminal devices 100a and 100b are close to each other in the same area when the location information of the terminal devices 100a and 100b indicates that the terminal devices 100 are moving in a similar locus. You may determine that you are doing. In this case, the terminal devices 100a and 100b may be moving on vehicles such as the same train or the same bus.
  • the same area determination function 312 can be executed, for example, to extract terminal devices 100 that are determined to be close to each other and that are likely to be continuously close after that. Therefore, for example, when the process of determining the proximity between the terminal devices 100 and adjusting the action recognition result described later is periodically executed at short time intervals, the same area determination function 312 is not implemented.
  • the action recognition result may be adjusted between the terminal devices 100 detected to be close by the proximity terminal detection function 105.
  • the behavior sharing function 313 shares information related to behavior among the terminal devices 100 determined to be close in the same area by the same area determination function 312.
  • the behavior sharing function 313 refers to the behavior recognition data 302 and the schedule information 303 based on the terminal ID or user ID of the terminal device 100 determined to be close in the same area.
  • the behavior sharing function 313 provides information acquired from the behavior recognition data 302 and / or the schedule information 303 to the reliability determination function 314.
  • the reliability determination function 314 determines the reliability of information related to actions shared between users by the action sharing function 313.
  • the reliability determination function 314 can determine that the reliability of the action recognition result based on the information input by the user is higher than the reliability of the action recognition result generated by analyzing the sensor data.
  • the user may be able to manually correct the action recognition result generated based on the sensor data.
  • the information input by the user can be regarded as a correct action recognition result.
  • the schedule information input in advance by the user can be regarded as a correct action recognition result if the schedule is executed as scheduled. For example, when the action recognition result generated based on the sensor data is consistent with the schedule information (the action recognition result “getting on the train” is obtained at the time when the schedule information “move by train” is input).
  • This behavior recognition result is more reliable than the behavior recognition result generated based only on the sensor data.
  • the information input by the user regarding the action recognition result is not limited to information input as correction or schedule information of the action recognition result, and may be extracted from, for example, recent postings on social media.
  • the reliability determination function 314 increases the accuracy of the behavior recognition model 301. You may determine with the reliability of a recognition result being high. As described above, for example, when there is a difference in the amount of action logs accumulated for each user, a difference may occur in the accuracy of the action recognition model 301 referred to for each user.
  • the reliability determination function 314 may determine that the behavior recognition result generated by analyzing the sensor data has a higher reliability of the behavior recognition result as the number of types of sensor data is larger.
  • the terminal device 100 includes the gyro sensor 101, the acceleration sensor 102, and the atmospheric pressure sensor 103, but other types of sensors may be available depending on the type of the terminal device 100. sell. Similarly, there may be few types of sensors that can be used. The more types of sensor data, the more advanced action recognition processing can be performed and the reliability of action recognition results can be improved.
  • the reliability determination function 314 may determine that the behavior recognition result generated by analyzing the sensor data has a higher reliability of the behavior recognition result as the processing capability of the apparatus that performs the analysis is higher. Good.
  • the behavior recognition function 311 may be realized by the behavior recognition server 300 or the terminal device 100, for example.
  • the behavior recognition function 311 is realized by the terminal device 100, that is, when the behavior recognition process is executed by a different device for each user, a difference occurs in the processing capability of the device that performs the analysis for behavior recognition. sell. Since processing using a higher-level algorithm becomes possible when processing is performed by a device with high processing capability, the reliability of the action recognition result can be improved.
  • the action recognition result adjustment function 315 adjusts the action recognition result between the terminal device 100 determined to be close within the same area or the user U based on the reliability determined by the reliability determination function 314. To do. If it demonstrates using the example of FIG. 1, the action recognition result adjustment function 315 will be more specifically based on the 1st action recognition result which shows the action of the user Ua of the 1st terminal device 100a. A second behavior recognition result indicating the behavior of the user Ub of the second terminal device 100b adjacent to the terminal device 100a is generated or corrected. For example, the behavior recognition result adjustment function 315 acquires the reliability for each of the user Ua's behavior recognition result and the user Ub's behavior recognition result based on the analysis result of the reliability determination function 314 and the like. When the reliability of the recognition result is higher than the second action recognition result, the second action recognition result is replaced with the first action recognition result.
  • the action recognition result adjustment function 315 may adjust the action recognition result regardless of the reliability determined by the reliability determination function 314. More specifically, for example, there is a first behavior recognition result indicating the behavior of the user Ua of a certain terminal device 100a, but there is a second behavior recognition result indicating the behavior of the user Ub of another terminal device 100b. If not (for example, if the terminal device 100b does not include a sensor or invalidates the sensor), the behavior recognition result adjustment function 315 performs the second behavior recognition result based on the first behavior recognition result. May be newly generated.
  • the behavior recognition result adjustment The function 315 includes the number of action recognition results common to the first action recognition result indicating the action of the user Ua and the second action recognition result indicating the action of the user Ub among the plurality of acquired action recognition results. And the number of action recognition results in common with each other. Further, the behavior recognition result adjustment function 315 may change the first behavior when the number of behavior recognition results common to the first behavior recognition result is larger than the number of behavior recognition results common to the second behavior recognition result. The second action recognition result is replaced with the recognition result.
  • FIG. 3 is a diagram for describing an example in which an action recognition result is adjusted based on the number of common action recognition results in an embodiment of the present disclosure.
  • the user Ua, Ub, and Ud have the action recognition result “moving on the train”, while they are on the train together.
  • the action recognition result “moving on the bus” is obtained for the user Uc who is.
  • the action recognition result “moving on the bus” is actually misrecognized.
  • the number of action recognition results including action recognition results indicating the actions of the users Ub and Ud
  • the action recognition result adjustment function 315 replaces the second action recognition result indicating the action of the user Uc with the first action recognition result. .
  • the feedback control function 316 is additionally provided, and based on the result of the action recognition result adjustment process in the action recognition result adjustment function 315, at least of the processing system including the sensor of the terminal device 100. Executes control to make a part of the mode shift to the power saving mode. More specifically, the feedback control function 316 performs at least a part of a processing system for generating the second action recognition result when the second action recognition result is replaced by the first action recognition result. Transition to power saving mode.
  • the feedback control function 316 The sensor 100b (for example, the gyro sensor 101, the acceleration sensor 102, or the atmospheric pressure sensor 103) or the entire terminal device 100b including the sensor is shifted to the power saving mode.
  • the feedback control function 316 is a device that performs analysis for behavior recognition (for example, the terminal device 100). You may make it change to power saving mode.
  • the fact that the action recognition result of the user Ub based on the sensor data provided by the terminal device 100b has been replaced means that a more reliable action recognition result has been acquired by another means. . Therefore, in such a case, in the terminal device 100b, power consumption can be reduced by, for example, stopping the sensor or switching the entire terminal device 100b to the power saving mode.
  • the terminal device 100 can continue the normal operation of the proximity sensor 104 even when transitioning to the power saving mode. For example, when another terminal device that provided a more reliable action recognition result is no longer close to the terminal apparatus 100, the terminal apparatus 100 returns from the power saving mode, and the action recognition result of the user U is displayed. There is a need to resume providing sensor data to generate.
  • the proximity sensor 104 detects that the terminal device 100 is no longer close to another terminal device.
  • FIG. 4 is a block diagram illustrating a hardware configuration example of the information processing apparatus according to the embodiment of the present disclosure.
  • the illustrated information processing apparatus 900 can realize, for example, the mobile terminal apparatus, wearable terminal apparatus, and / or action recognition server in the above-described embodiment.
  • the information processing apparatus 900 includes a CPU (Central Processing unit) 901, a ROM (Read Only Memory) 903, and a RAM (Random Access Memory) 905.
  • the information processing apparatus 900 may include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 923, and a communication device 925.
  • the information processing apparatus 900 may include an imaging device 933 and a sensor 935 as necessary.
  • the information processing apparatus 900 may include a processing circuit such as a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable Gate Array) instead of or in addition to the CPU 901.
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • the CPU 901 functions as an arithmetic processing device and a control device, and controls all or a part of the operation in the information processing device 900 according to various programs recorded in the ROM 903, the RAM 905, the storage device 919, or the removable recording medium 927.
  • the ROM 903 stores programs and calculation parameters used by the CPU 901.
  • the RAM 905 primarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like.
  • the CPU 901, the ROM 903, and the RAM 905 are connected to each other by a host bus 907 configured by an internal bus such as a CPU bus. Further, the host bus 907 is connected to an external bus 911 such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 909.
  • PCI Peripheral Component Interconnect / Interface
  • the input device 915 is a device operated by the user, such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever.
  • the input device 915 may be, for example, a remote control device that uses infrared rays or other radio waves, or may be an external connection device 929 such as a mobile phone that supports the operation of the information processing device 900.
  • the input device 915 includes an input control circuit that generates an input signal based on information input by the user and outputs the input signal to the CPU 901. The user operates the input device 915 to input various data and instruct processing operations to the information processing device 900.
  • the output device 917 is configured by a device capable of notifying the acquired information to the user using a sense such as vision, hearing, or touch.
  • the output device 917 can be, for example, a display device such as an LCD (Liquid Crystal Display) or an organic EL (Electro-Luminescence) display, an audio output device such as a speaker or headphones, or a vibrator.
  • the output device 917 outputs the result obtained by the processing of the information processing device 900 as video such as text or image, sound such as sound or sound, or vibration.
  • the storage device 919 is a data storage device configured as an example of a storage unit of the information processing device 900.
  • the storage device 919 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
  • the storage device 919 stores, for example, programs executed by the CPU 901 and various data, and various data acquired from the outside.
  • the drive 921 is a reader / writer for a removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and is built in or externally attached to the information processing apparatus 900.
  • the drive 921 reads information recorded on the attached removable recording medium 927 and outputs the information to the RAM 905.
  • the drive 921 writes a record in the attached removable recording medium 927.
  • the connection port 923 is a port for connecting a device to the information processing apparatus 900.
  • the connection port 923 can be, for example, a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, or the like.
  • the connection port 923 may be an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, or the like.
  • the communication device 925 is a communication interface configured with, for example, a communication device for connecting to the communication network 931.
  • the communication device 925 can be, for example, a communication card for LAN (Local Area Network), Bluetooth (registered trademark), Wi-Fi, or WUSB (Wireless USB).
  • the communication device 925 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communication.
  • the communication device 925 transmits and receives signals and the like using a predetermined protocol such as TCP / IP with the Internet and other communication devices, for example.
  • the communication network 931 connected to the communication device 925 is a network connected by wire or wireless, and may include, for example, the Internet, a home LAN, infrared communication, radio wave communication, satellite communication, or the like.
  • the imaging device 933 uses various members such as an imaging element such as a CMOS (Complementary Metal Oxide Semiconductor) or a CCD (Charge Coupled Device), and a lens for controlling the formation of a subject image on the imaging element. It is an apparatus that images a real space and generates a captured image.
  • the imaging device 933 may capture a still image or may capture a moving image.
  • the sensor 935 is various sensors such as an acceleration sensor, an angular velocity sensor, a geomagnetic sensor, an illuminance sensor, a temperature sensor, an atmospheric pressure sensor, or a sound sensor (microphone).
  • the sensor 935 acquires information about the state of the information processing apparatus 900 itself, such as the posture of the information processing apparatus 900, and information about the surrounding environment of the information processing apparatus 900, such as brightness and noise around the information processing apparatus 900, for example. To do.
  • the sensor 935 may include a GPS receiver that receives a GPS (Global Positioning System) signal and measures the latitude, longitude, and altitude of the device.
  • GPS Global Positioning System
  • Each component described above may be configured using a general-purpose member, or may be configured by hardware specialized for the function of each component. Such a configuration can be appropriately changed according to the technical level at the time of implementation.
  • Embodiments of the present disclosure include, for example, an information processing device (behavior recognition server), a system, an information processing method executed by the information processing device or system, a program for causing the information processing device to function, and It may include a non-transitory tangible medium on which the program is recorded.
  • an information processing device behavior recognition server
  • a system an information processing method executed by the information processing device or system
  • a program for causing the information processing device to function and It may include a non-transitory tangible medium on which the program is recorded.
  • a function of acquiring a first behavior recognition result indicating the behavior of the user of the first terminal device A function of acquiring information indicating a second terminal device proximate to the first terminal device;
  • An information processing apparatus comprising: a processing circuit that realizes an adjustment function that generates or corrects a second action recognition result indicating the action of the user of the second terminal device based on the first action recognition result.
  • the processing circuit includes: A function of acquiring the reliability of the first action recognition result; Further realizing a function of obtaining a reliability of the second action recognition result and the second action recognition result, The adjustment function replaces the second action recognition result by the first action recognition result when the reliability of the first action recognition result is higher than the reliability of the second action recognition result.
  • the information processing apparatus according to (1) or (2).
  • the adjustment function determines that the reliability of the action recognition result based on information input by the user is higher than the reliability of the action recognition result generated by analyzing the sensor data. ).
  • the adjustment function determines that the higher the accuracy of the learning model, the higher the reliability of the behavior recognition result.
  • the information processing apparatus according to (3) or (4).
  • the adjustment function determines that the behavior recognition result generated by analyzing the sensor data has a higher reliability of the behavior recognition result as the type of the sensor data is larger.
  • the information processing apparatus according to any one of 5).
  • the adjustment function determines that the reliability of the behavior recognition result is higher as the processing capability of the device that performs the analysis is higher.
  • the information processing apparatus according to any one of (6) to (6).
  • the processing circuit includes: A function of acquiring a plurality of behavior recognition results that are close to each other and each indicate a user's behavior of a plurality of terminal devices including the first terminal device and the second terminal device; A function of counting the number of action recognition results common to the first action recognition result and the number of action recognition results common to the second action recognition result among the plurality of action recognition results; Realized, When the number of action recognition results common to the first action recognition result is larger than the number of action recognition results common to the second action recognition result, the adjustment function is configured to use the first action recognition result.
  • the information processing apparatus wherein the second action recognition result is replaced by: (9)
  • the adjustment function replaces the second action recognition result by the first action recognition result according to a predetermined condition
  • the processing circuit sets at least a part of a processing system for generating the second action recognition result to a power saving mode when the second action recognition result is replaced by the first action recognition result.
  • the information processing apparatus according to any one of (1) to (8), further realizing a control function for transition.
  • the control function causes the sensor that provides the sensor data or a device including the sensor to transition to the power saving mode.
  • the information processing apparatus according to (9).
  • the control function causes the device that performs the analysis to transition to the power saving mode.
  • the information processing apparatus according to 10).
  • An information processing method including: a processing circuit generating or correcting a second behavior recognition result indicating a behavior of a user of the second terminal device based on the first behavior recognition result.
  • (13) a function of acquiring a first behavior recognition result indicating the behavior of the user of the first terminal device; A function of acquiring information indicating a second terminal device proximate to the first terminal device; A program for causing a processing circuit to realize an adjustment function that generates or corrects a second behavior recognition result indicating a behavior of a user of the second terminal device based on the first behavior recognition result.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)

Abstract

[Problem] To improve reliability in behavior recognition using the proximity relationship between terminal devices. [Solution] An information processing device equipped with a processing circuit that executes: a function for obtaining a first behavior recognition result indicating the behavior of a user of a first terminal device; a function for obtaining information indicating a second terminal device in proximity to the first terminal device; and an adjustment function for generating or revising a second behavior recognition result indicating the behavior of a user of the second terminal device, on the basis of the first behavior recognition result.

Description

情報処理装置、情報処理方法およびプログラムInformation processing apparatus, information processing method, and program
 本開示は、情報処理装置、情報処理方法およびプログラムに関する。 This disclosure relates to an information processing apparatus, an information processing method, and a program.
 ユーザが携帯または装着するモバイル装置またはウェアラブル装置に搭載された加速度センサなどの検出値を利用して、ユーザの行動を認識する行動認識技術が開発されている。こうした行動認識技術、および行動認識技術によって得られる情報を利用してユーザに提供される情報の例は、例えば特許文献1に見ることができる。 A behavior recognition technology for recognizing a user's behavior using a detection value of an acceleration sensor or the like mounted on a mobile device or a wearable device carried or worn by the user has been developed. An example of such behavior recognition technology and information provided to the user using information obtained by the behavior recognition technology can be found in Patent Document 1, for example.
特開2013-003643号公報JP 2013-003643 A
 特許文献1に記載されたような技術では、加速度センサなどの検出値とともに、GPS(Global Positioning System)を利用して取得されたユーザの位置情報を用いて行動認識が実施される。位置情報を利用して、例えば、ユーザの行動が発生した場所や、ユーザの移動速度などを特定し、行動認識の信頼度を向上させることができる。 In the technique described in Patent Document 1, action recognition is performed using position information of a user acquired using GPS (Global Positioning System) together with a detection value of an acceleration sensor or the like. For example, the location information can be used to identify the location where the user's action occurred, the user's moving speed, and the like, thereby improving the reliability of action recognition.
 しかしながら、例えば位置情報を利用してユーザの行動が発生した場所を特定しようとする場合、位置情報の信頼度が高いことに加えて、建物のフロア内などを含むユーザの周辺環境の詳細な地図情報が必要とされる。あるいは、詳細な地図情報がない場合には、カメラによる物体認識システムなどの大規模な設備が必要とされる。そのような情報または設備を、ユーザの周辺環境の全体について用意することは、今のところ困難である。 However, for example, when the location information is used to identify the location where the user's action occurred, in addition to the high reliability of the location information, a detailed map of the user's surrounding environment including the floor of the building Information is needed. Alternatively, when there is no detailed map information, a large-scale facility such as an object recognition system using a camera is required. It is currently difficult to prepare such information or equipment for the entire user's surrounding environment.
 そこで、本開示では、端末装置同士の近接関係を利用して行動認識の信頼度を向上させることが可能な、新規かつ改良された情報処理装置、情報処理方法およびプログラムを提案する。 Therefore, the present disclosure proposes a new and improved information processing apparatus, information processing method, and program capable of improving the reliability of action recognition using the proximity relationship between terminal devices.
 本開示によれば、第1の端末装置のユーザの行動を示す第1の行動認識結果を取得する機能と、上記第1の端末装置に近接する第2の端末装置を示す情報を取得する機能と、上記第1の行動認識結果に基づいて上記第2の端末装置のユーザの行動を示す第2の行動認識結果を生成または修正する調整機能とを実現する処理回路を備える情報処理装置が提供される。 According to the present disclosure, a function of acquiring a first action recognition result indicating the action of the user of the first terminal device, and a function of acquiring information indicating the second terminal device close to the first terminal device. And an information processing apparatus provided with a processing circuit that realizes an adjustment function for generating or correcting a second action recognition result indicating the action of the user of the second terminal device based on the first action recognition result Is done.
 また、本開示によれば、第1の端末装置のユーザの行動を示す第1の行動認識結果を取得することと、上記第1の端末装置に近接する第2の端末装置を示す情報を取得することと、処理回路が、上記第1の行動認識結果に基づいて上記第2の端末装置のユーザの行動を示す第2の行動認識結果を生成または修正することとを含む情報処理方法が提供される。 Moreover, according to this indication, the 1st action recognition result which shows the user's action of the 1st terminal unit is acquired, and the information which shows the 2nd terminal unit which adjoins to the 1st terminal unit is acquired. And an information processing method including: a processing circuit generating or correcting a second action recognition result indicating the action of the user of the second terminal device based on the first action recognition result. Is done.
 また、本開示によれば、第1の端末装置のユーザの行動を示す第1の行動認識結果を取得する機能と、上記第1の端末装置に近接する第2の端末装置を示す情報を取得する機能と、上記第1の行動認識結果に基づいて上記第2の端末装置のユーザの行動を示す第2の行動認識結果を生成または修正する調整機能とを処理回路に実現させるためのプログラムが提供される。 Moreover, according to this indication, the information which shows the function which acquires the 1st action recognition result which shows the user's action of the 1st terminal device, and the 2nd terminal device which adjoins the 1st terminal device is acquired. A program for causing a processing circuit to realize a function for performing and an adjustment function for generating or correcting a second action recognition result indicating the action of the user of the second terminal device based on the first action recognition result. Provided.
 以上説明したように本開示によれば、端末装置同士の近接関係を利用して行動認識の信頼度を向上させることができる。 As described above, according to the present disclosure, the reliability of action recognition can be improved using the proximity relationship between terminal devices.
 なお、上記の効果は必ずしも限定的なものではなく、上記の効果とともに、または上記の効果に代えて、本明細書に示されたいずれかの効果、または本明細書から把握され得る他の効果が奏されてもよい。 Note that the above effects are not necessarily limited, and any of the effects shown in the present specification, or other effects that can be grasped from the present specification, together with or in place of the above effects. May be played.
本開示の一実施形態を概念的に説明するための図である。1 is a diagram for conceptually explaining an embodiment of the present disclosure. FIG. 本開示の一実施形態の機能構成例を示すブロック図である。3 is a block diagram illustrating a functional configuration example of an embodiment of the present disclosure. FIG. 本開示の一実施形態において、共通する行動認識結果の数に基づいて行動認識結果を調整する例について説明するための図である。In one embodiment of this indication, it is a figure for explaining an example which adjusts an action recognition result based on the number of common action recognition results. 本開示の実施形態に係る情報処理装置のハードウェア構成例を示すブロック図である。FIG. 3 is a block diagram illustrating a hardware configuration example of an information processing apparatus according to an embodiment of the present disclosure.
 以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書および図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In the present specification and drawings, components having substantially the same functional configuration are denoted by the same reference numerals, and redundant description is omitted.
 なお、説明は以下の順序で行うものとする。
 1.概要
 2.機能構成例
 3.ハードウェア構成
 4.補足
The description will be made in the following order.
1. Overview 2. 2. Functional configuration example Hardware configuration Supplement
 (1.概要)
 図1は、本開示の一実施形態を概念的に説明するための図である。図1を参照すると、端末装置100a~100c(以下、端末装置100として総称されうる)は、ユーザUa~Uc(以下、ユーザUとして総称されうる)によってそれぞれ携帯または装着されている。それぞれの端末装置100は、ネットワーク200を介して行動認識サーバ300に接続されている。行動認識サーバ300は、それぞれの端末装置100のユーザUの行動認識を実施する。例えば、行動認識サーバ300は、端末装置100が備えるセンサによって取得されたセンサデータに基づく行動認識を実施する。センサデータに基づく行動認識を実施するにあたり、行動認識サーバ300では、行動推定モデルが参照されてもよい。また、行動認識サーバ300は、ユーザUが任意に入力した現在の行動を示す情報や、ユーザUが事前に入力したスケジュールなどに基づいて行動認識を実施してもよい。
(1. Overview)
FIG. 1 is a diagram for conceptually explaining an embodiment of the present disclosure. Referring to FIG. 1, terminal devices 100a to 100c (hereinafter collectively referred to as terminal device 100) are carried or mounted by users Ua to Uc (hereinafter collectively referred to as user U), respectively. Each terminal device 100 is connected to the action recognition server 300 via the network 200. The action recognition server 300 performs action recognition of the user U of each terminal device 100. For example, the behavior recognition server 300 performs behavior recognition based on sensor data acquired by a sensor included in the terminal device 100. In performing behavior recognition based on sensor data, the behavior recognition server 300 may refer to a behavior estimation model. Moreover, the action recognition server 300 may implement action recognition based on information indicating the current action arbitrarily input by the user U, a schedule input in advance by the user U, and the like.
 一方、端末装置100a~100cは、それぞれが例えばBluetooth(登録商標)、Wi-Fi、赤外線、または超音波などを用いた装置間通信を実行し、該通信の結果に基づいて、互いに近接していることを検出する。また、端末装置100a~100cは、近接しているだけではなく同一のエリアR内にある。エリアRは、例えば、建物などを含む地理的領域であってもよいし、ユーザを乗せて移動する列車やバスなどの乗り物であってもよい。端末装置100a~100cが同一のエリアR内にあることは、例えば、それぞれの端末装置100がGPSやWi-Fi基地局などとの通信によって取得する位置情報に基づいて判定される。 On the other hand, each of the terminal devices 100a to 100c performs inter-device communication using, for example, Bluetooth (registered trademark), Wi-Fi, infrared, or ultrasonic waves, and close to each other based on the result of the communication. Detect that Further, the terminal devices 100a to 100c are not only close to each other but are in the same area R. The area R may be, for example, a geographical region including a building or the like, or may be a vehicle such as a train or a bus that moves with a user. Whether the terminal devices 100a to 100c are in the same area R is determined based on, for example, position information acquired by each terminal device 100 through communication with a GPS or Wi-Fi base station.
 ここで、端末装置100a~100cが互いに近接しており、また同一のエリアR内にあることが判定されている場合、行動認識サーバ300は、これらの端末装置100のユーザUの行動を示す行動認識結果の調整を実施する。より具体的には、例えば、行動認識サーバ300は、より信頼度が高い行動認識結果によって、信頼度が低い行動認識結果を置き換える。図示された例では、ユーザUaが入力したスケジュール(またはユーザUaについて学習された行動推定モデル)350が存在するため、ユーザUaの行動を示す行動認識結果の信頼度が、他のユーザUb,Ucの行動を示す行動認識結果の信頼度よりも高い。従って、行動認識サーバ300は、ユーザUaの行動認識結果に基づいて、ユーザUb,Ucの行動を推定する。つまり、行動認識サーバ300は、ユーザUaの行動認識結果を、ユーザUb,Ucにも共通の行動認識結果として採用する。 Here, when it is determined that the terminal devices 100a to 100c are close to each other and are within the same area R, the behavior recognition server 300 performs a behavior indicating the behavior of the user U of these terminal devices 100. Adjust the recognition result. More specifically, for example, the behavior recognition server 300 replaces a behavior recognition result with a low reliability with a behavior recognition result with a higher reliability. In the illustrated example, since the schedule (or the behavior estimation model learned for the user Ua) 350 input by the user Ua exists, the reliability of the behavior recognition result indicating the behavior of the user Ua is determined by the other users Ub and Uc. It is higher than the reliability of the action recognition result indicating the action. Therefore, the action recognition server 300 estimates the actions of the users Ub and Uc based on the action recognition result of the user Ua. That is, the behavior recognition server 300 adopts the behavior recognition result of the user Ua as a behavior recognition result common to the users Ub and Uc.
 なお、本実施形態において、行動認識サーバ300は、さまざまなやり方で、ユーザUa~Ucの行動認識結果を調整する。例えば、上記の例において、ユーザUaの行動認識結果は存在するものの、ユーザUb,Ucの行動認識結果が存在しない場合(端末装置100b,100cが行動認識のためのセンサデータなどを提供する機能を持たないか、そのような機能が無効化されている場合)、行動認識サーバ300は、ユーザUaの行動認識結果に基づいてユーザUb,Ucの行動認識結果を新たに生成してもよい。また、ユーザUa~Ucのそれぞれの行動認識結果が存在するものの、整合しない場合、行動認識サーバ300は、少数派の行動認識結果を多数派の行動認識結果で置き換えてもよい。例えば、ユーザUa,Ubの行動認識結果が一致しているが、ユーザUcの行動認識結果はこれとは異なるような場合、行動認識サーバ300は、ユーザUcの行動認識結果をユーザUa,Ubの行動認識結果によって置き換えてもよい。 In this embodiment, the action recognition server 300 adjusts the action recognition results of the users Ua to Uc in various ways. For example, in the above example, when the action recognition result of the user Ua exists, but the action recognition result of the users Ub and Uc does not exist (the terminal devices 100b and 100c have a function of providing sensor data for action recognition, etc. The action recognition server 300 may newly generate the action recognition results of the users Ub and Uc based on the action recognition result of the user Ua. Further, when there is an action recognition result for each of the users Ua to Uc but they do not match, the action recognition server 300 may replace the action recognition result of the minority with the action recognition result of the majority. For example, when the action recognition results of the users Ua and Ub match but the action recognition result of the user Uc is different from this, the action recognition server 300 displays the action recognition result of the user Uc as the user Ua and Ub. It may be replaced by the action recognition result.
 (2.機能構成例)
 図2は、本開示の一実施形態の機能構成例を示すブロック図である。図2を参照すると、端末装置100は、ジャイロセンサ101、加速度センサ102、気圧センサ103、および近接センサ104を備える。また、端末装置100は、近接端末検出機能105を含む。端末装置100は、例えばユーザUによって携帯されるスマートフォンやタブレットなどのモバイル端末装置であってもよく、またユーザUによって装着される眼鏡型、腕輪型、指輪型などのウェアラブル端末装置であってもよい。
(2. Example of functional configuration)
FIG. 2 is a block diagram illustrating a functional configuration example of an embodiment of the present disclosure. Referring to FIG. 2, the terminal device 100 includes a gyro sensor 101, an acceleration sensor 102, an atmospheric pressure sensor 103, and a proximity sensor 104. Further, the terminal device 100 includes a proximity terminal detection function 105. The terminal device 100 may be a mobile terminal device such as a smartphone or a tablet carried by the user U, or may be a wearable terminal device such as a glasses type, a bracelet type, or a ring type worn by the user U. Good.
 近接端末検出機能105は、例えば端末装置100が有するCPU(Central Processing Unit)などの処理回路によって実現され、近接センサ104の検出結果に基づいて端末装置100に近接する他の端末装置を検出する。例えば、近接センサ104は、Bluetooth(登録商標)、Wi-Fi、赤外線、または超音波などを用いた装置間通信を実行する通信装置を含み、近接端末検出機能105は、近接センサ104が他の端末装置から信号を受信した場合、またはその信号の強度が所定の閾値を超えた場合に、端末装置100に他の端末装置が近接したことを検出する。また、近接端末検出機能105は、近接センサ104が受信した信号から、他の端末装置の端末IDやユーザIDなどを抽出してもよい。 The proximity terminal detection function 105 is realized by a processing circuit such as a CPU (Central Processing Unit) included in the terminal device 100, for example, and detects other terminal devices close to the terminal device 100 based on the detection result of the proximity sensor 104. For example, the proximity sensor 104 includes a communication device that performs communication between devices using Bluetooth (registered trademark), Wi-Fi, infrared rays, ultrasonic waves, or the like. When a signal is received from the terminal device, or when the intensity of the signal exceeds a predetermined threshold, it is detected that another terminal device has approached the terminal device 100. The proximity terminal detection function 105 may extract a terminal ID or a user ID of another terminal device from a signal received by the proximity sensor 104.
 別の例として、例えば、端末装置100においてGPSなどを用いて高精度の位置情報が取得可能な場合、近接端末検出機能105は、位置情報に基づいて、端末装置100に近接する他の端末装置を検出してもよい。この場合、近接端末検出機能105は、例えば行動認識サーバ300において実現されてもよい。例えば、行動認識サーバ300は、複数の端末装置100から送信されるそれぞれの端末装置100の位置情報を受信し、位置情報によって示される座標間の距離が閾値を下回る端末装置100同士について、近接していることを検出してもよい。 As another example, for example, when high-accuracy position information can be acquired using the GPS or the like in the terminal device 100, the proximity terminal detection function 105 is based on the position information, and other terminal devices that are close to the terminal device 100 May be detected. In this case, the proximity terminal detection function 105 may be realized in the action recognition server 300, for example. For example, the action recognition server 300 receives the position information of each terminal device 100 transmitted from the plurality of terminal devices 100, and the terminal devices 100 in which the distance between coordinates indicated by the position information is less than a threshold are close to each other. May be detected.
 一方、行動認識サーバ300では、メモリまたはストレージに格納されるデータ301~303に基づいて、機能311~315が実現される。機能311~316は、例えば、行動認識サーバ300を構成する1または複数の情報処理装置が有するCPUなどの処理回路によって実現される。以下、それぞれの機能について、さらに説明する。 On the other hand, in the action recognition server 300, functions 311 to 315 are realized based on data 301 to 303 stored in a memory or storage. The functions 311 to 316 are realized by, for example, a processing circuit such as a CPU included in one or a plurality of information processing apparatuses constituting the action recognition server 300. Hereinafter, each function will be further described.
 行動認識機能311は、端末装置100が備えるジャイロセンサ101、加速度センサ102、および気圧センサ103によって提供されるセンサデータを解析することによって、端末装置100のユーザUの行動を示す行動認識結果を取得する。行動認識機能311は、複数の端末装置100から提供されるセンサデータを解析し、それぞれの端末装置100のユーザUの行動を示す複数の行動認識結果を取得してもよい。センサデータの解析にあたり、行動認識機能311は、行動認識モデル301を参照する。行動認識機能311によって取得された行動認識結果は、行動認識データ302として格納される。なお、行動認識の処理については、例えば特開2012-8771号公報など多くの文献に記載された公知の行動認識技術を適宜利用することが可能であるため、詳細な説明は省略する。 The behavior recognition function 311 acquires a behavior recognition result indicating the behavior of the user U of the terminal device 100 by analyzing sensor data provided by the gyro sensor 101, the acceleration sensor 102, and the atmospheric pressure sensor 103 included in the terminal device 100. To do. The behavior recognition function 311 may analyze sensor data provided from a plurality of terminal devices 100 and acquire a plurality of behavior recognition results indicating the behavior of the user U of each terminal device 100. In analyzing sensor data, the behavior recognition function 311 refers to the behavior recognition model 301. The action recognition result acquired by the action recognition function 311 is stored as action recognition data 302. Regarding the action recognition process, for example, a well-known action recognition technique described in many documents such as Japanese Patent Application Laid-Open No. 2012-8771 can be used as appropriate.
 行動認識モデル301は、機械学習によって生成される学習モデルでありうる。行動認識モデル301では、例えばユーザごとに、またはユーザに共通して用意される。例えば、端末装置100aのユーザUaについて蓄積された行動ログが、端末装置100b,100cのユーザUb,Ucについて蓄積された行動ログよりも多いような場合、ユーザUaについては個人化された行動認識モデル301を参照することが可能である一方で、ユーザUb,Ucについては共通の行動認識モデル301しか参照可能ではないといったようなケースが生じうる。このような場合、ユーザUaについて参照される行動認識モデル301の精度の方が、ユーザUb,Ucについて参照される行動認識モデル301の精度よりも高いと判断されうる The behavior recognition model 301 can be a learning model generated by machine learning. In the action recognition model 301, for example, it is prepared for each user or in common with the user. For example, if there are more action logs stored for the user Ua of the terminal device 100a than the action logs stored for the users Ub and Uc of the terminal devices 100b and 100c, a personalized action recognition model for the user Ua. While it is possible to refer to 301, there may be a case where only the common action recognition model 301 can be referred to for the users Ub and Uc. In such a case, it can be determined that the accuracy of the behavior recognition model 301 referred to for the user Ua is higher than the accuracy of the behavior recognition model 301 referred to for the users Ub and Uc.
 なお、本実施形態の他の例では、行動認識機能311が端末装置100の処理回路によって実現され、端末装置100から行動認識サーバ300に送信された行動認識結果が行動認識データ302として格納されてもよい。この場合、行動認識サーバ300において行動認識結果を受信する通信装置によって、行動認識結果を取得する機能が実現されるともいえる。 In another example of the present embodiment, the behavior recognition function 311 is realized by the processing circuit of the terminal device 100, and the behavior recognition result transmitted from the terminal device 100 to the behavior recognition server 300 is stored as the behavior recognition data 302. Also good. In this case, it can be said that the function of acquiring the action recognition result is realized by the communication device that receives the action recognition result in the action recognition server 300.
 同一エリア判定機能312は、近接端末検出機能105によって提供される情報に基づいて、互いに近接されると判定された端末装置100同士が同一エリア内にあるか否かを判定する。より具体的には、例えば、同一エリア判定機能312は、近接端末検出機能105を含む端末装置100aから、端末装置100a自身がGPSなどを用いて取得した位置情報とともに、端末装置100aに近接していることが検出された他の端末装置100bの端末IDまたはそのユーザIDを取得する。同一エリア判定機能312は、さらに、当該他の端末装置100bから送信された端末装置100bの位置情報を参照することによって、端末装置100a,100bが同一エリア内にいるか否かを判定する。あるいは、端末装置100aが通信によって端末装置100bから位置情報を受信し、端末装置100a,100bの位置情報がともに行動認識サーバ300に送信されてもよい。 The same area determination function 312 determines whether or not the terminal devices 100 determined to be close to each other are in the same area based on the information provided by the proximity terminal detection function 105. More specifically, for example, the same area determination function 312 is in proximity to the terminal device 100a together with the position information acquired by the terminal device 100a itself using GPS or the like from the terminal device 100a including the proximity terminal detection function 105. The terminal ID of the other terminal device 100b detected to be present or the user ID thereof is acquired. The same area determination function 312 further determines whether or not the terminal devices 100a and 100b are in the same area by referring to the position information of the terminal device 100b transmitted from the other terminal device 100b. Alternatively, the terminal device 100a may receive position information from the terminal device 100b through communication, and both the position information of the terminal devices 100a and 100b may be transmitted to the action recognition server 300.
 ここで、同一エリア判定機能312は、例えば、端末装置100a,100bの位置情報によって、これらの端末装置100が共通する地理的領域にあることが示される場合に、端末装置100a,100bが同一エリア内で近接していると判定してもよい。また、同一エリア判定機能312は、端末装置100a,100bの位置情報によって、これらの端末装置100が類似した軌跡移動していることが示される場合に、端末装置100a,100bが同一エリア内で近接していると判定してもよい。この場合、端末装置100a,100bは、同じ電車や同じバスなどの乗り物で移動中でありうる。 Here, the same area determination function 312 indicates that the terminal devices 100a and 100b are in the same area when the location information of the terminal devices 100a and 100b indicates that the terminal devices 100 are in a common geographical area. It may be determined that they are close together. Further, the same area determination function 312 indicates that the terminal devices 100a and 100b are close to each other in the same area when the location information of the terminal devices 100a and 100b indicates that the terminal devices 100 are moving in a similar locus. You may determine that you are doing. In this case, the terminal devices 100a and 100b may be moving on vehicles such as the same train or the same bus.
 なお、同一エリア判定機能312は、例えば、互いに近接していると判定された端末装置100のうち、その後も継続的に近接している可能性が高いものを抽出するために実行されうる。従って、例えば、端末装置100同士の近接の判定、および後述する行動認識結果の調整の処理が、短い時間間隔で周期的に実行されるような場合には、同一エリア判定機能312を実装せず、近接端末検出機能105によって近接していることが検出された端末装置100同士の間で、行動認識結果の調整を実施してもよい。 Note that the same area determination function 312 can be executed, for example, to extract terminal devices 100 that are determined to be close to each other and that are likely to be continuously close after that. Therefore, for example, when the process of determining the proximity between the terminal devices 100 and adjusting the action recognition result described later is periodically executed at short time intervals, the same area determination function 312 is not implemented. The action recognition result may be adjusted between the terminal devices 100 detected to be close by the proximity terminal detection function 105.
 行動共有機能313は、同一エリア判定機能312によって同一エリア内で近接していると判定された端末装置100の間で、行動に関する情報を共有する。例えば、行動共有機能313は、同一エリア内で近接していると判定された端末装置100の端末IDまたはユーザIDに基づいて、行動認識データ302と、スケジュール情報303とを参照する。行動共有機能313は、行動認識データ302および/またはスケジュール情報303から取得された情報を、信頼度判定機能314に提供する。 The behavior sharing function 313 shares information related to behavior among the terminal devices 100 determined to be close in the same area by the same area determination function 312. For example, the behavior sharing function 313 refers to the behavior recognition data 302 and the schedule information 303 based on the terminal ID or user ID of the terminal device 100 determined to be close in the same area. The behavior sharing function 313 provides information acquired from the behavior recognition data 302 and / or the schedule information 303 to the reliability determination function 314.
 信頼度判定機能314は、行動共有機能313によってユーザ間で共有された行動に関する情報について、それぞれの信頼度を判定する。 The reliability determination function 314 determines the reliability of information related to actions shared between users by the action sharing function 313.
 例えば、信頼度判定機能314は、ユーザによって入力された情報に基づく行動認識結果の信頼度が、センサデータを解析することによって生成された行動認識結果の信頼度よりも高いと判定しうる。例えば、ユーザは、センサデータに基づいて生成された行動認識結果を手入力で修正することが可能でありうる。この場合、ユーザによって入力された情報は、正しい行動認識結果とみなすことができる。また、ユーザが事前に入力したスケジュール情報も、スケジュールが予定通りに履行されれば正しい行動認識結果とみなされうる。例えば、センサデータに基づいて生成された行動認識結果が、スケジュール情報と整合する場合(「電車で移動」というスケジュール情報が入力された時間帯に「電車に乗っている」という行動認識結果が得られたような場合)、この行動認識結果は、センサデータのみに基づいて生成された行動認識結果よりも信頼度が高いといえる。なお、行動認識結果に関してユーザによって入力される情報は、行動認識結果の修正やスケジュール情報として入力されるものには限らず、例えばソーシャルメディアにおける近況の投稿から抽出されてもよい。 For example, the reliability determination function 314 can determine that the reliability of the action recognition result based on the information input by the user is higher than the reliability of the action recognition result generated by analyzing the sensor data. For example, the user may be able to manually correct the action recognition result generated based on the sensor data. In this case, the information input by the user can be regarded as a correct action recognition result. Moreover, the schedule information input in advance by the user can be regarded as a correct action recognition result if the schedule is executed as scheduled. For example, when the action recognition result generated based on the sensor data is consistent with the schedule information (the action recognition result “getting on the train” is obtained at the time when the schedule information “move by train” is input). This behavior recognition result is more reliable than the behavior recognition result generated based only on the sensor data. Note that the information input by the user regarding the action recognition result is not limited to information input as correction or schedule information of the action recognition result, and may be extracted from, for example, recent postings on social media.
 また、例えば、信頼度判定機能314は、行動認識機能311が行動認識モデル301を参照しながらセンサデータを解析することによって行動認識結果を生成する場合に、行動認識モデル301の精度が高いほど行動認識結果の信頼度が高いと判定してもよい。上述のように、例えばユーザごとに蓄積された行動ログの量に差があることによって、ユーザごとに参照される行動認識モデル301の精度に差が生じることがありうる。 Further, for example, when the behavior recognition function 311 generates the behavior recognition result by analyzing the sensor data while referring to the behavior recognition model 301, the reliability determination function 314 increases the accuracy of the behavior recognition model 301. You may determine with the reliability of a recognition result being high. As described above, for example, when there is a difference in the amount of action logs accumulated for each user, a difference may occur in the accuracy of the action recognition model 301 referred to for each user.
 また、例えば、信頼度判定機能314は、センサデータを解析することによって生成された行動認識結果について、センサデータの種類が多いほど行動認識結果の信頼度が高いと判定してもよい。例えば、図示された例では端末装置100にジャイロセンサ101、加速度センサ102、および気圧センサ103が含まれるが、端末装置100の種類によっては、さらに他の種類のセンサが利用可能である場合もありうる。また、同様に、利用可能なセンサの種類が少ない場合もありうる。センサデータの種類が多いほど、より高度な行動認識の処理を実施することができ、行動認識結果の信頼度が向上しうる。 Further, for example, the reliability determination function 314 may determine that the behavior recognition result generated by analyzing the sensor data has a higher reliability of the behavior recognition result as the number of types of sensor data is larger. For example, in the illustrated example, the terminal device 100 includes the gyro sensor 101, the acceleration sensor 102, and the atmospheric pressure sensor 103, but other types of sensors may be available depending on the type of the terminal device 100. sell. Similarly, there may be few types of sensors that can be used. The more types of sensor data, the more advanced action recognition processing can be performed and the reliability of action recognition results can be improved.
 また、例えば、信頼度判定機能314は、センサデータを解析することによって生成された行動認識結果について、解析を実行する装置の処理能力が高いほど行動認識結果の信頼度が高いと判定してもよい。上述のように、行動認識機能311は、例えば行動認識サーバ300で実現されてもよいし、端末装置100で実現されてもよい。行動認識機能311が端末装置100で実現されるような場合、つまり行動認識処理がユーザごとに異なる装置において実行される場合、行動認識のための解析を実行する装置の処理能力には差が生じうる。処理能力の高い装置で処理される方が、より高度なアルゴリズムを用いた解析が可能になるため、行動認識結果の信頼度が向上しうる。 Further, for example, the reliability determination function 314 may determine that the behavior recognition result generated by analyzing the sensor data has a higher reliability of the behavior recognition result as the processing capability of the apparatus that performs the analysis is higher. Good. As described above, the behavior recognition function 311 may be realized by the behavior recognition server 300 or the terminal device 100, for example. When the behavior recognition function 311 is realized by the terminal device 100, that is, when the behavior recognition process is executed by a different device for each user, a difference occurs in the processing capability of the device that performs the analysis for behavior recognition. sell. Since processing using a higher-level algorithm becomes possible when processing is performed by a device with high processing capability, the reliability of the action recognition result can be improved.
 行動認識結果調整機能315は、信頼度判定機能314によって判定された信頼度に基づいて、同一エリア内で近接していると判定された端末装置100またはそのユーザUの間で行動認識結果を調整する。図1の例を用いて説明すると、より具体的には、行動認識結果調整機能315は、第1の端末装置100aのユーザUaの行動を示す第1の行動認識結果に基づいて、第1の端末装置100aに近接する第2の端末装置100bのユーザUbの行動を示す第2の行動認識結果を生成または修正する。例えば、行動認識結果調整機能315は、ユーザUaの行動認識結果と、ユーザUbの行動認識結果のそれぞれについて、信頼度判定機能314の解析結果などに基づいて信頼度を取得し、第1の行動認識結果の信頼度が第2の行動認識結果よりも高い場合に、第1の行動認識結果によって第2の行動認識結果を置き換える。 The action recognition result adjustment function 315 adjusts the action recognition result between the terminal device 100 determined to be close within the same area or the user U based on the reliability determined by the reliability determination function 314. To do. If it demonstrates using the example of FIG. 1, the action recognition result adjustment function 315 will be more specifically based on the 1st action recognition result which shows the action of the user Ua of the 1st terminal device 100a. A second behavior recognition result indicating the behavior of the user Ub of the second terminal device 100b adjacent to the terminal device 100a is generated or corrected. For example, the behavior recognition result adjustment function 315 acquires the reliability for each of the user Ua's behavior recognition result and the user Ub's behavior recognition result based on the analysis result of the reliability determination function 314 and the like. When the reliability of the recognition result is higher than the second action recognition result, the second action recognition result is replaced with the first action recognition result.
 あるいは、行動認識結果調整機能315は、信頼度判定機能314によって判定される信頼度によらずに、行動認識結果の調整を実施してもよい。より具体的には、例えば、ある端末装置100aのユーザUaの行動を示す第1の行動認識結果が存在するが、別の端末装置100bのユーザUbの行動を示す第2の行動認識結果が存在しない場合(例えば、端末装置100bがセンサを備えていないか、センサを無効化しているような場合)、行動認識結果調整機能315は、第1の行動認識結果に基づいて第2の行動認識結果を新たに生成してもよい。 Alternatively, the action recognition result adjustment function 315 may adjust the action recognition result regardless of the reliability determined by the reliability determination function 314. More specifically, for example, there is a first behavior recognition result indicating the behavior of the user Ua of a certain terminal device 100a, but there is a second behavior recognition result indicating the behavior of the user Ub of another terminal device 100b. If not (for example, if the terminal device 100b does not include a sensor or invalidates the sensor), the behavior recognition result adjustment function 315 performs the second behavior recognition result based on the first behavior recognition result. May be newly generated.
 また、例えば、行動認識機能311が、第1の端末装置100aおよび第2の端末装置100bを含む複数の端末装置のユーザの行動をそれぞれ示す複数の行動認識結果を取得する場合、行動認識結果調整機能315は、取得された複数の行動認識結果の中で、ユーザUaの行動を示す第1の行動認識結果と共通する行動認識結果の数と、ユーザUbの行動を示す第2の行動認識結果と共通する行動認識結果の数とを、それぞれカウントする。さらに、行動認識結果調整機能315は、第1の行動認識結果と共通する行動認識結果の数が、第2の行動認識結果と共通する行動認識結果の数よりも多い場合に、第1の行動認識結果によって第2の行動認識結果を置き換える。 Further, for example, when the behavior recognition function 311 acquires a plurality of behavior recognition results respectively indicating the behaviors of users of a plurality of terminal devices including the first terminal device 100a and the second terminal device 100b, the behavior recognition result adjustment The function 315 includes the number of action recognition results common to the first action recognition result indicating the action of the user Ua and the second action recognition result indicating the action of the user Ub among the plurality of acquired action recognition results. And the number of action recognition results in common with each other. Further, the behavior recognition result adjustment function 315 may change the first behavior when the number of behavior recognition results common to the first behavior recognition result is larger than the number of behavior recognition results common to the second behavior recognition result. The second action recognition result is replaced with the recognition result.
 図3は、本開示の一実施形態において、共通する行動認識結果の数に基づいて行動認識結果を調整する例について説明するための図である。図示された例では、列車に乗っているユーザUa~Udのうち、ユーザUa,Ub,Udについては「列車で移動中」という行動認識結果が得られている一方で、一緒に列車に乗っているユーザUcについては「バスで移動中」という行動認識結果が得られている。この「バスで移動中」という行動認識結果は、実際には誤認識されている。この場合、ユーザUaの行動を示す第1の行動認識結果と共通する行動認識結果(ユーザUb,Udの行動を示す行動認識結果を含む)の数が3、ユーザUbの行動を示す第2の行動認識結果と共通する行動認識結果の数が1であることに基づいて、行動認識結果調整機能315が、ユーザUcの行動を示す第2の行動認識結果を、第1の行動認識結果によって置き換える。 FIG. 3 is a diagram for describing an example in which an action recognition result is adjusted based on the number of common action recognition results in an embodiment of the present disclosure. In the illustrated example, among the users Ua to Ud on the train, the user Ua, Ub, and Ud have the action recognition result “moving on the train”, while they are on the train together. The action recognition result “moving on the bus” is obtained for the user Uc who is. The action recognition result “moving on the bus” is actually misrecognized. In this case, the number of action recognition results (including action recognition results indicating the actions of the users Ub and Ud) in common with the first action recognition result indicating the actions of the user Ua is 3, and the second indicating the action of the user Ub. Based on the number of action recognition results in common with the action recognition result being 1, the action recognition result adjustment function 315 replaces the second action recognition result indicating the action of the user Uc with the first action recognition result. .
 再び図2を参照して、フィードバック制御機能316は、付加的に設けられ、行動認識結果調整機能315における行動認識結果の調整処理の結果に基づいて、端末装置100のセンサを含む処理系統の少なくとも一部を省電力モードに遷移させる制御を実行する。より具体的には、フィードバック制御機能316は、第1の行動認識結果によって第2の行動認識結果が置き換えられた場合に、第2の行動認識結果を生成するための処理系統の少なくとも一部を省電力モードに遷移させる。 Referring to FIG. 2 again, the feedback control function 316 is additionally provided, and based on the result of the action recognition result adjustment process in the action recognition result adjustment function 315, at least of the processing system including the sensor of the terminal device 100. Executes control to make a part of the mode shift to the power saving mode. More specifically, the feedback control function 316 performs at least a part of a processing system for generating the second action recognition result when the second action recognition result is replaced by the first action recognition result. Transition to power saving mode.
 例えば、端末装置100bが提供するセンサデータに基づくユーザUbの行動認識結果が、端末装置100aが提供するセンサデータに基づくユーザUaの行動認識結果によって置き換えられた場合、フィードバック制御機能316は、端末装置100bのセンサ(例えばジャイロセンサ101、加速度センサ102、または気圧センサ103)、またはセンサを含む端末装置100b全体を、省電力モードに遷移させる。 For example, when the action recognition result of the user Ub based on the sensor data provided by the terminal device 100b is replaced by the action recognition result of the user Ua based on the sensor data provided by the terminal device 100a, the feedback control function 316 The sensor 100b (for example, the gyro sensor 101, the acceleration sensor 102, or the atmospheric pressure sensor 103) or the entire terminal device 100b including the sensor is shifted to the power saving mode.
 また、例えば、行動認識機能311が、端末装置100などユーザごとに異なる装置で実現されるような場合、フィードバック制御機能316は、行動認識のための解析を実行する装置(例えば端末装置100)を省電力モードに遷移させてもよい。 Further, for example, when the behavior recognition function 311 is realized by a different device for each user, such as the terminal device 100, the feedback control function 316 is a device that performs analysis for behavior recognition (for example, the terminal device 100). You may make it change to power saving mode.
 上記の例において、端末装置100bが提供するセンサデータに基づくユーザUbの行動認識結果が置き換えられたということは、より信頼度の高い行動認識結果が別の手段によって取得されていることを意味する。それゆえ、このような場合、端末装置100bでは、例えばセンサを停止させたり、端末装置100b全体を省電力モードに遷移させたりすることによって、消費電力を節減することができる。 In the above example, the fact that the action recognition result of the user Ub based on the sensor data provided by the terminal device 100b has been replaced means that a more reliable action recognition result has been acquired by another means. . Therefore, in such a case, in the terminal device 100b, power consumption can be reduced by, for example, stopping the sensor or switching the entire terminal device 100b to the power saving mode.
 なお、端末装置100は、省電力モードに遷移した場合でも、近接センサ104については通常通りの動作を継続しうる。例えば、より信頼度の高い行動認識結果を提供していた他の端末装置が、もはや端末装置100と近接しなくなった場合、端末装置100は省電力モードから復帰し、ユーザUの行動認識結果を生成するためのセンサデータの提供を再開する必要がある。端末装置100が他の端末装置と近接しなくなったことは、近接センサ104によって検出される。 Note that the terminal device 100 can continue the normal operation of the proximity sensor 104 even when transitioning to the power saving mode. For example, when another terminal device that provided a more reliable action recognition result is no longer close to the terminal apparatus 100, the terminal apparatus 100 returns from the power saving mode, and the action recognition result of the user U is displayed. There is a need to resume providing sensor data to generate. The proximity sensor 104 detects that the terminal device 100 is no longer close to another terminal device.
 (3.ハードウェア構成)
 次に、図4を参照して、本開示の実施形態に係る情報処理装置のハードウェア構成について説明する。図4は、本開示の実施形態に係る情報処理装置のハードウェア構成例を示すブロック図である。図示された情報処理装置900は、例えば、上記の実施形態におけるモバイル端末装置、ウェアラブル端末装置、および/または行動認識サーバを実現しうる。
(3. Hardware configuration)
Next, a hardware configuration of the information processing apparatus according to the embodiment of the present disclosure will be described with reference to FIG. FIG. 4 is a block diagram illustrating a hardware configuration example of the information processing apparatus according to the embodiment of the present disclosure. The illustrated information processing apparatus 900 can realize, for example, the mobile terminal apparatus, wearable terminal apparatus, and / or action recognition server in the above-described embodiment.
 情報処理装置900は、CPU(Central Processing unit)901、ROM(Read Only Memory)903、およびRAM(Random Access Memory)905を含む。また、情報処理装置900は、ホストバス907、ブリッジ909、外部バス911、インターフェース913、入力装置915、出力装置917、ストレージ装置919、ドライブ921、接続ポート923、通信装置925を含んでもよい。さらに、情報処理装置900は、必要に応じて、撮像装置933、およびセンサ935を含んでもよい。情報処理装置900は、CPU901に代えて、またはこれとともに、DSP(Digital Signal Processor)、ASIC(Application Specific Integrated Circuit)、またはFPGA(Field-Programmable Gate Array)などの処理回路を有してもよい。 The information processing apparatus 900 includes a CPU (Central Processing unit) 901, a ROM (Read Only Memory) 903, and a RAM (Random Access Memory) 905. The information processing apparatus 900 may include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 923, and a communication device 925. Furthermore, the information processing apparatus 900 may include an imaging device 933 and a sensor 935 as necessary. The information processing apparatus 900 may include a processing circuit such as a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable Gate Array) instead of or in addition to the CPU 901.
 CPU901は、演算処理装置および制御装置として機能し、ROM903、RAM905、ストレージ装置919、またはリムーバブル記録媒体927に記録された各種プログラムに従って、情報処理装置900内の動作全般またはその一部を制御する。ROM903は、CPU901が使用するプログラムや演算パラメータなどを記憶する。RAM905は、CPU901の実行において使用するプログラムや、その実行において適宜変化するパラメータなどを一次記憶する。CPU901、ROM903、およびRAM905は、CPUバスなどの内部バスにより構成されるホストバス907により相互に接続されている。さらに、ホストバス907は、ブリッジ909を介して、PCI(Peripheral Component Interconnect/Interface)バスなどの外部バス911に接続されている。 The CPU 901 functions as an arithmetic processing device and a control device, and controls all or a part of the operation in the information processing device 900 according to various programs recorded in the ROM 903, the RAM 905, the storage device 919, or the removable recording medium 927. The ROM 903 stores programs and calculation parameters used by the CPU 901. The RAM 905 primarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like. The CPU 901, the ROM 903, and the RAM 905 are connected to each other by a host bus 907 configured by an internal bus such as a CPU bus. Further, the host bus 907 is connected to an external bus 911 such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 909.
 入力装置915は、例えば、マウス、キーボード、タッチパネル、ボタン、スイッチおよびレバーなど、ユーザによって操作される装置である。入力装置915は、例えば、赤外線やその他の電波を利用したリモートコントロール装置であってもよいし、情報処理装置900の操作に対応した携帯電話などの外部接続機器929であってもよい。入力装置915は、ユーザが入力した情報に基づいて入力信号を生成してCPU901に出力する入力制御回路を含む。ユーザは、この入力装置915を操作することによって、情報処理装置900に対して各種のデータを入力したり処理動作を指示したりする。 The input device 915 is a device operated by the user, such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever. The input device 915 may be, for example, a remote control device that uses infrared rays or other radio waves, or may be an external connection device 929 such as a mobile phone that supports the operation of the information processing device 900. The input device 915 includes an input control circuit that generates an input signal based on information input by the user and outputs the input signal to the CPU 901. The user operates the input device 915 to input various data and instruct processing operations to the information processing device 900.
 出力装置917は、取得した情報をユーザに対して視覚や聴覚、触覚などの感覚を用いて通知することが可能な装置で構成される。出力装置917は、例えば、LCD(Liquid Crystal Display)または有機EL(Electro-Luminescence)ディスプレイなどの表示装置、スピーカまたはヘッドフォンなどの音声出力装置、もしくはバイブレータなどでありうる。出力装置917は、情報処理装置900の処理により得られた結果を、テキストもしくは画像などの映像、音声もしくは音響などの音声、またはバイブレーションなどとして出力する。 The output device 917 is configured by a device capable of notifying the acquired information to the user using a sense such as vision, hearing, or touch. The output device 917 can be, for example, a display device such as an LCD (Liquid Crystal Display) or an organic EL (Electro-Luminescence) display, an audio output device such as a speaker or headphones, or a vibrator. The output device 917 outputs the result obtained by the processing of the information processing device 900 as video such as text or image, sound such as sound or sound, or vibration.
 ストレージ装置919は、情報処理装置900の記憶部の一例として構成されたデータ格納用の装置である。ストレージ装置919は、例えば、HDD(Hard Disk Drive)などの磁気記憶部デバイス、半導体記憶デバイス、光記憶デバイス、または光磁気記憶デバイスなどにより構成される。ストレージ装置919は、例えばCPU901が実行するプログラムや各種データ、および外部から取得した各種のデータなどを格納する。 The storage device 919 is a data storage device configured as an example of a storage unit of the information processing device 900. The storage device 919 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device. The storage device 919 stores, for example, programs executed by the CPU 901 and various data, and various data acquired from the outside.
 ドライブ921は、磁気ディスク、光ディスク、光磁気ディスク、または半導体メモリなどのリムーバブル記録媒体927のためのリーダライタであり、情報処理装置900に内蔵、あるいは外付けされる。ドライブ921は、装着されているリムーバブル記録媒体927に記録されている情報を読み出して、RAM905に出力する。また、ドライブ921は、装着されているリムーバブル記録媒体927に記録を書き込む。 The drive 921 is a reader / writer for a removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and is built in or externally attached to the information processing apparatus 900. The drive 921 reads information recorded on the attached removable recording medium 927 and outputs the information to the RAM 905. In addition, the drive 921 writes a record in the attached removable recording medium 927.
 接続ポート923は、機器を情報処理装置900に接続するためのポートである。接続ポート923は、例えば、USB(Universal Serial Bus)ポート、IEEE1394ポート、SCSI(Small Computer System Interface)ポートなどでありうる。また、接続ポート923は、RS-232Cポート、光オーディオ端子、HDMI(登録商標)(High-Definition Multimedia Interface)ポートなどであってもよい。接続ポート923に外部接続機器929を接続することで、情報処理装置900と外部接続機器929との間で各種のデータが交換されうる。 The connection port 923 is a port for connecting a device to the information processing apparatus 900. The connection port 923 can be, for example, a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, or the like. The connection port 923 may be an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, or the like. By connecting the external connection device 929 to the connection port 923, various types of data can be exchanged between the information processing apparatus 900 and the external connection device 929.
 通信装置925は、例えば、通信ネットワーク931に接続するための通信デバイスなどで構成された通信インターフェースである。通信装置925は、例えば、LAN(Local Area Network)、Bluetooth(登録商標)、Wi-Fi、またはWUSB(Wireless USB)用の通信カードなどでありうる。また、通信装置925は、光通信用のルータ、ADSL(Asymmetric Digital Subscriber Line)用のルータ、または、各種通信用のモデムなどであってもよい。通信装置925は、例えば、インターネットや他の通信機器との間で、TCP/IPなどの所定のプロトコルを用いて信号などを送受信する。また、通信装置925に接続される通信ネットワーク931は、有線または無線によって接続されたネットワークであり、例えば、インターネット、家庭内LAN、赤外線通信、ラジオ波通信または衛星通信などを含みうる。 The communication device 925 is a communication interface configured with, for example, a communication device for connecting to the communication network 931. The communication device 925 can be, for example, a communication card for LAN (Local Area Network), Bluetooth (registered trademark), Wi-Fi, or WUSB (Wireless USB). The communication device 925 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communication. The communication device 925 transmits and receives signals and the like using a predetermined protocol such as TCP / IP with the Internet and other communication devices, for example. The communication network 931 connected to the communication device 925 is a network connected by wire or wireless, and may include, for example, the Internet, a home LAN, infrared communication, radio wave communication, satellite communication, or the like.
 撮像装置933は、例えば、CMOS(Complementary Metal Oxide Semiconductor)またはCCD(Charge Coupled Device)などの撮像素子、および撮像素子への被写体像の結像を制御するためのレンズなどの各種の部材を用いて実空間を撮像し、撮像画像を生成する装置である。撮像装置933は、静止画を撮像するものであってもよいし、また動画を撮像するものであってもよい。 The imaging device 933 uses various members such as an imaging element such as a CMOS (Complementary Metal Oxide Semiconductor) or a CCD (Charge Coupled Device), and a lens for controlling the formation of a subject image on the imaging element. It is an apparatus that images a real space and generates a captured image. The imaging device 933 may capture a still image or may capture a moving image.
 センサ935は、例えば、加速度センサ、角速度センサ、地磁気センサ、照度センサ、温度センサ、気圧センサ、または音センサ(マイクロフォン)などの各種のセンサである。センサ935は、例えば情報処理装置900の筐体の姿勢など、情報処理装置900自体の状態に関する情報や、情報処理装置900の周辺の明るさや騒音など、情報処理装置900の周辺環境に関する情報を取得する。また、センサ935は、GPS(Global Positioning System)信号を受信して装置の緯度、経度および高度を測定するGPS受信機を含んでもよい。 The sensor 935 is various sensors such as an acceleration sensor, an angular velocity sensor, a geomagnetic sensor, an illuminance sensor, a temperature sensor, an atmospheric pressure sensor, or a sound sensor (microphone). The sensor 935 acquires information about the state of the information processing apparatus 900 itself, such as the posture of the information processing apparatus 900, and information about the surrounding environment of the information processing apparatus 900, such as brightness and noise around the information processing apparatus 900, for example. To do. The sensor 935 may include a GPS receiver that receives a GPS (Global Positioning System) signal and measures the latitude, longitude, and altitude of the device.
 以上、情報処理装置900のハードウェア構成の一例を示した。上記の各構成要素は、汎用的な部材を用いて構成されていてもよいし、各構成要素の機能に特化したハードウェアにより構成されていてもよい。かかる構成は、実施する時々の技術レベルに応じて適宜変更されうる。 Heretofore, an example of the hardware configuration of the information processing apparatus 900 has been shown. Each component described above may be configured using a general-purpose member, or may be configured by hardware specialized for the function of each component. Such a configuration can be appropriately changed according to the technical level at the time of implementation.
 (4.補足)
 本開示の実施形態は、例えば、上記で説明したような情報処理装置(行動認識サーバ)、システム、情報処理装置またはシステムで実行される情報処理方法、情報処理装置を機能させるためのプログラム、およびプログラムが記録された一時的でない有形の媒体を含みうる。
(4. Supplement)
Embodiments of the present disclosure include, for example, an information processing device (behavior recognition server), a system, an information processing method executed by the information processing device or system, a program for causing the information processing device to function, and It may include a non-transitory tangible medium on which the program is recorded.
 以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。 The preferred embodiments of the present disclosure have been described in detail above with reference to the accompanying drawings, but the technical scope of the present disclosure is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field of the present disclosure can come up with various changes or modifications within the scope of the technical idea described in the claims. Of course, it is understood that it belongs to the technical scope of the present disclosure.
 また、本明細書に記載された効果は、あくまで説明的または例示的なものであって限定的ではない。つまり、本開示に係る技術は、上記の効果とともに、または上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏しうる。 In addition, the effects described in this specification are merely illustrative or illustrative, and are not limited. That is, the technology according to the present disclosure can exhibit other effects that are apparent to those skilled in the art from the description of the present specification in addition to or instead of the above effects.
 なお、以下のような構成も本開示の技術的範囲に属する。
(1)第1の端末装置のユーザの行動を示す第1の行動認識結果を取得する機能と、
 前記第1の端末装置に近接する第2の端末装置を示す情報を取得する機能と、
 前記第1の行動認識結果に基づいて前記第2の端末装置のユーザの行動を示す第2の行動認識結果を生成または修正する調整機能と
 を実現する処理回路を備える情報処理装置。
(2)前記調整機能は、前記第2の行動認識結果が存在しない場合に、前記第1の行動認識結果に基づいて前記第2の行動認識結果を生成する、前記(1)に記載の情報処理装置。
(3)前記処理回路は、
  前記第1の行動認識結果の信頼度を取得する機能と、
  前記第2の行動認識結果および前記第2の行動認識結果の信頼度を取得する機能と
 をさらに実現し、
 前記調整機能は、前記第1の行動認識結果の信頼度が前記第2の行動認識結果の信頼度よりも高い場合に、前記第1の行動認識結果によって前記第2の行動認識結果を置き換える、前記(1)または(2)に記載の情報処理装置。
(4)前記調整機能は、ユーザによって入力された情報に基づく行動認識結果の信頼度が、センサデータを解析することによって生成された行動認識結果の信頼度よりも高いと判定する、前記(3)に記載の情報処理装置。
(5)前記調整機能は、学習モデルを参照しながらセンサデータを解析することによって生成された行動認識結果について、前記学習モデルの精度が高いほど前記行動認識結果の信頼度が高いと判定する、前記(3)または(4)に記載の情報処理装置。
(6)前記調整機能は、センサデータを解析することによって生成された行動認識結果について、前記センサデータの種類が多いほど前記行動認識結果の信頼度が高いと判定する、前記(3)~(5)のいずれか1項に記載の情報処理装置。
(7)前記調整機能は、センサデータを解析することによって生成された行動認識結果について、解析を実行する装置の処理能力が高いほど前記行動認識結果の信頼度が高いと判定する、前記(3)~(6)のいずれか1項に記載の情報処理装置。
(8)前記処理回路は、
  互いに近接し、前記第1の端末装置および前記第2の端末装置を含む複数の端末装置のユーザの行動をそれぞれ示す複数の行動認識結果を取得する機能と、
  前記複数の行動認識結果の中で、前記第1の行動認識結果と共通する行動認識結果の数と、前記第2の行動認識結果と共通する行動認識結果の数とをカウントする機能と
 をさらに実現し、
 前記調整機能は、前記第1の行動認識結果と共通する行動認識結果の数が、前記第2の行動認識結果と共通する行動認識結果の数よりも多い場合に、前記第1の行動認識結果によって前記第2の行動認識結果を置き換える、前記(1)または(2)に記載の情報処理装置。
(9)前記調整機能は、所定の条件に従って前記第1の行動認識結果によって前記第2の行動認識結果を置き換え、
 前記処理回路は、前記第1の行動認識結果によって前記第2の行動認識結果が置き換えられた場合に、前記第2の行動認識結果を生成するための処理系統の少なくとも一部を省電力モードに遷移させる制御機能をさらに実現する、前記(1)~(8)のいずれか1項に記載の情報処理装置。
(10)前記制御機能は、前記第2の行動認識結果がセンサデータを解析することによって生成される場合に、前記センサデータを提供するセンサまたは該センサを含む装置を前記省電力モードに遷移させる、前記(9)に記載の情報処理装置。
(11)前記制御機能は、前記第2の行動認識結果がセンサデータを解析することによって生成される場合に、前記解析を実行する装置を前記省電力モードに遷移させる、前記(9)または(10)に記載の情報処理装置。
(12)第1の端末装置のユーザの行動を示す第1の行動認識結果を取得することと、
 前記第1の端末装置に近接する第2の端末装置を示す情報を取得することと、
 処理回路が、前記第1の行動認識結果に基づいて前記第2の端末装置のユーザの行動を示す第2の行動認識結果を生成または修正することと
 を含む情報処理方法。
(13)第1の端末装置のユーザの行動を示す第1の行動認識結果を取得する機能と、
 前記第1の端末装置に近接する第2の端末装置を示す情報を取得する機能と、
 前記第1の行動認識結果に基づいて前記第2の端末装置のユーザの行動を示す第2の行動認識結果を生成または修正する調整機能と
 を処理回路に実現させるためのプログラム。
The following configurations also belong to the technical scope of the present disclosure.
(1) a function of acquiring a first behavior recognition result indicating the behavior of the user of the first terminal device;
A function of acquiring information indicating a second terminal device proximate to the first terminal device;
An information processing apparatus comprising: a processing circuit that realizes an adjustment function that generates or corrects a second action recognition result indicating the action of the user of the second terminal device based on the first action recognition result.
(2) The information according to (1), wherein the adjustment function generates the second action recognition result based on the first action recognition result when the second action recognition result does not exist. Processing equipment.
(3) The processing circuit includes:
A function of acquiring the reliability of the first action recognition result;
Further realizing a function of obtaining a reliability of the second action recognition result and the second action recognition result,
The adjustment function replaces the second action recognition result by the first action recognition result when the reliability of the first action recognition result is higher than the reliability of the second action recognition result. The information processing apparatus according to (1) or (2).
(4) The adjustment function determines that the reliability of the action recognition result based on information input by the user is higher than the reliability of the action recognition result generated by analyzing the sensor data. ).
(5) For the behavior recognition result generated by analyzing sensor data while referring to the learning model, the adjustment function determines that the higher the accuracy of the learning model, the higher the reliability of the behavior recognition result. The information processing apparatus according to (3) or (4).
(6) The adjustment function determines that the behavior recognition result generated by analyzing the sensor data has a higher reliability of the behavior recognition result as the type of the sensor data is larger. The information processing apparatus according to any one of 5).
(7) For the behavior recognition result generated by analyzing the sensor data, the adjustment function determines that the reliability of the behavior recognition result is higher as the processing capability of the device that performs the analysis is higher. The information processing apparatus according to any one of (6) to (6).
(8) The processing circuit includes:
A function of acquiring a plurality of behavior recognition results that are close to each other and each indicate a user's behavior of a plurality of terminal devices including the first terminal device and the second terminal device;
A function of counting the number of action recognition results common to the first action recognition result and the number of action recognition results common to the second action recognition result among the plurality of action recognition results; Realized,
When the number of action recognition results common to the first action recognition result is larger than the number of action recognition results common to the second action recognition result, the adjustment function is configured to use the first action recognition result. The information processing apparatus according to (1) or (2), wherein the second action recognition result is replaced by:
(9) The adjustment function replaces the second action recognition result by the first action recognition result according to a predetermined condition,
The processing circuit sets at least a part of a processing system for generating the second action recognition result to a power saving mode when the second action recognition result is replaced by the first action recognition result. The information processing apparatus according to any one of (1) to (8), further realizing a control function for transition.
(10) When the second action recognition result is generated by analyzing sensor data, the control function causes the sensor that provides the sensor data or a device including the sensor to transition to the power saving mode. The information processing apparatus according to (9).
(11) When the second action recognition result is generated by analyzing sensor data, the control function causes the device that performs the analysis to transition to the power saving mode. The information processing apparatus according to 10).
(12) acquiring a first behavior recognition result indicating the behavior of the user of the first terminal device;
Obtaining information indicating a second terminal device proximate to the first terminal device;
An information processing method including: a processing circuit generating or correcting a second behavior recognition result indicating a behavior of a user of the second terminal device based on the first behavior recognition result.
(13) a function of acquiring a first behavior recognition result indicating the behavior of the user of the first terminal device;
A function of acquiring information indicating a second terminal device proximate to the first terminal device;
A program for causing a processing circuit to realize an adjustment function that generates or corrects a second behavior recognition result indicating a behavior of a user of the second terminal device based on the first behavior recognition result.
 100  端末装置
 101  ジャイロセンサ
 102  加速度センサ
 103  気圧センサ
 104  近接センサ
 105  近接端末検出機能
 300  行動認識サーバ
 311  行動認識機能
 312  同一エリア判定機能
 313  行動共有機能
 314  信頼度判定機能
 315  行動認識結果調整機能
 316  フィードバック制御機能
 
DESCRIPTION OF SYMBOLS 100 Terminal device 101 Gyro sensor 102 Acceleration sensor 103 Atmospheric pressure sensor 104 Proximity sensor 105 Proximity terminal detection function 300 Action recognition server 311 Action recognition function 312 Same area determination function 313 Action sharing function 314 Reliability determination function 315 Action recognition result adjustment function 316 Feedback Control function

Claims (13)

  1.  第1の端末装置のユーザの行動を示す第1の行動認識結果を取得する機能と、
     前記第1の端末装置に近接する第2の端末装置を示す情報を取得する機能と、
     前記第1の行動認識結果に基づいて前記第2の端末装置のユーザの行動を示す第2の行動認識結果を生成または修正する調整機能と
     を実現する処理回路を備える情報処理装置。
    A function of acquiring a first behavior recognition result indicating the behavior of the user of the first terminal device;
    A function of acquiring information indicating a second terminal device proximate to the first terminal device;
    An information processing apparatus comprising: a processing circuit that realizes an adjustment function that generates or corrects a second action recognition result indicating the action of the user of the second terminal device based on the first action recognition result.
  2.  前記調整機能は、前記第2の行動認識結果が存在しない場合に、前記第1の行動認識結果に基づいて前記第2の行動認識結果を生成する、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the adjustment function generates the second action recognition result based on the first action recognition result when the second action recognition result does not exist.
  3.  前記処理回路は、
      前記第1の行動認識結果の信頼度を取得する機能と、
      前記第2の行動認識結果および前記第2の行動認識結果の信頼度を取得する機能と
     をさらに実現し、
     前記調整機能は、前記第1の行動認識結果の信頼度が前記第2の行動認識結果の信頼度よりも高い場合に、前記第1の行動認識結果によって前記第2の行動認識結果を置き換える、請求項1に記載の情報処理装置。
    The processing circuit is
    A function of acquiring the reliability of the first action recognition result;
    Further realizing a function of obtaining a reliability of the second action recognition result and the second action recognition result,
    The adjustment function replaces the second action recognition result by the first action recognition result when the reliability of the first action recognition result is higher than the reliability of the second action recognition result. The information processing apparatus according to claim 1.
  4.  前記調整機能は、ユーザによって入力された情報に基づく行動認識結果の信頼度が、センサデータを解析することによって生成された行動認識結果の信頼度よりも高いと判定する、請求項3に記載の情報処理装置。 4. The adjustment function according to claim 3, wherein the adjustment function determines that the reliability of the action recognition result based on the information input by the user is higher than the reliability of the action recognition result generated by analyzing the sensor data. Information processing device.
  5.  前記調整機能は、学習モデルを参照しながらセンサデータを解析することによって生成された行動認識結果について、前記学習モデルの精度が高いほど前記行動認識結果の信頼度が高いと判定する、請求項3に記載の情報処理装置。 The said adjustment function determines that the reliability of the said action recognition result is so high that the precision of the said learning model is high about the action recognition result produced | generated by analyzing sensor data, referring a learning model. The information processing apparatus described in 1.
  6.  前記調整機能は、センサデータを解析することによって生成された行動認識結果について、前記センサデータの種類が多いほど前記行動認識結果の信頼度が高いと判定する、請求項3に記載の情報処理装置。 The information processing apparatus according to claim 3, wherein the adjustment function determines that the action recognition result generated by analyzing the sensor data has a higher reliability of the action recognition result as the type of the sensor data is larger. .
  7.  前記調整機能は、センサデータを解析することによって生成された行動認識結果について、解析を実行する装置の処理能力が高いほど前記行動認識結果の信頼度が高いと判定する、請求項3に記載の情報処理装置。 4. The adjustment function according to claim 3, wherein the adjustment function determines that the behavior recognition result generated by analyzing the sensor data has a higher reliability of the behavior recognition result as the processing capability of the apparatus that performs the analysis is higher. Information processing device.
  8.  前記処理回路は、
      互いに近接し、前記第1の端末装置および前記第2の端末装置を含む複数の端末装置のユーザの行動をそれぞれ示す複数の行動認識結果を取得する機能と、
      前記複数の行動認識結果の中で、前記第1の行動認識結果と共通する行動認識結果の数と、前記第2の行動認識結果と共通する行動認識結果の数とをカウントする機能と
     をさらに実現し、
     前記調整機能は、前記第1の行動認識結果と共通する行動認識結果の数が、前記第2の行動認識結果と共通する行動認識結果の数よりも多い場合に、前記第1の行動認識結果によって前記第2の行動認識結果を置き換える、請求項1に記載の情報処理装置。
    The processing circuit is
    A function of acquiring a plurality of behavior recognition results that are close to each other and each indicate a user's behavior of a plurality of terminal devices including the first terminal device and the second terminal device;
    A function of counting the number of action recognition results common to the first action recognition result and the number of action recognition results common to the second action recognition result among the plurality of action recognition results; Realized,
    When the number of action recognition results common to the first action recognition result is larger than the number of action recognition results common to the second action recognition result, the adjustment function is configured to use the first action recognition result. The information processing apparatus according to claim 1, wherein the second action recognition result is replaced by:
  9.  前記調整機能は、所定の条件に従って前記第1の行動認識結果によって前記第2の行動認識結果を置き換え、
     前記処理回路は、前記第1の行動認識結果によって前記第2の行動認識結果が置き換えられた場合に、前記第2の行動認識結果を生成するための処理系統の少なくとも一部を省電力モードに遷移させる制御機能をさらに実現する、請求項1に記載の情報処理装置。
    The adjustment function replaces the second action recognition result by the first action recognition result according to a predetermined condition,
    The processing circuit sets at least a part of a processing system for generating the second action recognition result to a power saving mode when the second action recognition result is replaced by the first action recognition result. The information processing apparatus according to claim 1, further realizing a control function for transition.
  10.  前記制御機能は、前記第2の行動認識結果がセンサデータを解析することによって生成される場合に、前記センサデータを提供するセンサまたは該センサを含む装置を前記省電力モードに遷移させる、請求項9に記載の情報処理装置。 The control function, when the second action recognition result is generated by analyzing sensor data, causes the sensor that provides the sensor data or a device including the sensor to transition to the power saving mode. 9. The information processing apparatus according to 9.
  11.  前記制御機能は、前記第2の行動認識結果がセンサデータを解析することによって生成される場合に、前記解析を実行する装置を前記省電力モードに遷移させる、請求項9に記載の情報処理装置。 The information processing apparatus according to claim 9, wherein the control function causes the apparatus that performs the analysis to transition to the power saving mode when the second action recognition result is generated by analyzing sensor data. .
  12.  第1の端末装置のユーザの行動を示す第1の行動認識結果を取得することと、
     前記第1の端末装置に近接する第2の端末装置を示す情報を取得することと、
     処理回路が、前記第1の行動認識結果に基づいて前記第2の端末装置のユーザの行動を示す第2の行動認識結果を生成または修正することと
     を含む情報処理方法。
    Obtaining a first behavior recognition result indicating the behavior of the user of the first terminal device;
    Obtaining information indicating a second terminal device proximate to the first terminal device;
    An information processing method including: a processing circuit generating or correcting a second behavior recognition result indicating a behavior of a user of the second terminal device based on the first behavior recognition result.
  13.  第1の端末装置のユーザの行動を示す第1の行動認識結果を取得する機能と、
     前記第1の端末装置に近接する第2の端末装置を示す情報を取得する機能と、
     前記第1の行動認識結果に基づいて前記第2の端末装置のユーザの行動を示す第2の行動認識結果を生成または修正する調整機能と
     を処理回路に実現させるためのプログラム。
     
    A function of acquiring a first behavior recognition result indicating the behavior of the user of the first terminal device;
    A function of acquiring information indicating a second terminal device proximate to the first terminal device;
    A program for causing a processing circuit to realize an adjustment function that generates or corrects a second behavior recognition result indicating a behavior of a user of the second terminal device based on the first behavior recognition result.
PCT/JP2015/059759 2014-06-24 2015-03-27 Information processing device, information processing method, and program WO2015198672A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-129170 2014-06-24
JP2014129170 2014-06-24

Publications (1)

Publication Number Publication Date
WO2015198672A1 true WO2015198672A1 (en) 2015-12-30

Family

ID=54937773

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/059759 WO2015198672A1 (en) 2014-06-24 2015-03-27 Information processing device, information processing method, and program

Country Status (1)

Country Link
WO (1) WO2015198672A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012014357A (en) * 2010-06-30 2012-01-19 Nippon Telegr & Teleph Corp <Ntt> Mobile body feature description device, mobile body feature description method, mobile body feature description program and recording medium

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012014357A (en) * 2010-06-30 2012-01-19 Nippon Telegr & Teleph Corp <Ntt> Mobile body feature description device, mobile body feature description method, mobile body feature description program and recording medium

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
KENJI TOMIOKA: "Design for Support System of Watching Over Children based on Symbiotic Computing", IEICE TECHNICAL REPORT, vol. 107, no. 353, 19 November 2007 (2007-11-19), pages 35 - 40 *
TAKASHI NOMURA ET AL.: "A Proposal of Trajectory Estimation Algorithm for Mobile Nodes and Its Evaluation with Realistic Scenarios", SYMPOSIUM ON MULTIMEDIA, DISTRIBUTED, COOPERATIVE AND MOBILE SYSTEMS (DICOM02008) RONBUNSHU, IPSJ SYMPOSIUM SERIES, vol. 2008, no. 1, 2 July 2008 (2008-07-02), pages 1066 - 1074 *
XIAOPENG LIU: "Spring Model based Collaborative Indoor Position Estimation with Neighbor Mobile Terminals", KENKYU HOKOKU UBIQUITOUS COMPUTING SYSTEM, vol. 2013 -UB, no. 10, 7 March 2013 (2013-03-07), pages 1 - 8, XP055247163 *
YUKA KUME: "Android ni Okeru Sensor Joho no Kiroku Saisei Kino no Sekkei", INFORMATION PROCESSING SOCIETY OF JAPAN , KENKYU HOKOKU CONSUMER DEVICES & SYSTEMS (CDS, 15 May 2014 (2014-05-15), pages 1 - 6 *

Similar Documents

Publication Publication Date Title
JP6756328B2 (en) Information processing equipment, information processing methods, and programs
JP6897728B2 (en) Image processing equipment, image processing methods and programs
JP6459972B2 (en) Display control apparatus, display control method, and program
US20170322017A1 (en) Information processing device, information processing method, and program
US20210072398A1 (en) Information processing apparatus, information processing method, and ranging system
US10713525B2 (en) Image processing device and method to obtain a 360° image without remapping
US11720814B2 (en) Method and system for classifying time-series data
WO2020108041A1 (en) Detection method and device for key points of ear region and storage medium
US10962738B2 (en) Information processing apparatus and information processing method to calibrate line-of-sight of a user
US11816269B1 (en) Gesture recognition for wearable multimedia device using real-time data streams
JP2018152777A (en) Information processing apparatus, imaging apparatus, and electronic apparatus
CN108846817B (en) Image processing method and device and mobile terminal
WO2015198672A1 (en) Information processing device, information processing method, and program
CN113432620B (en) Error estimation method and device, vehicle-mounted terminal and storage medium
WO2015194215A1 (en) Information processing device, information processing method, and program
WO2020031795A1 (en) Information processing device, information processing method, and program
US11372473B2 (en) Information processing apparatus and information processing method
JPWO2015151548A1 (en) Electronic equipment and recording medium
US11232581B2 (en) Information processing apparatus, information processing method, and recording medium
US20210166720A1 (en) Electronic device for identifying location information of external device and operating method thereof
WO2020183602A1 (en) Information processing device and information processing method
US10855639B2 (en) Information processing apparatus and information processing method for selection of a target user
Matsumoto et al. Image processing device and method to obtain a 360 image without remapping
KR20150066350A (en) A portable terminal of having a blackbox function
CN111488898A (en) Countermeasure data acquisition method, device, equipment and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15812485

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15812485

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP