WO2017179258A1 - 情報処理装置、及び情報処理方法 - Google Patents
情報処理装置、及び情報処理方法 Download PDFInfo
- Publication number
- WO2017179258A1 WO2017179258A1 PCT/JP2017/001903 JP2017001903W WO2017179258A1 WO 2017179258 A1 WO2017179258 A1 WO 2017179258A1 JP 2017001903 W JP2017001903 W JP 2017001903W WO 2017179258 A1 WO2017179258 A1 WO 2017179258A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- display size
- information processing
- class
- processing apparatus
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 86
- 238000003672 processing method Methods 0.000 title claims abstract description 8
- 238000004891 communication Methods 0.000 claims abstract description 88
- 239000013598 vector Substances 0.000 claims abstract description 38
- 238000012545 processing Methods 0.000 claims description 23
- 230000008859 change Effects 0.000 claims description 15
- 230000004044 response Effects 0.000 claims description 8
- 238000002372 labelling Methods 0.000 description 22
- 238000000034 method Methods 0.000 description 18
- 230000004048 modification Effects 0.000 description 17
- 238000012986 modification Methods 0.000 description 17
- 230000006870 function Effects 0.000 description 14
- 238000010586 diagram Methods 0.000 description 10
- 238000010801 machine learning Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 230000007704 transition Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 4
- 238000004590 computer program Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 230000001151 other effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/5866—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/904—Browsing; Visualisation therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2411—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/243—Classification techniques relating to the number of classes
- G06F18/2431—Multiple classes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N99/00—Subject matter not provided for in other groups of this subclass
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
- G06T11/206—Drawing of charts or graphs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
Definitions
- the present disclosure relates to an information processing apparatus and an information processing method.
- supervised learning performs learning based on teacher data (a combination of data and a label labeled with the data).
- teacher data a combination of data and a label labeled with the data.
- the cost for labeling may increase. .
- Patent Document 1 does not label all input image data, but performs age estimation using image data labeled only for some representative data.
- a technique for reducing the cost associated with labeling is disclosed.
- An information processing apparatus based on the distance between the position of the data arranged based on the likelihood vector obtained by recognition of data based on learning of the label corresponding to the class and the position of the class to which the data belongs, includes a display size determination unit that determines a display size of the data, and a communication control unit that transmits the display size.
- an input unit that accepts a change of a display target, and an arrangement based on a likelihood vector obtained by recognition of data based on learning of a label corresponding to a class in response to the change of the display target
- an information processing apparatus comprising: a display size determining unit that determines a display size of the data based on a distance between the position of the data to be processed and a position of a class to which the data belongs.
- a processor determines a display size of the data based on a distance between the position of the data and the position of the class to which the data belongs.
- FIG. 1 is an explanatory diagram for describing a configuration example of an information processing system 1000 according to an embodiment of the present disclosure.
- the information processing system 1000 according to the present embodiment is an information processing system for labeling data learned in machine learning.
- the data according to the present disclosure is not particularly limited, and may be, for example, image data, audio data, text data, and the like.
- the data that is learned and recognized in the present embodiment is image data.
- the present embodiment class that the image data belongs to the image data for learning (for example, the type of object included in the image data). ) Will be described.
- an information processing system 1000 includes a client terminal 1, a server 2, and a communication network 5.
- a client terminal 1 includes a client terminal 1 and a server 2 included in the information processing system 1000 according to the present embodiment.
- the client terminal 1 is an information processing apparatus used for a user who performs labeling.
- the user performs labeling by giving a label to the learning image data displayed on the client terminal 1.
- the label information (label information) given by the user is provided to the server 2 via the communication network 5.
- the image data for learning displayed by the client terminal 1 may be provided from the server 2 to the client terminal 1 via the communication network 5.
- the client terminal 1 receives from the server 2 the graph information for visualizing the recognition result and the display size of the image data, and performs display processing based on the graph information and the display size.
- the server 2 receives from the server 2 the graph information for visualizing the recognition result and the display size of the image data, and performs display processing based on the graph information and the display size.
- the server 2 is an information processing apparatus that learns labeled image data and recognizes image data.
- the labeled image data is so-called teacher data, and is obtained by associating the label information received from the client terminal 1 via the communication network 5 with the image data.
- the server 2 performs graph generation for display by the client terminal 1 and visualization of the recognition result, and determination of the display size of each image data based on the recognition result, and displays the graph information and the display size in the client terminal 1. Send to.
- a configuration example of the server 2 will be described later with reference to FIG.
- the communication network 5 is a wired or wireless transmission path for information transmitted from a device or system connected to the communication network 5.
- the communication network 5 may include a public line network such as the Internet, a telephone line network, a satellite communication network, various LANs including Ethernet (Registered Trademark), a WAN (Wide Area Network), and the like.
- the communication network 5 may include a dedicated line network such as an IP-VPN (Internet Protocol-Virtual Private Network).
- the control unit 10 controls each configuration of the client terminal 1.
- the control unit 10 also functions as a communication control unit 101, a display control unit 103, and a label control unit 105, as shown in FIG.
- the communication control unit 101 controls communication by the communication unit 12.
- the communication control unit 101 controls the communication unit 12 to receive image data, graph information, a display size, and the like from the server 2.
- the communication control unit 101 controls the communication unit 12 to cause the server 2 to transmit label information.
- the label information may be generated by a label control unit 105, which will be described later, based on a user input, for example.
- the display control unit 103 controls display by the display unit 14 (performs processing related to display). For example, the display control unit 103 causes the display unit 14 to display image data received from the server 2. Further, the display control unit 103 may cause the display unit 14 to display the image data based on the graph information received from the server 2 and the display size.
- the label control unit 105 generates label information (label information) based on user input.
- the label information generated by the label control unit 105 may include a label and information that associates the label with the image data (for example, identification information of the image data).
- the label control unit 105 may generate the label information by associating the image data displayed on the display unit 14 with the user input obtained via the operation unit 16.
- the label information generated by the label control unit 105 is transmitted to the server 2 by communication control of the communication control unit 101.
- the communication unit 12 (reception unit) is a communication interface that mediates communication with other devices.
- the communication unit 12 supports an arbitrary wireless communication protocol or wired communication protocol, and establishes a communication connection with another device via, for example, the communication network 5 shown in FIG. Thereby, for example, the client terminal 1 can transmit label information to the server 2 connected to the communication network 5 and can receive graph information and display size from the server 2.
- the display unit 14 is a display that displays various screens under the control of the display control unit 103.
- the display unit 14 may display a plurality of image data or a label input screen.
- the display unit 14 may be realized in various forms according to the form of the client terminal 1.
- the operation unit 16 receives a user input and provides it to the control unit 10. For example, the user may select the image data to be labeled by operating the operation unit 16. Note that the user may select one or more image data to be labeled. Further, the user may operate the operation unit 16 and input a label to be given to the image data. In addition, the user may operate the operation unit 16 to input a change in display target (for example, enlargement, movement, rotation, etc. of the display target range).
- the operation unit 16 may be realized by, for example, a mouse, a keyboard, a touch panel, a button, a switch, a line-of-sight input device, a gesture input device, a voice input device, or the like.
- FIG. 2 is an explanatory diagram for explaining a configuration example of the server 2 according to the present embodiment.
- the server 2 is an information processing apparatus that includes a control unit 20, a communication unit 22, and a storage unit 24.
- the control unit 20 controls each component of the server 2. Further, as illustrated in FIG. 2, the control unit 20 also functions as a communication control unit 201, a learning unit 202, a recognition unit 203, an arrangement control unit 204, and a display size determination unit 205.
- the communication control unit 201 controls communication by the communication unit 22.
- the communication control unit 201 controls the communication unit 22 to receive label information from the client terminal 1.
- the communication control unit 201 controls the communication unit 22 to transmit image data, graph information, display size, and the like.
- the learning unit 202 performs learning based on the labeled image data in which the label information received from the client terminal 1 is associated with the image data.
- the learning method by the learning unit 202 is not particularly limited, but may be a method based on a machine learning algorithm such as a neural network, a decision tree, or SVM (Support Vector Machine).
- a classifier that classifies image data into one of a plurality of classes is constructed.
- the recognition unit 203 specifies a likelihood vector including a likelihood indicating how much each image data is likely to belong to each class, and based on the likelihood vector, each image data is You may classify into classes. For example, the recognition unit 203 may classify the image data into a class corresponding to the highest likelihood in the likelihood vector related to the image data. Further, the recognition unit 203 classifies the image data into a class corresponding to the highest likelihood when the highest likelihood value in the likelihood vector related to the image data is equal to or greater than a predetermined threshold. Also good.
- image data classified into a class may be referred to as image data belonging to the class.
- the arrangement control unit 204 arranges the image data based on the likelihood vector in the recognition of the image data by the recognition unit 203. For example, the arrangement control unit 204 may arrange each image data by generating a graph that visualizes the recognition result based on the recognition result by the recognition unit 203.
- the method of generating a graph by the placement control unit 204 is not particularly limited, but for example, the graph generated by the placement control unit 204 may be a graph based on a dynamic model (force-based graph). In such a case, the arrangement control unit 204 generates a graph by assigning forces (forces) to the vertices and sides of the graph and specifying a stable state with low mechanical energy.
- the graph generation by the placement control unit 204 will be described as an example where the placement control unit 204 generates a graph based on the dynamic model.
- the arrangement control unit 204 may set sides between classes and assign a force based on the similarity between classes to the sides between classes. For example, the force assigned to the side between classes may be set to increase as the degree of similarity between classes increases. With such a configuration, similar classes are easily arranged at close positions, and image data belonging to the classes are also easily arranged at close positions.
- the similarity between classes may be specified based on the likelihood vector of image data belonging to each class, for example.
- the average value of the likelihood corresponding to the second class or in the likelihood vector of the image data classified into the second class
- the similarity between the first class and the second class may be specified based on an average value of likelihood corresponding to one class.
- the display size determination unit 205 determines the display size of the image data based on the distance between the position of the image data arranged by the arrangement control unit 204 and the position of the class to which the image data belongs. For example, the display size determination unit 205 may determine the display size such that the display size of the image data increases as the distance increases. With such a configuration, image data with a low likelihood and a high possibility of being misrecognized has a larger display size, and the user can more easily find data for which a correct label is desired.
- the display size determination unit 205 may determine the display size based further on the class radius described above. For example, the display size may be determined such that the display size increases as the ratio of the distance to the class radius increases. According to such a configuration, the display size is normalized by the class radius. For example, even if the image data belongs to a class having a small class radius, image data that is likely to be erroneously recognized is displayed large.
- the display size determination unit 205 makes the display size larger than the other image data when the distance related to the image data or the ratio of the distance to the class radius is larger than a predetermined threshold.
- the display size may be determined.
- the image data of the outer edge of the cloud (circle indicating the class range) is displayed large, and the other image data is displayed small.
- the communication unit 22 is a communication interface that mediates communication with other devices.
- the communication unit 22 supports an arbitrary wireless communication protocol or wired communication protocol, and establishes a communication connection with another device via, for example, the communication network 5 illustrated in FIG.
- the server 2 can receive the label information from the client terminal 1 connected to the communication network 5, and can transmit the graph information and the display size to the client terminal 1.
- the storage unit 24 stores a program and parameters for each component of the server 2 to function.
- the storage unit 24 stores learning image data and label information received from the client terminal 1.
- Example of operation The configuration example of the information processing system 1000 according to the present embodiment has been described above. Subsequently, an operation example of the information processing system 1000 according to the present embodiment will be described with reference to FIGS. In the following, first, the processing flow of the information processing system 1000 will be described with reference to FIG. 3, and examples of screen transitions displayed on the client terminal 1 in this embodiment will be described with reference to FIGS.
- FIG. 3 is a sequence diagram showing a processing flow of the information processing system 1000 according to the present embodiment.
- image data for learning is transmitted from the server 2 to the client terminal 1 (S100). Subsequently, the client terminal 1 displays image data (S102). In step S102, since no label is given to the image data and the server 2 does not recognize the image data, for example, the image data are all displayed at random positions with the same display size. May be.
- step S104 image data to be labeled is selected (S104).
- step S106 a label is attached to the image data selected in step S104 (S106).
- labeling may be performed by the user inputting a label, or labeling may be performed by the user selecting a label from among labels prepared in advance or already input by the user. It may be done.
- step S104 and step S106 a plurality of image data may be selected and labeled, or steps S104 and S106 are repeated to label a plurality of labels. Also good.
- the label information obtained by labeling in step S106 is transmitted from the client terminal 1 to the server 2 (S108).
- the learning unit 202 of the server 2 that has received the label information performs learning using the labeled image data (S110).
- the recognition unit 203 of the server 2 identifies a likelihood vector related to the image data based on the learning result of step S110, and recognizes (classifies) the image data based on the likelihood vector (S112).
- the arrangement control unit 204 of the server 2 arranges the class and the image data by generating a graph based on the likelihood vector (S114). Based on the arranged class and the position of the image data, the display size determining unit 205 of the server 2 determines the display size (S116).
- the graph information (graph information) generated in step S114 and the display size determined in step S116 are transmitted from the server 2 to the client terminal 1 (S118).
- the process is finished.
- the server 2 is notified that the labeling is completed, and the label corresponding to the class to which each image data currently belongs may be labeled as the label of the image data.
- step S122 If it is selected in step S122 that labeling has not ended (NO in S122), the user operates the client terminal 1 to select a cloud that includes image data that the user wants to label. (S124).
- the display control unit 103 of the client terminal 1 changes the display size of the image data included in the selected cloud to the display size received from the server 2 in step S118 (S126). Subsequently, the process returns to step S104 and is repeated until it is selected by the user that labeling has been completed.
- FIGS. 4 to 10 are explanatory diagrams illustrating screen examples displayed on the display unit 14 by the display control unit 103 of the client terminal 1 according to the present embodiment. In the following description, the processing steps shown in FIG. 3 are referred to as appropriate.
- the image data D120 is selected in step S104 of FIG. 3, the image data D120 is enlarged and can be labeled as shown in the screen G200 shown in FIG.
- the user can easily perform labeling while checking details of the image data D120.
- the display is updated as shown in a screen G300 shown in FIG. 6 in step S120 of FIG.
- the cloud C10 corresponding to the class Car is displayed, and the image data D102, D104, D112, D114, D120, D122, D124, and D126 belonging to the class Car are displayed in the cloud C10. .
- image data having a low likelihood related to the class Car corresponding to the cloud C10 and having a high possibility of being erroneously recognized is easily displayed at a position far from the center position of the cloud C10 (the position of the class Car).
- the display size of the image data is changed as shown in a screen G400 shown in FIG. 7 in step S126 of FIG.
- the image data D104, D114, D122, and D126 arranged at positions far from the center position (class Car position) C12 of the cloud C10 are larger in display size than the other image data. It is displayed.
- the image data D126 is enlarged and can be labeled as shown in the screen G500 in FIG.
- step S120 in FIG. 3 the display is updated as shown in a screen G600 shown in FIG.
- a cloud C10 corresponding to the class Car and a cloud C20 corresponding to the class Plane are displayed.
- image data D102, D112, D120, and D124 belonging to the class Car are displayed in the cloud C10
- image data D104, D114, D122, and D126 belonging to the class Plane are displayed in the cloud C20, respectively. Is done.
- image data that has a low likelihood related to the class corresponding to each cloud and is likely to be misrecognized is likely to be displayed at a position far from the center position of each cloud.
- step S122 of FIG. 3 the user confirms the screen G600 and selects that labeling has been completed.
- the display size of each image data may be changed to the display size received from the server 2.
- a range in the screen may be selected instead of the cloud, and the display size of the image data included in the selected range may be changed to the display size received from the server 2.
- image data may be selected instead of the cloud, and the display size of the selected image data may be changed to the display size received from the server 2.
- image data is clustered and displayed. Therefore, when a plurality of image data is selected and the same labeling is performed, the user Makes it easier to select image data.
- the above clustering may be performed by the recognition unit 203 of the server 2, for example.
- the arrangement control unit 204 has described an example in which classes and image data are arranged by generating a graph based on a dynamic model (force-based graph).
- the present technology is not limited to such an example. .
- FIG. 11 is an explanatory diagram for explaining a configuration example of the client terminal 1-2 according to the present modification.
- the client terminal 1-2 is an information processing apparatus including a control unit 11, a communication unit 12, a display unit 14, and an operation unit 16. Note that, in the configuration illustrated in FIG. 11, the substantially same configuration as each configuration illustrated in FIG.
- the communication control unit 102 controls communication by the communication unit 12 in the same manner as the communication control unit 101 described with reference to FIG.
- the communication control unit 101 controls the communication unit 12 to receive image data, likelihood vectors, and the like from the server 2.
- the communication control unit 101 controls the communication unit 12 to cause the server 2 to transmit label information.
- the arrangement control unit 107 arranges image data and classes based on the likelihood vector received from the server 2.
- the arrangement method of the image data and the class by the arrangement control unit 107 is the same as the arrangement method by the arrangement control unit 204 described with reference to FIG.
- FIG. 12 is a block diagram illustrating an example of a hardware configuration of the information processing apparatus according to the present embodiment.
- the information processing apparatus 900 illustrated in FIG. 12 can implement, for example, the client terminal 1, the server 2, or the client terminal 1-2 illustrated in FIGS.
- Information processing by the client terminal 1, the server 2, or the client terminal 1-2 according to the present embodiment is realized by cooperation of software and hardware described below.
- the CPU 901 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 900 according to various programs. Further, the CPU 901 may be a microprocessor.
- the ROM 902 stores programs used by the CPU 901, calculation parameters, and the like.
- the RAM 903 temporarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like.
- CPU901 can form the control part 10 shown in FIG. 1, the control part 20 shown in FIG. 2, and the control part 11 shown in FIG. 11, for example.
- the CPU 901, ROM 902, and RAM 903 are connected to each other by a host bus 904a including a CPU bus.
- the host bus 904 a is connected to an external bus 904 b such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 904.
- an external bus 904 b such as a PCI (Peripheral Component Interconnect / Interface) bus
- PCI Peripheral Component Interconnect / Interface
- the host bus 904a, the bridge 904, and the external bus 904b do not necessarily have to be configured separately, and these functions may be mounted on one bus.
- the input device 906 is realized by a device in which information is input by the user, such as a mouse, a keyboard, a touch panel, a button, a microphone, a switch, and a lever.
- the input device 906 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device such as a mobile phone or a PDA that supports the operation of the information processing device 900.
- the input device 906 may include, for example, an input control circuit that generates an input signal based on information input by the user using the above-described input means and outputs the input signal to the CPU 901.
- a user of the information processing apparatus 900 can input various data and instruct a processing operation to the information processing apparatus 900 by operating the input device 906.
- the input device 906 can form, for example, the operation unit 16 shown in FIG.
- the output device 907 is formed of a device that can notify the user of the acquired information visually or audibly. Examples of such devices include CRT display devices, liquid crystal display devices, plasma display devices, EL display devices, display devices such as lamps, audio output devices such as speakers and headphones, printer devices, and the like.
- the output device 907 outputs results obtained by various processes performed by the information processing device 900.
- the display device visually displays results obtained by various processes performed by the information processing device 900 in various formats such as text, images, tables, and graphs.
- the audio output device converts an audio signal composed of reproduced audio data, acoustic data, and the like into an analog signal and outputs it aurally.
- the display device can form, for example, the display unit 14 shown in FIG.
- the storage device 908 is a data storage device formed as an example of a storage unit of the information processing device 900.
- the storage apparatus 908 is realized by, for example, a magnetic storage device such as an HDD, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
- the storage device 908 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like.
- the storage device 908 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like.
- the storage device 908 can form, for example, the storage unit 24 shown in FIG.
- the drive 909 is a storage medium reader / writer, and is built in or externally attached to the information processing apparatus 900.
- the drive 909 reads information recorded on a removable storage medium such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the information to the RAM 903.
- the drive 909 can also write information to a removable storage medium.
- connection port 911 is an interface connected to an external device, and is a connection port with an external device capable of transmitting data by USB (Universal Serial Bus), for example.
- USB Universal Serial Bus
- the sensor 915 is various sensors such as an acceleration sensor, a gyro sensor, a geomagnetic sensor, an optical sensor, a sound sensor, a distance measuring sensor, and a force sensor.
- the sensor 915 acquires information on the state of the information processing apparatus 900 itself, such as the posture and movement speed of the information processing apparatus 900, and information on the surrounding environment of the information processing apparatus 900, such as brightness and noise around the information processing apparatus 900.
- Sensor 915 may also include a GPS sensor that receives GPS signals and measures the latitude, longitude, and altitude of the device.
- each of the above components may be realized using a general-purpose member, or may be realized by hardware specialized for the function of each component. Therefore, it is possible to change the hardware configuration to be used as appropriate according to the technical level at the time of carrying out this embodiment.
- a computer program for realizing each function of the information processing apparatus 900 according to the present embodiment as described above can be produced and mounted on a PC or the like.
- a computer-readable recording medium storing such a computer program can be provided.
- the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
- the above computer program may be distributed via a network, for example, without using a recording medium.
- the data to be labeled is image data
- the present technology is not limited to such an example.
- the data to be labeled may be text data or voice data.
- the display size determination unit may determine the display size (character size) of the text data.
- each step in the above embodiment does not necessarily have to be processed in time series in the order described as a sequence diagram.
- each step in the processing of the above embodiment may be processed in an order different from the order described as the sequence diagram or may be processed in parallel.
- the display size of the data is A display size determining unit to determine; A communication control unit for transmitting the display size;
- An information processing apparatus comprising: (2) The information processing apparatus according to (1), wherein the display size determination unit determines the display size such that the display size increases as the distance from the position of the class to which the data belongs increases. (3) The information processing apparatus according to (1) or (2), wherein the display size determination unit determines a display size further based on a radius of a circle indicating the range of the class.
- the information processing apparatus determines the display size such that the display size increases as the ratio of the distance to the radius increases.
- the display size determination unit determines the display size so that the display size is increased when a ratio of the distance to the radius is larger than a predetermined threshold.
- the information processing apparatus further includes an arrangement control unit that arranges the data, The information processing apparatus according to any one of (1) to (5), wherein the arrangement control unit arranges the data by generating a graph having the class and the data as vertices. (7) The information processing apparatus according to (6), wherein the arrangement control unit generates the graph based on a dynamic model.
- the information processing apparatus according to any one of (1) to (11), further including a display control unit configured to display the data with the display size.
- the data includes image data.
- a receiver for receiving the display size;
- a processing unit that performs processing based on the display size;
- An information processing apparatus comprising: (15) Based on the distance between the position of the data arranged based on the likelihood vector obtained by recognition of data based on learning of the label corresponding to the class and the position of the class to which the data belongs, the display size of the data is The processor decides, Sending the display size;
- An information processing method including: (16) An input unit for accepting a change in display target; In response to the change of the display object, the distance between the position of the data arranged based on the likelihood vector obtained by the recognition of the
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Human Computer Interaction (AREA)
- Library & Information Science (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- User Interface Of Digital Computer (AREA)
- Image Analysis (AREA)
Abstract
Description
<<1.構成例>>
<1-1.全体構成>
<1-2.クライアント端末>
<1-3.サーバ>
<<2.動作例>>
<2-1.処理フロー>
<2-2.画面遷移例>
<<3.変形例>>
<3-1.変形例1>
<3-2.変形例2>
<3-3.変形例3>
<3-4.変形例4>
<<4.ハードウェア構成例>>
<<5.むすび>>
まず、図1を参照しながら、本開示の一実施形態の構成例を説明する。図1は本開示の一実施形態に係る情報処理システム1000の構成例を説明するための説明図である。本実施形態に係る情報処理システム1000は、機械学習において学習されるデータに、ラベル付けを行うための情報処理システムである。
クライアント端末1は、ラベル付けを行うユーザに用いられる情報処理装置である。ユーザは、クライアント端末1に表示された学習用の画像データに、ラベルを与えることで、ラベル付けを行う。ユーザによって与えられたラベルの情報(ラベル情報)は、通信網5を介してサーバ2に提供される。なお、クライアント端末1が表示する学習用の画像データは、サーバ2から通信網5を介してクライアント端末1に提供されてもよい。
以上、本実施形態による情報処理システム1000の全体構成例を説明した。続いて、本実施形態に係るクライアント端末1の構成例を説明する。図1に示すように、本実施形態に係るクライアント端末1は、制御部10、通信部12、表示部14、操作部16を備える情報処理装置である。なお、クライアント端末1は、例えばPC(Personal Computer)、携帯電話、スマートフォン、タブレットPC等であってもよいし、HMD(Head Mounted Display)等のウェアラブルデバイスであってもよい。
以上、本実施形態に係るクライアント端末1の構成例を説明した。続いて、図2を参照して、本実施形態に係るサーバ2の構成例を説明する。図2は、本実施形態に係るサーバ2の構成例を説明するための説明図である。図2に示すように、サーバ2は、制御部20、通信部22、及び記憶部24を備える情報処理装置である。
以上、本実施形態による情報処理システム1000の構成例について説明した。続いて、本実施形態による情報処理システム1000の動作例について、図3~10を参照して説明する。以下では、まず情報処理システム1000の処理フローについて図3を参照して説明した後、本実施形態においてクライアント端末1に表示される画面の遷移例について図4~10を参照して説明する。
図3は、本実施形態による情報処理システム1000の処理フローを示すシーケンス図である。
以上、本実施形態による情報処理システム1000の処理フローを説明した。続いて、画像データにラベル付けが行われる場合に、クライアント端末1に表示される画面の遷移の具体例について図4~10を参照して説明する。図4~10は、本実施形態に係るクライアント端末1の表示制御部103が表示部14に表示させる画面例を示す説明図である。なお、以下では、図3に示した処理ステップを適宜参照しながら説明を行う。
以上、本開示の一実施形態を説明した。以下では、本開示の一実施形態の幾つかの変形例を説明する。なお、以下に説明する各変形例は、単独で本開示の実施形態に適用されてもよいし、組み合わせで本開示の実施形態に適用されてもよい。また、各変形例は、本開示の実施形態で説明した構成に代えて適用されてもよいし、本開示の実施形態で説明した構成に対して追加的に適用されてもよい。
上記実施形態ではステップS124においてクラウドが選択された後に、選択されたクラウドに含まれる画像データの表示サイズがサーバ2から受信した表示サイズに変更される(S126)例を説明したが、本技術は係る例に限定されない。
また、上記実施形態では、ラベル付けが一度も行われていない初期の状態(ステップS102、画面G100)において、画像データがランダムな位置に、全て同一の表示サイズで、表示される例を説明したが、本技術は係る例に限定されない。例えば、k-means法等の教師なし学習手法により、画像データがクラスタリングされた結果が、ステップS102において表示されてもよい。
また、上記実施形態では、配置制御部204は、力学モデルに基づくグラフ(フォースベースドグラフ)を生成することで、クラスと画像データを配置する例を説明したが、本技術は係る例に限定されない。
また、上記実施形態では、図1、及び図2を参照してクライアント端末1、及びサーバ2が有する機能を説明したが、本技術は係る例に限定されない。例えば、上記実施形態で説明したクライアント端末1の機能をサーバ2が有してもよいし、上記実施形態で説明したサーバ2の機能をクライアント端末1が有してもよい。以下では、変形例として、クライアント端末が、上記実施形態で説明した配置制御部、及び表示サイズ決定部の機能を有する例を説明する。
以上、本開示の実施形態を説明した。最後に、図12を参照して、本実施形態に係る情報処理装置のハードウェア構成について説明する。図12は、本実施形態に係る情報処理装置のハードウェア構成の一例を示すブロック図である。なお、図12に示す情報処理装置900は、例えば、図1、2、11にそれぞれ示したクライアント端末1、サーバ2、またはクライアント端末1-2を実現し得る。本実施形態に係るクライアント端末1、サーバ2、またはクライアント端末1-2による情報処理は、ソフトウェアと、以下に説明するハードウェアとの協働により実現される。
以上、説明したように、本開示の実施形態によれば、データに対するラベル付けをより効率化することが可能である。
(1)
クラスに対応するラベルの学習に基づくデータの認識により得られる尤度ベクトルに基づいて配置される前記データの位置と、前記データが属するクラスの位置との距離に基づいて、前記データの表示サイズを決定する、表示サイズ決定部と、
前記表示サイズを送信させる、通信制御部と、
を備える情報処理装置。
(2)
前記表示サイズ決定部は、前記データが属する前記クラスの位置からの前記距離が大きい程、前記表示サイズが大きくなるように、前記表示サイズを決定する、前記(1)に記載の情報処理装置。
(3)
前記表示サイズ決定部は、前記クラスの範囲を示す円の半径にさらに基づいて、表示サイズを決定する、前記(1)または(2)に記載の情報処理装置。
(4)
前記表示サイズ決定部は、前記半径に対する前記距離の比が大きい程、前記表示サイズが大きくなるように、前記表示サイズを決定する、前記(3)に記載の情報処理装置。
(5)
前記表示サイズ決定部は、前記半径に対する前記距離の比が所定の閾値より大きい場合に、前記表示サイズが大きくなるように、前記表示サイズを決定する、前記(3)に記載の情報処理装置。
(6)
前記情報処理装置は、前記データを配置する配置制御部を更に備え、
前記配置制御部は、前記クラス、及び前記データを頂点とするグラフを生成することで、前記データを配置する、前記(1)~(5)のいずれか一項に記載の情報処理装置。
(7)
前記配置制御部は、力学モデルに基づいて、前記グラフを生成する、前記(6)に記載の情報処理装置。
(8)
前記配置制御部は、前記クラスと前記データとの間の辺に、前記尤度ベクトルに基づくフォースを割り当てる、前記(7)に記載の情報処理装置。
(9)
前記配置制御部は、前記尤度ベクトルにおける前記クラスに係る尤度が大きい程、前記クラスと前記データとの間の辺に割り当てられる前記フォースが大きくなるように、前記フォースを割り当てる、前記(8)に記載の情報処理装置。
(10)
前記配置制御部は、クラス間の辺に、前記クラス間の類似度に基づくフォースを割り当てる、前記(6)~(9)のいずれか一項に記載の情報処理装置。
(11)
前記配置制御部は、前記クラス間の類似度が大きい程、前記クラス間の辺に割り当てられるフォースが大きくなるように、前記フォースを割り当てる、前記(10)に記載の情報処理装置。
(12)
前記情報処理装置は、前記表示サイズで、前記データを表示させる表示制御部をさらに備える、前記(1)~(11)のいずれか一項に記載の情報処理装置。
(13)
前記データは、画像データを含む、前記(1)~(12)のいずれか一項に記載の情報処理装置。
(14)
クラスに対応するラベルの学習に基づくデータの認識により得られる尤度ベクトルに基づいて配置される前記データの位置と、前記データが属するクラスの位置との距離に基づいて、決定された前記データの表示サイズを受信する受信部と、
前記表示サイズに基づいて処理を行う処理部と、
を備える情報処理装置。
(15)
クラスに対応するラベルの学習に基づくデータの認識により得られる尤度ベクトルに基づいて配置される前記データの位置と、前記データが属するクラスの位置との距離に基づいて、前記データの表示サイズをプロセッサが決定することと、
前記表示サイズを送信させることと、
を含む情報処理方法。
(16)
表示対象の変更を受け付ける入力部と、
前記表示対象の変更に呼応して、クラスに対応するラベルの学習に基づくデータの認識により得られる尤度ベクトルに基づいて配置される前記データの位置と、前記データが属するクラスの位置との距離に基づいて、前記データの表示サイズを決定する、表示サイズ決定部と、
を備える情報処理装置。
(17)
表示対象の変更を受け付けることと、
前記表示対象の変更に呼応して、クラスに対応するラベルの学習に基づくデータの認識により得られる尤度ベクトルに基づいて配置される前記データの位置と、前記データが属するクラスの位置との距離に基づいて、前記データの表示サイズをプロセッサが決定することと、
を含む情報処理方法。
2 サーバ
5 通信網
10、11 制御部
12 通信部
14 表示部
16 操作部
20 制御部
22 通信部
24 記憶部
101、102 通信制御部
103 表示制御部
105 ラベル制御部
107 配置制御部
109 表示サイズ決定部
201 通信制御部
202 学習部
203 認識部
204 配置制御部
205 表示サイズ決定部
Claims (17)
- クラスに対応するラベルの学習に基づくデータの認識により得られる尤度ベクトルに基づいて配置される前記データの位置と、前記データが属するクラスの位置との距離に基づいて、前記データの表示サイズを決定する、表示サイズ決定部と、
前記表示サイズを送信させる、通信制御部と、
を備える情報処理装置。 - 前記表示サイズ決定部は、前記データが属する前記クラスの位置からの前記距離が大きい程、前記表示サイズが大きくなるように、前記表示サイズを決定する、請求項1に記載の情報処理装置。
- 前記表示サイズ決定部は、前記クラスの範囲を示す円の半径にさらに基づいて、表示サイズを決定する、請求項1に記載の情報処理装置。
- 前記表示サイズ決定部は、前記半径に対する前記距離の比が大きい程、前記表示サイズが大きくなるように、前記表示サイズを決定する、請求項3に記載の情報処理装置。
- 前記表示サイズ決定部は、前記半径に対する前記距離の比が所定の閾値より大きい場合に、前記表示サイズが大きくなるように、前記表示サイズを決定する、請求項3に記載の情報処理装置。
- 前記情報処理装置は、前記データを配置する配置制御部を更に備え、
前記配置制御部は、前記クラス、及び前記データを頂点とするグラフを生成することで、前記データを配置する、請求項1に記載の情報処理装置。 - 前記配置制御部は、力学モデルに基づいて、前記グラフを生成する、請求項6に記載の情報処理装置。
- 前記配置制御部は、前記クラスと前記データとの間の辺に、前記尤度ベクトルに基づくフォースを割り当てる、請求項7に記載の情報処理装置。
- 前記配置制御部は、前記尤度ベクトルにおける前記クラスに係る尤度が大きい程、前記クラスと前記データとの間の辺に割り当てられる前記フォースが大きくなるように、前記フォースを割り当てる、請求項8に記載の情報処理装置。
- 前記配置制御部は、クラス間の辺に、前記クラス間の類似度に基づくフォースを割り当てる、請求項6に記載の情報処理装置。
- 前記配置制御部は、前記クラス間の類似度が大きい程、前記クラス間の辺に割り当てられるフォースが大きくなるように、前記フォースを割り当てる、請求項10に記載の情報処理装置。
- 前記情報処理装置は、前記表示サイズで、前記データを表示させる表示制御部をさらに備える、請求項1に記載の情報処理装置。
- 前記データは、画像データを含む、請求項1に記載の情報処理装置。
- クラスに対応するラベルの学習に基づくデータの認識により得られる尤度ベクトルに基づいて配置される前記データの位置と、前記データが属するクラスの位置との距離に基づいて、決定された前記データの表示サイズを受信する受信部と、
前記表示サイズに基づいて処理を行う処理部と、
を備える情報処理装置。 - クラスに対応するラベルの学習に基づくデータの認識により得られる尤度ベクトルに基づいて配置される前記データの位置と、前記データが属するクラスの位置との距離に基づいて、前記データの表示サイズをプロセッサが決定することと、
前記表示サイズを送信させることと、
を含む情報処理方法。 - 表示対象の変更を受け付ける入力部と、
前記表示対象の変更に呼応して、クラスに対応するラベルの学習に基づくデータの認識により得られる尤度ベクトルに基づいて配置される前記データの位置と、前記データが属するクラスの位置との距離に基づいて、前記データの表示サイズを決定する、表示サイズ決定部と、
を備える情報処理装置。 - 表示対象の変更を受け付けることと、
前記表示対象の変更に呼応して、クラスに対応するラベルの学習に基づくデータの認識により得られる尤度ベクトルに基づいて配置される前記データの位置と、前記データが属するクラスの位置との距離に基づいて、前記データの表示サイズをプロセッサが決定することと、
を含む情報処理方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP17782077.6A EP3444731A4 (en) | 2016-04-11 | 2017-01-20 | INFORMATION PROCESSING DEVICE AND INFORMATION PROCESSING METHOD |
JP2018511887A JP6977715B2 (ja) | 2016-04-11 | 2017-01-20 | 情報処理装置、及び情報処理方法 |
US16/081,235 US10891765B2 (en) | 2016-04-11 | 2017-01-20 | Information processing apparatus and information processing method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016-079006 | 2016-04-11 | ||
JP2016079006 | 2016-04-11 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017179258A1 true WO2017179258A1 (ja) | 2017-10-19 |
Family
ID=60042516
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/001903 WO2017179258A1 (ja) | 2016-04-11 | 2017-01-20 | 情報処理装置、及び情報処理方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US10891765B2 (ja) |
EP (1) | EP3444731A4 (ja) |
JP (1) | JP6977715B2 (ja) |
WO (1) | WO2017179258A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020013274A (ja) * | 2018-07-17 | 2020-01-23 | 富士通株式会社 | 表示プログラム、表示装置及び表示方法 |
WO2020099986A1 (ja) * | 2018-11-15 | 2020-05-22 | 株式会社半導体エネルギー研究所 | コンテンツの分類方法 |
JPWO2021181654A1 (ja) * | 2020-03-13 | 2021-09-16 |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190207889A1 (en) * | 2018-01-03 | 2019-07-04 | International Business Machines Corporation | Filtering graphic content in a message to determine whether to render the graphic content or a descriptive classification of the graphic content |
US11494616B2 (en) * | 2019-05-09 | 2022-11-08 | Shenzhen Malong Technologies Co., Ltd. | Decoupling category-wise independence and relevance with self-attention for multi-label image classification |
US11347565B1 (en) * | 2021-06-30 | 2022-05-31 | United Services Automobile Association (Usaa) | System and method for app-to-app content reconfiguration |
CN115190515B (zh) * | 2022-09-14 | 2022-12-23 | 良业科技集团股份有限公司 | 适用于文旅物联控制的通讯数据处理方法、系统 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001175380A (ja) * | 1999-12-20 | 2001-06-29 | Wellstone Inc | 情報インデックス表示方法とその装置 |
JP2001351127A (ja) * | 2000-06-07 | 2001-12-21 | Sharp Corp | データ表示システム |
JP2009110151A (ja) * | 2007-10-29 | 2009-05-21 | Ricoh Co Ltd | 画像処理装置、画像処理方法、プログラムおよび記録媒体 |
JP2012069097A (ja) * | 2010-08-26 | 2012-04-05 | Canon Inc | データ検索結果の表示方法およびデータ検索結果の表示装置、プログラム |
JP2013025645A (ja) * | 2011-07-22 | 2013-02-04 | Canon Inc | 情報処理装置、情報処理方法およびプログラム |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4465534B2 (ja) * | 2004-03-31 | 2010-05-19 | パイオニア株式会社 | 画像検索方法、装置及びプログラムを記録した記録媒体 |
JP4456617B2 (ja) * | 2007-04-16 | 2010-04-28 | 富士通株式会社 | 類似分析装置、画像表示装置、および画像表示プログラム |
US8774526B2 (en) * | 2010-02-08 | 2014-07-08 | Microsoft Corporation | Intelligent image search results summarization and browsing |
US20130125069A1 (en) * | 2011-09-06 | 2013-05-16 | Lubomir D. Bourdev | System and Method for Interactive Labeling of a Collection of Images |
JP5610655B2 (ja) * | 2011-12-29 | 2014-10-22 | 楽天株式会社 | 情報処理システム、情報処理システムの制御方法、プログラム、及び情報記憶媒体 |
US10198590B2 (en) * | 2015-11-11 | 2019-02-05 | Adobe Inc. | Content sharing collections and navigation |
-
2017
- 2017-01-20 EP EP17782077.6A patent/EP3444731A4/en not_active Ceased
- 2017-01-20 JP JP2018511887A patent/JP6977715B2/ja active Active
- 2017-01-20 US US16/081,235 patent/US10891765B2/en active Active
- 2017-01-20 WO PCT/JP2017/001903 patent/WO2017179258A1/ja active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001175380A (ja) * | 1999-12-20 | 2001-06-29 | Wellstone Inc | 情報インデックス表示方法とその装置 |
JP2001351127A (ja) * | 2000-06-07 | 2001-12-21 | Sharp Corp | データ表示システム |
JP2009110151A (ja) * | 2007-10-29 | 2009-05-21 | Ricoh Co Ltd | 画像処理装置、画像処理方法、プログラムおよび記録媒体 |
JP2012069097A (ja) * | 2010-08-26 | 2012-04-05 | Canon Inc | データ検索結果の表示方法およびデータ検索結果の表示装置、プログラム |
JP2013025645A (ja) * | 2011-07-22 | 2013-02-04 | Canon Inc | 情報処理装置、情報処理方法およびプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3444731A4 * |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020013274A (ja) * | 2018-07-17 | 2020-01-23 | 富士通株式会社 | 表示プログラム、表示装置及び表示方法 |
WO2020099986A1 (ja) * | 2018-11-15 | 2020-05-22 | 株式会社半導体エネルギー研究所 | コンテンツの分類方法 |
JPWO2020099986A1 (ja) * | 2018-11-15 | 2021-11-04 | 株式会社半導体エネルギー研究所 | コンテンツの分類方法 |
JPWO2021181654A1 (ja) * | 2020-03-13 | 2021-09-16 | ||
WO2021181654A1 (ja) * | 2020-03-13 | 2021-09-16 | 三菱電機株式会社 | 情報処理装置、プログラム及び情報処理方法 |
JP7130153B2 (ja) | 2020-03-13 | 2022-09-02 | 三菱電機株式会社 | 情報処理装置、プログラム及び情報処理方法 |
KR20220127347A (ko) * | 2020-03-13 | 2022-09-19 | 미쓰비시덴키 가부시키가이샤 | 정보 처리 장치, 프로그램을 기록한 컴퓨터 판독 가능한 기록 매체 및 정보 처리 방법 |
KR102552786B1 (ko) | 2020-03-13 | 2023-07-06 | 미쓰비시덴키 가부시키가이샤 | 정보 처리 장치, 프로그램을 기록한 컴퓨터 판독 가능한 기록 매체 및 정보 처리 방법 |
Also Published As
Publication number | Publication date |
---|---|
US10891765B2 (en) | 2021-01-12 |
US20190035122A1 (en) | 2019-01-31 |
EP3444731A4 (en) | 2019-05-08 |
EP3444731A1 (en) | 2019-02-20 |
JPWO2017179258A1 (ja) | 2019-02-14 |
JP6977715B2 (ja) | 2021-12-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6977715B2 (ja) | 情報処理装置、及び情報処理方法 | |
US20170243578A1 (en) | Voice processing method and device | |
KR20160091725A (ko) | 음성 인식 방법 및 장치 | |
WO2015127739A1 (zh) | 联系人的分组处理方法及装置 | |
CN108764051B (zh) | 图像处理方法、装置及移动终端 | |
US11531835B2 (en) | Electronic device for controlling predefined function based on response time of external electronic device on user input, and method thereof | |
TWI834837B (zh) | 用於訓練神經網路的方法及系統 | |
WO2018107580A1 (zh) | 一种信息提示的方法及装置 | |
US20220108713A1 (en) | Acoustic zooming | |
CN111354434A (zh) | 电子装置及其提供信息的方法 | |
EP4235379A1 (en) | Electronic device and multi-window control method of electronic device | |
TW202034648A (zh) | 無線通信系統以及操作無線通信系統的方法 | |
US20230164103A1 (en) | Electronic device providing notification and method for operating the same | |
US20200125398A1 (en) | Information processing apparatus, method for processing information, and program | |
JP2016143310A (ja) | 情報処理装置、画像処理方法及びプログラム | |
US20170206898A1 (en) | Systems and methods for assisting automatic speech recognition | |
WO2016206642A1 (zh) | 机器人的控制数据的生成方法及装置 | |
RU2540844C2 (ru) | Способ и устройство для выбора приемника | |
WO2016189905A1 (ja) | 情報処理装置、情報処理方法及びプログラム | |
US11163378B2 (en) | Electronic device and operating method therefor | |
US11928263B2 (en) | Electronic device for processing user input and method thereof | |
US11997445B2 (en) | Systems and methods for live conversation using hearing devices | |
WO2018139050A1 (ja) | 情報処理装置、情報処理方法およびプログラム | |
CN109313506A (zh) | 信息处理装置、信息处理方法和程序 | |
US12067171B1 (en) | Systems and methods for operating artificial reality devices using gestures |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2018511887 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2017782077 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2017782077 Country of ref document: EP Effective date: 20181112 |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17782077 Country of ref document: EP Kind code of ref document: A1 |