WO2018205230A1 - 物品搜索方法、装置及机器人 - Google Patents
物品搜索方法、装置及机器人 Download PDFInfo
- Publication number
- WO2018205230A1 WO2018205230A1 PCT/CN2017/083965 CN2017083965W WO2018205230A1 WO 2018205230 A1 WO2018205230 A1 WO 2018205230A1 CN 2017083965 W CN2017083965 W CN 2017083965W WO 2018205230 A1 WO2018205230 A1 WO 2018205230A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- searched
- item
- search
- model
- searching
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0003—Home robots, i.e. small robots for domestic use
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/951—Indexing; Web crawling techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
- G06V20/647—Three-dimensional objects by matching two-dimensional images to three-dimensional objects
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40577—Multisensor object recognition
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40594—Two range sensors for recognizing 3-D objects
Definitions
- the present application relates to the field of Internet of Things technologies, and in particular, to an item search method, apparatus, and robot.
- robots such as robot butlers, sweeping robots, item sorting robots, and nursing robots. If the robot can be used to find an item, the efficiency of the user looking for the item will be effectively improved.
- the embodiment of the present application mainly solves the problem that the user is inefficient when searching for an item.
- a technical solution adopted by the embodiment of the present application is to provide an item search method, the method comprising: receiving a search task for searching for an item to be searched; and acquiring, according to the search task, the item to be searched for a 3D model; determining a search task group for searching the item to be searched; and searching for the item to be searched in association with the search task group according to the acquired 3D model, wherein the search task group is searching for the to-be-searched Share search results as you search for items.
- an item search device comprising: a first receiving module, configured to receive a search task for searching for an item to be searched; and an obtaining module, configured to: Obtaining, according to the search task, a 3D model corresponding to the item to be searched; a determining module, configured to determine a search task group for searching for the item to be searched; and a searching module, configured to use the acquired 3D model, the joint office
- the search task group searches for the item to be searched, wherein the search task group shares the search result in the process of searching for the item to be searched.
- a robot comprising: at least one processor; and a memory communicably connected to the at least one processor; wherein The memory stores a program of instructions executable by the at least one processor, the program of instructions being executed by the at least one processor to cause the at least one processor to perform the method as described above.
- another technical solution adopted by the embodiment of the present application is to provide a non-transitory computer readable storage medium storing computer executable instructions, the computer executable The instructions are for causing a computer to perform the method as described above.
- another technical solution adopted by the embodiment of the present application is to provide a computer program product, the computer program product comprising: a non-transitory computer readable storage medium and embedded in the nonvolatile Computer program instructions of a computer readable storage medium; the computer program instructions comprising instructions to cause a processor to perform the method as described above.
- the joint search task group searches for the item to be searched corresponding to the 3D model.
- This embodiment provides convenience to the user on the one hand, and improves the efficiency of the user in finding the item and the probability of finding the item on the other hand.
- FIG. 1 is a schematic diagram of an application environment of an item search method provided by an embodiment of the present application
- FIG. 2 is a schematic flow chart of an item search method according to an embodiment of the present application.
- FIG. 3 is a schematic flowchart of a method for acquiring a 3D model corresponding to an item to be searched in an item search method according to an embodiment of the present application;
- FIG. 4 is a schematic flowchart of a method for searching for the item to be searched in association with the search task group according to the acquired 3D model in an item search method according to an embodiment of the present application;
- FIG. 5 is another schematic flowchart of a method for searching for the item to be searched in association with the search task group according to the acquired 3D model in an item search method according to an embodiment of the present application;
- FIG. 6 is still another schematic flowchart of a method for searching for the to-be-searched item in combination with the search task group according to the acquired 3D model in an item search method according to an embodiment of the present application;
- FIG. 7 is a schematic flowchart of an item search method according to another embodiment of the present application.
- FIG. 8 is a schematic flowchart diagram of an item search method according to another embodiment of the present application.
- FIG. 9 is a schematic structural diagram of an item search device according to an embodiment of the present application.
- FIG. 10 is a schematic structural diagram of an item search device according to another embodiment of the present application.
- FIG. 11 is a schematic structural diagram of an item search device according to another embodiment of the present application.
- FIG. 12 is a schematic structural diagram of a robot according to an embodiment of the present application.
- FIG. 1 is a schematic diagram of an application environment of an item search method according to an embodiment of the present application.
- the application environment includes: a user 10, a smart terminal 20, and a cloud server 30.
- User 10 can be any number of groups having the same or similar operational behavior, such as a family, work group, or individual.
- the user 10 can perform data interaction with the smart terminal 20 by means of voice, text, physical actions, and the like.
- the intelligent terminal 20 can be any suitable type of electronic device that has some logical computing power and provides one or more functions that satisfy the user's intent. For example, a sweeping robot, an item sorting robot, a care robot, and the like.
- the smart terminal 20 has functions of visual search, sound collection and sound recognition, as well as image acquisition and image recognition. After receiving the information such as voice, text, and body motion input by the user 10, the smart terminal 20 can acquire the task issued by the user 10 by functions such as voice recognition or image recognition.
- the smart terminal 20 can access the local area network and the Internet, and after receiving the task, can assist in completing the task by accessing the local area network or the Internet. In this process, the smart terminal 20 can perform data interaction with the cloud server 30, and assist the smart terminal 20 to complete the tasks issued by the user 10 under the powerful computing capabilities of the cloud server 30.
- the tasks issued by the user 10 can be performed by one or more smart terminals 20.
- the plurality of smart terminals 20 can communicate with each other and share data information. Further, the plurality of smart terminals 20 can access the local area network, the Internet, and the cloud server, and are not limited to one smart terminal shown in FIG. 1.
- the cloud server 30 is used to provide cloud services for the smart terminal 20, specifically for cloud computing and cloud storage requirements. After receiving the cloud computing request message sent by the smart terminal 20, the cloud server 30 acquires the application or invokes the stored application to perform cloud computing processing, for example, 3D modeling an certain item. After receiving the cloud storage request message sent by the smart terminal 20, the cloud server 30 can cache the data information. The cloud server 30 can also store pre-stored data information.
- the application environment of the item search method provided by the embodiment of the present application may be further extended to other suitable application environments, and is not limited to the application environment shown in FIG. Although only one user 10, three smart terminals 20, and one cloud server 30 are shown in FIG. 1, those skilled in the art can understand that the application environment may include more or fewer users in the actual application process. , smart terminals and cloud servers.
- FIG. 2 is a schematic flowchart diagram of an item search method according to an embodiment of the present application. As shown in Figure 2, the method includes:
- Step 21 Receive a search task for searching for an item to be searched.
- the smart terminal receives a search task sent by the user, and the search task is for a certain Search for items in space.
- the user can issue a search task to the smart terminal by means of voice input or text input or gesture input.
- the search task includes key information of the item to be searched, and the smart terminal can understand the search task according to the key information.
- the smart terminal can extract keywords of the search task based on voice recognition and semantic understanding.
- the smart terminal may extract a keyword of the search task based on the motion recognition or the like.
- a user can call one or a few robots in the family, such as "Tom, where did the book for me find me, how steel is made,” called Tom's robot received by the user.
- Tom's robot received by the user.
- the key information is identified as “book” and "how steel is made”. Therefore, the current search task is to search for a "how steel is made” book of.
- Step 22 Acquire a 3D model corresponding to the item to be searched according to the search task.
- the smart terminal After receiving the search task, the smart terminal acquires a 3D model corresponding to the item to be searched according to the key information included in the search task, so as to search for the item to be searched according to the 3D model in the subsequent search process.
- acquiring a 3D model corresponding to the item to be searched includes:
- Step 221 Searching, in the local 3D model set, a 3D model corresponding to the item to be searched;
- Step 222 Determine whether a 3D model corresponding to the item to be searched is found in the local 3D model set.
- Step 223 If the 3D model corresponding to the item to be searched is found in the local 3D model set, the 3D model found is used as a 3D model corresponding to the item to be searched.
- the smart terminal searches for a 3D model of the item to be searched in the local 3D model set by accessing the local area network.
- the keywords included in the search task may be matched with the text labels corresponding to each 3D model in the local 3D model, and if the matching is consistent, the 3D model is found.
- the text label corresponding to each 3D model is used to explain the 3D model stored in the local 3D model set.
- the local 3D model set is a pre-established data set containing a 3D model of an item in the current area.
- the items in the current area may be all items in the current area, or items that are frequently used by the user, or items that are small in size, not easy to find, and the like.
- the 3D model in the local 3D model set may be established according to the category of the item, for example, the "book” class corresponds to one or several 3D models, the "mobile phone” class corresponds to one or several 3D models, and the like.
- the entire content of the local 3D model set may be stored on the local server, or may be stored in the smart terminal itself, and may also be stored in the cloud server, which is not limited herein.
- the content in the local 3D model set can be updated according to a specific application scenario.
- the method for obtaining a 3D model corresponding to the item to be searched further includes:
- Step 224 If the 3D model corresponding to the item to be searched is not found in the local 3D model set, search for a 3D model corresponding to the item to be searched in the network;
- Step 225 Determine whether the network searches for a 3D model corresponding to the item to be searched;
- Step 226 If the 3D model corresponding to the item to be searched is searched in the network, the searched 3D model is used as a 3D model corresponding to the item to be searched, and the searched 3D model is stored. To the local 3D model set.
- the 3D model of the item to be searched is searched from the Internet.
- the 3D model corresponding to the keyword is searched on the network according to the keywords included in the search task. It can be understood that the number of 3D models searched is related to the searched keywords, and the more detailed the keywords, the more accurate the searched 3D model is. However, when there are multiple versions or multiple styles of certain items, the keywords can no longer distinguish the specific 3D model. At this time, the 3D model of the item to be searched can be determined by interacting with the user.
- the picture of the book can be searched according to the keyword, and the searched picture of the book (ie, the cover style) can be displayed to the user through the screen or the projection, and the user can be prompted by voice to have multiple versions of the book, and the user is expected to select one.
- the user may select by voice or click on the screen, and after receiving the picture selected by the user, search for the corresponding 3D model according to the picture.
- the method for obtaining the 3D model corresponding to the item to be searched further includes:
- Step 227 If the 3D model corresponding to the item to be searched is not found in the network, search for a picture corresponding to the item to be searched in the network;
- Step 228 Obtain a 3D model of the item to be searched according to the picture established in the cloud, and store the established 3D model to the local 3D model set.
- a 3D model of the item to be searched is established.
- the detailed implementation of the 3D modeling of the image according to the image is described in detail in the related art, and is not limited herein.
- the process of performing object 3D modeling may be performed on a cloud server, and the smart terminal sends the searched image to the cloud server, the cloud server returns the modeling result to the smart terminal, and stores the established 3D model to the local 3D model. In the collection, to refine the content in the local 3D model collection.
- the process of performing 3D modeling of the article can also be performed at the smart terminal.
- Step 23 Determine a search task group that searches for the item to be searched.
- the search task group is a smart terminal set including at least two smart terminals, and the smart terminals in the set can be used to search for items to be searched.
- the search task group can be established by the following method.
- the robot housekeeper when the robot housekeeper receives the task of searching for the item to be searched by the user, the robot housekeeper can notify the robot having the visual search function among all the local robots belonging to the user, thereby establishing the search task group by the selected robots.
- the robot that receives the same search task establishes a search task group.
- a robot having a visual search function among the local robots to which the user belongs and being in an idle state is used to establish a search task group. Further, the priority of the search task sent by the user is compared with the priority of the task currently executed by the local robot, and if the priority of the currently executed task of the local robot is lower than the priority of the search task, the priority is lower.
- the corresponding robots join the search task group, so that these robots preferentially perform search tasks, and their currently executed tasks can be cached.
- the above-mentioned searching for an item by establishing a search task group can not only improve search efficiency, but also more effectively improve the probability of finding an item.
- Step 24 Searching for the item to be searched according to the acquired 3D model according to the search task group, wherein the search task group shares the search result in the process of searching for the item to be searched.
- the smart terminal acquiring the 3D model sends the acquired 3D model to other smart terminals in the search task group, thereby jointly searching the search task group to search for the item to be searched according to the 3D model.
- each intelligent terminal in the search task group can communicate with each other to share the search process and search results.
- the smart terminal in the search task group is preferably a robot, and the search task group established by the robot can search for the item to be searched in the following manner.
- the searching for the to-be-searched item in association with the search task group according to the acquired 3D model includes:
- Step 24a1 determining a search area corresponding to the robot in the search task group
- Step 24a2 Send a search instruction to the robot in the search task group, so that each robot in the search task group searches in the corresponding search area according to the 3D model.
- each robot in the search task group is divided into corresponding search regions, so that the robot searches in its corresponding search region according to the 3D model.
- the search area of the robot in the search task group can be determined according to the current position of the robot. For example, when the search task is received, the sweeping robot is located in the living room, and the article sorting robot is located in the room, it is determined that the search area of the sweeping robot is the living room, and the search area of the article sorting robot is the room. It is also possible to determine the search area of the robot in the search task group based on the number of robots in the search task group and the size of the search area. It is also possible to determine the corresponding search area according to the function property of each robot itself. For example, the search area corresponding to the kitchen robot is the kitchen, and the search area corresponding to the door opening robot is the living room. There are other ways to determine the search area of the robot in the search task group.
- the search task group can be searched with the help of the monitoring system.
- the searching for the item to be searched according to the searched task group according to the acquired 3D model includes:
- Step 24b1 Obtain a monitoring screen monitored by the monitoring system
- Step 24b2 Search for the item to be searched corresponding to the 3D model according to the monitoring screen.
- the monitor screen refers to the screen obtained by monitoring the search area.
- the monitoring screen may be obtained by searching for the most powerful robot in the task group, or the robot without action capability, or the robot whose current search range has been searched, acquiring the monitoring screen from the monitoring system, and sharing the acquired monitoring screen to the search.
- Other robots in the task group so that each robot finds the item to be searched according to the 3D model and the monitoring screen. It is also possible that each robot in the search task group obtains a monitoring picture from the monitoring system.
- the items that the user is looking for may be items that are often easily forgotten, such as glasses, headphones, etc., and the user will habitually place the items in several places. Therefore, the item can be searched for in conjunction with the location where the item is historically placed.
- the searching for the item to be searched according to the search task group according to the acquired 3D model includes:
- Step 24c1 Obtain a historical location group corresponding to the to-be-searched item, where the historical location group is used to record a historical location of the to-be-searched item and a searched number of times the to-be-searched item is searched at each historical location;
- Step 24c2 Send a search instruction to the robot in the search task group, so that each robot in the search task group searches for the item to be searched according to the 3D model and in combination with the historical location group.
- the historical location group is pre-established, which records the location when the item was found and the number of times it was found at the location.
- the historical time corresponding to the historical location group can be customized by the user.
- the historical location group can be stored on each robot, or it can be stored on the cloud server or saved. Stored on the local server.
- Searching for an item based on a historical location group includes sorting historical locations based on the number of times the item is found in the location history, and then searching for items in order based on the ranking. For example, glasses, the number of times the computer desk was found in history was 5 times, the number of times the bedside table was found in history was 3 times, and the number of times the desk was found in the history was 2 times, then the robot could be based on the computer desk, bedside table, and hand washing. The order of the stations to search for glasses.
- the search area of each robot in the search task group may be determined first, and then when the robot searches for the item to be searched in its corresponding search area, the search is started according to the historical position group corresponding to the item to be searched in the area.
- Robot A and Robot B jointly search for a book on how steel is made.
- the search area corresponding to Robot A is the living room, and the search area corresponding to Robot B is the bedroom.
- the historical position of the book in the living room is ⁇ ( Coffee table, 3 times), (sofa, 2 times), (drawer, 1 time) ⁇
- the book's historical position in the bedroom is set to ⁇ (bedside table, 3 times), (under the pillow, 1 time) ⁇ , here
- the robot A searches for the book "How the steel is made” in the order of "coffee table, sofa, drawer”.
- the robot B searches for the "how to make steel” in the order of "bedside table and pillow under the bedroom” in the bedroom. "book. In this process, Robot A and Robot B share their search results.
- the robot After determining the search area of each robot in the search task group, the robot acquires a monitoring screen of its corresponding area, and searches for the item to be searched by scanning the monitoring screen and its own search.
- the monitoring screen of each historical position in the historical location group is called, thereby judging whether the item to be searched exists by monitoring the screen.
- the searched position and the scanned position of the monitoring system can be marked on the indoor three-dimensional map, and
- the marked indoor three-dimensional map can be shared between the robots, and the robot can selectively filter the search area according to the records of the indoor three-dimensional map.
- the method further includes:
- Step 25 Determine whether the location of the item to be searched for the search belongs to the historical location group
- Step 26 If yes, update the number of searches corresponding to the location of the item to be searched for;
- Step 27 If not, record the search for the item to be searched in the historical location group The location, and the number of searches for the location.
- the searched glasses are found and found in the "hand washing station", it indicates that the found location belongs to the historical location group, and the number of searches corresponding to the "hand washing station” is updated, for example, by 1; the above search
- the glasses are found and found in the "coffee table", indicating that the found location does not belong to the historical location group, at this time the "coffee table” position is recorded in the historical location group, and the corresponding search is recorded.
- the number of times such as a record of 1.
- the embodiment of the present application provides an item search method, which searches a search item by searching a task group and a 3D model corresponding to the item to be searched.
- the item is searched by dividing the area, invoking the monitoring system, and combining historical locations during the search process. This embodiment not only improves search efficiency, but also improves the accuracy of finding items.
- FIG. 8 is a schematic flowchart diagram of an item search method according to another embodiment of the present application. As shown in Figure 8, the method includes:
- Step 31 Receive a search task for searching for an item to be searched
- Step 32 Acquire a 3D model corresponding to the item to be searched according to the search task.
- Step 33 Determine a search task group that searches for the item to be searched
- Step 34 Searching for the to-be-searched item in combination with the search task group according to the acquired 3D model, wherein the search task group shares the search result in the process of searching for the item to be searched.
- Step 35 Generate a prompt message for searching for the item to be searched
- Step 36 Receive a confirmation message that determines the item to be searched
- Step 37 The search task group is caused to share the confirmation message, so that the search task group confirms that the search task is completed.
- the smart terminal may prompt the user to find the item by generating a prompt message for searching for the item to be searched.
- the prompt message includes: a specific location of the item, a picture of the item, the item itself, and the like.
- the user confirms the prompt message. If it is confirmed that the found item to be searched for is an item required by the user, the search task group shares the confirmation message. At this time, the search task is completed, and the search task group can be dissolved. If it is confirmed that the found item to be searched for is not the item desired by the user, at this time, the user can input more features of the item to be searched to the smart terminal, and the search task group displays the search again. If the space search is not found, the user is notified that it has not been found, and the search task group is dismissed, and the task ends.
- the embodiment of the present application provides an item search method, which searches a search item by searching a task group and a 3D model corresponding to the item to be searched, and feeds back the search result to the user, and determines whether the search task is based on the confirmation result of the user. termination.
- This embodiment not only enhances the search for items Efficiency, and the final items found are more in line with the needs of users.
- FIG. 9 is a schematic structural diagram of an item search apparatus according to an embodiment of the present application.
- the device 40 includes a first receiving module 41, an obtaining module 42, a determining module 43, and a searching module 44.
- the first receiving module 41 is configured to receive a search task for searching for an item to be searched; the obtaining module 42 is configured to acquire a 3D model corresponding to the item to be searched according to the search task, and the determining module 43 is configured to determine the search.
- a search task group of the item to be searched; the search module 44 is configured to search for the item to be searched according to the searched task group according to the acquired 3D model, wherein the search task group searches for the to-be-searched Share search results as you search for items.
- the first receiving module 41 sends the received search task of the item to be searched to the obtaining module 42.
- the obtaining module 42 acquires the 3D model corresponding to the item to be searched according to the search task, and determines the search task by the determining module 43.
- the group, the search module 44 searches for the item to be searched according to the 3D model acquired by the acquisition module 42 and the search task group determined by the determination module 43.
- the obtaining module 42 includes a searching unit 421, a first obtaining unit 422, a first searching unit 423, a second obtaining unit 424, a second searching unit 425, and a third obtaining unit 426.
- the searching unit 421 is configured to search for a 3D model corresponding to the item to be searched in the local 3D model set;
- the first obtaining unit 422 is configured to find a 3D model corresponding to the item to be searched in the local 3D model set.
- the 3D model found is used as a 3D model corresponding to the item to be searched.
- a first search unit 423 configured to search for a 3D model corresponding to the item to be searched in the network if the 3D model corresponding to the item to be searched is not found in the local 3D model set; and the second obtaining unit 424 uses If the 3D model corresponding to the item to be searched is searched in the network, the searched 3D model is used as a 3D model corresponding to the item to be searched, and the searched 3D model is stored in the
- the local search unit 425 is configured to search for a picture corresponding to the item to be searched in the network if the 3D model corresponding to the item to be searched is not found in the network; the third obtaining unit 426 And acquiring a 3D model of the item to be searched according to the picture established in the cloud, and storing the established 3D model to the local 3D model set.
- the determining module 43 is specifically configured to determine a search task group for searching for the item to be searched according to the priority of the search task and the priority of tasks currently performed by other robots.
- the search module 44 includes a determining unit 441, a third searching unit 442, a fourth obtaining unit 443, a fourth searching unit 444, a fifth obtaining unit 445, and a fifth searching unit 446.
- a determining unit 441, configured to determine a search area corresponding to the robot in the search task group
- a third search unit 442, configured to send a search instruction to the robot in the search task group, so that the search task group is Each robot performs a search in the corresponding search area based on the 3D model.
- the fourth obtaining unit 443, The fourth search unit 444 is configured to search for the to-be-searched item corresponding to the 3D model according to the monitoring screen.
- the fifth obtaining unit 445 is configured to acquire a historical location group corresponding to the to-be-searched item, where the historical location group is used to record the historical location of the to-be-searched item and the to-be-searched item is searched in each historical location.
- a fifth search unit 446 configured to send a search instruction to the robot in the search task group, so that each robot in the search task group according to the 3D model, and in combination with the historical location group search Tell the search item.
- the apparatus when the search result includes information for searching for the item to be searched, the apparatus further includes: a determining module 45, a first processing module 46, and a second processing module 47.
- the determining module 45 is configured to determine whether the location of the to-be-searched item belongs to the historical location group, and the first processing module 46 is configured to: if it belongs, search for the location corresponding to the location of the to-be-searched item The number of times is updated; the second processing module 47 is configured to record, in the historical location group, a location where the item to be searched for is searched, and a number of searches corresponding to the location, if not.
- the item search device may be a robot, which may be one of members of the search task group, or may be a robot independent of members of the search task group, or the item search device may be a control mechanism.
- An embodiment of the present application provides an item search device that searches a search item by searching a task group and a 3D model corresponding to the item to be searched.
- the item is searched by dividing the area, invoking the monitoring system, and combining historical locations during the search process. This embodiment not only improves search efficiency, but also improves the accuracy of finding items.
- FIG. 11 is a schematic structural diagram of an item search device according to another embodiment of the present application.
- the device 50 includes a first receiving module 51 , an obtaining module 52 , a determining module 53 , a searching module 54 , a generating module 55 , a second receiving module 56 , and a third processing module 57 .
- the first receiving module 51 is configured to receive a search task for searching for an item to be searched; the obtaining module 52 is configured to acquire a 3D model corresponding to the item to be searched according to the search task, and the determining module 53 is configured to determine the search.
- the search result is shared in the process of searching for the item; the generating module 55 is configured to generate a prompt message for searching for the item to be searched; the second receiving module 56 is configured to receive a confirmation message for determining the item to be searched; the third processing module 57.
- the present invention is configured to enable the search task group to share the confirmation message, so that the search task group confirms that the search task is completed.
- the first receiving module 51 sends the search task of the received item to be searched to
- the obtaining module 52 obtains a 3D model corresponding to the item to be searched according to the search task, and determines a search task group by the determining module 53.
- the search module 54 treats the search task group determined according to the 3D model acquired by the obtaining module 52 and the determining module 53. Search for items to search.
- the generating module 55 generates a prompt message for searching for the item to be searched, the prompt message is displayed to the user, and the second receiving module 56 receives the confirmation message of the user.
- the confirmation message is sent to the third processing module 57, and the confirmation message is processed by the third processing module 57 to determine if the search task is over.
- the embodiment of the present application provides an item search device, which searches a search item by searching a task group and a 3D model corresponding to the item to be searched, and feeds back the search result to the user, and determines whether the search task is based on the confirmation result of the user. termination.
- This embodiment not only improves the search efficiency of the item, but also finally finds the item more in line with the user's needs.
- FIG. 12 is a schematic structural diagram of a robot according to an embodiment of the present application.
- the robot can execute the article search method as described above, which includes various types of robots such as a robot housekeeper, a sweeping robot, an article sorting robot, and a care robot.
- the robot 60 includes one or more processors 61 and a memory 62, and one processor 61 is taken as an example in FIG.
- the processor 61 and the memory 62 can be connected by a bus or other means, as exemplified by a bus connection in FIG.
- the robot that executes the item search method may further include: an input device 63 and an output device 64.
- the memory 62 is a non-volatile computer readable storage medium, and can be used for storing non-volatile software programs, non-volatile computer-executable programs, and modules, such as program instructions corresponding to the item search method in the embodiment of the present application.
- module for example, the first receiving module 41, the obtaining module 42, the determining module 43 and the search module 44 shown in FIG. 9
- the processor 61 running the non-volatile software programs, instructions, and instructions stored in the memory 62
- the module thereby performing various functional applications of the server and data processing, that is, implementing the item search method of the above method embodiment.
- the memory 62 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application required for at least one function; the storage data area may store data created according to usage of the item search device, and the like.
- memory 62 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device.
- the memory 62 can optionally include a processor 61 Remotely set up memory that can be connected to the item search device over a network. Examples of such networks include, but are not limited to, the Internet, intranets, local area networks, mobile communication networks, and combinations thereof.
- the input device 63 can receive the input numeric or character information and generate a key signal input related to user settings and function control of the item search device.
- Output device 64 can include a display device such as a display screen.
- the one or more modules are stored in the memory 62, and when executed by the one or more processors 61, perform an item search method in any of the above method embodiments, for example, performing the above described FIG. Method step 21 to step 24, method step 221 to step 226 in FIG. 3, method step 24a1 to step 24a2 in FIG. 4, method step 24b1 to step 24b2 in FIG. 5, method step 24c1 to step 24c2 in FIG. Steps 21 to 27 of FIG. 7 and steps 31 to 37 of FIG. 8 implement module 41-44, unit 421-426, and unit 441-446 of FIG. 9 to implement module 41 of FIG. - 47, units 421-426, units 441-446, functions of modules 51-57 in FIG.
- the embodiment of the present application provides a non-transitory computer readable storage medium storing computer-executable instructions that are executed by an electronic device to perform any of the above method embodiments.
- the item search method for example, the method steps 21 to 24 in FIG. 2, the method steps 221 to 226 in FIG. 3, the method steps 24a1 to 24a2 in FIG. 4, and the method in FIG. 5 are performed.
- Steps 24b1 to 24b2, method steps 24c1 to 24c2 in FIG. 6, method steps 21 to 27 in FIG. 7, and method steps 31 to 37 in FIG. 8 implement module 41-44 in FIG. 421-426, units 441-446, implement the functions of modules 41-47, units 421-426, 441-446, and modules 51-57 of FIG.
- An embodiment of the present application provides a computer program product, including a computing program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions, when the program instructions are executed by a computer,
- the computer executes the item search method in any of the above method embodiments, for example, performs the method steps 21 to 24 in FIG. 2 described above, the method steps 221 to 226 in FIG. 3, and the method steps 24a1 to step in FIG. 24a2, method step 24b1 to step 24b2 in FIG. 5, method step 24c1 to step 24c2 in FIG. 6, method step 21 to step 27 in FIG. 7, method step 31 to step 37 in FIG. 8, and implementation in FIG.
- Modules 41-44, 421-426, 441-446 implement the functions of modules 41-47, 421-426, 441-446, and modules 51-57 in FIG.
- the device embodiments described above are merely illustrative, wherein the described as separate components
- the units may or may not be physically separate, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
- the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (RAM).
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Databases & Information Systems (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Manipulator (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims (21)
- 一种物品搜索方法,其特征在于,包括:接收搜索待搜索物品的搜索任务;根据所述搜索任务,获取所述待搜索物品对应的3D模型;确定搜索所述待搜索物品的搜索任务组;根据获取到的所述3D模型,联合所述搜索任务组搜索所述待搜索物品,其中,所述搜索任务组在搜索所述待搜索物品的过程中共享搜索结果。
- 根据权利要求1所述的方法,其特征在于,所述获取所述待搜索物品对应的3D模型包括:在本地3D模型集合中查找所述待搜索物品对应的3D模型;若在所述本地3D模型集合中找到所述待搜索物品对应的3D模型,则将找到的所述3D模型作为所述待搜索物品对应的3D模型。
- 根据权利要求2所述的方法,其特征在于,所述方法还包括:若在所述本地3D模型集合中未找到所述待搜索物品对应的3D模型,则在网络搜索所述待搜索物品对应的3D模型;若在所述网络搜索到所述待搜索物品对应的3D模型,则将所述搜索到的3D模型作为所述待搜索物品对应的3D模型,并且将所述搜索到的3D模型存储至所述本地3D模型集合;若在所述网络未搜索到所述待搜索物品对应的3D模型,则在网络搜索所述待搜索物品对应的图片;获取在云端根据所述图片建立的所述待搜索物品的3D模型,并将所述建立的3D模型存储至所述本地3D模型集合。
- 根据权利要求1所述的方法,其特征在于,所述确定搜索所述待搜索物品的搜索任务组包括:根据所述搜索任务的优先级,以及其他机器人当前所执行的任务的优先级来确定搜索所述待搜索物品的搜索任务组。
- 根据权利要求1所述的方法,其特征在于,所述根据获取到的所述3D模型,联合所述搜索任务组搜索所述待搜索物品包括:确定所述搜索任务组中的机器人所对应的搜索区域;向所述搜索任务组中的机器人发送搜索指令,以使所述搜索任务组中各机器人根据所述3D模型,在对应的搜索区域进行搜索。
- 根据权利要求1所述的方法,其特征在于,所述根据获取到的所述3D模型,联合所述搜索任务组搜索所述待搜索物品包括:获取监控系统所监控到的监控画面;根据所述监控画面搜索所述3D模型对应的待搜索物品。
- 根据权利要求1所述的方法,其特征在于,所述根据获取到的所述3D模型,联合所述搜索任务组搜索所述待搜索物品包括:获取所述待搜索物品对应的历史位置组,所述历史位置组用于记录所述待搜索物品的历史位置和所述待搜索物品在每一历史位置被搜索到的搜索次数;向所述搜索任务组中的机器人发送搜索指令,以使所述搜索任务组中各机器人根据所述3D模型,并且结合所述历史位置组搜索所述待搜索物品。
- 根据权利要求7所述的方法,其特征在于,当所述搜索结果包含搜索到所述待搜索物品的信息时,所述方法还包括:判断搜索到所述待搜索物品的位置是否属于所述历史位置组;若属于,则将搜索到所述待搜索物品的位置所对应的搜索次数进行更新;若不属于,则在所述历史位置组中记录搜索到所述待搜索物品的位置,以及所述位置对应的搜索次数。
- 根据权利要求1至7任一项所述的方法,其特征在于,当所述搜索结果包含搜索到所述待搜索物品的信息时,所述方法还包括:产生搜索到所述待搜索物品的提示消息;接收确定所述待搜索物品的确认消息;使所述搜索任务组共享所述确认消息,以使所述搜索任务组确认完成所述搜索任务。
- 一种物品搜索装置,其特征在于,包括:第一接收模块,用于接收搜索待搜索物品的搜索任务;获取模块,用于根据所述搜索任务,获取所述待搜索物品对应的3D模型;确定模块,用于确定搜索所述待搜索物品的搜索任务组;搜索模块,用于根据获取到的所述3D模型,联合所述搜索任务组搜索所述待搜索物品,其中,所述搜索任务组在搜索所述待搜索物品的过程中共享搜索结果。
- 根据权利要求10所述的装置,其特征在于,所述获取模块包括:查找单元,用于在本地3D模型集合中查找所述待搜索物品对应的3D模型;第一获取单元,用于若在所述本地3D模型集合中找到所述待搜索物品对应的3D模型,则将找到的所述3D模型作为所述待搜索物品对应的3D模型。
- 根据权利要求11所述的装置,其特征在于,所述获取模块还包括:第一搜索单元,用于若在所述本地3D模型集合中未找到所述待搜索物品对应的3D模型,则在网络搜索所述待搜索物品对应的3D模型;第二获取单元,用于若在所述网络搜索到所述待搜索物品对应的3D模型,则将所述搜索到的3D模型作为所述待搜索物品对应的3D模型,并且将所述搜索到的3D模型存储至所述本地3D模型集合;第二搜索单元,用于若在所述网络未搜索到所述待搜索物品对应的3D模型,则在网络搜索所述待搜索物品对应的图片;第三获取单元,用于获取在云端根据所述图片建立的所述待搜索物品的3D模型,并将所述建立的3D模型存储至所述本地3D模型集合。
- 根据权利要求10所述的装置,其特征在于,所述确定模块具体用于:根据所述搜索任务的优先级,以及其他机器人当前所执行的任务的优先级来确定搜索所述待搜索物品的搜索任务组。
- 根据权利要求10所述的装置,其特征在于,所述搜索模块包括:确定单元,用于确定所述搜索任务组中的机器人所对应的搜索区域;第三搜索单元,用于向所述搜索任务组中的机器人发送搜索指令,以使所述搜索任务组中各机器人根据所述3D模型,在对应的搜索区域进行搜索。
- 根据权利要求10所述的装置,其特征在于,所述搜索模块包括:第四获取单元,用于获取监控系统所监控到的监控画面;第四搜索单元,用于根据所述监控画面搜索所述3D模型对应的待搜索物品。
- 根据权利要求10所述的装置,其特征在于,所述搜索模块包括:第五获取单元,用于获取所述待搜索物品对应的历史位置组,所述历史位置组用于记录所述待搜索物品的历史位置和所述待搜索物品在每一历史位置被搜索到的搜索次数;第五搜索单元,用于向所述搜索任务组中的机器人发送搜索指令,以使所述搜索任务组中各机器人根据所述3D模型,并且结合所述历史位置组搜索所述待搜索物品。
- 根据权利要求16所述的装置,其特征在于,当所述搜索结果包含搜索到所述待搜索物品的信息时,所述装置还包括:判断模块,用于判断搜索到所述待搜索物品的位置是否属于所述历史位置组;第一处理模块,用于若属于,则将搜索到所述待搜索物品的位置所对应的搜索次数进行更新;第二处理模块,用于若不属于,则在所述历史位置组中记录搜索到所述待搜索物品的位置,以及所述位置对应的搜索次数。
- 根据权利要求10至16任一项所述的装置,其特征在于,当所述搜索结果包含搜索到所述待搜索物品的信息时,所述装置还包括:生成模块,用于产生搜索到所述待搜索物品的提示消息;第二接收模块,用于接收确定所述待搜索物品的确认消息;第三处理模块,用于使所述搜索任务组共享所述确认消息,以使所述搜索任务组确认完成所述搜索任务。
- 一种机器人,其特征在于,包括:至少一个处理器;以及,与所述至少一个处理器通信连接的存储器;其中,所述存储器存储有可被所述至少一个处理器执行的指令程序,所述指令程序被所述至少一个处理器执行,以使所述至少一个处理器执行权利要求1至9任一项所述的方法。
- 一种非易失性计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机可执行指令,所述计算机可执行指令用于使计算机执行权利要求1至9任一项所述的方法。
- 一种计算机程序产品,其特征在于,所述计算机程序产品包括:非易失性计算机可读存储介质以及内嵌于所述非易失性计算机可读存储介质的计算机程序指令;所述计算机程序指令包括用以使处理器执行权利要求1至9任一项所述的方法的指令。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019561904A JP6905087B2 (ja) | 2017-05-11 | 2017-05-11 | 物品探索方法、装置及びロボット |
CN201780000587.8A CN107466404B (zh) | 2017-05-11 | 2017-05-11 | 物品搜索方法、装置及机器人 |
PCT/CN2017/083965 WO2018205230A1 (zh) | 2017-05-11 | 2017-05-11 | 物品搜索方法、装置及机器人 |
US16/679,692 US11389961B2 (en) | 2017-05-11 | 2019-11-11 | Article searching method and robot thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2017/083965 WO2018205230A1 (zh) | 2017-05-11 | 2017-05-11 | 物品搜索方法、装置及机器人 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/679,692 Continuation US11389961B2 (en) | 2017-05-11 | 2019-11-11 | Article searching method and robot thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018205230A1 true WO2018205230A1 (zh) | 2018-11-15 |
Family
ID=60554183
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2017/083965 WO2018205230A1 (zh) | 2017-05-11 | 2017-05-11 | 物品搜索方法、装置及机器人 |
Country Status (4)
Country | Link |
---|---|
US (1) | US11389961B2 (zh) |
JP (1) | JP6905087B2 (zh) |
CN (1) | CN107466404B (zh) |
WO (1) | WO2018205230A1 (zh) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109961074A (zh) * | 2017-12-22 | 2019-07-02 | 深圳市优必选科技有限公司 | 一种查找物品的方法、机器人及计算机可读存储介质 |
CN110019863B (zh) * | 2017-12-26 | 2021-09-17 | 深圳市优必选科技有限公司 | 一种物体查找方法、装置、终端设备和存储介质 |
CN108555909A (zh) * | 2018-04-17 | 2018-09-21 | 子歌教育机器人(深圳)有限公司 | 一种目标寻找方法、ai机器人以及计算机可读存储介质 |
CN108858207A (zh) * | 2018-09-06 | 2018-11-23 | 顺德职业技术学院 | 一种基于远程控制的多机器人协同目标搜索方法及系统 |
CN109472825B (zh) * | 2018-10-16 | 2021-06-25 | 维沃移动通信有限公司 | 一种对象搜索方法及终端设备 |
CN110853135A (zh) * | 2019-10-31 | 2020-02-28 | 天津大学 | 基于养老机器人的室内场景实时重建跟踪服务方法 |
CN113510716A (zh) * | 2021-04-28 | 2021-10-19 | 哈尔滨理工大学 | 一种基于球形电机的仿人型护理机器人 |
CN113542689A (zh) * | 2021-07-16 | 2021-10-22 | 金茂智慧科技(广州)有限公司 | 基于无线物联网的图像处理方法及相关设备 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102982170A (zh) * | 2012-12-14 | 2013-03-20 | 上海斐讯数据通信技术有限公司 | 失物追踪系统及方法 |
CN105573326A (zh) * | 2016-02-03 | 2016-05-11 | 南京聚立工程技术有限公司 | 移动巡检极地机器人自主充电系统及其方法 |
CN106314728A (zh) * | 2016-09-18 | 2017-01-11 | 河海大学常州校区 | 水下搜救机器人、协同控制系统及其工作方法 |
CN205969038U (zh) * | 2016-07-06 | 2017-02-22 | 山东海旭物联网有限公司 | 智能机器人系统 |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4254588B2 (ja) * | 2004-03-18 | 2009-04-15 | 沖電気工業株式会社 | 自律ロボットおよびその制御方法 |
US8447863B1 (en) * | 2011-05-06 | 2013-05-21 | Google Inc. | Systems and methods for object recognition |
KR101255950B1 (ko) * | 2011-06-13 | 2013-05-02 | 연세대학교 산학협력단 | 위치기반 건설 현장 관리 방법 및 시스템 |
US9079315B2 (en) * | 2011-08-29 | 2015-07-14 | Neil Davey | Banking automation using autonomous robot |
US8386079B1 (en) * | 2011-10-28 | 2013-02-26 | Google Inc. | Systems and methods for determining semantic information associated with objects |
US10406686B2 (en) * | 2012-12-14 | 2019-09-10 | Abb Schweiz Ag | Bare hand robot path teaching |
JP6370038B2 (ja) * | 2013-02-07 | 2018-08-08 | キヤノン株式会社 | 位置姿勢計測装置及び方法 |
US9355368B2 (en) * | 2013-03-14 | 2016-05-31 | Toyota Motor Engineering & Manufacturing North America, Inc. | Computer-based method and system for providing active and automatic personal assistance using a robotic device/platform |
US9607584B2 (en) * | 2013-03-15 | 2017-03-28 | Daqri, Llc | Real world analytics visualization |
US9342785B2 (en) * | 2013-11-15 | 2016-05-17 | Disney Enterprises, Inc. | Tracking player role using non-rigid formation priors |
US9643779B2 (en) * | 2014-03-31 | 2017-05-09 | Panasonic Intellectual Property Corporation Of America | Article management system and transport robot |
AU2014274647B2 (en) * | 2014-12-12 | 2021-05-20 | Caterpillar Of Australia Pty Ltd | Determining terrain model error |
US20170046965A1 (en) * | 2015-08-12 | 2017-02-16 | Intel Corporation | Robot with awareness of users and environment for use in educational applications |
US10282591B2 (en) * | 2015-08-24 | 2019-05-07 | Qualcomm Incorporated | Systems and methods for depth map sampling |
US9827677B1 (en) * | 2016-05-16 | 2017-11-28 | X Development Llc | Robotic device with coordinated sweeping tool and shovel tool |
US9827678B1 (en) * | 2016-05-16 | 2017-11-28 | X Development Llc | Kinematic design for robotic arm |
US10137567B2 (en) * | 2016-09-20 | 2018-11-27 | Toyota Motor Engineering & Manufacturing North America, Inc. | Inventory robot |
US10430657B2 (en) * | 2016-12-12 | 2019-10-01 | X Development Llc | Object recognition tool |
US10140773B2 (en) * | 2017-02-01 | 2018-11-27 | Accenture Global Solutions Limited | Rendering virtual objects in 3D environments |
CN106874092A (zh) * | 2017-02-10 | 2017-06-20 | 深圳市笨笨机器人有限公司 | 机器人任务托管方法及系统 |
CN111741513B (zh) * | 2020-06-18 | 2023-04-07 | 深圳市晨北科技有限公司 | 一种物联网设备的配网方法及相关设备 |
-
2017
- 2017-05-11 JP JP2019561904A patent/JP6905087B2/ja active Active
- 2017-05-11 CN CN201780000587.8A patent/CN107466404B/zh active Active
- 2017-05-11 WO PCT/CN2017/083965 patent/WO2018205230A1/zh active Application Filing
-
2019
- 2019-11-11 US US16/679,692 patent/US11389961B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102982170A (zh) * | 2012-12-14 | 2013-03-20 | 上海斐讯数据通信技术有限公司 | 失物追踪系统及方法 |
CN105573326A (zh) * | 2016-02-03 | 2016-05-11 | 南京聚立工程技术有限公司 | 移动巡检极地机器人自主充电系统及其方法 |
CN205969038U (zh) * | 2016-07-06 | 2017-02-22 | 山东海旭物联网有限公司 | 智能机器人系统 |
CN106314728A (zh) * | 2016-09-18 | 2017-01-11 | 河海大学常州校区 | 水下搜救机器人、协同控制系统及其工作方法 |
Also Published As
Publication number | Publication date |
---|---|
JP6905087B2 (ja) | 2021-07-21 |
CN107466404B (zh) | 2023-01-31 |
JP2020520508A (ja) | 2020-07-09 |
US20200070348A1 (en) | 2020-03-05 |
CN107466404A (zh) | 2017-12-12 |
US11389961B2 (en) | 2022-07-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018205230A1 (zh) | 物品搜索方法、装置及机器人 | |
CN104714981B (zh) | 语音消息搜索方法、装置及系统 | |
CN107430853B (zh) | 将用于具有选择性离线能力的话音动作的数据本地保存在支持话音的电子装置中 | |
US20150170664A1 (en) | Compartmentalized self registration of external devices | |
WO2019027546A1 (en) | KNOWLEDGE GRAPH FOR SEMANTIC CONVERSATIONAL RESEARCH | |
CN109155748A (zh) | 互联网云托管的自然语言交互式消息传送系统服务器协作 | |
US20140245178A1 (en) | Communication device and method for profiling and presentation of message threads | |
CN109155749A (zh) | 互联网云托管的自然语言交互式消息传送系统会话器 | |
AU2014369911A1 (en) | Providing access to a cloud based content management system on a mobile device | |
JP6404351B2 (ja) | 商品情報を通信および提示するための方法、装置、および、システム | |
CN109076010A (zh) | 互联网云托管的自然语言交互式消息传送系统用户解析器 | |
CN103365893B (zh) | 一种用于实现搜索用户的个体信息的方法和设备 | |
CN109661662A (zh) | 利用外部上下文针对相关性将查询结果进行排名 | |
US9459933B1 (en) | Contention and selection of controlling work coordinator in a distributed computing environment | |
JP2023520483A (ja) | 検索コンテンツ表示方法、装置、電子機器及び記憶媒体 | |
US20160292291A1 (en) | Methods and apparatuses for opening a webpage, invoking a client, and creating a light application | |
CN111970189B (zh) | 一种内容分享控制方法、装置、电子设备和存储介质 | |
WO2018005204A1 (en) | Providing communication ranking scheme based on relationship graph | |
CN104424304A (zh) | 一种基于情景感知信息的个性化推荐与导览系统及控制方法 | |
WO2014152088A2 (en) | Simplified collaborative searching through pattern recognition | |
CN103678624A (zh) | 搜索方法、搜索服务器、搜索请求执行方法及终端 | |
CN103973884A (zh) | 信息的显示方法、装置及终端 | |
US11893427B2 (en) | Method for determining and notifying users of pending activities on CRM data | |
US20150370908A1 (en) | Method, system and computer program for managing social networking service information | |
CN112352401A (zh) | 生成涉及图像文件的智能回复 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17909583 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2019561904 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 28/04/2020) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17909583 Country of ref document: EP Kind code of ref document: A1 |