WO2020044097A1 - Procédé et appareil destinés à mettre en œuvre un service en fonction d'un emplacement - Google Patents

Procédé et appareil destinés à mettre en œuvre un service en fonction d'un emplacement Download PDF

Info

Publication number
WO2020044097A1
WO2020044097A1 PCT/IB2018/057165 IB2018057165W WO2020044097A1 WO 2020044097 A1 WO2020044097 A1 WO 2020044097A1 IB 2018057165 W IB2018057165 W IB 2018057165W WO 2020044097 A1 WO2020044097 A1 WO 2020044097A1
Authority
WO
WIPO (PCT)
Prior art keywords
target object
information
user
location
terminal
Prior art date
Application number
PCT/IB2018/057165
Other languages
English (en)
Chinese (zh)
Inventor
廖科海
Original Assignee
优视科技新加坡有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 优视科技新加坡有限公司 filed Critical 优视科技新加坡有限公司
Publication of WO2020044097A1 publication Critical patent/WO2020044097A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/52Network services specially adapted for the location of the user terminal

Definitions

  • a method and device for implementing location services requires that the Chinese Patent Application filed on August 27, 2018 with the Chinese Patent Office under the application number of 201810986044. 4.
  • the Chinese patent application with the invention name "A Method and Device for Realizing Location Services” has priority Rights, the entire contents of which are incorporated herein by reference.
  • the present application relates to the field of computer technology, and particularly to the field of Internet technology, and in particular, to a method and device for implementing location services.
  • Background technique
  • LBS Location Based Service
  • the user when a user uses LBS, the user usually recommends a nearby store based on the user's location, or the user directly enters the information of the target place (such as the name of the target place) for searching, and further displays the target place. Corresponding position.
  • the purpose of this application is to provide a method and device for implementing location services, which are used to solve the inconvenience of using the traditional LBS method in some application scenarios.
  • an embodiment of the present application provides a method for implementing a location service, including:
  • the terminal collects image data of a target object at an actual location to obtain real scene data
  • the geographic identification information is provided to a user.
  • an embodiment of the present application further provides an apparatus for implementing a location service, including:
  • An image acquisition module is configured to perform image acquisition for a target object at an actual location to obtain real scene data.
  • the information determining module is configured to determine geographic identification information that matches the target object according to the real scene data.
  • An information providing module is configured to provide the geographic identification information to a user.
  • an embodiment of the present application further provides an electronic device, including: one or more processors;
  • a storage device for storing one or more programs
  • the one or more processors When the one or more programs are executed by the one or more processors, the one or more processors are caused to implement the foregoing method in the present application.
  • an embodiment of the present application further provides a computer-readable storage medium on which a computer program is stored, and the program is implemented by a processor to implement the foregoing method in the present application.
  • the method and device for implementing location services provided in this application are different from the traditional LBS method.
  • a user can directly use a terminal to perform image acquisition on a target object such as a building or a store at a certain actual location, and obtain corresponding information after the acquisition.
  • the geographic identification information corresponding to the target object can be determined through the real-world data, so that the geographic identification information can be provided to the user.
  • this method can reduce or avoid the tedious steps of searching, viewing, and inferring the corresponding position of the target object on the electronic map through the electronic map application to a certain extent, and making it easier for the user to know the relevant information of the target object. It helps to improve the user experience, and can minimize the deviation caused by the user's subjective inference.
  • FIG. 1 is a schematic diagram of an interaction architecture between a terminal and a server provided in an embodiment of the present application
  • FIG. 2 is a schematic flowchart of a method for implementing a location service provided by an embodiment of the present application on the basis shown in FIG. 1;
  • FIG. 3 is a schematic diagram of a method for displaying geographic identification information provided by an embodiment of the present application;
  • FIG. 4a is a schematic diagram of displaying the location service information provided by the embodiment of the present application on the basis shown in FIG. 3;
  • FIG. 4b is a diagram for more detailed location service information provided by the embodiment of the present application on the basis of FIG. 4a Schematic for presentation;
  • 4c is a schematic diagram of another manner of displaying location service information provided by an embodiment of the present application.
  • FIG. 5 is a schematic diagram of another interaction architecture provided by an embodiment of the present application.
  • FIG. 6 is a schematic flowchart of a method for implementing a location service based on a first terminal side according to an embodiment of the present application
  • FIG. 7 is a schematic flowchart of a method for implementing a location service based on a second terminal side according to an embodiment of the present application
  • FIG. 9 is a schematic structural diagram of a device for implementing location services provided by an embodiment of the present application
  • FIG. 10 is a schematic structural diagram of a device for implementing location services on a first terminal side based on the architecture shown in FIG. 5 according to an embodiment of the present application;
  • FIG. 11 is a schematic structural diagram of a device for implementing location services on a second terminal side based on the architecture shown in FIG. 5 according to an embodiment of the present application;
  • FIG. 12 is a structure diagram S3 of a device for implementing location services on the server side based on the architecture shown in FIG. 5 according to an embodiment of the present application;
  • FIG. 13 is a schematic structural diagram of an electronic device according to an embodiment of the present application. detailed description
  • location services are usually recommended based on the user's location, or the corresponding location is determined by searching based on the information entered by the user, and displayed in the form of an electronic map. To the user, but this way has a certain degree of inconvenience.
  • the user can launch the electronic map application on the terminal, and estimate the correspondence of the building in the electronic map based on the user's location displayed on the electronic map and the actual location of the building seen by the user. Location, and further obtain relevant information such as name, specific address, or postal code, or plan a route to a building on an electronic map.
  • the above method is relatively inconvenient for the user, and when determining the corresponding position of the building in the electronic map, it mostly depends on the subjective judgment of the user, which may cause deviation.
  • the terminal may directly perform image collection on an actual location, and provide relevant information of a target object located on the actual location to a user, thereby implementing location services. , While simplifying the user's operation, it can provide relevant information to the user more accurately.
  • FIG. 1 a schematic diagram of an architecture based on a method for implementing a location service according to an embodiment of the present application is shown. As shown in Figure 1, it includes a terminal and a server, where:
  • the terminal can be considered as a mobile phone, tablet, laptop, smart watch, smart glasses or smart camera, etc.
  • Mobile communication terminal with image acquisition function can be connected to an external (not the terminal itself) image acquisition device (such as a camera).
  • the terminal In practical applications, users can use the terminal to collect images of actual target objects such as buildings, and the terminal can send the collected real-world data to the server.
  • the terminal itself has a positioning function, and the positioning function can optimize the results of the foregoing method in this application.
  • the terminal may not have a positioning function. This constitutes a limitation on this application.
  • the server can identify the corresponding target object based on the real scene data captured by the terminal, and determine the relevant information to feed back to the terminal.
  • the server may be constituted by a server.
  • the server may adopt a cluster architecture, a distributed architecture, or only adopt a single server, which may be specifically determined according to actual application requirements, which is not limited herein.
  • the server can be considered as a back-end service system for service providers such as navigation service providers, telecom operators, and commercial websites. It should be understood that a service system composed of distributed computers or computing devices should also be considered to be within the scope covered by the server in this application.
  • the interaction mode between the terminal and the server shown in FIG. 1 is not the only way. In other embodiments, the terminal itself has sufficient processing performance and storage space. Then, the terminal can directly obtain the real-world data collected by the terminal. Processing without uploading to the server. It can be understood that the devices and the numbers of the terminals and servers used in FIG. 1 are only exemplary. In actual application, as for the specific interaction method, the type of equipment and the number of equipment used between the terminal and the server, the details will be determined according to the actual application situation, which should not be construed as limiting this application.
  • FIG. 2 a method for implementing a location service provided in an embodiment of the present application is shown, which specifically includes the following steps:
  • Step S201 The terminal collects image data of a target object at an actual location to obtain real-world data.
  • the target object of the actual location described herein can be considered as a spatial place located in a real location in the real world, and can include but is not limited to: buildings (such as office buildings, shopping malls, etc.), streets, squares, or offline stores (Eg: restaurants, coffee shops, etc.).
  • the user can directly use the terminal to target the actual location.
  • the subject performs image acquisition.
  • the image acquisition may be performed by taking a picture, taking a video, or simply aiming a framing frame at a target object.
  • the user can trigger the terminal to perform image acquisition by issuing a trigger operation on the image acquisition function in the location service application that supports image acquisition installed or running on the terminal.
  • the trigger operation may include, but is not limited to, a tap operation, a slide operation, or a voice input operation.
  • an electronic map application or an application with an electronic map function provides a corresponding real scene collection function, and the user can trigger the real scene collection function from the above application to start the camera of the terminal to the actual location. Image acquisition of the target object.
  • the collection of real-world images may also be implemented by a camera application carried in a terminal operating system, that is, a user launches the camera application and points the lens at a target object at an actual location. So as to achieve the image set.
  • the real scene data may be image data or video data, which is not specifically limited herein.
  • Step S203 Determine geographic identification information that matches the target object according to the real scene data.
  • the geographic identification information may include at least the name of the target object (such as a building name, a store name, etc.), an identification number (such as a number, an ID, etc.), coordinate data, and specific address information.
  • the server can process the real scene data, that is, the terminal sends the collected real scene data to the server, so that the server determines the The actual location information matched with the real scene data is fed back to the terminal.
  • the terminal itself may also perform the above step S203.
  • Step S205 Provide the geographic identification information to the user.
  • the manner in which the geographical identification information is provided to the user may be diversified, and the geographical identification information may be displayed to the user, or may be broadcasted to the user in the form of voice (for example, the voice to the user: current The building is XX Tower A).
  • the display may be performed in a specified interface of the terminal.
  • the specified interface described herein may include, but is not limited to, a photo interface, a viewfinder interface, or Video interface.
  • this is a display manner of geographic identification information in the embodiment of the present application.
  • the framing interface in the figure can display the real scene captured by the camera in real time. Select the target object in the real scene.
  • the geographical identification information is the specific name of the building. For the building selected by the framing frame, the specific name will be displayed in real-time in the form of text.
  • the method shown in FIG. 3 is only an example, and is not the only display method. In actual applications, other display methods can also be used.
  • the geographical identification information also includes standard pictures and profile information of the target object. Then, a separate interface can be used on the terminal to present standard pictures and profile information of the target object.
  • a user uses a smart wearable device such as virtual reality (VR) or augmented reality (AR) (such as: AR glasses, VR helmets, etc.).
  • VR virtual reality
  • AR augmented reality
  • the geographical identification information can also be rendered and displayed in real time on the smart wearable device. on the screen.
  • the user can directly use the terminal to perform image acquisition on target objects such as buildings or stores located in a certain actual location.
  • target objects such as buildings or stores located in a certain actual location.
  • the user can pass The real scene data determines the geographical identification information corresponding to the target object, so that the geographical identification information can be provided to the user.
  • this method can reduce or avoid the tedious steps of searching, viewing, and inferring the corresponding position of the target object on the electronic map through the electronic map application to a certain extent, and making it easier for the user to know the relevant information of the target object. It helps to improve the user experience, and can minimize the deviation caused by the user's subjective inference.
  • the above steps may be executed by the terminal itself, an application running on the terminal, or a server.
  • the execution subject may also change.
  • the execution subject of step S201 may be the terminal itself
  • the execution subject of step S203 may be the server.
  • the specific will be determined according to the actual application situation, which should not be understood as limiting the present application.
  • the target object image is first identified from the real scene data. Feature, and then find matching feature data in a pre-established feature database according to the identified image features, and determine the geographic identification information corresponding to the matched feature data as the geographic identification matched by the target object in the real scene data information.
  • the pre-trained recognition model combined with the recognition algorithm can be used to realize target object recognition and feature extraction, such as: local feature point extraction, speckle detection, corner detection, binary string feature descriptors and other image recognition algorithms.
  • target object recognition and feature extraction such as: local feature point extraction, speckle detection, corner detection, binary string feature descriptors and other image recognition algorithms.
  • corresponding identification technology is relatively mature, so I won't go into too much detail here.
  • the pre-established feature database it can be considered that corresponding image data is obtained in advance for different target objects, and the pre-obtained image data is processed by steps such as feature extraction to obtain relevant feature data, thereby establishing The resulting feature database.
  • the acquired image data of the target object in order to establish a feature database, can be derived from a variety of different ways, such as real-world maps, field collections, network pictures, and user uploads. Decided.
  • the data format stored in the feature database may be, for example: target object name-feature data-geographic identification information.
  • target object name-feature data-geographic identification information may be, for example: target object name-feature data-geographic identification information.
  • the data format stored in the feature database may also be in other ways, but it is understandable that the data format used in the feature database should be Conducive to the determination of geographical indication information.
  • the current location information of the terminal may also be obtained, that is, for the method shown in FIG. 2, matching with the target object is determined according to the real scene data.
  • the method may further include: obtaining location information of the terminal.
  • the process of determining geographic identification information that matches the target object according to the real scene data may also be: determining, based on the location information of the terminal and the real scene data, a match that matches the target object. Geographical information.
  • the location information of the terminal can be used to filter the feature data of the target object that matches the location information in the feature database, and further extract the feature data and the real-world data of the filtered target object. Image features for recognition and comparison.
  • the location of the terminal can also be considered as the location of the user. Then, using the location of the terminal as a reference, the range of feature data to be queried can be effectively reduced, that is, only The target object characteristic data within a certain range of the user's location needs to be compared, so that the target object corresponding to the real scene data and the geographic identification information matched by the target object can be determined quickly.
  • the method shown in FIG. 2 can also be applied to different application scenarios.
  • information about offline stores such as restaurants
  • the user can know the offline stores near the corresponding place through the application such as an electronic map application or a consumer service application (such as a business review application).
  • the application such as an electronic map application or a consumer service application (such as a business review application).
  • multiple offline stores may be distributed in a certain building (such as an office building or a shopping mall). If a user wants to know the offline stores contained in the building, the usual method is as follows: An electronic map application, which uses the electronic map to view information about each store in a certain place.
  • the aforementioned problems may still occur, that is, if the user is near the above-mentioned building, but does not know the name, detailed address and other information of the building in advance, the user needs to subjectively observe the actual environment. Infer the possible corresponding position of the building to be queried on the electronic map, and check the information of the stores contained in the building displayed on the location map.
  • information such as stores and stores corresponding to the building can be provided to users.
  • the method may further include the following steps:
  • the location service information may be considered as various types of information related to location services.
  • the location service information may also include: stores, merchants, business information, stores, or merchants in the building. Traffic, hotspot ratings, referrals, rankings, and more.
  • location service information may be obtained from different information sources, and the information sources may be, for example, users, service providers, enterprises, third-party data statistics agencies, and the like, which are not specifically limited herein.
  • the method of providing the location service information to the user is similar to the above-mentioned geographic identification information, and it can also use voice broadcasting, or display the location service information on the terminal interface.
  • the location service information When the location service information is displayed, it may be specifically displayed in the form of text, numbers, symbols, graphics, icons, or a combination thereof.
  • the location service information of the target object can be displayed in real time in the framing interface of the terminal.
  • the building may be marked by a framing frame.
  • location services Information can also be displayed based on the corresponding framing frame.
  • the display method of the location service information may be as shown in FIG. 4a. That is, in the framing interface shown in FIG. 4a, the location service information of the building is displayed under its name and connected to the viewfinder by means of an instruction line. frame.
  • the location service information shown in FIG. 4a is a kind of abbreviated information, and the user can click to display more detailed information. Location service information.
  • FIG. 4b after the user performs operations on the location service information in FIG. 4a (eg, clicks), more detailed location service information related to the building is displayed in the viewfinder interface. More detailed location service information can be displayed in the designated information display area. As shown in Figure 4b, the information display area uses floating windows to display detailed location service information on the left side of the selected building. In the display state of FIG. 4b, the user clicks the information display area again, the information display area disappears, and the interface returns to the state shown in FIG. 4a.
  • the information display area can also use floating windows with higher transparency, or be fully transparent.
  • a method such as jumping to a new page may also be adopted, that is, when the user clicks on the thumbnail information in FIG. 4a, the user may jump from the viewfinder interface currently displayed on the terminal. Go to the details page.
  • FIG. 4a and FIG. 4b only show the display of the mall location service information contained in the building.
  • multiple merchants may be displayed, that is, as shown in FIG. 4c.
  • the different merchants shown in FIG. 4c are not all merchants in the building.
  • These merchants displayed may be determined after the server-side back-end algorithm is sorted according to the popularity, rating, etc., which is not specifically limited here. .
  • FIG. 5 is a schematic diagram of an architecture adopted by the foregoing method according to an embodiment of the present application in a scenario of providing location service information. It can be seen from FIG. 5 that the first terminal, the second terminal, and the server are included.
  • the first terminal may be considered as a terminal device that performs image collection on a target object at an actual location. For details, refer to the terminal shown in FIG. 1, which is not described in detail here.
  • the second terminal may be considered as a terminal that uploads location service information related to the actual location target object.
  • the second terminal is usually located in a target object at an actual location, and may include, but is not limited to, a mobile phone, a tablet computer, a smart watch, a notebook computer, a computer, a cash register device with a network function, a number picking device, and a card used by a merchant Machine and so on.
  • the mobile phone used by the user can upload location information and rating information for the restaurant to the server, then the mobile phone used by the user can be regarded as a second terminal.
  • the number picking device of a restaurant can upload the information such as the number of the current time to the server, and the number picking device can also be regarded as a second terminal.
  • the server can feedback the location service information uploaded by the second terminal to the first terminal.
  • the specific composition structure and type of the server please refer to the corresponding content in FIG. 1, which will not be repeated here.
  • the number of second terminals is usually large, thereby forming a large number of data sources, and providing data support for implementing the above method in this application.
  • FIG. 5 only illustrates the second terminal by way of example, and does not limit the number of the second terminal to one, nor does it limit the type of the second terminal to a mobile phone.
  • the number of the first terminals may also be large, and is not limited to that shown in FIG. 5.
  • Step S601 The first terminal collects image data of a target object at an actual location to obtain real scene data.
  • Step S603 Send the real scene data to a server, so that the server determines, based on the real scene data, geographic identification information and location service information that match the target object, and feeds it back to the first terminal.
  • the specific manner in which the first terminal sends real-world data to the server may be a request manner, that is, the first terminal generates a service information acquisition request according to the real-world data, and sends the service information acquisition request.
  • the server may be a request manner, that is, the first terminal generates a service information acquisition request according to the real-world data, and sends the service information acquisition request.
  • Step S605 Receive the geographical indication information and location service information fed back by the server and provide it to the user.
  • the server For a specific method adopted by the first terminal when displaying the geographical indication information and the location service information of the target object, reference may be made to the foregoing content, and details are not described herein again.
  • Step S701 The second terminal obtains basic information related to the location service.
  • the second terminal since the types of the second terminal are various, and the information uploaded to the server by different types of the second terminal is also different, as mentioned above, if the second terminal is a mobile communication terminal such as a mobile phone, the second terminal Location information, evaluation information, etc. may be uploaded to the server; if the second terminal is a number-taking device used by a merchant, the second terminal may upload rank information and other bit information to the server.
  • the above information uploaded by the second terminal to the server can be used as a basis for generating location service information by the server. Therefore, the information uploaded by the second terminal can be regarded as basic information related to the location service.
  • the above-mentioned basic information can be obtained by the second terminal itself.
  • the location information of the mobile phone can be obtained by the mobile phone itself through related positioning services; obtain.
  • Step S703 Upload the basic information to a server, so that the server generates location service information according to the basic information.
  • location information can be uploaded to the server; and for a non-mobile communication terminal such as a cash register device, a number obtaining device, etc.
  • a mobile communication terminal such as a mobile phone, a tablet computer, etc.
  • location information can be uploaded to the server; and for a non-mobile communication terminal such as a cash register device, a number obtaining device, etc.
  • the corresponding location information can be uploaded to the server together.
  • Step S801 Receive basic information related to the actual location target object location service uploaded by the second terminal.
  • Step S803 Generate corresponding location service information for the target object according to the basic information.
  • the server can further perform data processing such as classification and statistics according to the basic information, so as to generate corresponding location service information for each target object. For example: For restaurant A located in a certain place, after receiving the ranking information and other bit information uploaded by the restaurant's number picking device, the server can generate location service information for restaurant A (specifically: Ranking situation).
  • Step S805 After receiving the real scene data sent by the first terminal, feedback the location service related to the target in the real scene data to the first terminal.
  • the user can use the first terminal to perform image acquisition on the target object of the building in the actual location, and the first terminal can send a corresponding information acquisition request to the server based on the collected real scene data. After that, the location service information related to the target object can be provided to the user.
  • the above method in the embodiment of the present application may be applied to an electronic map application, a browser, or a location service information application.
  • the foregoing is a method for implementing a location service provided by an embodiment of the present application. Based on the same thinking, the embodiment of the present application also provides a corresponding device for implementing a location service.
  • FIG. 9 shows an apparatus for implementing a location service provided by the embodiment shown in FIG. 1 in the embodiment of the present application.
  • the apparatus includes:
  • the image acquisition module 901 is configured to perform image acquisition for a target object at an actual location to obtain real scene data.
  • the information determining module 902 is configured to determine geographic identification information matching the target object according to the real scene data.
  • the information providing module 903 is configured to provide the geographic identification information to a user.
  • the information determining module 902 is configured to extract image characteristics of the target object from the real scene data, and query in a pre-established database according to the image characteristics of the target object to determine the target object's Geographical information.
  • the image acquisition module 901 is configured to perform image acquisition on a target object of the actual location according to a trigger operation of a user.
  • the triggering operation includes at least: a click operation, a slide operation, or a voice input operation for its image acquisition function in a location service application that supports image acquisition.
  • the information providing module 903 is configured to, in a specified interface, place the geographical indication of the target object Information is shown to users.
  • the specified interface includes at least: a photo interface, a viewfinder interface, or a video interface on which the target object is displayed.
  • the information determining module 902 is further configured to determine location service information that matches the target object according to the real scene data;
  • the information providing module 903 is further configured to provide the location service information to the user.
  • the geographical indication information at least includes: a name, an identification number, and / or specific address information of the target object;
  • the location service information includes at least: stores, merchants, and enterprise information settled in the building, and / or merchant traffic, hotspot scores, recommendations, and rankings of the merchants.
  • the architecture shown in Figure 5 may also be used. Therefore, for the architecture shown in FIG. 5, a corresponding device is also provided in the embodiment of the present application. specifically:
  • the apparatus includes:
  • the image acquisition module 1001 is configured to perform image acquisition for a target object at an actual location to obtain real scene data.
  • the sending module 1002 is configured to send the real-world data to a server, so that the server determines, based on the real-world data, geographic identification information and location service information that match the target object, and feeds it back to the One terminal.
  • the information providing module 1003 is configured to receive geographic indication information and location service information fed back by the server, and provide the information to the user.
  • a device for implementing location services based on a second terminal side in an embodiment of the present application includes:
  • the basic information acquisition module 1101 is configured to acquire basic information related to a location service.
  • the uploading module 1102 is configured to upload the basic information to a server, so that the server generates location service information according to the basic information.
  • the apparatus includes:
  • the receiving module 1201 is configured to receive basic information related to the actual location target object location service uploaded by the second terminal.
  • the information processing module 1202 is configured to generate corresponding location service information for the target object according to the basic information.
  • the information feedback module 1203 is configured to, after receiving the real scene data sent by the first terminal, feed back the location service related to the target corresponding to the real scene data to the first terminal.
  • a corresponding computer-readable storage medium is also provided in the embodiments of the present application, and the computer-readable medium may be included in the device described in the foregoing embodiments; or may exist separately. It is not assembled into the device.
  • the computer-readable storage medium may be configured to store one or more computer programs; when the one or more computer programs are executed by one or more processors, so that the one or more processors implement the foregoing embodiments As described in the device section.
  • the electronic device may include: one or more processors, a communication interface, a storage medium, and a communication bus, where:
  • the processor, the communication interface, and the storage medium complete communication with each other through a communication bus.
  • the communication interface may be an interface of a communication module, such as an interface of a GSM module.
  • the storage medium may be, but is not limited to, a random access storage medium (Random Access Memory, RAM), a read-only storage medium (Read Only Memory, ROM), and a programmable read-only storage medium (Programmable Read-Only Memory (PROM), Erasable Programmable Read-Only Memory (EPROM), Electric Erasable Programmable Read-Only Memory (EEPROM), etc.
  • RAM Random Access Memory
  • ROM Read Only Memory
  • EEPROM Electrical Erasable Programmable Read-Only Memory
  • the processor may be a general-purpose processor, including a central processing unit (CPU), a network processor (NP), and the like; it may also be a digital signal processor (DSP), an application specific integrated circuit (ASIC), or a ready-made Programmable gate array (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA ready-made Programmable gate array
  • the general-purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
  • the processor may be specifically configured to: acquire image data of a target object at an actual location to obtain real scene data, and send the real scene data to the server, so that the server The real scene data determines geographic identification information and location service information that match the target object, and feeds it back to the first terminal, receives the geographical indication information and location service information fed back by the server, and provides it to the user .
  • the processor may be further configured to: obtain basic information related to the location service, and upload the basic information to the server, so that the server generates a location service according to the basic information information.
  • the processor may be further configured to: receive basic information related to the actual location target object location service uploaded by the second terminal, and generate corresponding location service information for the target object according to the basic information After receiving the real scene data sent by the first terminal, feedback the location service related to the target corresponding to the real scene data to the first terminal.
  • the process described above with reference to the flowchart may be implemented as a computer software program.
  • embodiments of the present disclosure include a computer program product including a computer program carried on a computer-readable medium, the computer program containing program code for performing a method shown in a flowchart.
  • the computer program may be downloaded and installed from a network through a communication section, and / or installed from a removable medium.
  • CPU central processing unit
  • the above-mentioned functions defined in the method of the present application are performed.
  • the computer-readable medium described in this application may be a computer-readable signal medium or a computer-readable storage medium or any combination of the foregoing.
  • the computer-readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination thereof. More specific examples of computer-readable storage media may include, but are not limited to: electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read-only memory (ROM), erasable Programming read-only memory (EPR0M or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the foregoing.
  • a computer-readable storage medium may be any tangible medium containing or storing a program, and the program may be used by or in combination with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a data signal that is included in baseband or propagated as part of a carrier wave, and which carries computer-readable program code. Such a propagated data signal may take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • the computer-readable signal medium may also be any computer-readable medium other than a computer-readable storage medium, and the computer-readable medium may send, propagate, or transmit a program for use by or in connection with an instruction execution system, apparatus, or device .
  • Program code embodied on a computer-readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
  • Computer program code for performing the operations of the present application may be written in one or more programming languages or a combination thereof, the programming languages including an object-oriented programming language such as Java, Smalltalk, C ++, and also conventional A procedural programming language such as "C" or a similar programming language.
  • the program code can be executed entirely on the user's computer, partly on the user's computer, as an independent software package, partly on the user's computer, partly on a remote computer, or entirely on the remote computer or server.
  • the remote computer can be connected to the user's computer through any kind of network, including a local area network (LAN) or wide area network (WAN), or it can be connected to an external computer (for example, by using an Internet service provider to pass Internet connection).
  • LAN local area network
  • WAN wide area network
  • each block in the flowchart or block diagram may represent a module, a program segment, or a portion of a code, which module, program segment, or part of the code contains one or more functions for implementing a specified logical function Executable instructions.
  • the functions noted in the blocks may also occur in a different order than those marked in the drawings. For example, two successively represented boxes may actually be executed substantially in parallel, and they may sometimes be executed in the reverse order, depending on the functions involved.
  • each block in the block diagrams and / or flowcharts, and combinations of blocks in the block diagrams and / or flowcharts may be implemented in a dedicated hardware-based system that performs the specified function or operation. Or, it can be implemented by a combination of dedicated hardware and computer instructions.
  • the modules described in the embodiments of the present application may be implemented in a software manner, or may be implemented in a hardware manner.
  • the described module may also be provided in a processor, for example, it may be described as:
  • a processor includes an image acquisition module, a sending module, and an information providing module.
  • the names of these modules do not in any way constitute a limitation on the unit itself.
  • the image acquisition module can also be described as a "module that acquires real-world data by image acquisition of target objects at actual locations.”
  • first, second, the first, or “the second” used in various embodiments of the present disclosure may modify various components regardless of order and / or importance However, these expressions do not limit the corresponding components.
  • the above expressions are only configured for the purpose of distinguishing components from other components.
  • the first user equipment and the second user equipment represent different user equipments, although both are user equipments.
  • a first element may be referred to as a second element, and similarly, a second element may be referred to as a first element.
  • an element eg, a first element
  • another element eg, a second element
  • an element for example, a second element or “connected to” another element (for example, a second element)
  • an element for example, a second element or “connected to” another element (for example, a second element)
  • the one element is directly connected to the other element or the one element is via another element (for example, The third element) is indirectly connected to the other element.
  • an element for example, the first element
  • the second element no element (for example, the third element) is inserted between the two elements.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Information Transfer Between Computers (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

Selon certains modes de réalisation, la présente invention concerne un procédé et un appareil destinés à mettre en œuvre un service en fonction d'un emplacement (LBS). Une mise en œuvre particulière du procédé comprend un terminal qui : effectue une acquisition d'image sur un objet cible au niveau d'un emplacement actuel pour obtenir des données de scène réelle; détermine, conformément aux données de scène réelle, des informations d'identifiant géographique correspondant à l'objet cible; fournit les informations d'identifiant géographique à un utilisateur. Dans certaines scènes particulières, au moyen d'un mode différent d'un mode LBS classique, l'utilisateur peut effectuer directement, à l'aide d'un terminal, une acquisition d'image sur un objet cible, tels qu'un bâtiment et un magasin, situés au niveau d'au moins un emplacement actuel, affichant en outre des informations associées à un LBS de l'objet cible sur le terminal, de telle sorte que l'utilisateur apprend les informations associées aux informations cibles facilement, facilitant ainsi l'amélioration d'une expérience d'utilisateur et réduisant un décalage généré en raison de l'inférence subjective de l'utilisateur autant que possible.
PCT/IB2018/057165 2018-08-27 2018-09-18 Procédé et appareil destinés à mettre en œuvre un service en fonction d'un emplacement WO2020044097A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810986044.4A CN109040960A (zh) 2018-08-27 2018-08-27 一种实现位置服务的方法和装置
CN201810986044.4 2018-08-27

Publications (1)

Publication Number Publication Date
WO2020044097A1 true WO2020044097A1 (fr) 2020-03-05

Family

ID=64625544

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2018/057165 WO2020044097A1 (fr) 2018-08-27 2018-09-18 Procédé et appareil destinés à mettre en œuvre un service en fonction d'un emplacement

Country Status (2)

Country Link
CN (1) CN109040960A (fr)
WO (1) WO2020044097A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111432001A (zh) * 2020-03-24 2020-07-17 北京字节跳动网络技术有限公司 用于跳转场景的方法、装置、电子设备和计算机可读介质
CN111737599A (zh) * 2020-05-07 2020-10-02 北京城市网邻信息技术有限公司 一种房源对象的验证方法和装置
CN112819965A (zh) * 2020-12-31 2021-05-18 北京嘀嘀无限科技发展有限公司 一种车辆信息的显示方法、装置以及电子设备
CN113301138A (zh) * 2021-05-20 2021-08-24 苏州达家迎信息技术有限公司 目标服务节点的位置确定方法、装置及电子设备
CN113572850A (zh) * 2021-07-29 2021-10-29 上海浦东发展银行股份有限公司 一种数据同步方法、装置、服务器及存储介质
WO2023138029A1 (fr) * 2022-01-19 2023-07-27 上海商汤智能科技有限公司 Procédé et appareil de traitement de données de détection à distance, dispositif, support d'enregistrement et produit programme d'ordinateur

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108701158B (zh) 2016-12-30 2023-03-10 谷歌有限责任公司 对信息资源上的内容的基于散列的动态限制
CN109873992B (zh) * 2019-03-26 2021-12-24 联想(北京)有限公司 信息处理方法和装置
EP3827222B1 (fr) * 2019-05-24 2022-07-06 Google LLC Dispositif permettant de guider au moins deux utilisateurs vers un lieu de rencontre
CN110287358A (zh) * 2019-06-18 2019-09-27 深圳市中诺通讯有限公司 一种摄取实景获取周围目标信息的方法及系统
CN111158556B (zh) * 2019-12-31 2022-03-25 维沃移动通信有限公司 一种显示控制方法及电子设备
CN111159460A (zh) * 2019-12-31 2020-05-15 维沃移动通信有限公司 一种信息处理方法及电子设备
CN111337015B (zh) * 2020-02-28 2021-05-04 重庆特斯联智慧科技股份有限公司 一种基于商圈聚合大数据的实景导航方法与系统
CN113779184A (zh) * 2020-06-09 2021-12-10 大众问问(北京)信息科技有限公司 一种信息交互方法、装置及电子设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103440318A (zh) * 2013-08-29 2013-12-11 王靖洲 移动终端的景观识别系统
CN103473257A (zh) * 2012-06-06 2013-12-25 三星电子株式会社 基于图像跟踪无线终端的位置的设备和方法
CN104422439A (zh) * 2013-08-21 2015-03-18 希姆通信息技术(上海)有限公司 导航方法、装置、服务器、导航系统及其使用方法
US9449228B1 (en) * 2008-01-31 2016-09-20 Google Inc. Inferring locations from an image

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101916417A (zh) * 2010-09-03 2010-12-15 李占胜 一种基于位置的信息共享系统
CN102752336B (zh) * 2011-04-22 2016-02-03 腾讯科技(深圳)有限公司 基于地理位置服务的ugc的共享方法及系统
CN102708193A (zh) * 2012-05-15 2012-10-03 中国科学技术大学 一种基于位置服务的餐饮信息聚合搜索与互动的方法及系统
CN104180814A (zh) * 2013-05-22 2014-12-03 北京百度网讯科技有限公司 移动终端上实景功能中的导航方法和电子地图客户端
CN105160327A (zh) * 2015-09-16 2015-12-16 小米科技有限责任公司 建筑物识别方法和装置
CN107729528A (zh) * 2017-10-30 2018-02-23 珠海市魅族科技有限公司 一种建筑物信息获取方法及装置、计算机装置和计算机可读存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9449228B1 (en) * 2008-01-31 2016-09-20 Google Inc. Inferring locations from an image
CN103473257A (zh) * 2012-06-06 2013-12-25 三星电子株式会社 基于图像跟踪无线终端的位置的设备和方法
CN104422439A (zh) * 2013-08-21 2015-03-18 希姆通信息技术(上海)有限公司 导航方法、装置、服务器、导航系统及其使用方法
CN103440318A (zh) * 2013-08-29 2013-12-11 王靖洲 移动终端的景观识别系统

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111432001A (zh) * 2020-03-24 2020-07-17 北京字节跳动网络技术有限公司 用于跳转场景的方法、装置、电子设备和计算机可读介质
CN111432001B (zh) * 2020-03-24 2023-06-30 抖音视界有限公司 用于跳转场景的方法、装置、电子设备和计算机可读介质
CN111737599A (zh) * 2020-05-07 2020-10-02 北京城市网邻信息技术有限公司 一种房源对象的验证方法和装置
CN112819965A (zh) * 2020-12-31 2021-05-18 北京嘀嘀无限科技发展有限公司 一种车辆信息的显示方法、装置以及电子设备
CN112819965B (zh) * 2020-12-31 2023-12-12 北京嘀嘀无限科技发展有限公司 一种车辆信息的显示方法、装置以及电子设备
CN113301138A (zh) * 2021-05-20 2021-08-24 苏州达家迎信息技术有限公司 目标服务节点的位置确定方法、装置及电子设备
CN113301138B (zh) * 2021-05-20 2024-05-07 苏州达家迎信息技术有限公司 目标服务节点的位置确定方法、装置及电子设备
CN113572850A (zh) * 2021-07-29 2021-10-29 上海浦东发展银行股份有限公司 一种数据同步方法、装置、服务器及存储介质
CN113572850B (zh) * 2021-07-29 2024-05-03 上海浦东发展银行股份有限公司 一种数据同步方法、装置、服务器及存储介质
WO2023138029A1 (fr) * 2022-01-19 2023-07-27 上海商汤智能科技有限公司 Procédé et appareil de traitement de données de détection à distance, dispositif, support d'enregistrement et produit programme d'ordinateur

Also Published As

Publication number Publication date
CN109040960A (zh) 2018-12-18

Similar Documents

Publication Publication Date Title
WO2020044097A1 (fr) Procédé et appareil destinés à mettre en œuvre un service en fonction d'un emplacement
CN110225369B (zh) 视频选择播放方法、装置、设备和可读存储介质
US9407815B2 (en) Location aware photograph recommendation notification
US9564175B2 (en) Clustering crowdsourced videos by line-of-sight
KR101810578B1 (ko) 셔터 클릭을 통한 자동 미디어 공유
US8483715B2 (en) Computer based location identification using images
CN109981695B (zh) 内容推送方法、装置及设备
CN110858134B (zh) 数据、显示处理方法、装置、电子设备和存储介质
WO2020044099A1 (fr) Procédé et appareil de traitement de service basés sur une reconnaissance d'objets
WO2017080173A1 (fr) Système et procédé de push basé sur la reconnaissance d'informations de nature, et client
EP2827617A1 (fr) Procédé de recherche, client, serveur et système de recherche associés à une technologie de réalité augmentée mobile
US9600720B1 (en) Using available data to assist in object recognition
US10674183B2 (en) System and method for perspective switching during video access
EP3328088A1 (fr) Fourniture coopérative de fonctions d'utilisateur personnalisées à l'aide de dispositifs partagés et personnels
CN109767257B (zh) 基于大数据分析的广告投放方法、系统及电子设备
US20230316529A1 (en) Image processing method and apparatus, device and storage medium
WO2022134555A1 (fr) Procédé de traitement vidéo et terminal
US10360246B2 (en) Method, system, and apparatus for searching and displaying user generated content
CN113891105A (zh) 画面显示方法和装置、存储介质及电子设备
CN110020150B (zh) 信息推荐方法及装置
WO2016008759A1 (fr) Procédé de détermination de zones stables à l'intérieur d'un flux d'images, et dispositif portatif pour la mise en œuvre dudit procédé
WO2021031909A1 (fr) Procédé et appareil de production de contenu de données, dispositif électronique et support lisible par ordinateur
US20170109365A1 (en) File processing method, file processing apparatus and electronic equipment
KR20130126203A (ko) 클라이언트 단말기를 이용한 동영상 콘텐츠 공유 서비스 제공 시스템 및 방법
US20200409521A1 (en) Method for obtaining vr resource and terminal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18932066

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18932066

Country of ref document: EP

Kind code of ref document: A1