WO2020202347A1 - Système de fourniture d'informations et terminal d'informations - Google Patents

Système de fourniture d'informations et terminal d'informations Download PDF

Info

Publication number
WO2020202347A1
WO2020202347A1 PCT/JP2019/014254 JP2019014254W WO2020202347A1 WO 2020202347 A1 WO2020202347 A1 WO 2020202347A1 JP 2019014254 W JP2019014254 W JP 2019014254W WO 2020202347 A1 WO2020202347 A1 WO 2020202347A1
Authority
WO
WIPO (PCT)
Prior art keywords
article
display
information
information terminal
user
Prior art date
Application number
PCT/JP2019/014254
Other languages
English (en)
Japanese (ja)
Inventor
山岡 大祐
田中 一彦
祐 瀧口
瞳 ▲濱▼村
Original Assignee
本田技研工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 本田技研工業株式会社 filed Critical 本田技研工業株式会社
Priority to CN201980090738.2A priority Critical patent/CN113383363A/zh
Priority to JP2021511722A priority patent/JP7237149B2/ja
Priority to PCT/JP2019/014254 priority patent/WO2020202347A1/fr
Publication of WO2020202347A1 publication Critical patent/WO2020202347A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present invention relates to an information providing system and an information terminal that provide information to a user.
  • Patent Document 1 An application that provides services by connecting to a network via a mobile terminal such as a smartphone has been devised.
  • a captured image is displayed on a display in a portable wireless communication terminal (smartphone), and guidance (name) of a component included in the captured image is superimposed and displayed on the display. It is disclosed that the operation manual of the component is displayed on the display when the guidance of the superimposed component is pressed.
  • the in-vehicle part can the in-vehicle part be mounted or attached to the vehicle, or if the user is considering purchasing a large item, the user can load the item in the vehicle. It is desirable to easily grasp whether or not the article that the user is considering purchasing is suitable for the vehicle (that is, the suitability of the article for the vehicle), such as whether or not the article can be purchased. is there. In addition, the user may want to easily grasp information on articles that fit the structure of the vehicle, for example.
  • an object of the present invention is to provide the user with information desired by the user easily and intuitively.
  • the information providing system as one aspect of the present invention is an information providing system that provides a user with information on compatibility between a designated article and a target location where the designated article is to be mounted by using an information terminal having a camera and a display. Therefore, the information terminal has display control means for displaying the captured image obtained by the camera on the display, acquisition means for acquiring image data of the designated article, and image data acquired by the acquisition means. Based on this, the display control means has a generation means for generating an extended reality image of the designated article, and the display control means uses the extended reality image of the designated article generated by the generation means as the information in the captured image. It is characterized in that it is superimposed on the target location and displayed on the display.
  • the information desired by the user can be provided to the user through augmented reality, so that the user can easily and intuitively grasp the information.
  • Block diagram showing the configuration of the information provision system Flowchart showing the process of accepting the designation of goods
  • the figure which shows the display example of article information A diagram showing the inside of a car being photographed with an information terminal Figure showing display example of augmented reality image of designated article Figure showing display example of augmented reality image of designated article Figure showing display example of augmented reality image of designated article
  • FIG. 1 is a block diagram showing a configuration of the information providing system 100 of the present embodiment.
  • the information providing system 100 of the present embodiment includes, for example, an information terminal 10 and a server device 20 that are communicably connected to each other via a network NTW, and is a target of an article specified by a user and a vehicle on which the article is to be mounted. It is a system for providing a user with information on compatibility with a location.
  • the article designated by the user may be referred to as a "designated article”
  • the target location where the designated article is to be mounted may be referred to as a "scheduled loading location”.
  • a four-wheeled vehicle will be illustrated as a vehicle.
  • the information terminal 10 may include, for example, a processing unit 11, a storage unit 12, a camera 13, a display 14, a position detection sensor 15, a posture detection sensor 16, and a communication unit 17. Each part of the information terminal 10 is connected to each other so as to be able to communicate with each other via the system bus 18. Examples of the information terminal 10 include a smartphone and a tablet terminal. In the present embodiment, an example in which a smartphone is used as the information terminal 10 will be described. Smartphones and tablet terminals are mobile terminals having various functions other than the call function, but the dimensions of the displays are different from each other. In general, tablet terminals have larger display dimensions than smartphones.
  • the processing unit 11 includes a processor typified by a CPU, a storage device such as a semiconductor memory, an interface with an external device, and the like.
  • the storage unit 12 stores a program executed by the processor, data used by the processor for processing, and the like, and the processing unit 11 reads the program stored in the storage unit 12 into a storage device such as a memory and executes the program. be able to.
  • the storage unit 12 stores an application program (information providing program) for providing the user with information on the compatibility between the designated article and the planned mounting location of the vehicle, and the processing unit 11 stores the application program (information providing program).
  • the information providing program stored in the storage unit 12 can be read into a storage device such as a memory and executed.
  • the camera 13 has a lens and an image sensor, and captures a subject to acquire a captured image.
  • the camera 13 may be provided, for example, on an outer surface opposite to the outer surface on which the display 14 is provided.
  • the display 14 notifies the user of information by displaying an image.
  • the display 14 can display the captured image acquired by the camera 13 in real time.
  • the display 14 of the present embodiment includes, for example, a touch panel type LCD (Liquid Crystal Display), and has a function of receiving information input from a user in addition to a function of displaying an image.
  • the present invention is not limited to this, and the display 14 may have only the function of displaying an image, and an input unit (for example, a keyboard, a mouse, etc.) may be provided independently of the display 14.
  • the position detection sensor 15 detects the position and orientation of the information terminal 10.
  • the position detection sensor 15 for example, a GPS sensor that receives a signal from a GPS satellite to acquire the current position of the information terminal 10 or detects the direction in which the camera 13 of the information terminal 10 is directed based on geomagnetism or the like.
  • An orientation sensor or the like can be used.
  • the posture detection sensor 16 detects the posture of the information terminal 10.
  • an acceleration sensor, a gyro sensor, or the like can be used.
  • the communication unit 17 is communicably connected to the server device 20 via the network NTW. Specifically, the communication unit 17 has a function as a transmission unit that transmits information to the server device 20 via the network NTW and a function as a reception unit that receives information from the server device 20 via the network NTW. Have.
  • a first acquisition unit 11a acquires the data of the captured image acquired by the camera 13.
  • the second acquisition unit 11b acquires the article information and the like stored in the server device 20 in association with the article from the server device 20.
  • the identification unit 11c analyzes the captured image by performing image processing such as a pattern matching method, and identifies a planned mounting location in the captured image displayed on the display 14.
  • the generation unit 11d generates an Augmented Reality (AR) image of the designated article based on the image data of the appearance of the article.
  • AR Augmented Reality
  • the reception unit 11e executes a process of accepting the designation of the article by the user.
  • the display control unit 11d displays the captured image acquired by the first acquisition unit 11a on the display 14. Further, the display control unit 11e is based on the position and orientation of the information terminal 10 detected by the position detection sensor 15 and the attitude detection sensor 16 when the augmented reality image of the designated article is generated by the generation unit 11d. , The augmented reality image of the designated article is superimposed on the photographed image and displayed on the display 14.
  • the server device 20 may include a processing unit 21, a storage unit 22, and a communication unit 23.
  • the processing unit 21 includes a processor typified by a CPU, a storage device such as a semiconductor memory, an interface with an external device, and the like.
  • the storage unit 22 stores a program executed by the processor, data used by the processor for processing, and the like, and the processing unit 21 reads the program stored in the storage unit 22 into a storage device such as a memory and executes the program. can do.
  • the communication unit 23 is communicably connected to the information terminal 10 via the network NTW. Specifically, the communication unit 23 has a function as a receiving unit that receives information from the information terminal 10 via the network NTW and a function as a transmitting unit that transmits information to the information terminal 10 via the network NTW. Have.
  • the server device 20 stores article information for each of a plurality of types of articles.
  • the article may be, for example, an in-vehicle part attached to the vehicle, or a transported object (including a packaging container) loaded in the luggage compartment of the vehicle for transportation by the vehicle.
  • the article information includes, for example, information indicating the model and dimensions of the article, image data of the appearance of the article, facility where the article is sold and its location information, information posted by a user of the article, and the like.
  • the posted information is information posted by the user of each article, for example, the model of the vehicle on which the article can be loaded, the orientation of the article when the article is loaded in the luggage compartment of the vehicle, and the article. It may contain information such as how to crush the corrugated cardboard when it is loaded on the vehicle.
  • the posted information may include information that is not described in the operation manual of the article, such as a method of using the article more comfortably and the ease of use of the article.
  • FIG. 2A is a flowchart showing a process of accepting the designation of the article by the user
  • FIG. 2B is a flowchart showing the process of providing the user with information on the suitability of the designated article with respect to the vehicle.
  • FIG. 2A the process of accepting the designation of an article by the user will be described using the flowchart shown in FIG. 2A.
  • the processing of the flowchart shown in FIG. 2A can be performed by the reception unit 11e of the processing unit 11.
  • 3 to 7 are diagrams showing images displayed on the display 14 of the information terminal 10, and can be used to explain the processing performed by the processing unit 11 of the information terminal 10.
  • FIG. 3 shows a state in which the initial screen is displayed on the display 14 of the information terminal 10.
  • an input field 31a in which the category (classification, type) of the article is input by the user, and a search button for receiving the user's instruction to start the search for the article. 31b and can be provided.
  • "vehicle-mounted parts" is input by the user in the input field 31a as the category of the article.
  • the processing unit 11 determines whether or not the search button 31b has been touched on the display 14 by the user. If the search button 31b is touched by the user, the process proceeds to S13, and if the search button 31b is not touched, S12 is repeated. In S13, the processing unit 11 determines whether or not the category of the article has been input to the input field 31a by the user. If the search button 31b is touched while the article category is not entered in the input field 31a, the process proceeds to S14, and selection processes (S14 to S15) for allowing the user to select the article category are executed. On the other hand, when the search button 31b is touched while the article category is entered in the input field 31a, the selection process of S14 to S15 is omitted and the process proceeds to S16.
  • the processing unit 11 displays a list of facilities such as shops existing around the information terminal 10 on the display 14.
  • the storage unit 22 of the server device 20 stores map information indicating the locations of a plurality of facilities.
  • the processing unit 11 transmits the current position of the information terminal 10 detected by the position detection sensor 15 to the server device 20 via the communication unit 17.
  • the server device 20 that has received the current position of the information terminal 10 searches for facilities existing within a predetermined range from the current position of the information terminal 10 based on the map information stored in the storage unit 22, and uses the search results.
  • the obtained list of facilities is transmitted to the information terminal 10 via the communication unit 23.
  • the information terminal 10 displays an image showing a list of received facilities on the display 14. In the example shown in FIG.
  • the area where the name of each facility is displayed is the selection button 32 for the user to select the facility, and the user touches one of the selection buttons 32 on the display 14. By doing so, you can select the facility. If the facility is selected by the user, the process proceeds to S15.
  • the predetermined range can be set in advance and arbitrarily by the user. The predetermined range may be the range of the distance from the information terminal 10 or the range of the time until arrival at the facility.
  • the processing unit 11 displays a list of categories of articles provided (sold) at the facility selected in S14 on the display 14. Specifically, in the storage unit 22 of the server device 20, article information that can be provided by each of the plurality of facilities is stored for each category of articles associated with each facility. The processing unit 11 obtains the article information stored in the server device 20 (storage unit 22) linked to the facility selected in S14, thereby displaying a list of article categories as shown in FIG. It can be displayed on the display 14. In the example shown in FIG. 5, the area where the category name of the article is displayed is the selection button 33 for the user to select the category of the article, and the user presses one of the selection buttons 33 on the display 14. By touching with, you can select the category of the article.
  • the processing unit 11 displays an article candidate list on the display 14 for the article category specified by the user.
  • the article category specified by the user is an article category input by the user in the input field 31a of the initial screen, or an article category selected by the user in S15.
  • the processing unit 11 acquires the article information stored in the storage unit 22 of the server device 20 in association with the category of the article specified by the user from the server device 20.
  • the candidate list of articles can be displayed on the display 14.
  • "drive recorder" is specified by the user as the category of the article, and the area where the model of the article is displayed is the selection button 34 for the user to select the article. It has become.
  • the user can specify (select) an article by touching one of the selection buttons 34 on the display 14.
  • the processing unit 11 can display a candidate list of articles provided (sold) at the selected facility on the display 14.
  • the processing unit 11 determines whether or not the article has been designated by the user. If the article is specified by the user, the process proceeds to S18, and if the article is not specified by the user, S17 is repeated.
  • the processing unit 11 (second acquisition unit 11b) acquires the article information regarding the article (designated article) designated in S17 from the server device 20, and displays the acquired article information on the display 14.
  • the article information includes information indicating the model and dimensions of the article, image data of the appearance of the article, information posted by the user of the article, and the like, and the information and data are displayed on the display 14. Is displayed in.
  • FIG. 7 shows an example of a display screen of article information regarding a drive recorder as a designated article.
  • the article information display screen is provided with a designated article model display column 35a, an appearance display column 35b of the designated article, an article dimension display column 35c, and a posted information display column 35d. .. Further, the OK button 35e and the cancel button 35f are provided on the article information display screen. When the OK button 35e is touched by the user, the process proceeds to S21 in FIG. 2B, and when the cancel button 35f is touched by the user, the process ends.
  • FIG. 8 to 10B are diagrams showing images displayed on the display 14 of the information terminal 10, and can be used to explain the processing performed by the processing unit 11 of the information terminal 10.
  • the processing unit 11 determines the target location (for example, the position of the structure or the vehicle) of the vehicle on which the designated article is to be mounted, based on the article information of the designated article.
  • the target location where the designated article will be loaded may be referred to as the “planned loading location”.
  • the article information may include information on whether or not the article is an in-vehicle component, and if the article is an in-vehicle component, information indicating a planned mounting location of the article. Therefore, the processing unit 11 can determine whether or not the designated article is an in-vehicle component and determine the planned mounting location of the designated article based on the article information of the designated article.
  • the processing unit 11 can determine the planned mounting location of the drive recorder to be the windshield near the room mirror based on the article information.
  • the processing unit 11 determines that the designated article is not an in-vehicle part based on the article information of the designated article, the processing unit 11 can determine the planned loading location of the designated article in the luggage compartment (loading section) of the vehicle. ..
  • the processing unit 11 requests the user to take a picture of the planned mounting location determined in S21. For example, when the designated article is a drive recorder, the processing unit 11 displays a comment requesting a picture of the windshield near the rearview mirror, which is the planned mounting location, on the display 14. If it is determined that the designated article is not an in-vehicle part, a comment requesting a photograph of the luggage compartment of the vehicle, which is the planned mounting location, is displayed on the display 14.
  • the processing unit 11 (first acquisition unit 11a) causes the camera 13 to start shooting and acquires a shot image from the camera 13.
  • the processing unit 11 display control unit 11f sequentially displays the captured images acquired from the camera 13 on the display 14.
  • FIG. 8 shows a state in which the inside of the vehicle is photographed by the information terminal 10 (camera 13) so as to include the rearview mirror and the center console.
  • the processing unit 11 identifies the planned mounting location determined in S21 in the captured image displayed on the display 14. For example, the processing unit 11 can determine which part of the vehicle is captured by the camera 13 and the captured image is an image obtained by performing known image processing. As an example of known image processing, a portion (feature point) having a feature amount such as a corner, a curvature, a change in brightness, or a change in color is detected in a captured image, and the feature of the detected feature point is detected. A method of recognizing a vehicle portion (photographed portion) photographed by the camera 13 from feature information indicating an amount, a positional relationship, or the like can be mentioned. By such a method, the processing unit 11 can specify the planned mounting location in the captured image.
  • known image processing a portion (feature point) having a feature amount such as a corner, a curvature, a change in brightness, or a change in color is detected in a captured image, and the feature of the detected feature point is detected.
  • the processing unit 11 determines whether or not the planned mounting location is specified in the captured image displayed on the display 14. If the planned mounting location is specified in the captured image, the process proceeds to S27, and if the planned mounting location is not specified in the captured image, the process returns to S25.
  • the processing unit 11 acquires the dimensional information of the planned mounting location specified in S25 to S26.
  • the storage unit 12 of the information terminal 10 stores feature information for each of a plurality of types of vehicles, and the processing unit 11 has a high degree of agreement with the feature information specified in S25 to S26 ( That is, it is possible to specify the model of the vehicle having the characteristic information (the degree of matching exceeds a predetermined value).
  • the storage unit 22 of the server device 20 dimensional information of each part of the vehicle is stored in association with the model of the vehicle for each model of the vehicle.
  • the processing unit 11 of the information terminal 10 transmits information on the model of the specified vehicle to the server device 20, and transmits the dimensional information of each component associated with the model of the vehicle and stored in the server device 20 to the server device 20. Receive from. As a result, the processing unit 11 of the information terminal 10 can acquire the dimensional information of the planned mounting location specified in S25 to S26.
  • the processing unit 11 (generation unit 11d) generates an augmented reality image 40 of the designated article based on the image data of the appearance of the designated article. Then, the processing unit 11 (display control unit 11f) superimposes (superimposes) the augmented reality image 40 of the generated designated article on the planned mounting location in the captured image obtained by the camera 13 and displays it on the display 14. To do. At this time, the processing unit 11 moves the augmented reality image 40 of the designated article according to the movement of the information terminal 10 based on the position and orientation information of the information terminal 10 detected by the position detection sensor 15 and the attitude detection sensor 16. As such, the image captured by the camera 13 and the augmented reality image 40 of the designated article are aligned.
  • the processing unit 11 displays the augmented reality image 40 of the designated article on the display 14 so as to match the position of the planned mounting location in the captured image displayed on the display 14 based on the information on the position and orientation of the information terminal 10. To display. Further, the processing unit 11 reflects the actual dimensional relationship between the designated article and the planned mounting location based on the dimensional information of the designated article and the dimensional information of the planned mounting location acquired in S27 so that the designated article is reflected.
  • the augmented reality image 40 of the above is displayed on the display 14.
  • the image data of the appearance of the designated article and the dimensional information of the designated article are the information included in the article information acquired from the server device 20 in S18.
  • FIG. 9 shows an example in which the augmented reality image 40a of the drive recorder is superimposed on the planned mounting location (windshield near the rearview mirror) in the captured image and displayed on the display 14 when the designated article is a drive recorder. Is shown.
  • a photographed image of the inside of the vehicle including the rearview mirror 41 and the center console 42 is displayed on the display 14, and the drive recorder is used with respect to the windshield near the rearview mirror specified as the planned mounting location.
  • the augmented reality image 40a is displayed.
  • the user can easily and intuitively grasp the suitability of the drive recorder for the vehicle, such as the positional relationship and the dimensional relationship between the vehicle and the drive recorder, before purchasing the drive recorder.
  • FIGS. 10A to 10B when the designated article is a transported object (for example, a packaging container such as a desk or a bed), the augmented reality image 40b of the transported object is mounted on the captured image (vehicle 43).
  • a photographed image of the rear part (baggage compartment) of the vehicle 43 with the back door open is displayed on the display 14, and the luggage compartment specified as the planned mounting location is displayed.
  • the augmented reality image 40b of the transported object is displayed.
  • FIG. 10A shows the case where the size of the transported object is smaller than the size of the luggage compartment of the vehicle 43
  • FIG. 10A shows the case where the size of the transported object is smaller than the size of the luggage compartment of the vehicle 43, and FIG.
  • 10B shows the case where the size of the transported object is larger than the size of the luggage compartment of the vehicle 43.
  • the user can easily and intuitively grasp the suitability of the transported object for the vehicle, such as whether or not the transported object as the designated article can be loaded on the loading platform of the vehicle 43, before purchasing the transported object. be able to.
  • the augmented reality image of the designated article designated by the user is included in the captured image obtained by the camera 13 as information indicating the suitability of the designated article with respect to the planned mounting location. It is superimposed on the planned mounting location in the above and displayed on the display 14.
  • FIG. 11 is a diagram showing a state in which the information terminal 10 (camera 13) is photographing the vicinity of the key insertion portion 51 of the motorcycle 50.
  • the augmented reality image 40c of the cover of the key insertion portion 51 is superimposed on the key insertion portion 51 in the captured image and displayed on the display 14.
  • the user can easily determine the suitability of the cover of the key insertion portion 51 with respect to the motorcycle, such as the positional relationship and the dimensional relationship between the key insertion portion 51 of the motorcycle 50 owned by the user and the cover, before purchasing the cover. It can be grasped intuitively.
  • FIG. 12 is a diagram showing a state in which the cultivator 60 is photographed by the information terminal 10 (camera 13).
  • the augmented reality image 40d of the wheel is superimposed on the claw portion 61 in the captured image and displayed on the display 14.
  • the model of the cultivator 60 (vehicle) is specified in order to acquire the dimensional information of the planned mounting location.
  • the processing unit 11 determines that the wheel does not fit (that is, cannot be attached) to the specified type of cultivator 60 based on the article information of the wheel, a comment indicating that the wheel does not fit.
  • a mark such as a wheel or a cross mark may be displayed on the display 14.
  • a third embodiment according to the present invention will be described.
  • This embodiment basically inherits the first to second embodiments, and the terms, definitions, and the like are as described in the first and second embodiments.
  • a process performed by the information terminal 10 when the information providing program is executed a structure included in the captured image obtained by the camera 13 is specified, and an article conforming to the specified structure is specified. The process of providing information to the user will be described.
  • the information terminal 10 displays a screen on the display 14 for allowing the user to select either the first mode or the second mode, and is selected by the user.
  • the first mode is a mode for acquiring article information about an article designated by the user, and when the first mode is selected, the processes described in the first to second embodiments are executed.
  • the second mode is a mode for identifying a structure in a captured image and acquiring article information about an article conforming to the specified structure, and the process described below is executed.
  • FIG. 13 is a flowchart showing processing performed by the processing unit 11 of the information terminal 10 when the second mode is selected by the user.
  • the processing unit 11 first acquisition unit 11a
  • the processing unit 11 causes the camera 13 to start shooting and acquires a shot image from the camera 13.
  • the processing unit 11 display control unit 11f
  • the processing unit 11 identifies the structure included in the captured image displayed on the display 14. For example, the processing unit 11 first recognizes the structure contained in the captured image by performing known image processing. As an example of known image processing, as described in S25 of the flowchart of FIG. 2B, a feature point is detected in a captured image, and a structure is obtained from feature information indicating the feature amount and positional relationship of the detected feature point. There is a method of recognition. In the storage unit 22 of the server device 20, feature information about each of the plurality of types of structures is stored in association with the model of each structure, and the processing unit 11 of the information terminal 10 recognizes it.
  • the processing unit 11 can specify the model of the structure included in the captured image. As an example, as shown in FIG. 8, when the user photographs the inside of the vehicle with the information terminal 10 (camera 13) so as to include in-vehicle parts such as the rearview mirror and the center console, the processing unit 11 sets the rearview mirror as a structure. Can be specified as.
  • the processing unit 11 determines whether or not the structure has been specified in the captured image displayed on the display 14. If the structure is specified in the captured image, the process proceeds to S35, and if the structure is not specified in the captured image, the process returns to S33.
  • the processing unit 11 acquires a list of goods that can be provided at facilities such as shops existing around the information terminal 10. For example, the processing unit 11 may acquire a list of articles recommended by the facility (a list of recommended articles) as an article list that can be provided by a facility such as a store existing around the information terminal 10.
  • the storage unit 22 of the server device 20 stores map information indicating the locations of a plurality of facilities.
  • the processing unit 11 transmits the current position of the information terminal 10 detected by the position detection sensor 15 to the server device 20 via the communication unit 17.
  • the server device 20 that has received the current position of the information terminal 10 searches for facilities existing within a predetermined range from the current position of the information terminal 10 based on the map information stored in the storage unit 22, and uses the search results.
  • a list of a plurality of articles (article list) that can be provided at the obtained facility is transmitted to the information terminal 10 via the communication unit 23.
  • the information terminal 10 can acquire a list of articles that can be provided at the surrounding facilities.
  • the predetermined range can be set in advance and arbitrarily by the user.
  • the predetermined range may be the range of the distance from the information terminal 10 or the range of the time until arrival at the facility.
  • the processing unit 11 determines whether or not there is an article conforming to the structure specified in S33 (hereinafter, may be referred to as a conforming article) in the article list acquired in S35. If there is a conforming article, the process proceeds to S37, and if there is no conforming article, the process returns to S33. Further, in S37, the processing unit 11 (second acquisition unit 11b) acquires the article information stored in the server device 20 (storage unit 22) associated with the conforming article from the server device 20. As described above, the article information may include information indicating the model and dimensions of the article, image data of the appearance of the article, facilities where the article is sold, and location information thereof.
  • the processing unit generation unit 11d
  • the processing unit 11 displays the augmented reality image of the conforming article on the structure specified in S33 in the captured image obtained by the camera 13 and displays it on the display 14.
  • the processing unit 11 moves the augmented reality image of the conforming article according to the movement of the information terminal 10 based on the position and orientation information of the information terminal 10 detected by the position detection sensor 15 and the attitude detection sensor 16. As described above, the image captured by the camera 13 and the augmented reality image of the conforming article are aligned.
  • the processing unit 11 displays the augmented reality image of the conforming article on the display 14 so as to match the position of the structure in the captured image displayed on the display 14 based on the information on the position and orientation of the information terminal 10. To do. Further, the processing unit 11 determines the conforming article so that the actual dimensional relationship between the conforming article and the structure is reflected based on the dimensional information of the conforming article and the dimensional information of the structure specified in S33.
  • the augmented reality image 40 may be displayed on the display 14.
  • the processing unit 11 may display information on the facility where the conforming article is provided on the display 14.
  • the facility information may include, for example, information such as the homepage and telephone number of the facility, and information on the route to the facility.
  • the information of such a facility may be displayed on the display 14 when the user touches the augmented reality image of the conforming article on the display 14.
  • the inside of the vehicle is photographed by the information terminal 10 (camera 13) so that the user includes in-vehicle parts such as the rearview mirror and the center console, and in S33, the front of the rearview mirror or the vicinity of the rearview mirror is taken.
  • glass is identified as a structure.
  • the processing unit 11 has acquired a list of articles that can be provided at a car accessory store (facility) existing within a predetermined range from the information terminal 10 in S35, in S37, the room mirror or the windshield near the room mirror Goods information for drive recorders compatible with glass can be obtained.
  • the augmented reality image 40a of the drive recorder generated based on the acquired article information (image data) is superimposed on the structure in the captured image (for example, the windshield near the rearview mirror). Is displayed on the display 14.
  • the structure in the photographed image is specified, and the article information regarding the article (conforming article) conforming to the specified structure is acquired.
  • an augmented reality image of the conforming article is generated based on the article information (image data), and the augmented reality image of the conforming article is superimposed on the structure in the captured image and displayed on the display.
  • the information providing system of the above embodiment is An information providing system (for example, 10) that uses an information terminal (for example, 10) having a camera (for example, 13) and a display (for example, 14) to provide a user with information on compatibility between a designated article and a target location where the designated article will be mounted.
  • the information terminal is A display control means (for example, 11f) for displaying a captured image obtained by the camera on the display, and An acquisition means (for example, 11b) for acquiring image data of the designated article, and A generation means (for example, 11d) that generates an augmented reality image of the designated article based on the image data acquired by the acquisition means, and Have,
  • the display control means superimposes the augmented reality image of the designated article generated by the generation means on the target location in the captured image as the information and displays it on the display.
  • the user specifies an article on the information terminal, and if the target location where the designated article (designated article) is to be mounted is photographed by the camera of the information terminal, the designated article is adapted to the target location. It is possible to easily and intuitively grasp the sex.
  • the display control means displays an augmented reality image of the designated article on the display so that the actual dimensional relationship between the designated article and the target location is reflected. According to this configuration, the user can more easily grasp the dimensional relationship between the designated article and the target location, so that the suitability of the designated article with respect to the target location can be grasped more easily and intuitively. ..
  • the information terminal further includes a specific means (for example, 11c) for identifying the target location in the captured image.
  • the display control means superimposes and displays an augmented reality image of the designated article on the target location in the captured image specified by the specific means.
  • the target location is specified on the information terminal side without the user instructing the position of the target location in the captured image, so that the user can grasp the suitability of the designated article for the target location. It is possible to improve convenience.
  • the information terminal further has a receiving means (for example, 11e) for displaying a candidate list of articles on the display and accepting designation of articles by a user.
  • the designated article is an article that has been designated by the user by the receiving means. According to this configuration, it is possible to improve the convenience of the user when designating the article.
  • the reception means displays on the display as the candidate list the articles that can be provided at the facility existing within a predetermined range from the current position of the information terminal.
  • information on articles provided (sold) at a nearby facility can be presented to the user, for example, while moving, so that the user can obtain necessary articles based on the information. For example, it is possible to drop in at the facility and improve the convenience of the user.
  • the reception means displays a list of articles belonging to the category input by the user on the display as the candidate list. According to this configuration, the user can search for the necessary articles by category, so that the convenience of the user can be improved.
  • the information providing system of the above embodiment is An information providing system (eg 100) that provides information to a user using an information terminal (eg 10) having a camera (eg 13) and a display (eg 14).
  • the information terminal is A display control means (for example, 11f) for displaying a captured image obtained by the camera on the display, and Specific means (for example, 11c) for identifying a structure contained in the captured image displayed on the display, and Acquisition means (for example, 11b) for acquiring image data of an article conforming to the structure specified by the specific means among a plurality of articles that can be provided at a facility existing within a predetermined range from the current position of the information terminal.
  • a generation means for example, 11d
  • the display control means superimposes an augmented reality image of the article generated by the generation means on the structure in the captured image and displays it on the display. According to this configuration, the user can easily and intuitively acquire information on an article that matches the structure in the captured image obtained by the camera of the information terminal without specifying the article on the information terminal. Is possible.

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Traffic Control Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Ce système de fourniture d'informations utilise un terminal d'informations comportant une caméra et un dispositif d'affichage pour fournir à un utilisateur des informations concernant la compatibilité de l'article désigné avec le site applicable sur lequel ledit article désigné doit être installé, le terminal d'informations comportant un moyen de commande d'affichage qui affiche des images capturées par la caméra sur l'écran d'affichage, un moyen d'acquisition qui acquiert des données d'image de l'article désigné, et un moyen de génération qui génère une image de réalité augmentée de l'article désigné sur la base des données d'image acquises par le moyen d'acquisition. Le moyen de commande d'affichage affiche, sur l'écran d'affichage, une image de réalité augmentée de l'article désigné généré par le moyen de génération en tant qu'informations, superposé sur le site applicable dans l'image capturée.
PCT/JP2019/014254 2019-03-29 2019-03-29 Système de fourniture d'informations et terminal d'informations WO2020202347A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201980090738.2A CN113383363A (zh) 2019-03-29 2019-03-29 信息提供系统以及信息终端
JP2021511722A JP7237149B2 (ja) 2019-03-29 2019-03-29 情報提供システム、および情報端末
PCT/JP2019/014254 WO2020202347A1 (fr) 2019-03-29 2019-03-29 Système de fourniture d'informations et terminal d'informations

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/014254 WO2020202347A1 (fr) 2019-03-29 2019-03-29 Système de fourniture d'informations et terminal d'informations

Publications (1)

Publication Number Publication Date
WO2020202347A1 true WO2020202347A1 (fr) 2020-10-08

Family

ID=72666511

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/014254 WO2020202347A1 (fr) 2019-03-29 2019-03-29 Système de fourniture d'informations et terminal d'informations

Country Status (3)

Country Link
JP (1) JP7237149B2 (fr)
CN (1) CN113383363A (fr)
WO (1) WO2020202347A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003044704A (ja) * 2001-07-31 2003-02-14 Honda Motor Co Ltd サービス提供方法
JP2003331075A (ja) * 2002-05-09 2003-11-21 Honda Motor Co Ltd サービス提供システム
JP2011222000A (ja) * 2010-03-25 2011-11-04 Choushin Inc 画像合成サービスシステム
US9928544B1 (en) * 2015-03-10 2018-03-27 Amazon Technologies, Inc. Vehicle component installation preview image generation

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9449342B2 (en) * 2011-10-27 2016-09-20 Ebay Inc. System and method for visualization of items in an environment using augmented reality
CN108255304B (zh) * 2018-01-26 2022-10-04 腾讯科技(深圳)有限公司 基于增强现实的视频数据处理方法、装置和存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003044704A (ja) * 2001-07-31 2003-02-14 Honda Motor Co Ltd サービス提供方法
JP2003331075A (ja) * 2002-05-09 2003-11-21 Honda Motor Co Ltd サービス提供システム
JP2011222000A (ja) * 2010-03-25 2011-11-04 Choushin Inc 画像合成サービスシステム
US9928544B1 (en) * 2015-03-10 2018-03-27 Amazon Technologies, Inc. Vehicle component installation preview image generation

Also Published As

Publication number Publication date
CN113383363A (zh) 2021-09-10
JP7237149B2 (ja) 2023-03-10
JPWO2020202347A1 (fr) 2020-10-08

Similar Documents

Publication Publication Date Title
JP5280475B2 (ja) 情報処理システム、情報処理方法及びプログラム
EP2418621B1 (fr) Appareil et procédé pour fournir des informations de réalité améliorées
CN108885452A (zh) 多轴控制器
US20170205889A1 (en) Individually interactive multi-view display system for non-stationary viewing locations and methods therefor
JP6177998B2 (ja) 情報表示方法および情報表示端末
CN110478901A (zh) 基于增强现实设备的交互方法及系统
JP2007080060A (ja) 対象物特定装置
CN111400610B (zh) 车载社交方法及装置、计算机存储介质
CN111742281B (zh) 用于针对显示在显示器上的第一内容提供根据外部对象的移动的第二内容的电子装置及其操作方法
JP7209474B2 (ja) 情報処理プログラム、情報処理方法及び情報処理システム
CN105383411B (zh) 用于操作运输工具中的多媒体-内容的方法和装置
CN111243200A (zh) 购物方法、穿戴式设备及介质
CN114758100A (zh) 显示方法、装置、电子设备和计算机可读存储介质
CN113891166B (zh) 数据处理方法、装置、计算机设备及介质
US20170186073A1 (en) Shopping cart display
WO2020202347A1 (fr) Système de fourniture d'informations et terminal d'informations
WO2021029043A1 (fr) Système de fourniture d'informations, terminal d'informations, et procédé de fourniture d'informations
WO2018100631A1 (fr) Appareil de traitement d'informations
KR20180063877A (ko) 단말 장치 및 제어 방법
WO2020202346A1 (fr) Système de fourniture d'informations et terminal d'informations
JP6040804B2 (ja) 対象物識別システム、対象物識別方法
US11556976B2 (en) Server apparatus, mobile shop, and information processing system
US20210224926A1 (en) Server apparatus, control apparatus, medium, mobile shop, and operation method for information processing system
JP6833472B2 (ja) 車載装置、及び情報処理システム
WO2018179312A1 (fr) Dispositif et procédé de génération d'images

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19922237

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021511722

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19922237

Country of ref document: EP

Kind code of ref document: A1