WO2016125307A1 - Information delivery device and information delivery program - Google Patents

Information delivery device and information delivery program Download PDF

Info

Publication number
WO2016125307A1
WO2016125307A1 PCT/JP2015/053408 JP2015053408W WO2016125307A1 WO 2016125307 A1 WO2016125307 A1 WO 2016125307A1 JP 2015053408 W JP2015053408 W JP 2015053408W WO 2016125307 A1 WO2016125307 A1 WO 2016125307A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
image data
user
category
unit
Prior art date
Application number
PCT/JP2015/053408
Other languages
French (fr)
Japanese (ja)
Inventor
宗利 津田
祐也 山岸
一紀 下村
厚 ▲高▼木
涼子 端村
一夢 菅原
逸雄 川島
真有 工藤
優 八木
泰志 高橋
井上 智之
Original Assignee
株式会社ぐるなび
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ぐるなび filed Critical 株式会社ぐるなび
Priority to JP2016573162A priority Critical patent/JPWO2016125307A1/en
Priority to PCT/JP2015/053408 priority patent/WO2016125307A1/en
Publication of WO2016125307A1 publication Critical patent/WO2016125307A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor

Definitions

  • the present invention relates to an information distribution apparatus and an information distribution program.
  • the captured image data is stored in the storage unit of those terminals, and the stored image data is displayed in a list, for example, arranged in time series at the time of imaging.
  • a technique is known in which subjects are classified into predetermined categories by so-called image recognition processing that analyzes predetermined feature amounts related to image data, and image data is organized and displayed for each category to which the subjects belong (for example, patents). Reference 1).
  • the purpose of the information distribution apparatus and the information distribution program described in the present application is to provide information to a user using image categorization information.
  • the information distribution device stores a category to which image data belongs in association with user identification information, acquires image data and user identification information from a user terminal, and performs image recognition processing on the acquired image data to perform category recognition.
  • a categorizing unit that associates the determined category with the user identification information and registers the category in the storage unit, an analysis unit that analyzes user preferences based on the category associated with the user identification information, and the analyzed user And an information providing unit for providing information to the user based on the preference.
  • the information distribution program is an information distribution program for a computer having a storage unit that stores a category to which image data belongs in association with user identification information, and acquires image data and user identification information from a user terminal.
  • Image recognition processing is performed on the data to determine the category to which the image data belongs, the determined category is associated with the user identification information and registered in the storage unit, and the user preference is based on the category associated with the user identification information.
  • the computer is caused to provide information to the user based on the analyzed user preference.
  • FIG. 2 is a diagram illustrating an example of a schematic configuration of a mobile terminal 2.
  • FIG. It is a figure which shows an example of the data structure regarding a specific image. It is a figure which shows an example of the hierarchical structure of the category at the time of categorizing image data.
  • 2 is a diagram illustrating an example of a schematic configuration of a server 3.
  • FIG. It is a figure which shows an example of the data structure regarding a specific user. It is a figure which shows an example of the operation
  • FIG. 10 is a diagram illustrating an example of the advertisement described in step S ⁇ b> 204 in FIG. 9. It is a figure which shows the other example of the advertisement demonstrated by step S204 of FIG. It is a figure which shows the further another example of the advertisement demonstrated by step S204 of FIG.
  • FIG. 1 is a diagram illustrating an example of a schematic configuration of an information distribution system 1.
  • the information distribution system 1 includes at least one portable terminal 2 and a server 3.
  • the portable terminal 2 and the server 3 are connected to each other via a communication network, and are connected to each other via, for example, a base station 4, a mobile communication network 5, a gateway 6, and the Internet 7.
  • the program executed on the mobile terminal 2 and the program executed on the server 3 communicate with each other using a communication protocol such as a hypertext transfer protocol (Hypertext Transfer Protocol, HTTP).
  • HTTP Hypertext Transfer Protocol
  • FIG. 2 is a diagram illustrating an example of a schematic configuration of the mobile terminal 2.
  • the portable terminal 2 includes a terminal communication unit 21, a terminal storage unit 22, an operation unit 23, a display unit 24, an imaging unit 25, a GPS sensor 26, a terminal processing unit 27, and the like.
  • a multi-function mobile phone (so-called “smart phone”) is assumed, but the mobile terminal 2 is not limited to this.
  • the mobile terminal 2 is not limited as long as the present invention can be applied.
  • a personal computer PC
  • a mobile phone So-called “feature phone”
  • PDA Personal Digital Assistant
  • a player, a tablet PC, or the like may be used.
  • the terminal communication unit 21 includes a communication interface circuit including an antenna whose sensitivity band is a predetermined frequency band, and connects the mobile terminal 2 to the wireless communication network.
  • the terminal communication unit 21 establishes a wireless signal line using the WCDMA (Wideband Code Multiple Access) method or the like with the base station 4 via the channel assigned by the base station 4. Communicate between the two.
  • the terminal communication unit 21 outputs the data supplied from the terminal processing unit 27 to the server 3 or the like. Further, the terminal communication unit 21 supplies data acquired from the server 3 or the like to the terminal processing unit 27.
  • WCDMA Wideband Code Multiple Access
  • the terminal storage unit 22 includes, for example, at least one of a semiconductor memory, a magnetic disk device, and an optical disk device.
  • the terminal storage unit 22 stores an operating system program, a driver program, an application program, data, and the like used for processing in the terminal processing unit 27.
  • the terminal storage unit 22 stores an input device driver program for controlling the operation unit 23, an output device driver program for controlling the display unit 24, and the like as driver programs.
  • the terminal storage unit 22 stores a program for acquiring and displaying image data as an application program. Further, the terminal storage unit 22 stores image data and bibliographic information accompanying the image data. Further, the terminal storage unit 22 may temporarily store temporary data related to a predetermined process.
  • the operation unit 23 may be any device as long as the operation of the mobile terminal 2 is possible, for example, a touch pad, a keyboard, or the like.
  • the user can input characters, numbers, and the like using the operation unit 23.
  • the operation unit 23 When operated by the user, the operation unit 23 generates a signal corresponding to the operation.
  • the generated signal is supplied to the terminal processing unit 27 as a user instruction.
  • the display unit 24 may be any device that can display video, images, and the like, and is, for example, a liquid crystal display, an organic EL (Electro-Luminescence) display, or the like.
  • the display unit 24 displays an image or the like corresponding to the image data supplied from the terminal processing unit 27.
  • the imaging unit 25 may be any device capable of imaging and reading a QR code (registered trademark), and includes, for example, a lens and an imaging element.
  • a QR code registered trademark
  • the GPS sensor 26 is an example of a position sensor, is connected to a communication satellite, and acquires position information (GPS (Global Positioning System) information) including latitude information and longitude information.
  • GPS Global Positioning System
  • the mobile terminal 2 does not necessarily include the GPS sensor 26.
  • the position information may be position information other than GPS information.
  • the position information may be position information specified based on the intensity of radio waves received by the position sensor of the mobile terminal 2 from a plurality of base stations.
  • the position information may be position information acquired by the mobile terminal 2 by iBeacon (registered trademark).
  • the terminal processing unit 27 includes one or a plurality of processors and their peripheral circuits.
  • the terminal processing unit 27 controls the overall operation of the mobile terminal 2 and is, for example, a CPU (Central Processing Unit).
  • the terminal processing unit 27 includes a terminal communication unit 21, a display unit 24, an imaging unit 25, and a program so that the program stored in the terminal storage unit 22 is executed in an appropriate procedure according to the operation of the operation unit 23.
  • the operation of the GPS sensor 26 and the like is controlled.
  • the terminal processing unit 27 executes processing based on programs (operating system program, driver program, application program, etc.) stored in the terminal storage unit 22.
  • the terminal processing unit 27 can execute a plurality of programs (such as application programs) in parallel.
  • the terminal processing unit 27 includes an image processing unit 271, an image browsing unit 272, an advertisement display unit 273, and a QR code processing unit 274.
  • Each of these units is a functional module realized by a program executed by a processor included in the terminal processing unit 27. Or these each part may be mounted in the portable terminal 2 as firmware.
  • the image processing unit 271 generates image data and bibliographic information related to the image data based on the imaging by the imaging unit 25.
  • the bibliographic information includes user ID, image type, imaging date and time, and the like.
  • the bibliographic information may include other information such as Exif (Exchangeable image file format) information.
  • Image data and bibliographic information are stored in the terminal storage unit 22.
  • the image browsing unit 272 displays an image browsing screen on the display unit 24 based on the program stored in the terminal storage unit 22.
  • the image browsing screen can be a screen of various forms such as a list display screen of thumbnail images based on image data and a display screen of specific image data alone.
  • the advertisement display unit 273 When the advertisement display unit 273 acquires the advertisement data from the server 3, the advertisement display unit 273 displays the advertisement on the display unit 24.
  • the QR code processing unit 274 is an example of a code information processing unit, and associates information related to the QR code read by the imaging unit 25 with image data stored in the terminal storage unit 22 by a predetermined method.
  • the code information is a QR code
  • the code information may be other code information such as a two-dimensional barcode or a one-dimensional barcode other than the QR code.
  • FIG. 3 is a diagram illustrating an example of a data structure related to a specific image.
  • the data structure shown in FIG. 3 shows a series of data linked to the image ID stored in the terminal storage unit 22.
  • the series of data includes, for example, a user ID, an image type, an imaging date and time, a category to which the image belongs, and image data.
  • the series of data described above is an example, and in addition, Exif information, metadata related to other images, and the like may be included as data related to images.
  • Image types include “imaging”, “download”, “capture”, etc., but are not limited to these.
  • Imaging indicates image data captured by the imaging unit 25 of the mobile terminal 2.
  • Download indicates that the image data is downloaded from the network to which the mobile terminal 2 is connected.
  • Capture indicates image data generated by capturing an arbitrary display on the display unit 24 of the mobile terminal 2.
  • FIG. 4 is a diagram illustrating an example of a hierarchical structure of categories when image data is categorized.
  • “Image” has, for example, subcategories of “person”, “meal”, “landscape”, and “other”.
  • “Person” has, for example, “male” and “female” subcategories, and “male” and “female” are, for example, “under 10 years old”, “teens”, “20s”, and “30”, respectively. It has a subcategory of age, such as “Yen”. “Meal” has, for example, subcategories of “Japanese food”, “Western food”, and “Chinese food”. “Japanese food” has, for example, lower categories such as “sushi”, “soba”, and “yakitori”. “Western food” has, for example, subcategories such as “pasta”, “salad”, and “pizza”.
  • “Chinese Chinese” has, for example, subcategories such as “ramen”, “gyoza”, and “spring roll”.
  • “Scenery” has, for example, subordinate categories of “mountain”, “river”, and “sea”.
  • the “facility” has, for example, subcategories of “tower”, “shrines and churches”, and “theme park”.
  • categorization method is an example, and categorization can be performed by any other method.
  • categories such as “Italian”, “French”, and “Spanish cuisine” may be included in the lower category of “Western food”. Further, the category may form a structure other than a tree structure.
  • FIG. 5 is a diagram illustrating an example of a schematic configuration of the server 3.
  • the server 3 includes a server communication unit 31, a server storage unit 32, and a server processing unit 33.
  • the server 3 categorizes the image data in response to a request from the mobile terminal 2 and outputs the categorized image data to the mobile terminal 2.
  • the server communication unit 31 includes a communication interface circuit for transmitting and receiving data via the Internet 7 and communicates with the mobile terminal 2.
  • the server storage unit 32 includes, for example, at least one of a magnetic tape device, a magnetic disk device, and an optical disk device, and an operating system program, a driver program, an application program used for processing in the server processing unit 33, Store data etc. Further, the server storage unit 32 stores advertisement data as data.
  • the advertisement data includes advertisement data displayed on the display unit 24 of the mobile terminal 2 and a URL related to a web page related to the advertisement object.
  • the server storage unit 32 further includes a buffer for temporarily storing temporary data related to a predetermined process.
  • the server processing unit 33 includes one or a plurality of processors and their peripheral circuits, and controls the overall operation of the server 3 in an integrated manner.
  • the server processing unit 33 includes a CPU.
  • the server processing unit 33 controls the operation of the server communication unit 31 and the like by an appropriate procedure based on programs (operating system program, driver program, application program, etc.) stored in the server storage unit 32.
  • the server processing unit 33 may execute a plurality of programs (such as application programs) in parallel.
  • the server processing unit 33 includes a categorizing unit 331, a user preference analyzing unit 332, an advertisement providing unit 333, a related information extracting unit 334, and the like. These are functional modules realized by a program executed by a processor included in the server processing unit 33. These may be installed in the server 3 as firmware.
  • the categorizing unit 331 obtains an image classification request from the portable terminal 2 via the server communication unit 31, and uses one of the categories to which each image data belongs, for example, the lowest category shown in FIG. Determine one category.
  • a method of categorizing an image using a boundary line described in Japanese Patent Application Laid-Open No. 2007-133746 can be used.
  • the image is divided into a plurality of regions, the boundary lines of each region are extracted, the figure composed of the extracted boundary lines is compared with the figure registered in advance for each category, and both are approximated.
  • the category is determined based on whether or not For example, when a figure composed of boundary lines extracted from an image approximates a figure (for example, a triangle figure indicating a mountain) registered in advance for the category “landscape”, the image data Are classified into the “landscape” category.
  • the category can be determined using the color.
  • color information of each part of the image is extracted, the extracted color is compared with a color registered in advance for each category, and the category is determined based on whether or not the two are approximate. For example, when the color extracted from the image approximates a color registered in advance for the category of “person” (for example, yellow indicating skin color), the image data is classified into the category of “person”. Classify into:
  • a method of categorizing images using the face recognition processing described in JP-T-2014-517371 can be used. This method is based on geometric analysis (examine prominent features), optical analysis (statistical approach that removes unnecessary elements from images and compares them with templates to eliminate inconsistencies), etc. Determine the category.
  • geometric analysis examine prominent features
  • optical analysis statistical approach that removes unnecessary elements from images and compares them with templates to eliminate inconsistencies
  • the face recognition process the sex of the subject and the age such as “under 10 years old”, “10s”, “20s”, etc. can be determined.
  • the user preference analysis unit 332 analyzes the user preference based on the number of images by category and the number of browsing advertisements by category, which will be described later.
  • the user preference is a concept representing the user's hobbies, preferences, interests, etc. with respect to things.
  • the user preference analysis method for example, supposes that the category with the largest number is the user preference by adding the number of images by category and the number of advertisements browsed by category for each category. For example, in FIG. 6, the category of “pasta” has 28 category-specific images and 10 category-specific advertisement views, and the total number of these is 38. As a result of performing this operation for all categories, for example, if the number 38 relating to the “pasta” category is maximum, the user preference is regarded as “pasta”.
  • the above analysis method is an example, and the degree of weighting in the user preference analysis may be changed depending on the number of images by category and the number of advertisements browsed by category, or other analysis methods may be used.
  • the advertisement providing unit 333 is an example of an information providing unit. Based on the user preference analyzed by the user preference analyzing unit 332, the advertisement providing unit 333 obtains related advertisement data from the advertisement data stored in the server storage unit 32. Extract.
  • the related information extraction unit 334 extracts related information related to the image data based on the acquired GPS information or QR code information. Related information will be described later.
  • FIG. 6 is a diagram illustrating an example of a data structure related to a specific user.
  • the data structure shown in FIG. 6 indicates a series of data associated with the user ID stored in the server storage unit 32.
  • the series of data includes, for example, member type, name, sex, date of birth, image information, number of images by category, number of browsing advertisements by category, and the like.
  • the series of data described above is an example, and may include information regarding other users.
  • the membership type indicates the type of the user as a member. For example, “paid”, “free”, and “premium” (indicating a member who pays an additional membership fee in addition to a regular fee-based membership fee) Etc.
  • the image information includes an image ID associated with the user ID and a subject category related to the image ID. Therefore, the server storage unit 32 stores the category to which the image belongs in association with the user identification information.
  • the number of images by category represents the number of image data belonging to each category.
  • the display of “pasta (28)” indicates that there are 28 images belonging to the “pasta” category among images collected by the user.
  • the display of “female in twenties (15)” indicates that, among the images taken by the user, images belonging to the category of “20s”, which is a subcategory of “female”, is a person in the image. It shows that there are 15 pieces.
  • the number of advertisements browsed by category indicates the number of times the user has selected display of a screen guided by advertisement for each category. For example, the display of “pasta (10)” indicates that the user has selected an advertisement belonging to the category “pasta” and browsed a screen guided by the advertisement 10 times.
  • FIG. 7 is a diagram showing an example of an operation sequence for image classification.
  • the operation sequence described below is based on a program stored in the terminal storage unit 22 and the server storage unit 32 in advance, and each element of the mobile terminal 2 and the server 3 is mainly performed by the terminal processing unit 27 and the server processing unit 33. It is executed in cooperation with.
  • the mobile terminal 2 may be automatically set to an environment in which the following operations can be executed by launching a specific application.
  • the terminal processing unit 27 of the mobile terminal 2 displays a login screen on the display unit 24 (step S100).
  • the terminal processing unit 27 logs in including the user ID and password via the terminal communication unit 21.
  • the request is output to the server 3 (step S101).
  • the server processing unit 33 of the server 3 performs user authentication based on the acquired user ID and password (step S102).
  • the server processing unit 33 outputs an authentication permission notification to the mobile terminal 2 (step S103).
  • the terminal processing unit 27 of the mobile terminal 2 displays an image browsing screen on the display unit 24 based on the image data stored in the terminal storage unit 22 (step). S104).
  • the terminal processing unit 27 sends the image data and the image classification request stored in the terminal storage unit 22 to the terminal communication unit. 21 to the server 3 (step S105).
  • the image data output to the server 3 in step S105 may be one image data or a plurality of image data among a set of image data stored in the terminal storage unit 22 by user selection or the like. It may be.
  • the categorizing unit 331 processes each acquired image data using the above-described image recognition processing method, and determines a category to which each image data belongs (step S106). .
  • the categorizing unit 331 updates the image information associated with the user stored in the server storage unit 32, and registers a category for each image ID (step S107).
  • the categorizing unit 331 aggregates the number of image data for each category, and updates the number of images by category stored in the server storage unit 32 (step S108).
  • the server processing unit 33 outputs category data related to each image data to the mobile terminal 2 via the server communication unit 31 (step S109).
  • the terminal processing unit 27 of the portable terminal 2 registers the category of each image data stored in the terminal storage unit 22 (step S110).
  • the terminal processing unit 27 displays an image classification screen for displaying image data for each category on the display unit 24 based on the image data and the category stored in the terminal storage unit 22 (step S111), and a series of steps. End the operation sequence.
  • FIG. 8A is a diagram showing an example of the login screen 1100 described in step S100 of FIG.
  • a user ID input unit 1101 In the center of the login screen 1100, a user ID input unit 1101, a password input unit 1102, and a login determination icon 1103 are arranged.
  • the terminal processing unit 27 of the mobile terminal 2 When the login determination icon 1103 is selected, the terminal processing unit 27 of the mobile terminal 2 outputs a login request including the input user ID and password to the server 3.
  • FIG. 8B is a diagram showing an example of the image browsing screen 1200 described in step S104 of FIG.
  • images 1201 are arranged in 3 rows ⁇ 3 columns.
  • the image 1201 is generated by capturing an image captured by the user with the imaging unit 25 of the mobile terminal 2, an image downloaded from a network to which the mobile terminal 2 is connected, or an arbitrary display on the display unit 24 of the mobile terminal 2. It is an image to be.
  • the images 1201 are arranged in the order of 3 rows ⁇ 3 columns from the top to the bottom and from the left to the right in order from the slowest time of image capture, download, or capture.
  • the image browsing screen 1200 is an example, and the display method of the image 1201 is not limited to this display method.
  • FIG. 8C is a diagram showing an example of the image classification request selection screen 1300 described in step S105 of FIG.
  • the image browsing unit 272 of the mobile terminal 2 displays a pop-up 1301 "Do you want to organize images in the camera roll?" If “Yes” 1302 is selected, the terminal processing unit 27 outputs the one or more image data selected by the user and the image classification request to the server 3 via the terminal communication unit 21. If “No” 1303 is selected, the image classification process is terminated.
  • FIG. 8D is a diagram showing an example of the image classification screen 1400 described in step S111 of FIG.
  • the terminal processing unit 27 After acquiring the category data from the server 3 via the terminal communication unit 21 and registering the category, the terminal processing unit 27 displays the image classification screen 1400.
  • the image classification screen 1400 for example, according to the tree structure of FIG. 4, a category name 1401 and an image 1402 classified into the category are displayed, for example, in chronological order.
  • FIG. 8D shows an image classification screen 1400 related to the “pasta” category as an example.
  • the category name 1401 notations “cooking” and “western food”, which are upper categories of “pasta”, are displayed together with the number of images included in the category.
  • the display of “Cooking (45)> Western food (25)> Pasta (10)” shown in FIG. 8D has 45 images of “Cooking”, of which 25 images are “Western food”. Furthermore, it shows that there are 10 “pasta” images.
  • FIG. 9 is a diagram showing an example of an operation sequence for providing advertisements.
  • the operation sequence described below is based on a program stored in the terminal storage unit 22 and the server storage unit 32 in advance, and each element of the mobile terminal 2 and the server 3 is mainly performed by the terminal processing unit 27 and the server processing unit 33. It is executed in cooperation with.
  • the following processing starts when the server processing unit 33 of the server 3 outputs the category data relating to each image data to the mobile terminal 2 via the server communication unit 31 as described above (step S109). Is done.
  • the process is not limited to this, and the process may be started at an arbitrary timing.
  • the user preference analysis unit 332 of the server 3 analyzes the user preference by the analysis method described above based on the number of images by category and the number of advertisements browsed by category (step S201).
  • the advertisement providing unit 333 extracts, from the advertisement data stored in the server storage unit 32, data related to the advertisement related to the analyzed user preference based on the analyzed user preference. (Step S202). Next, the advertisement providing unit 333 outputs the extracted advertisement data to the mobile terminal 2 (step S203).
  • the advertisement providing unit 333 increases the frequency of providing advertisements, increases the display area of advertisements, or the size of displayed characters as the number of image data related to categories (number of images by category) increases. Larger advertising data may be distributed.
  • the terminal processing unit 27 of the mobile terminal 2 acquires the advertising data via the terminal communication unit 21, the terminal processing unit 27 displays the advertising based on the advertising data on the display unit 24 (step S204). Thereby, the advertisement with strong user's preference can be effectively provided.
  • the terminal processing unit 27 starts up the browser and indicates the URL related to the selected advertisement.
  • a page browsing request is output to an external server (not shown) (step S205).
  • the terminal processing unit 27 outputs identification information related to the selected advertisement to the server 3 (step S206).
  • the user preference analysis unit 332 updates the number of classified advertisement browsing stored in the server storage unit 32 based on the identification information related to the advertisement acquired from the mobile terminal 2 via the server communication unit 31. (Step S207) The operation sequence for providing advertisements is terminated.
  • FIG. 10A is a diagram illustrating an example of the advertisement described in step S204 of FIG.
  • the advertisement 2101 is displayed at the center of the standby screen 2100 displayed on the display unit 24.
  • an explanation of a new menu provided by the store is displayed in text.
  • the display area of the advertisement 2101 may increase as the number of image data items related to the category (the number of images by category) increases.
  • the menu provided by the store is an example of a product provided by the store. Therefore, the description of the new product provided by the store may be displayed as text.
  • FIG. 10B is a diagram showing another example of the advertisement described in step S204 of FIG.
  • the advertisement 2201 is displayed at the bottom of the image classification screen 2200 displayed on the display unit 24.
  • the contents of the new menu provided by the store are displayed in text.
  • the display area of the advertisement 2201 may increase as the number of image data items related to the category (the number of images by category) increases.
  • FIG. 10C is a diagram showing still another example of the advertisement described in step S204 of FIG.
  • the advertisement 2301 is displayed along with the image 2302 on the image classification screen 2300 displayed on the display unit 24.
  • the name of the new menu provided by the store is displayed together with the characters “PR” in the upper left.
  • the size of characters displayed in the advertisement 2301 may increase as the number of image data related to the category (the number of images by category) increases.
  • the advertisements 2101, 2201, and 2301 are examples, and the display mode of the advertisements is not limited to the above.
  • the advertisement may be a display including various contents such as an image and a moving image in addition to text information.
  • the advertisement is not limited to the one provided in the restaurant according to the analyzed user preference, and may be an advertisement provided by any store, business type, organization, or the like.
  • advertisements related to baby goods and children's clothing will be provided to users with many infant images
  • advertisements related to outdoor goods will be provided to users with many images such as mountains and the sea. Will be done.
  • fashion or health advertising related to women in their 20s may be provided.
  • users with many images such as towers, shrines and temples, or theme parks, advertisements and the like regarding sightseeing spots will be provided.
  • the detailed information can be registered for each image data.
  • the detailed information refers to detailed information that more specifically develops the category of image data.
  • the “meal” category “store name”, “menu name”, and the like can be used as detailed information.
  • Detailed information can be arbitrarily set according to the category of image data.
  • related information related to specific image data can be extracted based on GPS information or QR code information as candidates for detailed information.
  • the related information is information that is a candidate for detailed information, and includes a set of facility information related to facilities having a predetermined relationship with bibliographic information such as GPS information or QR code information (including a case where there is one information). ).
  • a set of store names of one or a plurality of stores within a certain distance range from GPS information related to specific image data is related information related to the image data.
  • the store name related to the QR code read within a predetermined time before the imaging date and time of specific image data is related information related to the image data.
  • “store name” and “menu name” are examples of facility information, and in the second embodiment, a case where the facility information is “store name” and “menu name” will be described.
  • the “store name” and “menu name” may be other facility information, for example, “place name”, “product name”, “tourist place name”, “explanation regarding tourist spot”, and the like.
  • related information is extracted using either GPS information or QR code information. However, related information may be extracted using both GPS information and QR code.
  • the portable terminal 2 and the server 3 described in the first embodiment are used as they are.
  • FIG. 11 is a diagram illustrating an example of a data structure related to a specific image used in the second embodiment.
  • the data structure shown in FIG. 11 shows a series of data linked to the image ID stored in the terminal storage unit 22. Compared with the data structure shown in FIG. 3, the data structure shown in FIG. 11 further includes GPS information, QR code information, a store name, a menu name, and the like.
  • the terminal processing unit 27 acquires the GPS information acquired by the GPS sensor 26 when image capturing is performed by the image capturing unit 25, and stores the GPS information in the terminal storage unit 22 in association with the image data.
  • the QR code processing unit 274 stores information related to the QR code (for example, identification information of the QR code) and the reading date and time in the buffer of the terminal storage unit 22.
  • the buffer stores only information related to the QR code with the latest reading date and time and the reading date and time.
  • the QR code processing unit 274 stores the information related to the QR code and the reading date / time in the terminal storage unit 22 in association with the image data. Therefore, all the image data stored in the terminal storage unit 22 is associated with information on the latest QR code among QR codes read before the imaging date and time.
  • the store name is an example of the detailed information described above, and is, for example, the name of the store where the object of the image data is provided.
  • the menu name is an example of the detailed information described above, and is, for example, the menu name of the object.
  • FIG. 12 is a diagram illustrating an example of a data structure related to a specific user used in the second embodiment.
  • the data structure shown in FIG. 12 shows a series of data associated with the user ID stored in the server storage unit 32.
  • the data structure shown in FIG. Compared with the data structure shown in FIG. 6, the data structure shown in FIG. 12 further includes a store ID and a menu ID associated with the image ID, and includes the number of images by price range.
  • the number of images by price range represents the number of image data belonging to each price range. For example, “1,000 yen (25)” is displayed when images belonging to a price range of 1,000 yen (1,000 to 1,999 yen) among images collected by the user are displayed. It shows that there are 25.
  • the number of images by price range may be the basis for user preference analysis.
  • the number of images by price range represents how much the user pays for, for example, the food of a restaurant, and the user's hobbies, preferences, interests, etc. are the number of images by category and the number of browsing advertisements by category. It can be said that it represents from a different point of view.
  • Other arbitrary parameters may be set as the basis of user preference analysis, such as the number of images by price range. Based on arbitrary items included in a store information table and menu information table, which will be described later, and other tables, parameters serving as a basis for user preference analysis can be set.
  • FIG. 13A is a diagram illustrating an example of a store information table.
  • the store information table is stored in the server storage unit 32 of the server 3.
  • information such as store name, address, QR code ID (QR code identification information), menu information (menu ID) and the like is stored in association with the store ID.
  • QR code is installed in, for example, a store.
  • the QR code may be unique to the store.
  • the QR code may be the same for affiliate stores.
  • the QR code may include table identification information given for each table used when the user has a meal in a store or the like.
  • FIG. 13B is a diagram showing an example of the menu information table.
  • the menu information table is stored in the server storage unit 32 of the server 3.
  • information such as a category, a menu name, a price, store image data (a menu image provided by the store) and the like are stored in association with the menu ID.
  • FIG. 14 is a diagram illustrating an example of an operation sequence for registering detailed information.
  • the operation sequence described below is based on a program stored in the terminal storage unit 22 and the server storage unit 32 in advance, and each element of the mobile terminal 2 and the server 3 is mainly performed by the terminal processing unit 27 and the server processing unit 33. It is executed in cooperation with.
  • the terminal processing unit 27 of the mobile terminal 2 displays a login screen on the display unit 24 (step S300).
  • the terminal processing unit 27 logs in including the user ID and password via the terminal communication unit 21.
  • the request is output to the server 3 (step S301).
  • the server processing unit 33 of the server 3 performs user authentication based on the acquired user ID and password (step S302).
  • the server processing unit 33 outputs an authentication permission notification to the mobile terminal 2 (step S303). For example, when the detailed information registration process is performed as a continuation of the image classification process described with reference to FIG. 7, the above user authentication process (steps S300 to S303) is not performed.
  • the terminal processing unit 27 of the mobile terminal 2 displays an image classification screen on the display unit 24 (step S304).
  • the terminal processing unit 27 displays a detailed information registration screen of the selected image data on the display unit 24 (step S305).
  • the terminal processing unit 27 outputs a detailed information registration request including the user ID, selected image data, bibliographic information, and the like to the server 3 (step S306).
  • the bibliographic information includes GPS information and QR code information.
  • the related information extraction unit 334 of the server 3 refers to the store information table based on the bibliographic information acquired from the mobile terminal 2, and related information related to the store name. (Store name related information) is extracted (step S307). The extraction of store name related information will be described later.
  • the server processing unit 33 of the server 3 outputs the extracted store name related information data to the mobile terminal 2 (step S308).
  • the terminal processing unit 27 of the mobile terminal 2 displays the acquired store name related information on the detailed information registration screen (step S309).
  • the terminal processing unit 27 selects the store name selection result via the terminal communication unit 21. Is associated with the image ID and output to the server 3 (step S310).
  • the terminal processing unit 27 registers the store name of the image data stored in the terminal storage unit 22 (step S311).
  • the related information extraction unit 334 refers to the store information table and based on the acquired selection result of the store name, the server storage unit The store ID of 32 is registered in association with the image ID (step S312).
  • the related information extraction unit 334 of the server 3 refers to the store information table and the menu information table, and extracts related information related to the menu name (menu name related information) (step S313).
  • the related information extraction unit 334 selects a menu name having the same category as the category registered in step S107 among the menu names of the menu ID related to the store ID registered in step S308. Extract as information.
  • the server processing unit 33 of the server 3 outputs the menu name related information data to the portable terminal 2 (step S314).
  • the terminal processing unit 27 of the mobile terminal 2 displays the menu name related information on the detailed information registration screen (step S315).
  • the terminal processing unit 27 selects the menu name via the terminal communication unit 21. The result is linked to the image ID and output to the server 3 (step S316).
  • the terminal processing unit 27 registers the menu name of the image data stored in the terminal storage unit 22 (step S317).
  • the terminal processing unit 27 displays a detailed information screen on which the category of image data, the store name, the menu name, and the like are displayed on the display unit 24 (step S318).
  • the server processing unit 33 refers to the store information table and the menu information table and based on the acquired menu name selection result.
  • the menu ID in the server storage unit 32 is registered in association with the image ID (step S319).
  • the categorizing unit 331 of the server 3 refers to the menu information table, totals the number of image data for each price range of the menu with the menu ID linked to the image ID, and stores it in the server storage unit 32.
  • the number of images classified by price range is updated (step S320), and the detailed information registration operation sequence is terminated.
  • FIG. 15 is a diagram illustrating an example of a flow of processing for extracting store name related information.
  • the flow of the extraction process of the store name related information in step S307 will be described.
  • the related information extraction unit 334 of the server 3 acquires bibliographic information including GPS information or QR code information via the server communication unit 21 (step S400).
  • the related information extraction unit 334 refers to the bibliographic information and determines whether the imaging date / time is included within a predetermined time after the reading date / time of the QR code information (step S401).
  • the imaging date and time is 19:20 on January 16, 2015
  • the reading date and time is 19:02 on January 16, 2015.
  • the predetermined time is, for example, 1 hour.
  • the predetermined time after the reading date / time is from January 16, 2015 at 19:02 to January 16, 2015 at 20:02.
  • the imaging date and time (January 16, 2015 19:20) is within a predetermined time after the reading date and time. Therefore, in the case of the data shown in FIG. 11, the related information extraction unit 334 determines that the imaging date / time is included within a predetermined time after the reading date / time.
  • the predetermined time can be arbitrarily set.
  • step S401 when it is determined that the imaging date / time is included within a predetermined time after the reading date / time (step S401; Yes), the related information extraction unit 334 creates a store information table based on the QR code information of the bibliographic information. Referring to, the store name related to the QR code ID is extracted as related information (step S402), and the store name related information extraction process ends.
  • step S401 when it is determined in step S401 that the imaging date / time is not included within the predetermined time after the reading date / time (step S401; No), the related information extraction unit 334 stores store information based on the GPS information of the bibliographic information. With reference to the table, the store name of the store within a certain distance range is extracted as related information from the GPS information (step S403), and the store name related information extraction process is completed.
  • FIG. 16A is an example of the detailed information registration screen 3100 described in step S309 of FIG.
  • a store name column 3101 and a menu name column 3102 are displayed.
  • store name related information 3104 is displayed together with a question sentence 3103 that says “Where is the store used?”.
  • a store name can be selected by pressing a specific store name area in the store name related information 3104.
  • a search unit 3105 capable of searching for an arbitrary store is displayed below the store name related information 3104.
  • FIG. 16B is an example of the detailed information registration screen 3200 described in step S315 of FIG.
  • the selected store name 3206 is displayed in the store name column 3201 on the detailed information registration screen.
  • a menu sentence related information 3204 is displayed together with a question sentence 3203 “What is the menu name here?”.
  • a menu name can be selected by pressing a specific menu name area in the menu name related information 3204.
  • a search unit 3205 that can search for an arbitrary menu is displayed below the related information 3204 of the menu name.
  • FIG. 16C is an example of the detailed information screen 3300 described in step S318 of FIG.
  • a category 3302 to which image data belongs is displayed in the category column 3301.
  • the selected store name 3304 is displayed in the store name column 3303.
  • the selected menu name 3306 is displayed.
  • FIGS. 17A and 17B are diagrams showing an example of advertisement display.
  • an advertisement related to an image may be displayed in a form associated with the image. That is, the advertisement providing unit 333 may add related information related to the image data as advertisement information to the advertisement data extracted in step S202 described above.
  • the related information extraction unit 334 refers to the store information table and extracts the store name and / or menu information related to the store included in a predetermined range from the imaging location based on the GPS information related to the image data. Then, the advertisement providing unit 333 superimposes the extracted store name and / or menu information on the image data as advertiser information, and outputs the superimposed image data to the mobile terminal 2.
  • the related information extraction unit 334 refers to the store information table and the menu information table, and stores the store name and / or the menu in a specific store based on the QR code read within a predetermined time from the imaging time related to the image data. Extract information. Then, the advertisement providing unit 333 superimposes the extracted store name and / or menu information on the image data as information related to the advertiser, and outputs the superimposed image data to the mobile terminal 2. If it does in this way, the terminal processing part 27 of the portable terminal 2 can display the information regarding the shop relevant to the provided image data as the said advertisement.
  • the menu information in the store is an example of the product information in the store, the menu information may be replaced with the product information.
  • the terminal processing unit 27 of the mobile terminal 2 displays an advertisement 4101A so as to overlap the upper left image 4101 of the category list screen 4100.
  • the advertisement 4101A is a store name and menu information related to the image data.
  • the advertiser of the advertisement 4101 ⁇ / b> A is a store where the user has captured the image 4101.
  • an advertisement 4102 ⁇ / b> A whose advertiser is the store where the user captured the image 4102 is displayed.
  • FIG. 17B shows an example of an advertisement displayed on the detailed information screen 4200.
  • the detailed information screen 4200 is a screen that displays image data and bibliographic information related to the image 4201.
  • the advertiser of the advertisement 4201 ⁇ / b> A is a store “Ristorante ⁇ ” where the user images the image 4201.
  • the advertisement of a store that has been visited in the past can be displayed on the image captured at the store, and the user can display the store at the store.
  • the possibility of revisiting will increase.
  • by displaying the advertisement related to the specific image in a manner associated with the image it is possible to more effectively appeal the advertisement to the user.
  • the advertisement providing unit 333 may superimpose and synthesize the store name and / or the menu information. . Thereby, the user can confirm the image data of the part which has overlapped with store name and / or menu information.
  • information may be provided to the user based on the price range associated with the user ID.
  • the user preference analysis unit 332 refers to the server storage unit 32 and specifies a price range in which the number of price range pixels associated with the user ID is the maximum or a predetermined number or more.
  • the advertisement providing unit 333 may refer to the store information table and the menu information table, specify the store of the specified price range, and output the advertisement of the specified store to the mobile terminal 2.
  • the store of the specified price range may be a store related to the user preference analyzed by the user preference analysis unit 332 in the first embodiment.
  • the server processing unit 33 of the server 3 classifies image data acquired from the mobile terminal 2.
  • the terminal processing unit 27 of the mobile terminal 2 may classify the image data and output information related to a category such as the number of images for each category to the server 3.
  • the advertisement providing unit 333 outputs the advertisement to the mobile terminal 2 based on the analyzed user preference as an example of the information providing unit.
  • the information providing unit may provide information to a contact associated with the user identification information based on the analyzed user preference.
  • the contact address associated with the user identification information may be an e-mail address, a mailing list, or an e-mail magazine.
  • a store name related to a store included in a predetermined range is extracted from an imaging location based on GPS information related to image data, and the extracted store name is superimposed and synthesized on the image data.
  • the output image data is output to the portable terminal 2.
  • the store name is an example of facility information
  • the store name may be replaced with other facility information, for example, a description of a sightseeing spot name and / or a sightseeing spot.
  • the facility information includes a sightseeing place name (zoo name) and / or a description about the sightseeing place (type of animal).
  • the store name based on the QR code read within a predetermined time from the imaging time related to the image data is extracted, and the extracted store name is superimposed and synthesized on the image data.
  • the image data is output to the mobile terminal 2.
  • the store name is an example of facility information
  • the store name may be replaced with other facility information, for example, a description of a sightseeing spot name and / or a sightseeing spot.
  • the facility information includes a sightseeing place name (zoo name) and / or a description about the sightseeing place (type of animal).
  • the QR code is installed, for example, in or around the sightseeing spot.
  • the QR code may be printed on a paper of a sightseeing map read by the QR code processing unit 274 in or around the sightseeing spot.
  • the QR code may be unique to a tourist spot.
  • a computer program for causing a computer to realize the functions of the terminal processing unit 27 and the server processing unit 33 may be provided in a form recorded on a computer-readable recording medium such as a magnetic recording medium or an optical recording medium. Good.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Development Economics (AREA)
  • Strategic Management (AREA)
  • Finance (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • General Engineering & Computer Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Databases & Information Systems (AREA)
  • Game Theory and Decision Science (AREA)
  • Data Mining & Analysis (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

Provided are an information delivery device 3 and an information delivery program capable of providing advertisements by utilizing image categorization information. The information delivery device 3 has: a storage unit 32 which stores, in association with user identification information, categories to which image data belong; a categorization part 331 which, upon acquiring the image data and the user identification information from a user terminal, determines the categories by means of image recognition processing of the acquired image data and registers the determined categories in the storage unit in association with the user identification information; an analysis part 332 which analyzes user preferences on the basis of the categories associated with the user identification information; and an information provision part 333 which provides information to the user on the basis of the analyzed user preferences.

Description

情報配信装置及び情報配信プログラムInformation distribution apparatus and information distribution program
 本発明は、情報配信装置及び情報配信プログラムに関する。 The present invention relates to an information distribution apparatus and an information distribution program.
 近年、デジタルカメラ等のカメラ機器のみならず、携帯電話やスマートフォン、タブレット等の様々なデバイスにカメラ機能が搭載されている。撮像した画像データはそれら端末の記憶部に記憶され、記憶された画像データは、例えば撮像時点の時系列順に並べられ一覧で表示される。 In recent years, not only camera devices such as digital cameras but also various devices such as mobile phones, smartphones and tablets have been equipped with camera functions. The captured image data is stored in the storage unit of those terminals, and the stored image data is displayed in a list, for example, arranged in time series at the time of imaging.
 画像データに係る所定の特徴量を解析する所謂画像認識処理によって、被写体を所定のカテゴリに分類し、画像データを被写体の属するカテゴリ毎に整理して表示する技術が知られている(例えば、特許文献1)。 A technique is known in which subjects are classified into predetermined categories by so-called image recognition processing that analyzes predetermined feature amounts related to image data, and image data is organized and displayed for each category to which the subjects belong (for example, patents). Reference 1).
特開2007-133746号公報JP 2007-133746 A
 撮り貯めた画像をカテゴライズして表示することは、所望の画像を見つけ易くなるなど、それを閲覧するユーザにとって有用である。また、特定のユーザが撮り貯めた画像が、どの様にカテゴライズされたかという情報を、画像のカテゴライズの用途以外に効果的に活用する方法は、提案されていない。 It is useful for a user who browses to display a categorized image that has been taken and stored, such as making it easier to find a desired image. Also, no method has been proposed for effectively utilizing information on how images collected and stored by a specific user have been categorized for purposes other than image categorization.
 本願に記載の情報配信装置及び情報配信プログラムの目的は、画像のカテゴライズ情報を利用してユーザに情報を提供することにある。 The purpose of the information distribution apparatus and the information distribution program described in the present application is to provide information to a user using image categorization information.
 情報配信装置は、画像データが属するカテゴリをユーザ識別情報に紐付けて記憶する記憶部と、画像データ及びユーザ識別情報をユーザ端末から取得し、取得された画像データを画像認識処理することによりカテゴリを判定し、判定されたカテゴリをユーザ識別情報に紐づけて記憶部に登録するカテゴライズ部と、ユーザ識別情報に紐付けられたカテゴリに基づいてユーザ嗜好を解析する解析部と、解析されたユーザ嗜好に基づいて、ユーザに情報を提供する情報提供部とを有する。 The information distribution device stores a category to which image data belongs in association with user identification information, acquires image data and user identification information from a user terminal, and performs image recognition processing on the acquired image data to perform category recognition. A categorizing unit that associates the determined category with the user identification information and registers the category in the storage unit, an analysis unit that analyzes user preferences based on the category associated with the user identification information, and the analyzed user And an information providing unit for providing information to the user based on the preference.
 情報配信プログラムは、画像データが属するカテゴリをユーザ識別情報に紐付けて記憶する記憶部を有するコンピュータの情報配信プログラムであって、画像データ及びユーザ識別情報をユーザ端末から取得し、取得された画像データを画像認識処理することにより、画像データが属するカテゴリを判定し、判定されたカテゴリをユーザ識別情報に紐付けて記憶部に登録し、ユーザ識別情報に紐付けられたカテゴリに基づいてユーザ嗜好を解析し、解析されたユーザ嗜好に基づいて、ユーザに情報を提供することをコンピュータに実行させる。 The information distribution program is an information distribution program for a computer having a storage unit that stores a category to which image data belongs in association with user identification information, and acquires image data and user identification information from a user terminal. Image recognition processing is performed on the data to determine the category to which the image data belongs, the determined category is associated with the user identification information and registered in the storage unit, and the user preference is based on the category associated with the user identification information. And the computer is caused to provide information to the user based on the analyzed user preference.
 上記の情報配信装置及び情報配信プログラムによれば、ユーザの嗜好が強い情報を効果的に提供することが可能となった。 According to the above information distribution apparatus and information distribution program, it is possible to effectively provide information with strong user preferences.
 本発明の目的及び効果は、特に請求項において指摘される構成要素及び組み合わせを用いることによって認識され且つ得られるだろう。前述の一般的な説明及び後述の詳細な説明の両方は、例示的及び説明的なものであり、特許請求の範囲に記載されている本発明を制限するものではない。 The objects and advantages of the invention will be realized and obtained by means of the elements and combinations particularly pointed out in the appended claims. Both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention as claimed.
情報配信システム1の概略構成の一例を示す図である。It is a figure which shows an example of schematic structure of the information delivery system. 携帯端末2の概略構成の一例を示す図である。2 is a diagram illustrating an example of a schematic configuration of a mobile terminal 2. FIG. 特定の画像に関するデータ構造の一例を示す図である。It is a figure which shows an example of the data structure regarding a specific image. 画像データをカテゴライズした場合のカテゴリの階層構造の一例を示す図である。It is a figure which shows an example of the hierarchical structure of the category at the time of categorizing image data. サーバ3の概略構成の一例を示す図である。2 is a diagram illustrating an example of a schematic configuration of a server 3. FIG. 特定のユーザに関するデータ構造の一例を示す図である。It is a figure which shows an example of the data structure regarding a specific user. 画像分類の動作シーケンスの一例を示す図である。It is a figure which shows an example of the operation | movement sequence of an image classification. 図7のステップS100で説明したログイン画面1100の一例を示す図である。It is a figure which shows an example of the login screen 1100 demonstrated by step S100 of FIG. 図7のステップS104で説明した画像閲覧画面1200の一例を示す図である。It is a figure which shows an example of the image browsing screen 1200 demonstrated by step S104 of FIG. 図7のステップS105で説明した画像分類要求の選択画面1300の一例を示す図である。It is a figure which shows an example of the selection screen 1300 of the image classification request | requirement demonstrated by step S105 of FIG. 図7のステップS111で説明した画像分類画面1400の一例を示す図である。It is a figure which shows an example of the image classification screen 1400 demonstrated by step S111 of FIG. 広告宣伝の提供の動作シーケンスの一例を示す図である。It is a figure which shows an example of the operation | movement sequence of advertisement provision. 図9のステップS204で説明した広告宣伝の一例を示す図である。FIG. 10 is a diagram illustrating an example of the advertisement described in step S <b> 204 in FIG. 9. 図9のステップS204で説明した広告宣伝の他の例を示す図である。It is a figure which shows the other example of the advertisement demonstrated by step S204 of FIG. 図9のステップS204で説明した広告宣伝の更に他の例を示す図である。It is a figure which shows the further another example of the advertisement demonstrated by step S204 of FIG. 第2実施形態で用いられる特定の画像に関するデータ構造の一例を示す図である。It is a figure which shows an example of the data structure regarding the specific image used by 2nd Embodiment. 第2実施形態で用いられる特定のユーザに関するデータ構造の一例を示す図である。It is a figure which shows an example of the data structure regarding the specific user used by 2nd Embodiment. 店舗情報テーブルの一例を示す図である。It is a figure which shows an example of a shop information table. メニュー情報テーブルの一例を示す図である。It is a figure which shows an example of a menu information table. 詳細情報登録の動作シーケンスの一例を示す図である。It is a figure which shows an example of the operation | movement sequence of detailed information registration. 店舗名関連情報の抽出処理のフローの一例を示す図である。It is a figure which shows an example of the flow of the extraction process of shop name relevant information. 図14のステップS309で説明した詳細情報登録画面3100の一例である。It is an example of the detailed information registration screen 3100 demonstrated by step S309 of FIG. 図14のステップS315で説明した詳細情報登録画面3200の一例である。It is an example of the detailed information registration screen 3200 demonstrated by step S315 of FIG. 図14のステップS318で説明した詳細情報画面3300の一例である。It is an example of the detailed information screen 3300 demonstrated by step S318 of FIG. 広告宣伝の表示の一例を示す図である。It is a figure which shows an example of the display of an advertisement. 広告宣伝の表示の一例を示す図である。It is a figure which shows an example of the display of an advertisement.
[第1実施形態]
 図1は、情報配信システム1の概略構成の一例を示す図である。
[First Embodiment]
FIG. 1 is a diagram illustrating an example of a schematic configuration of an information distribution system 1.
 情報配信システム1は、少なくとも1台の携帯端末2と、サーバ3とを備える。携帯端末2とサーバ3とは、通信ネットワークを介して相互に接続され、例えば、基地局4、移動体通信網5、ゲートウェイ6、及びインターネット7を介して相互に接続される。携帯端末2で実行されるプログラムと、サーバ3で実行されるプログラムとは、ハイパーテキスト転送プロトコル(Hypertext Transfer Protocol, HTTP)等の通信プロトコルを用いて通信を行う。 The information distribution system 1 includes at least one portable terminal 2 and a server 3. The portable terminal 2 and the server 3 are connected to each other via a communication network, and are connected to each other via, for example, a base station 4, a mobile communication network 5, a gateway 6, and the Internet 7. The program executed on the mobile terminal 2 and the program executed on the server 3 communicate with each other using a communication protocol such as a hypertext transfer protocol (Hypertext Transfer Protocol, HTTP).
 図2は、携帯端末2の概略構成の一例を示す図である。 FIG. 2 is a diagram illustrating an example of a schematic configuration of the mobile terminal 2.
 携帯端末2は、端末通信部21、端末記憶部22、操作部23、表示部24、撮像部25、GPSセンサ26、及び端末処理部27等を備える。携帯端末2として、多機能携帯電話(所謂「スマートフォン」)が想定されるが、これに限定されるものではない。携帯端末2は、本発明が適用可能であればよく、例えば、パーソナルコンピュータ(PC)、携帯電話(所謂「フィーチャーフォン」)、携帯情報端末(Personal Digital Assistant, PDA)、携帯ゲーム機、携帯音楽プレーヤ、タブレットPC等でもよい。 The portable terminal 2 includes a terminal communication unit 21, a terminal storage unit 22, an operation unit 23, a display unit 24, an imaging unit 25, a GPS sensor 26, a terminal processing unit 27, and the like. As the mobile terminal 2, a multi-function mobile phone (so-called “smart phone”) is assumed, but the mobile terminal 2 is not limited to this. The mobile terminal 2 is not limited as long as the present invention can be applied. For example, a personal computer (PC), a mobile phone (so-called “feature phone”), a personal digital assistant (Personal Digital Assistant, PDA), a mobile game machine, and mobile music. A player, a tablet PC, or the like may be used.
 端末通信部21は、所定の周波数帯を感受帯域とするアンテナを含む通信インターフェース回路を備え、携帯端末2を無線通信ネットワークに接続する。端末通信部21は、基地局4により割り当てられるチャネルを介して、基地局4との間でWCDMA(登録商標)(Wideband Code Division Multiple Access)方式等による無線信号回線を確立し、基地局4との間で通信を行う。端末通信部21は、端末処理部27から供給されたデータをサーバ3等に出力する。また、端末通信部21は、サーバ3等から取得したデータを端末処理部27に供給する。 The terminal communication unit 21 includes a communication interface circuit including an antenna whose sensitivity band is a predetermined frequency band, and connects the mobile terminal 2 to the wireless communication network. The terminal communication unit 21 establishes a wireless signal line using the WCDMA (Wideband Code Multiple Access) method or the like with the base station 4 via the channel assigned by the base station 4. Communicate between the two. The terminal communication unit 21 outputs the data supplied from the terminal processing unit 27 to the server 3 or the like. Further, the terminal communication unit 21 supplies data acquired from the server 3 or the like to the terminal processing unit 27.
 端末記憶部22は、例えば、半導体メモリ、磁気ディスク装置、又は光ディスク装置のうちの少なくともいずれか一つを備える。端末記憶部22は、端末処理部27での処理に用いられるオペレーティングシステムプログラム、ドライバプログラム、アプリケーションプログラム、データ等を記憶する。例えば、端末記憶部22は、ドライバプログラムとして、操作部23を制御する入力デバイスドライバプログラム、表示部24を制御する出力デバイスドライバプログラム等を記憶する。また、端末記憶部22は、アプリケーションプログラムとして、画像データの取得及び表示を行うプログラム等を記憶する。また、端末記憶部22は、データとして、画像データ及びそれに付随する書誌情報等を記憶する。さらに、端末記憶部22は、所定の処理に係る一時的なデータを一時的に記憶してもよい。 The terminal storage unit 22 includes, for example, at least one of a semiconductor memory, a magnetic disk device, and an optical disk device. The terminal storage unit 22 stores an operating system program, a driver program, an application program, data, and the like used for processing in the terminal processing unit 27. For example, the terminal storage unit 22 stores an input device driver program for controlling the operation unit 23, an output device driver program for controlling the display unit 24, and the like as driver programs. The terminal storage unit 22 stores a program for acquiring and displaying image data as an application program. Further, the terminal storage unit 22 stores image data and bibliographic information accompanying the image data. Further, the terminal storage unit 22 may temporarily store temporary data related to a predetermined process.
 操作部23は、携帯端末2の操作が可能であればどのようなデバイスでもよく、例えば、タッチパッド、キーボード等である。ユーザは、操作部23を用いて、文字、数字等を入力することができる。操作部23は、ユーザにより操作されると、その操作に対応する信号を発生する。発生した信号は、ユーザの指示として、端末処理部27に供給される。 The operation unit 23 may be any device as long as the operation of the mobile terminal 2 is possible, for example, a touch pad, a keyboard, or the like. The user can input characters, numbers, and the like using the operation unit 23. When operated by the user, the operation unit 23 generates a signal corresponding to the operation. The generated signal is supplied to the terminal processing unit 27 as a user instruction.
 表示部24は、映像、画像等の表示が可能であればどのようなデバイスでもよく、例えば、液晶ディスプレイ、有機EL(Electro-Luminescence)ディスプレイ等である。表示部24は、端末処理部27から供給された、画像データに応じた画像等を表示する。 The display unit 24 may be any device that can display video, images, and the like, and is, for example, a liquid crystal display, an organic EL (Electro-Luminescence) display, or the like. The display unit 24 displays an image or the like corresponding to the image data supplied from the terminal processing unit 27.
 撮像部25は、撮像、及びQRコード(登録商標)の読み取りが可能であればどのようなデバイスでもよく、例えば、レンズ及び撮像素子等を有する。 The imaging unit 25 may be any device capable of imaging and reading a QR code (registered trademark), and includes, for example, a lens and an imaging element.
 GPSセンサ26は、位置センサの一例であり、通信衛星と接続し、緯度情報及び経度情報を含んだ位置情報(GPS(Global Positioning System)情報)を取得する。なお、携帯端末2は、必ずしもGPSセンサ26を備えていなくても良い。更に、位置情報は、GPS情報以外の位置情報であってもよい。例えば、位置情報は、複数の基地局から携帯端末2の位置センサが受信した電波の強度に基づいて特定された位置情報であってもよい。或いは、位置情報は、iBeacon(登録商標)により、携帯端末2が取得した位置情報であってもよい。 The GPS sensor 26 is an example of a position sensor, is connected to a communication satellite, and acquires position information (GPS (Global Positioning System) information) including latitude information and longitude information. Note that the mobile terminal 2 does not necessarily include the GPS sensor 26. Further, the position information may be position information other than GPS information. For example, the position information may be position information specified based on the intensity of radio waves received by the position sensor of the mobile terminal 2 from a plurality of base stations. Alternatively, the position information may be position information acquired by the mobile terminal 2 by iBeacon (registered trademark).
 端末処理部27は、一又は複数個のプロセッサ及びその周辺回路を備える。端末処理部27は、携帯端末2の全体的な動作を統括的に制御するものであり、例えば、CPU(Central Processing Unit)である。端末処理部27は、端末記憶部22に記憶されているプログラムが操作部23の操作等に応じて適切な手順で実行されるように、端末通信部21、表示部24、撮像部25、及びGPSセンサ26等の動作を制御する。端末処理部27は、端末記憶部22に記憶されているプログラム(オペレーティングシステムプログラム、ドライバプログラム、アプリケーションプログラム等)に基づいて処理を実行する。また、端末処理部27は、複数のプログラム(アプリケーションプログラム等)を並列に実行することができる。 The terminal processing unit 27 includes one or a plurality of processors and their peripheral circuits. The terminal processing unit 27 controls the overall operation of the mobile terminal 2 and is, for example, a CPU (Central Processing Unit). The terminal processing unit 27 includes a terminal communication unit 21, a display unit 24, an imaging unit 25, and a program so that the program stored in the terminal storage unit 22 is executed in an appropriate procedure according to the operation of the operation unit 23. The operation of the GPS sensor 26 and the like is controlled. The terminal processing unit 27 executes processing based on programs (operating system program, driver program, application program, etc.) stored in the terminal storage unit 22. The terminal processing unit 27 can execute a plurality of programs (such as application programs) in parallel.
 端末処理部27は、画像処理部271、画像閲覧部272、広告宣伝表示部273、及びQRコード処理部274を備える。これらの各部は、端末処理部27が備えるプロセッサで実行されるプログラムにより実現される機能モジュールである。あるいは、これらの各部は、ファームウェアとして携帯端末2に実装されてもよい。 The terminal processing unit 27 includes an image processing unit 271, an image browsing unit 272, an advertisement display unit 273, and a QR code processing unit 274. Each of these units is a functional module realized by a program executed by a processor included in the terminal processing unit 27. Or these each part may be mounted in the portable terminal 2 as firmware.
 画像処理部271は、撮像部25による撮像に基づいて画像データ及び画像データに係る書誌情報を生成する。書誌情報にはユーザID、画像種別、撮像日時等が含まれる。書誌情報は、Exif(Exchangeable image file format)情報等のその他の情報を含んでいてもよい。画像データ及び書誌情報は、端末記憶部22に記憶される。 The image processing unit 271 generates image data and bibliographic information related to the image data based on the imaging by the imaging unit 25. The bibliographic information includes user ID, image type, imaging date and time, and the like. The bibliographic information may include other information such as Exif (Exchangeable image file format) information. Image data and bibliographic information are stored in the terminal storage unit 22.
 画像閲覧部272は、端末記憶部22に記憶されたプログラムに基づいて、画像閲覧画面を表示部24に表示する。画像閲覧画面は、画像データに基づいたサムネイル画像の一覧表示画面、及び、特定の画像データ単体の表示画面等、様々な形態の画面とすることができる。 The image browsing unit 272 displays an image browsing screen on the display unit 24 based on the program stored in the terminal storage unit 22. The image browsing screen can be a screen of various forms such as a list display screen of thumbnail images based on image data and a display screen of specific image data alone.
 広告宣伝表示部273は、サーバ3から広告宣伝データを取得すると、表示部24に広告宣伝を表示する。 When the advertisement display unit 273 acquires the advertisement data from the server 3, the advertisement display unit 273 displays the advertisement on the display unit 24.
 QRコード処理部274は、コード情報処理部の一例であり、撮像部25により読み取ったQRコードに関する情報を、所定の方法で、端末記憶部22に記憶された画像データに紐付ける。なお、本実施形態では、コード情報がQRコードである場合について説明するが、コード情報は、QRコード以外の二次元バーコード又は一次元のバーコード等、他のコード情報であってもよい。 The QR code processing unit 274 is an example of a code information processing unit, and associates information related to the QR code read by the imaging unit 25 with image data stored in the terminal storage unit 22 by a predetermined method. In this embodiment, the case where the code information is a QR code will be described. However, the code information may be other code information such as a two-dimensional barcode or a one-dimensional barcode other than the QR code.
 図3は、特定の画像に関するデータ構造の一例を示す図である。 FIG. 3 is a diagram illustrating an example of a data structure related to a specific image.
 図3に示すデータ構造は、端末記憶部22に記憶されている画像IDに紐付けられた一連のデータを示している。一連のデータは、例えば、ユーザID、画像種別、撮像日時、画像が属するカテゴリ、及び画像データ等を含む。上記の一連のデータは一例であって、その他にも、画像に関するデータとしてExif情報や、その他の画像に関連するメタデータ等を含んでもよい。 The data structure shown in FIG. 3 shows a series of data linked to the image ID stored in the terminal storage unit 22. The series of data includes, for example, a user ID, an image type, an imaging date and time, a category to which the image belongs, and image data. The series of data described above is an example, and in addition, Exif information, metadata related to other images, and the like may be included as data related to images.
 画像種別には、「撮像」、「ダウンロード」、及び「キャプチャ」等が含まれるが、これらに限定されない。「撮像」は、携帯端末2の撮像部25で撮像された画像データであることを示す。「ダウンロード」は、携帯端末2が接続されるネットワーク上からダウンロードされた画像データであることを示す。「キャプチャ」は、携帯端末2の表示部24における任意の表示をキャプチャして生成される画像データであることを示す。 Image types include “imaging”, “download”, “capture”, etc., but are not limited to these. “Imaging” indicates image data captured by the imaging unit 25 of the mobile terminal 2. “Download” indicates that the image data is downloaded from the network to which the mobile terminal 2 is connected. “Capture” indicates image data generated by capturing an arbitrary display on the display unit 24 of the mobile terminal 2.
 図4は、画像データをカテゴライズした場合のカテゴリの階層構造の一例を示す図である。 FIG. 4 is a diagram illustrating an example of a hierarchical structure of categories when image data is categorized.
 図4に示される様に、複数のカテゴリが一つのツリー構造を形成している。「画像」は例えば、「人物」、「食事」、「風景」、及び「その他」の下位カテゴリを有する。 As shown in FIG. 4, a plurality of categories form one tree structure. “Image” has, for example, subcategories of “person”, “meal”, “landscape”, and “other”.
 「人物」は例えば、「男性」及び「女性」の下位カテゴリを有し、「男性」及び「女性」は例えばそれぞれ、「10歳未満」、「10代」、「20代」、及び「30代」等の年代の下位カテゴリを有する。「食事」は例えば、「和食」、「洋食」、及び「中華」の下位カテゴリを有する。「和食」は例えば、「寿司」、「そば」、及び「焼き鳥」等の下位カテゴリを有する。「洋食」は例えば、「パスタ」、「サラダ」、及び「ピザ」等の下位カテゴリを有する。「中華」は例えば、「ラーメン」、「餃子」、及び「春巻き」等の下位カテゴリを有する。「風景」は例えば、「山」、「川」、及び「海」の下位カテゴリを有する。「施設」は例えば、「タワー」、「社寺・教会」、及び「テーマパーク」の下位カテゴリを有する。 “Person” has, for example, “male” and “female” subcategories, and “male” and “female” are, for example, “under 10 years old”, “teens”, “20s”, and “30”, respectively. It has a subcategory of age, such as “Yen”. “Meal” has, for example, subcategories of “Japanese food”, “Western food”, and “Chinese food”. “Japanese food” has, for example, lower categories such as “sushi”, “soba”, and “yakitori”. “Western food” has, for example, subcategories such as “pasta”, “salad”, and “pizza”. “Chinese Chinese” has, for example, subcategories such as “ramen”, “gyoza”, and “spring roll”. “Scenery” has, for example, subordinate categories of “mountain”, “river”, and “sea”. The “facility” has, for example, subcategories of “tower”, “shrines and churches”, and “theme park”.
 前述のカテゴリ分けの方法は一例であり、他に任意の方法でカテゴリ分けを行うことができる。例えば、「洋食」の下位カテゴリに、「イタリアン」、「フレンチ」、及び「スペイン料理」等のカテゴリを含めてもよい。また、カテゴリはツリー構造以外の構造を形成してもよい。 The above-described categorization method is an example, and categorization can be performed by any other method. For example, categories such as “Italian”, “French”, and “Spanish cuisine” may be included in the lower category of “Western food”. Further, the category may form a structure other than a tree structure.
 図5は、サーバ3の概略構成の一例を示す図である。 FIG. 5 is a diagram illustrating an example of a schematic configuration of the server 3.
 サーバ3は、サーバ通信部31、サーバ記憶部32、及びサーバ処理部33を備える。サーバ3は、携帯端末2からの要求に応じて画像データをカテゴライズし、カテゴライズされた画像データを携帯端末2に出力する。 The server 3 includes a server communication unit 31, a server storage unit 32, and a server processing unit 33. The server 3 categorizes the image data in response to a request from the mobile terminal 2 and outputs the categorized image data to the mobile terminal 2.
 サーバ通信部31は、インターネット7を介してデータの送受信を行うための通信インターフェース回路を備え、携帯端末2との間で通信を行う。 The server communication unit 31 includes a communication interface circuit for transmitting and receiving data via the Internet 7 and communicates with the mobile terminal 2.
 サーバ記憶部32は、例えば、磁気テープ装置、磁気ディスク装置、又は光ディスク装置のうちの少なくともいずれか一つを備え、サーバ処理部33での処理に用いられるオペレーティングシステムプログラム、ドライバプログラム、アプリケーションプログラム、データ等を記憶する。また、サーバ記憶部32は、データとして、広告宣伝データを記憶する。広告宣伝データには、携帯端末2の表示部24で表示される広告宣伝のデータ、及び、広告宣伝の対象物に関するウェブページに係るURL等が含まれている。また、サーバ記憶部32は、所定の処理に係る一時的なデータを一時的に記憶するためのバッファを更に備える。 The server storage unit 32 includes, for example, at least one of a magnetic tape device, a magnetic disk device, and an optical disk device, and an operating system program, a driver program, an application program used for processing in the server processing unit 33, Store data etc. Further, the server storage unit 32 stores advertisement data as data. The advertisement data includes advertisement data displayed on the display unit 24 of the mobile terminal 2 and a URL related to a web page related to the advertisement object. The server storage unit 32 further includes a buffer for temporarily storing temporary data related to a predetermined process.
 サーバ処理部33は、一又は複数個のプロセッサ及びその周辺回路を備え、サーバ3の全体的な動作を統括的に制御するものであり、例えば、CPUで構成される。また、サーバ処理部33は、サーバ記憶部32に記憶されているプログラム(オペレーティングシステムプログラム、ドライバプログラム、アプリケーションプログラム等)に基づき、適切な手順でサーバ通信部31等の動作を制御する。さらに、サーバ処理部33は、複数のプログラム(アプリケーションプログラム等)を並列に実行してもよい。 The server processing unit 33 includes one or a plurality of processors and their peripheral circuits, and controls the overall operation of the server 3 in an integrated manner. For example, the server processing unit 33 includes a CPU. In addition, the server processing unit 33 controls the operation of the server communication unit 31 and the like by an appropriate procedure based on programs (operating system program, driver program, application program, etc.) stored in the server storage unit 32. Furthermore, the server processing unit 33 may execute a plurality of programs (such as application programs) in parallel.
 サーバ処理部33は、カテゴライズ部331、ユーザ嗜好解析部332、広告宣伝提供部333、及び関連情報抽出部334等を備える。これらは、サーバ処理部33が備えるプロセッサで実行されるプログラムにより実現される機能モジュールである。なお、これらは、ファームウェアとしてサーバ3に実装されてもよい。 The server processing unit 33 includes a categorizing unit 331, a user preference analyzing unit 332, an advertisement providing unit 333, a related information extracting unit 334, and the like. These are functional modules realized by a program executed by a processor included in the server processing unit 33. These may be installed in the server 3 as firmware.
 カテゴライズ部331は、携帯端末2からサーバ通信部31を介して画像分類要求を取得し、公知の画像認識処理方法によって各画像データの属するカテゴリ、例えば、図4に示す最下位カテゴリのいずれか一つのカテゴリを判定する。 The categorizing unit 331 obtains an image classification request from the portable terminal 2 via the server communication unit 31, and uses one of the categories to which each image data belongs, for example, the lowest category shown in FIG. Determine one category.
 公知の画像認識処理方法として、特開2007-133746号公報に記載された境界線を利用して画像をカテゴライズする手法を用いることができる。この方法では、画像を複数の領域に分割して各々の領域の境界線を抽出し、抽出した境界線で構成される図形とカテゴリ毎に予め登録されている図形とを比較し、両者が近似しているか否かに基づいてカテゴリ判定する。例えば、画像から抽出した境界線で構成される図形が、「風景」のカテゴリに対して予め登録されている図形(例えば、山を示す三角形の図形)に近似している場合に、その画像データを「風景」のカテゴリに分類する。 As a known image recognition processing method, a method of categorizing an image using a boundary line described in Japanese Patent Application Laid-Open No. 2007-133746 can be used. In this method, the image is divided into a plurality of regions, the boundary lines of each region are extracted, the figure composed of the extracted boundary lines is compared with the figure registered in advance for each category, and both are approximated. The category is determined based on whether or not For example, when a figure composed of boundary lines extracted from an image approximates a figure (for example, a triangle figure indicating a mountain) registered in advance for the category “landscape”, the image data Are classified into the “landscape” category.
 上記の方法では、色を利用してカテゴリを判定することができる。その場合、画像の各部の色情報を抽出し、抽出した色とカテゴリ毎に予め登録されている色とを比較し、両者が近似しているか否かに基づいてカテゴリを判定する。例えば、画像から抽出した色が、「人物」のカテゴリに対して予め登録されている色(例えば、肌の色を示す黄色)に近似している場合に、その画像データを「人物」のカテゴリに分類する。 In the above method, the category can be determined using the color. In that case, color information of each part of the image is extracted, the extracted color is compared with a color registered in advance for each category, and the category is determined based on whether or not the two are approximate. For example, when the color extracted from the image approximates a color registered in advance for the category of “person” (for example, yellow indicating skin color), the image data is classified into the category of “person”. Classify into:
 また、公知の画像認識処理方法として、特表2014-517371号公報に記載された顔認識処理を利用して画像をカテゴライズする手法を用いることができる。この方法では、幾何解析(目立つ特徴を調べるもの)、光分析(不一致を除外するため、画像から不要な要素を取り除いた値とし、その値をテンプレートと比較する、統計的アプローチ)等に基づいて、カテゴリを判定する。当該顔認識処理によって、被写体の性別や、「10歳未満」、「10代」、「20代」等の年代を判定することができる。 Also, as a known image recognition processing method, a method of categorizing images using the face recognition processing described in JP-T-2014-517371 can be used. This method is based on geometric analysis (examine prominent features), optical analysis (statistical approach that removes unnecessary elements from images and compares them with templates to eliminate inconsistencies), etc. Determine the category. By the face recognition process, the sex of the subject and the age such as “under 10 years old”, “10s”, “20s”, etc. can be determined.
 ユーザ嗜好解析部332は、後述するカテゴリ別画像数、及びカテゴリ別広告閲覧数に基づいて、ユーザ嗜好を解析する。ここで、ユーザ嗜好とは、事物に対するユーザの趣味、好み、及び興味等を表す概念である。 The user preference analysis unit 332 analyzes the user preference based on the number of images by category and the number of browsing advertisements by category, which will be described later. Here, the user preference is a concept representing the user's hobbies, preferences, interests, etc. with respect to things.
 ユーザ嗜好の解析方法は、例えば、カテゴリ毎にカテゴリ別画像数及びカテゴリ別広告閲覧数を足し合わせて、当該数が最も多いカテゴリをユーザ嗜好であるとする。例えば、図6では、「パスタ」のカテゴリは、カテゴリ別画像数が28、カテゴリ別広告閲覧数が10であり、それらを足し合わせた数は38となる。当該操作を全てのカテゴリについて行った結果、例えば「パスタ」のカテゴリに係る数38が最大であれば、ユーザ嗜好は「パスタ」であるとみなす。上記解析方法は一例であって、カテゴリ別画像数とカテゴリ別広告閲覧数とで、ユーザ嗜好解析における重み付けの度合いを変えても良いし、他の解析方法を用いてもよい。 The user preference analysis method, for example, supposes that the category with the largest number is the user preference by adding the number of images by category and the number of advertisements browsed by category for each category. For example, in FIG. 6, the category of “pasta” has 28 category-specific images and 10 category-specific advertisement views, and the total number of these is 38. As a result of performing this operation for all categories, for example, if the number 38 relating to the “pasta” category is maximum, the user preference is regarded as “pasta”. The above analysis method is an example, and the degree of weighting in the user preference analysis may be changed depending on the number of images by category and the number of advertisements browsed by category, or other analysis methods may be used.
 広告宣伝提供部333は、情報提供部の一例であり、ユーザ嗜好解析部332で解析されたユーザ嗜好に基づいて、サーバ記憶部32に記憶されている広告宣伝データから、関連する広告宣伝データを抽出する。 The advertisement providing unit 333 is an example of an information providing unit. Based on the user preference analyzed by the user preference analyzing unit 332, the advertisement providing unit 333 obtains related advertisement data from the advertisement data stored in the server storage unit 32. Extract.
 関連情報抽出部334は、取得されたGPS情報又はQRコード情報に基づいて、画像データに関連する関連情報を抽出する。関連情報については後述する。 The related information extraction unit 334 extracts related information related to the image data based on the acquired GPS information or QR code information. Related information will be described later.
 図6は、特定のユーザに関するデータ構造の一例を示す図である。 FIG. 6 is a diagram illustrating an example of a data structure related to a specific user.
 図6に示すデータ構造は、サーバ記憶部32に記憶されているユーザIDに紐付けられた一連のデータを示している。一連のデータは、例えば、会員種別、氏名、性別、生年月日、画像情報、カテゴリ別画像数、及びカテゴリ別広告閲覧数等を含む。上記の一連のデータは一例であって、その他のユーザに関する情報を含んでもよい。 The data structure shown in FIG. 6 indicates a series of data associated with the user ID stored in the server storage unit 32. The series of data includes, for example, member type, name, sex, date of birth, image information, number of images by category, number of browsing advertisements by category, and the like. The series of data described above is an example, and may include information regarding other users.
 会員種別は、ユーザの会員としての種類を表し、例えば、「有料」、「無料」、及び「プレミアム」(通常の有料会員用の会費の他に、上乗せの会費を支払っている会員を示す)等を含む。 The membership type indicates the type of the user as a member. For example, “paid”, “free”, and “premium” (indicating a member who pays an additional membership fee in addition to a regular fee-based membership fee) Etc.
 画像情報には、ユーザIDに紐付けられた画像IDと、当該画像IDに係る被写体のカテゴリが含まれる。したがって、サーバ記憶部32は、画像が属するカテゴリをユーザ識別情報に紐付けて記憶していることになる。 The image information includes an image ID associated with the user ID and a subject category related to the image ID. Therefore, the server storage unit 32 stores the category to which the image belongs in association with the user identification information.
 カテゴリ別画像数は、各カテゴリに属する画像データの数を表している。例えば、「パスタ(28)」との表示は、ユーザが撮り溜めた画像のうち、「パスタ」のカテゴリに属する画像が28個あることを示している。また、例えば、「20代女性(15)」との表示は、ユーザが撮り溜めた画像のうち、画像に写った人物が「女性」の下位カテゴリである「20代」のカテゴリに属する画像が15個あることを示している。 The number of images by category represents the number of image data belonging to each category. For example, the display of “pasta (28)” indicates that there are 28 images belonging to the “pasta” category among images collected by the user. Further, for example, the display of “female in twenties (15)” indicates that, among the images taken by the user, images belonging to the category of “20s”, which is a subcategory of “female”, is a person in the image. It shows that there are 15 pieces.
 カテゴリ別広告閲覧数は、カテゴリ毎に、広告宣伝が誘導する画面の表示をユーザが選択した回数が示されている。例えば、「パスタ(10)」との表示は、ユーザが「パスタ」のカテゴリに属する広告宣伝を選択し、当該広告宣伝が誘導する画面を10回閲覧したことを示している。 The number of advertisements browsed by category indicates the number of times the user has selected display of a screen guided by advertisement for each category. For example, the display of “pasta (10)” indicates that the user has selected an advertisement belonging to the category “pasta” and browsed a screen guided by the advertisement 10 times.
 図7は、画像分類の動作シーケンスの一例を示す図である。 FIG. 7 is a diagram showing an example of an operation sequence for image classification.
 以下に説明する動作シーケンスは、予め端末記憶部22及びサーバ記憶部32に記憶されているプログラムに基づいて、主に端末処理部27及びサーバ処理部33により、携帯端末2及びサーバ3の各要素と協働して実行される。なお、携帯端末2側では、特定のアプリケーションを立ち上げることによって、以下の動作を実行できる環境に自動的に設定されるようにしても良い。 The operation sequence described below is based on a program stored in the terminal storage unit 22 and the server storage unit 32 in advance, and each element of the mobile terminal 2 and the server 3 is mainly performed by the terminal processing unit 27 and the server processing unit 33. It is executed in cooperation with. Note that the mobile terminal 2 may be automatically set to an environment in which the following operations can be executed by launching a specific application.
 最初に、携帯端末2の端末処理部27は、ログイン画面を表示部24に表示させる(ステップS100)。次に、ユーザが、操作部23の操作によりログイン画面においてユーザID及びパスワードを入力し、ログイン実行を選択すると、端末処理部27は、端末通信部21を介して、ユーザID及びパスワードを含むログイン要求をサーバ3に出力する(ステップS101)。サーバ通信部31を介してログイン要求を取得すると、サーバ3のサーバ処理部33は、取得したユーザID及びパスワードに基づいてユーザ認証を行う(ステップS102)。ユーザ認証が許可されると、サーバ処理部33は、認証許可通知を携帯端末2に出力する(ステップS103)。 First, the terminal processing unit 27 of the mobile terminal 2 displays a login screen on the display unit 24 (step S100). Next, when the user inputs the user ID and password on the login screen by the operation of the operation unit 23 and selects execution of login, the terminal processing unit 27 logs in including the user ID and password via the terminal communication unit 21. The request is output to the server 3 (step S101). When the login request is acquired via the server communication unit 31, the server processing unit 33 of the server 3 performs user authentication based on the acquired user ID and password (step S102). When the user authentication is permitted, the server processing unit 33 outputs an authentication permission notification to the mobile terminal 2 (step S103).
 端末通信部21を介して認証許可通知を取得すると、携帯端末2の端末処理部27は、端末記憶部22に記憶された画像データに基づいて、表示部24に画像閲覧画面を表示させる(ステップS104)。次に、ユーザが操作部23を操作することにより、画像閲覧画面上で画像分類を選択すると、端末処理部27は、端末記憶部22に記憶された画像データ及び画像分類要求を、端末通信部21を介してサーバ3に出力する(ステップS105)。なお、ステップS105においてサーバ3に出力する画像データは、ユーザの選択等により、端末記憶部22に記憶された画像データの集合のうち、1つの画像データであってもよいし、複数の画像データであってもよい。 When the authentication permission notification is acquired via the terminal communication unit 21, the terminal processing unit 27 of the mobile terminal 2 displays an image browsing screen on the display unit 24 based on the image data stored in the terminal storage unit 22 (step). S104). Next, when the user operates the operation unit 23 to select an image classification on the image browsing screen, the terminal processing unit 27 sends the image data and the image classification request stored in the terminal storage unit 22 to the terminal communication unit. 21 to the server 3 (step S105). Note that the image data output to the server 3 in step S105 may be one image data or a plurality of image data among a set of image data stored in the terminal storage unit 22 by user selection or the like. It may be.
 サーバ通信部31を介して画像分類要求を取得すると、カテゴライズ部331は、取得した各画像データを前述した画像認識処理方法を用いて処理し、各画像データが属するカテゴリを判定する(ステップS106)。次に、カテゴライズ部331は、サーバ記憶部32に記憶されたユーザに紐付けられた画像情報を更新し、画像ID毎にカテゴリを登録する(ステップS107)。次に、カテゴライズ部331は、カテゴリ毎に画像データの数を集計して、サーバ記憶部32に記憶されたカテゴリ別画像数を更新する(ステップS108)。次に、サーバ処理部33は、サーバ通信部31を介して、携帯端末2に各画像データに係るカテゴリのデータを出力する(ステップS109)。 When the image classification request is acquired via the server communication unit 31, the categorizing unit 331 processes each acquired image data using the above-described image recognition processing method, and determines a category to which each image data belongs (step S106). . Next, the categorizing unit 331 updates the image information associated with the user stored in the server storage unit 32, and registers a category for each image ID (step S107). Next, the categorizing unit 331 aggregates the number of image data for each category, and updates the number of images by category stored in the server storage unit 32 (step S108). Next, the server processing unit 33 outputs category data related to each image data to the mobile terminal 2 via the server communication unit 31 (step S109).
 端末通信部21を介して各画像データに係るカテゴリのデータを取得すると、携帯端末2の端末処理部27は、端末記憶部22に記憶された各画像データのカテゴリを登録する(ステップS110)。次に、端末処理部27は、端末記憶部22に記憶された画像データとカテゴリに基づいて、カテゴリ毎に画像データを表示する画像分類画面を表示部24に表示し(ステップS111)、一連の動作シーケンスを終了する。 When the category data related to each image data is acquired via the terminal communication unit 21, the terminal processing unit 27 of the portable terminal 2 registers the category of each image data stored in the terminal storage unit 22 (step S110). Next, the terminal processing unit 27 displays an image classification screen for displaying image data for each category on the display unit 24 based on the image data and the category stored in the terminal storage unit 22 (step S111), and a series of steps. End the operation sequence.
 図8Aは、図7のステップS100で説明したログイン画面1100の一例を示す図である。 FIG. 8A is a diagram showing an example of the login screen 1100 described in step S100 of FIG.
 ログイン画面1100の中央には、ユーザID入力部1101、パスワード入力部1102、及びログイン決定アイコン1103が配置されている。ログイン決定アイコン1103が選択されると、携帯端末2の端末処理部27は、入力されたユーザID及びパスワードを含むログイン要求をサーバ3に出力する。 In the center of the login screen 1100, a user ID input unit 1101, a password input unit 1102, and a login determination icon 1103 are arranged. When the login determination icon 1103 is selected, the terminal processing unit 27 of the mobile terminal 2 outputs a login request including the input user ID and password to the server 3.
 図8Bは、図7のステップS104で説明した画像閲覧画面1200の一例を示す図である。 FIG. 8B is a diagram showing an example of the image browsing screen 1200 described in step S104 of FIG.
 図8Bに示す画像閲覧画面1200では、画像1201が3行×3列に配置されている。画像1201は、ユーザが携帯端末2の撮像部25で撮像した画像、携帯端末2が接続されるネットワーク上からダウンロードされた画像、又は携帯端末2の表示部24における任意の表示をキャプチャして生成される画像である。画像1201は、撮像時、ダウンロード時、又はキャプチャ時が遅いものから順に、3行×3列の上から下、及び左から右に向かって、配置される。なお、画像閲覧画面1200は一例であって、画像1201の表示方法はこの表示方法に限定されない。 In the image browsing screen 1200 shown in FIG. 8B, images 1201 are arranged in 3 rows × 3 columns. The image 1201 is generated by capturing an image captured by the user with the imaging unit 25 of the mobile terminal 2, an image downloaded from a network to which the mobile terminal 2 is connected, or an arbitrary display on the display unit 24 of the mobile terminal 2. It is an image to be. The images 1201 are arranged in the order of 3 rows × 3 columns from the top to the bottom and from the left to the right in order from the slowest time of image capture, download, or capture. The image browsing screen 1200 is an example, and the display method of the image 1201 is not limited to this display method.
 図8Cは、図7のステップS105で説明した画像分類要求の選択画面1300の一例を示す図である。 FIG. 8C is a diagram showing an example of the image classification request selection screen 1300 described in step S105 of FIG.
 ユーザが操作部23によって所定の操作をすると、携帯端末2の画像閲覧部272によって、「カメラロール内の画像を整理しますか?」とのポップアップ1301が表示される。「はい」1302を選択すると、端末処理部27は、ユーザが選択した1つ又は複数の画像データ及び画像分類要求を、端末通信部21を介してサーバ3に出力する。「いいえ」1303を選択すると、画像分類処理が終了する。 When the user performs a predetermined operation using the operation unit 23, the image browsing unit 272 of the mobile terminal 2 displays a pop-up 1301 "Do you want to organize images in the camera roll?" If “Yes” 1302 is selected, the terminal processing unit 27 outputs the one or more image data selected by the user and the image classification request to the server 3 via the terminal communication unit 21. If “No” 1303 is selected, the image classification process is terminated.
 図8Dは、図7のステップS111で説明した画像分類画面1400の一例を示す図である。 FIG. 8D is a diagram showing an example of the image classification screen 1400 described in step S111 of FIG.
 端末通信部21を介してサーバ3からカテゴリのデータを取得し、カテゴリを登録した後、端末処理部27は画像分類画面1400を表示させる。画像分類画面1400では、例えば図4のツリー構造にしたがって、カテゴリの名称1401と、当該カテゴリに分類される画像1402が例えば時系列順で表示される。図8Dには例として、「パスタ」カテゴリに係る画像分類画面1400が示されている。カテゴリの名称1401には、「パスタ」の上位カテゴリである「料理」及び「洋食」の表記が、当該カテゴリに含まれる画像の数と共に表示されている。例えば、図8Dに示す「料理(45)>洋食(25)>パスタ(10)」との表示は、「料理」の画像が45個であり、その内「洋食」の画像が25個であり、さらに、その内「パスタ」の画像が10個であることを示している。 After acquiring the category data from the server 3 via the terminal communication unit 21 and registering the category, the terminal processing unit 27 displays the image classification screen 1400. In the image classification screen 1400, for example, according to the tree structure of FIG. 4, a category name 1401 and an image 1402 classified into the category are displayed, for example, in chronological order. FIG. 8D shows an image classification screen 1400 related to the “pasta” category as an example. In the category name 1401, notations “cooking” and “western food”, which are upper categories of “pasta”, are displayed together with the number of images included in the category. For example, the display of “Cooking (45)> Western food (25)> Pasta (10)” shown in FIG. 8D has 45 images of “Cooking”, of which 25 images are “Western food”. Furthermore, it shows that there are 10 “pasta” images.
 図9は、広告宣伝の提供の動作シーケンスの一例を示す図である。 FIG. 9 is a diagram showing an example of an operation sequence for providing advertisements.
 以下に説明する動作シーケンスは、予め端末記憶部22及びサーバ記憶部32に記憶されているプログラムに基づいて、主に端末処理部27及びサーバ処理部33により、携帯端末2及びサーバ3の各要素と協働して実行される。 The operation sequence described below is based on a program stored in the terminal storage unit 22 and the server storage unit 32 in advance, and each element of the mobile terminal 2 and the server 3 is mainly performed by the terminal processing unit 27 and the server processing unit 33. It is executed in cooperation with.
 以下の処理は、一例として、前述の様に、サーバ3のサーバ処理部33がサーバ通信部31を介して携帯端末2に各画像データに係るカテゴリのデータを出力する(ステップS109)と、開始される。しかしながら、これに限らず、任意のタイミングで処理が開始されてもよい。 As an example, the following processing starts when the server processing unit 33 of the server 3 outputs the category data relating to each image data to the mobile terminal 2 via the server communication unit 31 as described above (step S109). Is done. However, the process is not limited to this, and the process may be started at an arbitrary timing.
 サーバ3のユーザ嗜好解析部332は、カテゴリ別画像数、及びカテゴリ別広告閲覧数に基づいて、前述した解析方法でユーザ嗜好を解析する(ステップS201)。 The user preference analysis unit 332 of the server 3 analyzes the user preference by the analysis method described above based on the number of images by category and the number of advertisements browsed by category (step S201).
 次に、広告宣伝提供部333は、解析されたユーザ嗜好に基づいて、サーバ記憶部32に記憶されている広告宣伝データの内、解析されたユーザ嗜好と関連のある広告宣伝に関するデータを抽出する(ステップS202)。次に、広告宣伝提供部333は、抽出された広告宣伝データを携帯端末2に出力する(ステップS203)。なお、広告宣伝提供部333は、カテゴリに係る画像データの数(カテゴリ別画像数)が多いほど、広告宣伝の提供頻度を多くしたり、広告宣伝の表示面積又は表示される文字の大きさを大きくした広告宣伝データを配信する様にしても良い。 Next, the advertisement providing unit 333 extracts, from the advertisement data stored in the server storage unit 32, data related to the advertisement related to the analyzed user preference based on the analyzed user preference. (Step S202). Next, the advertisement providing unit 333 outputs the extracted advertisement data to the mobile terminal 2 (step S203). The advertisement providing unit 333 increases the frequency of providing advertisements, increases the display area of advertisements, or the size of displayed characters as the number of image data related to categories (number of images by category) increases. Larger advertising data may be distributed.
 携帯端末2の端末処理部27は、端末通信部21を介して広告宣伝データを取得すると、表示部24に広告宣伝データに基づく広告宣伝表示を行う(ステップS204)。これにより、ユーザの嗜好が強い広告宣伝を効果的に提供できる。次に、広告宣伝が表示された領域を押下する等の操作によって、ユーザが広告宣伝の閲覧を選択すると、端末処理部27は、ブラウザを立ち上げて、選択された広告宣伝に係るURLが示すページの閲覧要求を不図示の外部サーバに出力する(ステップS205)。 When the terminal processing unit 27 of the mobile terminal 2 acquires the advertising data via the terminal communication unit 21, the terminal processing unit 27 displays the advertising based on the advertising data on the display unit 24 (step S204). Thereby, the advertisement with strong user's preference can be effectively provided. Next, when the user selects to view the advertisement by an operation such as pressing an area where the advertisement is displayed, the terminal processing unit 27 starts up the browser and indicates the URL related to the selected advertisement. A page browsing request is output to an external server (not shown) (step S205).
 次に、端末処理部27は、選択された広告宣伝に係る識別情報をサーバ3に出力する(ステップS206)。次に、ユーザ嗜好解析部332は、サーバ通信部31を介して携帯端末2から取得した広告宣伝に係る識別情報に基づいて、サーバ記憶部32に記憶されているカテゴリ別広告閲覧数を更新し(ステップS207)、広告宣伝の提供の動作シーケンスを終了する。 Next, the terminal processing unit 27 outputs identification information related to the selected advertisement to the server 3 (step S206). Next, the user preference analysis unit 332 updates the number of classified advertisement browsing stored in the server storage unit 32 based on the identification information related to the advertisement acquired from the mobile terminal 2 via the server communication unit 31. (Step S207) The operation sequence for providing advertisements is terminated.
 図10Aは、図9のステップS204で説明した広告宣伝の一例を示す図である。 FIG. 10A is a diagram illustrating an example of the advertisement described in step S204 of FIG.
 広告宣伝2101は、表示部24に表示される待受画面2100の中央に表示されている。広告宣伝2101には、店舗が提供する新メニューの説明がテキストによって表示されている。なお、広告宣伝2101の表示面積が、カテゴリに係る画像データの数(カテゴリ別画像数)が多いほど、大きくなってもよい。また、店舗が提供するメニューは、店舗が提供する商品の一例である。したがって、店舗が提供する新商品の説明がテキストによって表示されていてもよい。 The advertisement 2101 is displayed at the center of the standby screen 2100 displayed on the display unit 24. In the advertisement 2101, an explanation of a new menu provided by the store is displayed in text. Note that the display area of the advertisement 2101 may increase as the number of image data items related to the category (the number of images by category) increases. The menu provided by the store is an example of a product provided by the store. Therefore, the description of the new product provided by the store may be displayed as text.
 図10Bは、図9のステップS204で説明した広告宣伝の他の例を示す図である。 FIG. 10B is a diagram showing another example of the advertisement described in step S204 of FIG.
 広告宣伝2201は、表示部24に表示される画像分類画面2200の下部に表示されている。広告宣伝2201には、店舗が提供する新メニューの内容がテキストによって表示されている。なお、広告宣伝2201の表示面積が、カテゴリに係る画像データの数(カテゴリ別画像数)が多いほど、大きくなってもよい。 The advertisement 2201 is displayed at the bottom of the image classification screen 2200 displayed on the display unit 24. In the advertisement 2201, the contents of the new menu provided by the store are displayed in text. Note that the display area of the advertisement 2201 may increase as the number of image data items related to the category (the number of images by category) increases.
 図10Cは、図9のステップS204で説明した広告宣伝の更に他の例を示す図である。 FIG. 10C is a diagram showing still another example of the advertisement described in step S204 of FIG.
 広告宣伝2301は、表示部24に表示される画像分類画面2300において、画像2302と並んで表示されている。広告宣伝2301には、左上の「PR」の文字と共に、店舗が提供する新メニューの名前等が表示されている。なお、広告宣伝2301に表示される文字の大きさが、カテゴリに係る画像データの数(カテゴリ別画像数)が多いほど、大きくなってもよい。 The advertisement 2301 is displayed along with the image 2302 on the image classification screen 2300 displayed on the display unit 24. In the advertisement 2301, the name of the new menu provided by the store is displayed together with the characters “PR” in the upper left. Note that the size of characters displayed in the advertisement 2301 may increase as the number of image data related to the category (the number of images by category) increases.
 なお、広告宣伝2101、2201、及び2301は一例であって、広告宣伝の表示態様は上記に限定されない。広告宣伝は、例えば、テキスト情報の他、画像や動画等、様々なコンテンツを含んだ表示であってもよい。また、広告宣伝は、解析されたユーザ嗜好にしたがって、飲食店において提供されるものに限らず、あらゆる店舗、業種、団体等によって提供される広告宣伝であってよい。例えば、乳幼児の画像が多いユーザに対しては、ベビー用品や子供服に関する広告宣伝等が提供されるであろうし、山や海などの画像が多いユーザについては、アウトドア用品に関する広告宣伝等が提供されるであろう。また、例えば、20代女性の画像が多いユーザに対しては、20代女性に関するファッション又は健康の広告宣伝等が提供されてもよい。或いは、タワー、社寺・教会、又はテーマパークなどの画像が多いユーザについては、観光地に関する広告宣伝等が提供されるだろう。 In addition, the advertisements 2101, 2201, and 2301 are examples, and the display mode of the advertisements is not limited to the above. For example, the advertisement may be a display including various contents such as an image and a moving image in addition to text information. Further, the advertisement is not limited to the one provided in the restaurant according to the analyzed user preference, and may be an advertisement provided by any store, business type, organization, or the like. For example, advertisements related to baby goods and children's clothing will be provided to users with many infant images, and advertisements related to outdoor goods will be provided to users with many images such as mountains and the sea. Will be done. In addition, for example, for a user with many images of women in their 20s, fashion or health advertising related to women in their 20s may be provided. Or, for users with many images such as towers, shrines and temples, or theme parks, advertisements and the like regarding sightseeing spots will be provided.
[第2実施形態]
 第2実施形態は、画像データ毎に詳細情報を登録できる様にしたものである。ここで、詳細情報とは、画像データのカテゴリをさらに具体的に展開する詳細な情報をいう。例えば、「食事」のカテゴリについては、「店舗名」や「メニュー名」等を詳細情報とすることができる。詳細情報は、画像データのカテゴリ等に応じて任意に設定することができる。
[Second Embodiment]
In the second embodiment, detailed information can be registered for each image data. Here, the detailed information refers to detailed information that more specifically develops the category of image data. For example, for the “meal” category, “store name”, “menu name”, and the like can be used as detailed information. Detailed information can be arbitrarily set according to the category of image data.
 さらに、第2実施形態は、詳細情報の候補として、GPS情報又はQRコード情報等に基づいて特定の画像データに関連する関連情報を抽出できる様にしたものである。ここで、関連情報とは、詳細情報の候補となる情報であって、GPS情報又はQRコード情報等の書誌情報と所定の関係を有する施設に係る施設情報の集合(情報が1つの場合を含む)をいう。例えば、特定の画像データに係るGPS情報から一定の距離範囲内にある1つ又は複数の店舗の店舗名の集合が、当該画像データに関連する関連情報である。或いは、特定の画像データの撮像日時前の所定時間内に読み取ったQRコードに係る店舗名が、当該画像データに関連する関連情報である。ここで、「店舗名」及び「メニュー名」は、施設情報の一例であり、第2の実施形態は、施設情報が「店舗名」及び「メニュー名」である場合について説明する。なお、「店舗名」及び「メニュー名」は、他の施設情報、例えば、「地名」、「商品名」、「観光地名」、「観光地に関する説明」等であってもよい。また、第2の実施形態では、GPS情報又はQRコード情報のいずれかを用いて関連情報が抽出されているが、GPS情報及びQRコードの両方を用いて関連情報が抽出されてもよい。 Further, in the second embodiment, related information related to specific image data can be extracted based on GPS information or QR code information as candidates for detailed information. Here, the related information is information that is a candidate for detailed information, and includes a set of facility information related to facilities having a predetermined relationship with bibliographic information such as GPS information or QR code information (including a case where there is one information). ). For example, a set of store names of one or a plurality of stores within a certain distance range from GPS information related to specific image data is related information related to the image data. Alternatively, the store name related to the QR code read within a predetermined time before the imaging date and time of specific image data is related information related to the image data. Here, “store name” and “menu name” are examples of facility information, and in the second embodiment, a case where the facility information is “store name” and “menu name” will be described. The “store name” and “menu name” may be other facility information, for example, “place name”, “product name”, “tourist place name”, “explanation regarding tourist spot”, and the like. In the second embodiment, related information is extracted using either GPS information or QR code information. However, related information may be extracted using both GPS information and QR code.
 なお、第2実施形態では、第1実施形態について説明した携帯端末2及びサーバ3をそのまま利用するものとする。 In the second embodiment, the portable terminal 2 and the server 3 described in the first embodiment are used as they are.
 図11は、第2実施形態で用いられる特定の画像に関するデータ構造の一例を示す図である。 FIG. 11 is a diagram illustrating an example of a data structure related to a specific image used in the second embodiment.
 図11に示すデータ構造は、端末記憶部22に記憶されている画像IDに紐付けられた一連のデータを示している。図11に示すデータ構造は、図3に示すデータ構造と比較すると、さらに、GPS情報、QRコード情報、店舗名、及びメニュー名等を含んでいる。 The data structure shown in FIG. 11 shows a series of data linked to the image ID stored in the terminal storage unit 22. Compared with the data structure shown in FIG. 3, the data structure shown in FIG. 11 further includes GPS information, QR code information, a store name, a menu name, and the like.
 端末処理部27は、撮像部25による撮像が行われると、GPSセンサ26によって取得されたGPS情報を取得し、画像データと紐付けて端末記憶部22へ記憶する。 The terminal processing unit 27 acquires the GPS information acquired by the GPS sensor 26 when image capturing is performed by the image capturing unit 25, and stores the GPS information in the terminal storage unit 22 in association with the image data.
 QRコード処理部274は、撮像部25によりQRコードが読み取られるとQRコードに係る情報(例えばQRコードの識別情報)及びその読取日時を、端末記憶部22のバッファに記憶する。当該バッファには、読取日時が最も新しいQRコードに係る情報及びその読取日時のみを記憶しておく。QRコード処理部274は、さらに、画像処理部271により画像データが生成されると、前述のQRコードに係る情報及びその読取日時を、画像データに紐付けて端末記憶部22に記憶する。したがって、端末記憶部22に記憶される画像データには全て、撮像日時より前に読み取ったQRコードのうち最新のQRコードに関する情報が紐付けられることとなる。 When the QR code is read by the imaging unit 25, the QR code processing unit 274 stores information related to the QR code (for example, identification information of the QR code) and the reading date and time in the buffer of the terminal storage unit 22. The buffer stores only information related to the QR code with the latest reading date and time and the reading date and time. Further, when the image data is generated by the image processing unit 271, the QR code processing unit 274 stores the information related to the QR code and the reading date / time in the terminal storage unit 22 in association with the image data. Therefore, all the image data stored in the terminal storage unit 22 is associated with information on the latest QR code among QR codes read before the imaging date and time.
 店舗名は、前述の詳細情報の一例であり、例えば画像データの対象物が提供される店舗名である。メニュー名は、前述の詳細情報の一例であり、例えば対象物のメニュー名である。 The store name is an example of the detailed information described above, and is, for example, the name of the store where the object of the image data is provided. The menu name is an example of the detailed information described above, and is, for example, the menu name of the object.
 図12は、第2実施形態で用いられる特定のユーザに関するデータ構造の一例を示す図である。 FIG. 12 is a diagram illustrating an example of a data structure related to a specific user used in the second embodiment.
 図12に示すデータ構造は、サーバ記憶部32に記憶されているユーザIDに紐付けられた一連のデータを示している。図12に示すデータ構造は、図6に示すデータ構造と比較すると、さらに、店舗ID及びメニューIDが画像IDに紐付けられており、又、価格帯別画像数が含まれる。価格帯別画像数は、各価格帯に属する画像データの数を表している。例えば、「1,000円代(25)」との表示は、ユーザが撮り溜めた画像のうち、価格が1,000円代(1,000~1,999円)の価格帯に属する画像が25個あることを示している。 12 shows a series of data associated with the user ID stored in the server storage unit 32. The data structure shown in FIG. Compared with the data structure shown in FIG. 6, the data structure shown in FIG. 12 further includes a store ID and a menu ID associated with the image ID, and includes the number of images by price range. The number of images by price range represents the number of image data belonging to each price range. For example, “1,000 yen (25)” is displayed when images belonging to a price range of 1,000 yen (1,000 to 1,999 yen) among images collected by the user are displayed. It shows that there are 25.
 価格帯別画像数は、ユーザ嗜好の解析の基礎としてもよい。価格帯別画像数には、ユーザが例えば飲食店の料理にどの程度の費用を支払うかを表しており、ユーザの趣味、好み、及び興味等をカテゴリ別画像数及びカテゴリ別広告閲覧数とは異なった観点から表しているといえる。当該価格帯別画像数のような、ユーザ嗜好の解析の基礎となるパラメータは、他に任意のものを設定してもよい。後述の店舗情報テーブル及びメニュー情報テーブルや、その他のテーブルに含まれる任意の項目に基づいて、ユーザ嗜好解析の基礎となるパラメータを設定することができる。 The number of images by price range may be the basis for user preference analysis. The number of images by price range represents how much the user pays for, for example, the food of a restaurant, and the user's hobbies, preferences, interests, etc. are the number of images by category and the number of browsing advertisements by category. It can be said that it represents from a different point of view. Other arbitrary parameters may be set as the basis of user preference analysis, such as the number of images by price range. Based on arbitrary items included in a store information table and menu information table, which will be described later, and other tables, parameters serving as a basis for user preference analysis can be set.
 図13Aは、店舗情報テーブルの一例を示す図である。 FIG. 13A is a diagram illustrating an example of a store information table.
 店舗情報テーブルは、サーバ3のサーバ記憶部32に記憶される。店舗情報テーブルには、店舗IDに紐付けられて、店舗名、住所、QRコードID(QRコードの識別情報)、メニュー情報(メニューID)等の情報が格納されている。QRコードは、例えば店舗内に設置されている。QRコードは、店舗固有のものであってもよい。また、QRコードは、系列店舗同士で同一のものであってもよい。更に、QRコードは、ユーザが店舗内等で食事をする際に利用されるテーブルごとに与えられたテーブル識別情報を含んでもよい。 The store information table is stored in the server storage unit 32 of the server 3. In the store information table, information such as store name, address, QR code ID (QR code identification information), menu information (menu ID) and the like is stored in association with the store ID. The QR code is installed in, for example, a store. The QR code may be unique to the store. The QR code may be the same for affiliate stores. Furthermore, the QR code may include table identification information given for each table used when the user has a meal in a store or the like.
 図13Bは、メニュー情報テーブルの一例を示す図である。 FIG. 13B is a diagram showing an example of the menu information table.
 メニュー情報テーブルは、サーバ3のサーバ記憶部32に記憶される。メニュー情報テーブルには、メニューIDに紐付けられて、カテゴリ、メニュー名、価格、店舗画像データ(店舗が提供したメニュー画像)等の情報が格納されている。 The menu information table is stored in the server storage unit 32 of the server 3. In the menu information table, information such as a category, a menu name, a price, store image data (a menu image provided by the store) and the like are stored in association with the menu ID.
 図14は、詳細情報登録の動作シーケンスの一例を示す図である。 FIG. 14 is a diagram illustrating an example of an operation sequence for registering detailed information.
 以下に説明する動作シーケンスは、予め端末記憶部22及びサーバ記憶部32に記憶されているプログラムに基づいて、主に端末処理部27及びサーバ処理部33により、携帯端末2及びサーバ3の各要素と協働して実行される。 The operation sequence described below is based on a program stored in the terminal storage unit 22 and the server storage unit 32 in advance, and each element of the mobile terminal 2 and the server 3 is mainly performed by the terminal processing unit 27 and the server processing unit 33. It is executed in cooperation with.
 最初に、携帯端末2の端末処理部27は、ログイン画面を表示部24に表示させる(ステップS300)。次に、ユーザが、操作部23の操作によりログイン画面においてユーザID及びパスワードを入力し、ログイン実行を選択すると、端末処理部27は、端末通信部21を介して、ユーザID及びパスワードを含むログイン要求をサーバ3に出力する(ステップS301)。サーバ通信部21を介してログイン要求を取得すると、サーバ3のサーバ処理部33は、取得したユーザID及びパスワードに基づいてユーザ認証を行う(ステップS302)。ユーザ認証が許可されると、サーバ処理部33は、認証許可通知を携帯端末2に出力する(ステップS303)。なお、例えば、図7で説明した画像分類処理の続きとして詳細情報登録処理を行う場合等には、以上のユーザの認証処理(ステップS300~S303)は行われない。 First, the terminal processing unit 27 of the mobile terminal 2 displays a login screen on the display unit 24 (step S300). Next, when the user inputs the user ID and password on the login screen by the operation of the operation unit 23 and selects execution of login, the terminal processing unit 27 logs in including the user ID and password via the terminal communication unit 21. The request is output to the server 3 (step S301). When the login request is acquired via the server communication unit 21, the server processing unit 33 of the server 3 performs user authentication based on the acquired user ID and password (step S302). When the user authentication is permitted, the server processing unit 33 outputs an authentication permission notification to the mobile terminal 2 (step S303). For example, when the detailed information registration process is performed as a continuation of the image classification process described with reference to FIG. 7, the above user authentication process (steps S300 to S303) is not performed.
 端末通信部21を介して認証許可通知を取得すると、携帯端末2の端末処理部27が、表示部24に画像分類画面を表示する(ステップS304)。次に、ユーザが特定の画像データを選択すると、端末処理部27は、選択された画像データの詳細情報登録画面を表示部24に表示させる(ステップS305)。次に、端末処理部27は、ユーザID、選択された画像データ、及び書誌情報等を含んだ詳細情報登録要求をサーバ3に出力する(ステップS306)。当該書誌情報には、GPS情報及びQRコード情報等が含まれる。 When the authentication permission notification is acquired via the terminal communication unit 21, the terminal processing unit 27 of the mobile terminal 2 displays an image classification screen on the display unit 24 (step S304). Next, when the user selects specific image data, the terminal processing unit 27 displays a detailed information registration screen of the selected image data on the display unit 24 (step S305). Next, the terminal processing unit 27 outputs a detailed information registration request including the user ID, selected image data, bibliographic information, and the like to the server 3 (step S306). The bibliographic information includes GPS information and QR code information.
 サーバ通信部21を介して詳細情報登録要求を取得すると、サーバ3の関連情報抽出部334は、携帯端末2から取得した書誌情報に基づいて、店舗情報テーブルを参照して店舗名に係る関連情報(店舗名関連情報)を抽出する(ステップS307)。店舗名関連情報の抽出については、後述する。次に、サーバ3のサーバ処理部33は、抽出した店舗名関連情報のデータを携帯端末2に出力する(ステップS308)。 When the detailed information registration request is acquired via the server communication unit 21, the related information extraction unit 334 of the server 3 refers to the store information table based on the bibliographic information acquired from the mobile terminal 2, and related information related to the store name. (Store name related information) is extracted (step S307). The extraction of store name related information will be described later. Next, the server processing unit 33 of the server 3 outputs the extracted store name related information data to the mobile terminal 2 (step S308).
 端末通信部21を介して店舗名関連情報のデータを取得すると、携帯端末2の端末処理部27は、詳細情報登録画面上で取得した店舗名関連情報を表示する(ステップS309)。次に、ユーザが操作部23を操作することにより詳細情報登録画面上で店舗名関連情報から特定の店舗名を選択すると、端末処理部27は、端末通信部21を介して店舗名の選択結果を画像IDに紐付けてサーバ3に出力する(ステップS310)。次に、端末処理部27は、端末記憶部22に記憶された画像データの店舗名を登録する(ステップS311)。 When the store name related information data is acquired via the terminal communication unit 21, the terminal processing unit 27 of the mobile terminal 2 displays the acquired store name related information on the detailed information registration screen (step S309). Next, when the user operates the operation unit 23 to select a specific store name from the store name related information on the detailed information registration screen, the terminal processing unit 27 selects the store name selection result via the terminal communication unit 21. Is associated with the image ID and output to the server 3 (step S310). Next, the terminal processing unit 27 registers the store name of the image data stored in the terminal storage unit 22 (step S311).
 サーバ通信部31を介して画像IDに紐付けられた店舗名の選択結果を取得すると、関連情報抽出部334は、店舗情報テーブルを参照し、取得した店舗名の選択結果に基づいてサーバ記憶部32の店舗IDを画像IDに紐付けて登録する(ステップS312)。次に、サーバ3の関連情報抽出部334は、店舗情報テーブル及びメニュー情報テーブルを参照して、メニュー名に係る関連情報(メニュー名関連情報)を抽出する(ステップS313)。具体的には、関連情報抽出部334は、ステップS308で登録された店舗IDに係るメニューIDのメニュー名のうち、ステップS107で登録されたカテゴリと同一のカテゴリを有するメニュー名を、メニュー名関連情報として抽出する。次に、サーバ3のサーバ処理部33は、メニュー名関連情報のデータを携帯端末2に出力する(ステップS314)。 When the selection result of the store name associated with the image ID is acquired via the server communication unit 31, the related information extraction unit 334 refers to the store information table and based on the acquired selection result of the store name, the server storage unit The store ID of 32 is registered in association with the image ID (step S312). Next, the related information extraction unit 334 of the server 3 refers to the store information table and the menu information table, and extracts related information related to the menu name (menu name related information) (step S313). Specifically, the related information extraction unit 334 selects a menu name having the same category as the category registered in step S107 among the menu names of the menu ID related to the store ID registered in step S308. Extract as information. Next, the server processing unit 33 of the server 3 outputs the menu name related information data to the portable terminal 2 (step S314).
 端末通信部21を介してメニュー名関連情報のデータを取得すると、携帯端末2の端末処理部27は、詳細情報登録画面上でメニュー名関連情報を表示する(ステップS315)。次に、ユーザが操作部23を操作することにより、詳細情報登録画面上でメニュー名関連情報から特定のメニュー名を選択すると、端末処理部27は、端末通信部21を介してメニュー名の選択結果を画像IDに紐付けてサーバ3に出力する(ステップS316)。次に、端末処理部27は、端末記憶部22に記憶された画像データのメニュー名を登録する(ステップS317)。次に、端末処理部27は、画像データのカテゴリ、店舗名、及びメニュー名等が表示された詳細情報画面を表示部24に表示する(ステップS318)。 When the menu name related information data is acquired via the terminal communication unit 21, the terminal processing unit 27 of the mobile terminal 2 displays the menu name related information on the detailed information registration screen (step S315). Next, when the user operates the operation unit 23 to select a specific menu name from the menu name related information on the detailed information registration screen, the terminal processing unit 27 selects the menu name via the terminal communication unit 21. The result is linked to the image ID and output to the server 3 (step S316). Next, the terminal processing unit 27 registers the menu name of the image data stored in the terminal storage unit 22 (step S317). Next, the terminal processing unit 27 displays a detailed information screen on which the category of image data, the store name, the menu name, and the like are displayed on the display unit 24 (step S318).
 サーバ通信部21を介して画像IDに紐付けられたメニュー名の選択結果を取得すると、サーバ処理部33は、店舗情報テーブル及びメニュー情報テーブルを参照して、取得したメニュー名の選択結果に基づいてサーバ記憶部32のメニューIDを画像IDに紐付けて登録する(ステップS319)。次に、サーバ3のカテゴライズ部331は、メニュー情報テーブルを参照して、画像IDに紐づけられたメニューIDのメニューの価格帯毎に画像データの数を集計して、サーバ記憶部32に記憶された価格帯別画像数を更新し(ステップS320)、詳細情報登録の動作シーケンスを終了する。 When the selection result of the menu name associated with the image ID is acquired via the server communication unit 21, the server processing unit 33 refers to the store information table and the menu information table and based on the acquired menu name selection result. The menu ID in the server storage unit 32 is registered in association with the image ID (step S319). Next, the categorizing unit 331 of the server 3 refers to the menu information table, totals the number of image data for each price range of the menu with the menu ID linked to the image ID, and stores it in the server storage unit 32. The number of images classified by price range is updated (step S320), and the detailed information registration operation sequence is terminated.
 図15は、店舗名関連情報の抽出処理のフローの一例を示す図である。
 以下、図15を参照して、ステップS307の店舗名関連情報の抽出処理の流れを説明する。
FIG. 15 is a diagram illustrating an example of a flow of processing for extracting store name related information.
Hereinafter, with reference to FIG. 15, the flow of the extraction process of the store name related information in step S307 will be described.
 まず、サーバ3の関連情報抽出部334は、サーバ通信部21を介してGPS情報又はQRコード情報を含む書誌情報を取得する(ステップS400)。 First, the related information extraction unit 334 of the server 3 acquires bibliographic information including GPS information or QR code information via the server communication unit 21 (step S400).
 次に、関連情報抽出部334は、書誌情報を参照して、撮像日時がQRコード情報の読取日時後の所定の時間内に含まれるか否かを判断する(ステップS401)。例えば図11に示すデータの場合、撮像日時は2015年1月16日19時20分であり、読取日時は2015年1月16日19時02分である。上記所定の時間を例えば1時間とする。この場合、読取日時後の所定の時間は、2015年1月16日19時02分から2015年1月16日20時02分までである。上記撮像日時(2015年1月16日19時20分)は、当該読取日時後の所定の時間内である。したがって、図11に示すデータの場合、関連情報抽出部334は、撮像日時が読取日時後の所定の時間内に含まれる、と判断される。なお、上記所定の時間は、任意に設定することができる。 Next, the related information extraction unit 334 refers to the bibliographic information and determines whether the imaging date / time is included within a predetermined time after the reading date / time of the QR code information (step S401). For example, in the case of the data shown in FIG. 11, the imaging date and time is 19:20 on January 16, 2015, and the reading date and time is 19:02 on January 16, 2015. The predetermined time is, for example, 1 hour. In this case, the predetermined time after the reading date / time is from January 16, 2015 at 19:02 to January 16, 2015 at 20:02. The imaging date and time (January 16, 2015 19:20) is within a predetermined time after the reading date and time. Therefore, in the case of the data shown in FIG. 11, the related information extraction unit 334 determines that the imaging date / time is included within a predetermined time after the reading date / time. The predetermined time can be arbitrarily set.
 ステップS401において、撮像日時が読取日時後の所定の時間内に含まれると判断されると(ステップS401;Yes)、関連情報抽出部334は、書誌情報のQRコード情報に基づき、店舗情報テーブルを参照して、当該QRコードIDに係る店舗名を関連情報として抽出し(ステップS402)、店舗名関連情報の抽出処理が終了する。 In step S401, when it is determined that the imaging date / time is included within a predetermined time after the reading date / time (step S401; Yes), the related information extraction unit 334 creates a store information table based on the QR code information of the bibliographic information. Referring to, the store name related to the QR code ID is extracted as related information (step S402), and the store name related information extraction process ends.
 一方、ステップS401において、撮像日時が読取日時後の所定の時間内に含まれないと判断されると(ステップS401;No)、関連情報抽出部334は、書誌情報のGPS情報に基づき、店舗情報テーブルを参照して、当該GPS情報から一定の距離範囲内にある店舗の店舗名を関連情報として抽出し(ステップS403)、店舗名関連情報の抽出処理が終了する。 On the other hand, when it is determined in step S401 that the imaging date / time is not included within the predetermined time after the reading date / time (step S401; No), the related information extraction unit 334 stores store information based on the GPS information of the bibliographic information. With reference to the table, the store name of the store within a certain distance range is extracted as related information from the GPS information (step S403), and the store name related information extraction process is completed.
 図16Aは、図14のステップS309で説明した詳細情報登録画面3100の一例である。 FIG. 16A is an example of the detailed information registration screen 3100 described in step S309 of FIG.
 図16Aに示す様に、詳細情報登録画面3100では、店舗名の欄3101、メニュー名の欄3102が表示されている。メニュー名の欄3102の下部には、「ご利用されたお店はここですか?」との質問文3103と共に、店舗名の関連情報3104が表示される。店舗名の関連情報3104のうち、特定の店舗名の領域を押下等することにより、店舗名を選択することができる。店舗名の関連情報3104の下部には、任意の店舗を検索することができる検索部3105が表示される。 As shown in FIG. 16A, in the detailed information registration screen 3100, a store name column 3101 and a menu name column 3102 are displayed. In the lower part of the menu name column 3102, store name related information 3104 is displayed together with a question sentence 3103 that says “Where is the store used?”. A store name can be selected by pressing a specific store name area in the store name related information 3104. A search unit 3105 capable of searching for an arbitrary store is displayed below the store name related information 3104.
 図16Bは、図14のステップS315で説明した詳細情報登録画面3200の一例である。 FIG. 16B is an example of the detailed information registration screen 3200 described in step S315 of FIG.
 図16Bに示す様に、詳細情報登録画面では、店舗名の欄3201には選択された店舗名3206が表示されている。メニュー名の欄3202の下部には、「メニュー名はこちらですか?」との質問文3203と共に、メニュー名の関連情報3204が表示される。メニュー名の関連情報3204のうち、特定のメニュー名の領域を押下等することにより、メニュー名を選択することができる。メニュー名の関連情報3204の下部には、任意のメニューを検索することができる検索部3205が表示される。 As shown in FIG. 16B, the selected store name 3206 is displayed in the store name column 3201 on the detailed information registration screen. In the lower part of the menu name column 3202, a menu sentence related information 3204 is displayed together with a question sentence 3203 “What is the menu name here?”. A menu name can be selected by pressing a specific menu name area in the menu name related information 3204. A search unit 3205 that can search for an arbitrary menu is displayed below the related information 3204 of the menu name.
 図16Cは、図14のステップS318で説明した詳細情報画面3300の一例である。 FIG. 16C is an example of the detailed information screen 3300 described in step S318 of FIG.
 図16Cに示す様に、詳細情報画面3300では、カテゴリの欄3301には画像データの属するカテゴリ3302が表示されている。店舗名の欄3303には選択された店舗名3304が表示される。メニュー名の欄3305には選択されたメニュー名3306が表示される。 As shown in FIG. 16C, in the detailed information screen 3300, a category 3302 to which image data belongs is displayed in the category column 3301. The selected store name 3304 is displayed in the store name column 3303. In the menu name column 3305, the selected menu name 3306 is displayed.
 図17A及びBは、広告宣伝の表示の一例を示す図である。 FIGS. 17A and 17B are diagrams showing an example of advertisement display.
 第2実施形態では、画像と関連のある広告宣伝を、当該画像に関連付けられた態様で表示してもよい。すなわち、広告宣伝提供部333は、前述のステップS202等において抽出する広告宣伝データに、画像データに関連する関連情報を広告宣伝主の情報として付加してもよい。例えば、まず、関連情報抽出部334は、店舗情報テーブルを参照し、画像データに係るGPS情報に基づく撮像場所から所定の範囲内に含まれる店舗に係る店舗名及び/又はメニュー情報を抽出する。そして広告宣伝提供部333は、抽出された店舗名及び/又はメニュー情報を広告宣伝主の情報として画像データに重畳合成し、重畳合成された画像データを携帯端末2に出力する。或いは、まず、関連情報抽出部334は、店舗情報テーブル及びメニュー情報テーブルを参照し、画像データに係る撮像時刻から所定時間以内に読み取られたQRコードに基づく店舗名及び/又は特定の店舗におけるメニュー情報を抽出する。そして広告宣伝提供部333は、抽出された店舗名及び/又はメニュー情報を広告宣伝主に関する情報として画像データに重畳合成し、重畳合成された画像データを携帯端末2に出力する。このようにすれば、携帯端末2の端末処理部27は、提供された画像データに関連する店舗に関する情報を当該広告宣伝として表示することできる。なお、ここで、店舗におけるメニュー情報は、店舗における商品情報の一例であるため、メニュー情報を商品情報に置き換えてもよい。 In the second embodiment, an advertisement related to an image may be displayed in a form associated with the image. That is, the advertisement providing unit 333 may add related information related to the image data as advertisement information to the advertisement data extracted in step S202 described above. For example, first, the related information extraction unit 334 refers to the store information table and extracts the store name and / or menu information related to the store included in a predetermined range from the imaging location based on the GPS information related to the image data. Then, the advertisement providing unit 333 superimposes the extracted store name and / or menu information on the image data as advertiser information, and outputs the superimposed image data to the mobile terminal 2. Alternatively, first, the related information extraction unit 334 refers to the store information table and the menu information table, and stores the store name and / or the menu in a specific store based on the QR code read within a predetermined time from the imaging time related to the image data. Extract information. Then, the advertisement providing unit 333 superimposes the extracted store name and / or menu information on the image data as information related to the advertiser, and outputs the superimposed image data to the mobile terminal 2. If it does in this way, the terminal processing part 27 of the portable terminal 2 can display the information regarding the shop relevant to the provided image data as the said advertisement. Here, since the menu information in the store is an example of the product information in the store, the menu information may be replaced with the product information.
 例えば、図17Aに示す様に、携帯端末2の端末処理部27は、カテゴリの一覧画面4100の上段左側の画像4101に重なるようにして、広告宣伝4101Aを表示する。ここで、広告宣伝4101Aは、画像データに関連する店舗名及びメニュー情報である。また、広告宣伝4101Aの広告宣伝主は、ユーザが画像4101を撮像した店舗である。同様にして、画像4102には、ユーザが画像4102を撮像した店舗を広告宣伝主とする広告宣伝4102Aが表示される。 For example, as shown in FIG. 17A, the terminal processing unit 27 of the mobile terminal 2 displays an advertisement 4101A so as to overlap the upper left image 4101 of the category list screen 4100. Here, the advertisement 4101A is a store name and menu information related to the image data. The advertiser of the advertisement 4101 </ b> A is a store where the user has captured the image 4101. Similarly, in the image 4102, an advertisement 4102 </ b> A whose advertiser is the store where the user captured the image 4102 is displayed.
 図17Bは、詳細情報画面4200に表示される広告宣伝の例を示している。詳細情報画面4200は、画像4201に係る画像データ及び書誌情報を表示する画面である。ここで、広告宣伝4201Aの広告宣伝主は、ユーザが画像4201を撮像した店舗「リストランテ□□」である。 FIG. 17B shows an example of an advertisement displayed on the detailed information screen 4200. The detailed information screen 4200 is a screen that displays image data and bibliographic information related to the image 4201. Here, the advertiser of the advertisement 4201 </ b> A is a store “Ristorante □□” where the user images the image 4201.
 上記のような広告宣伝の表示態様によれば、例えば、過去に訪れたことのある店舗の広告宣伝を、当該店舗で撮像された画像に重ねて表示することが可能となり、ユーザが当該店舗に再度訪れる可能性が高まる。このように、特定の画像と関連のある広告宣伝を当該画像に関連付けられた態様で表示することによって、より効果的にユーザに広告宣伝をアピールすることが可能となる。 According to the advertisement display mode as described above, for example, the advertisement of a store that has been visited in the past can be displayed on the image captured at the store, and the user can display the store at the store. The possibility of revisiting will increase. In this way, by displaying the advertisement related to the specific image in a manner associated with the image, it is possible to more effectively appeal the advertisement to the user.
 また、店舗名及び/又はメニュー情報を広告宣伝主に関する情報として画像データに重畳合成する際に、広告宣伝提供部333は、店舗名及び/又はメニュー情報を半透明にして重畳合成してもよい。これにより、ユーザが、店舗名及び/又はメニュー情報と重なってしまっている部分の画像データを確認できる。 In addition, when superimposing and synthesizing the store name and / or menu information on the image data as information related to the advertiser, the advertisement providing unit 333 may superimpose and synthesize the store name and / or the menu information. . Thereby, the user can confirm the image data of the part which has overlapped with store name and / or menu information.
 なお、ユーザIDに紐づけられた価格帯に基づいて、ユーザに情報を提供してもよい。例えば、ユーザ嗜好解析部332は、サーバ記憶部32を参照し、ユーザIDに紐づけられた価格帯画素数が最大又は所定数以上の価格帯を特定する。そして広告宣伝提供部333は、店舗情報テーブル及びメニュー情報テーブルを参照し、特定された価格帯の店舗を特定し、特定された店舗の宣伝広告を携帯端末2に出力してもよい。これにより、ユーザの嗜好が強い広告宣伝を効果的に提供できる。ここで、特定された価格帯の店舗は、第1の実施形態において、ユーザ嗜好解析部332によって解析されたユーザ嗜好に関連する店舗であってもよい。 Note that information may be provided to the user based on the price range associated with the user ID. For example, the user preference analysis unit 332 refers to the server storage unit 32 and specifies a price range in which the number of price range pixels associated with the user ID is the maximum or a predetermined number or more. Then, the advertisement providing unit 333 may refer to the store information table and the menu information table, specify the store of the specified price range, and output the advertisement of the specified store to the mobile terminal 2. Thereby, the advertisement with strong user's preference can be effectively provided. Here, the store of the specified price range may be a store related to the user preference analyzed by the user preference analysis unit 332 in the first embodiment.
[その他の実施形態]
 第1実施形態及び第2実施形態においては、サーバ3のサーバ処理部33が、携帯端末2から取得した画像データの分類を行うものとした。しかしながら、携帯端末2の端末処理部27によって画像データの分類を行い、カテゴリ別画像数等のカテゴリに関連する情報等をサーバ3に出力する構成としても良い。
[Other Embodiments]
In the first embodiment and the second embodiment, the server processing unit 33 of the server 3 classifies image data acquired from the mobile terminal 2. However, the terminal processing unit 27 of the mobile terminal 2 may classify the image data and output information related to a category such as the number of images for each category to the server 3.
 第1実施形態及び第2実施形態においては、情報提供部の一例として広告宣伝提供部333が解析されたユーザ嗜好に基づいて広告宣伝を携帯端末2に出力するものとした。しかしながら、広告宣伝提供部333に代えて、情報提供部が解析されたユーザ嗜好に基づいてユーザ識別情報に紐付けられた連絡先に情報を提供してもよい。例えば、ユーザ識別情報に紐付けられた連絡先は、メールアドレス、メーリングリスト、又はメールマガジンなどであってもよい。 In the first embodiment and the second embodiment, the advertisement providing unit 333 outputs the advertisement to the mobile terminal 2 based on the analyzed user preference as an example of the information providing unit. However, instead of the advertisement providing unit 333, the information providing unit may provide information to a contact associated with the user identification information based on the analyzed user preference. For example, the contact address associated with the user identification information may be an e-mail address, a mailing list, or an e-mail magazine.
 第2実施形態においては、画像データに係るGPS情報に基づく撮像場所から所定の範囲内に含まれる店舗に係る店舗名を抽出し、抽出された店舗名を画像データに重畳合成し、重畳合成された画像データを携帯端末2に出力するものとした。しかしながら、店舗名は施設情報の一例であるため、店舗名を他の施設情報、例えば、観光地名及び/又は観光地に関する説明に置き換えてもよい。具体的には、動物園でユーザが動物を撮影した場合、施設情報は、観光地名(動物園名)及び/又は観光地に関する説明(動物の種類)などである。 In the second embodiment, a store name related to a store included in a predetermined range is extracted from an imaging location based on GPS information related to image data, and the extracted store name is superimposed and synthesized on the image data. The output image data is output to the portable terminal 2. However, since the store name is an example of facility information, the store name may be replaced with other facility information, for example, a description of a sightseeing spot name and / or a sightseeing spot. Specifically, when a user photographs an animal at a zoo, the facility information includes a sightseeing place name (zoo name) and / or a description about the sightseeing place (type of animal).
 また、第2実施形態においては、画像データに係る撮像時刻から所定時間以内に読み取られたQRコードに基づく店舗名を抽出し、抽出された店舗名を画像データに重畳合成し、重畳合成された画像データを携帯端末2に出力するものとした。しかしながら、店舗名は施設情報の一例であるため、店舗名を他の施設情報、例えば、観光地名及び/又は観光地に関する説明に置き換えてもよい。具体的には、動物園でユーザが動物を撮影した場合、施設情報は、観光地名(動物園名)及び/又は観光地に関する説明(動物の種類)などである。この場合、QRコードは、例えば、観光地内又は観光地周辺に設置されている。或いは、QRコードは、観光地内又は観光地周辺において、QRコード処理部274によって読み取られる観光マップの紙面に印刷されているものであってもよい。また、QRコードは、観光地固有のものであってもよい。 In the second embodiment, the store name based on the QR code read within a predetermined time from the imaging time related to the image data is extracted, and the extracted store name is superimposed and synthesized on the image data. The image data is output to the mobile terminal 2. However, since the store name is an example of facility information, the store name may be replaced with other facility information, for example, a description of a sightseeing spot name and / or a sightseeing spot. Specifically, when a user photographs an animal at a zoo, the facility information includes a sightseeing place name (zoo name) and / or a description about the sightseeing place (type of animal). In this case, the QR code is installed, for example, in or around the sightseeing spot. Alternatively, the QR code may be printed on a paper of a sightseeing map read by the QR code processing unit 274 in or around the sightseeing spot. Further, the QR code may be unique to a tourist spot.
 端末処理部27及びサーバ処理部33が備える各機能をコンピュータに実現させるためのコンピュータプログラムは、磁気記録媒体、光記録媒体等のコンピュータにより読み取り可能な記録媒体に記録された形で提供されてもよい。 A computer program for causing a computer to realize the functions of the terminal processing unit 27 and the server processing unit 33 may be provided in a form recorded on a computer-readable recording medium such as a magnetic recording medium or an optical recording medium. Good.
 当業者は、本発明の精神及び範囲から外れることなく、様々な変更、置換、及び修正をこれに加えることが可能であり、実施形態を適宜組み合わせてもよいことを理解されたい。 It should be understood by those skilled in the art that various changes, substitutions, and modifications can be made thereto without departing from the spirit and scope of the present invention, and the embodiments may be appropriately combined.
 1  情報配信システム
 2  携帯端末
 21  端末通信部
 22  端末記憶部
 23  操作部
 24  表示部
 25  撮像部
 26  GPSセンサ
 27  端末処理部
 271  画像処理部
 272  画像閲覧部
 273  広告宣伝表示部
 274  QRコード処理部
 3  サーバ
 31  サーバ通信部
 32  サーバ記憶部
 33  サーバ処理部
 331  カテゴライズ部
 332  ユーザ嗜好解析部
 333  広告宣伝提供部
 334  関連情報抽出部
 4  基地局
 5  移動体通信網
 6  ゲートウェイ
 7  インターネット
DESCRIPTION OF SYMBOLS 1 Information delivery system 2 Portable terminal 21 Terminal communication part 22 Terminal memory | storage part 23 Operation part 24 Display part 25 Imaging part 26 GPS sensor 27 Terminal processing part 271 Image processing part 272 Image browsing part 273 Advertisement display part 274 QR code processing part 3 Server 31 Server communication unit 32 Server storage unit 33 Server processing unit 331 Categorizing unit 332 User preference analysis unit 333 Advertisement providing unit 334 Related information extraction unit 4 Base station 5 Mobile communication network 6 Gateway 7 Internet

Claims (10)

  1.  画像データが属するカテゴリをユーザ識別情報に紐付けて記憶する記憶部と、
     前記画像データ及び前記ユーザ識別情報をユーザ端末から取得し、取得された前記画像データを画像認識処理することにより前記カテゴリを判定し、判定された前記カテゴリを前記ユーザ識別情報に紐づけて前記記憶部に登録するカテゴライズ部と、
     前記ユーザ識別情報に紐付けられた前記カテゴリに基づいてユーザ嗜好を解析する解析部と、
     解析された前記ユーザ嗜好に基づいて、ユーザに情報を提供する情報提供部と、
     を有することを特徴とする情報配信装置。
    A storage unit for storing the category to which the image data belongs in association with the user identification information;
    The image data and the user identification information are acquired from a user terminal, the category is determined by performing image recognition processing on the acquired image data, and the determined category is associated with the user identification information and stored. A categorizing section to be registered in the section,
    An analysis unit that analyzes user preferences based on the category associated with the user identification information;
    Based on the analyzed user preferences, an information providing unit that provides information to the user;
    An information distribution apparatus comprising:
  2.  前記カテゴリは、ツリー構造を形成する、請求項1に記載の情報配信装置。 The information distribution device according to claim 1, wherein the categories form a tree structure.
  3.  前記画像データに係る位置情報に基づいて前記画像データに関連する関連情報を抽出する関連情報抽出部を更に有し、
     前記情報提供部は、抽出された前記関連情報に基づいて前記ユーザに前記情報を提供する、請求項1又は2に記載の情報配信装置。
    A related information extracting unit that extracts related information related to the image data based on position information related to the image data;
    The information distribution apparatus according to claim 1, wherein the information providing unit provides the information to the user based on the extracted related information.
  4.  前記関連情報は、前記位置情報に基づく撮像場所から所定の範囲内に含まれる施設に係る施設情報であり、
     前記情報提供部は、抽出された前記関連情報を取得された前記画像データに合成し、合成された前記画像データを前記ユーザに提供する、請求項3に記載の情報配信装置。
    The related information is facility information relating to a facility included within a predetermined range from an imaging location based on the position information,
    The information distribution apparatus according to claim 3, wherein the information providing unit combines the extracted related information with the acquired image data, and provides the combined image data to the user.
  5.  前記画像データに係る撮像の前に読み取ったコード情報に基づいて前記画像データに関連する関連情報を抽出する関連情報抽出部を更に有し、
     前記情報提供部は、抽出された前記関連情報に基づいて前記ユーザに前記情報を提供する、請求項1又は2に記載の情報配信装置。
    A related information extracting unit that extracts related information related to the image data based on code information read before imaging related to the image data;
    The information distribution apparatus according to claim 1, wherein the information providing unit provides the information to the user based on the extracted related information.
  6.  前記関連情報は、前記画像データに係る撮像時刻から所定時間以内に読み取られた前記コード情報に基づく施設に係る施設情報であり、
     前記情報提供部は、抽出された前記関連情報を取得された前記画像データに合成し、合成された前記画像データを前記ユーザに提供する、請求項5に記載の情報配信装置。
    The related information is facility information related to a facility based on the code information read within a predetermined time from an imaging time related to the image data,
    The information distribution apparatus according to claim 5, wherein the information providing unit combines the extracted related information with the acquired image data, and provides the combined image data to the user.
  7.  前記情報提供部は、前記カテゴリに係る前記画像データの数が多いほど、前記情報提供部からの情報提供頻度を多くする、又は、前記情報の表示を大きくする、請求項1~6の何れか一項に記載の情報配信装置。 The information providing unit according to any one of claims 1 to 6, wherein the information providing unit increases the information providing frequency from the information providing unit or increases the display of the information as the number of the image data related to the category increases. The information distribution apparatus according to one item.
  8.  前記解析部は、ユーザが閲覧した広告閲覧数を取得し、取得された前記広告閲覧数及び前記カテゴリに基づいて前記ユーザ嗜好を解析する、請求項1~7の何れか一項に記載の情報配信装置。 The information according to any one of claims 1 to 7, wherein the analysis unit acquires the number of advertisements viewed by a user and analyzes the user preference based on the acquired number of advertisements viewed and the category. Distribution device.
  9.  前記カテゴライズ部は、取得された前記画像データを顔認識処理することにより、前記画像データに写った人物の性別及び年齢のカテゴリを判定し、
     前記情報提供部は、判定された前記画像データに写った前記人物の前記性別及び前記年齢の前記カテゴリに基づいて、前記ユーザに前記情報を提供する、請求項1~8の何れか一項に記載の情報配信装置。
    The categorizing unit performs face recognition processing on the acquired image data to determine a gender and age category of the person shown in the image data,
    9. The information providing unit according to claim 1, wherein the information providing unit provides the information to the user based on the gender and the age category of the person shown in the determined image data. The information distribution apparatus described.
  10.  画像データが属するカテゴリをユーザ識別情報に紐付けて記憶する記憶部を有するコンピュータの情報配信プログラムであって、
     前記画像データ及び前記ユーザ識別情報をユーザ端末から取得し、
     取得された前記画像データを画像認識処理することにより、前記画像データが属するカテゴリを判定し、
     判定された前記カテゴリを前記ユーザ識別情報に紐付けて前記記憶部に登録し、
     前記ユーザ識別情報に紐付けられた前記カテゴリに基づいてユーザ嗜好を解析し、
     解析された前記ユーザ嗜好に基づいて、ユーザに情報を提供することを前記コンピュータに実行させる、
     ことを特徴とする情報配信プログラム。
    An information distribution program for a computer having a storage unit for storing a category to which image data belongs in association with user identification information,
    Obtaining the image data and the user identification information from a user terminal;
    By performing image recognition processing on the acquired image data, a category to which the image data belongs is determined,
    The determined category is associated with the user identification information and registered in the storage unit,
    Analyzing user preferences based on the category associated with the user identification information,
    Causing the computer to provide information to a user based on the analyzed user preferences,
    An information distribution program characterized by that.
PCT/JP2015/053408 2015-02-06 2015-02-06 Information delivery device and information delivery program WO2016125307A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2016573162A JPWO2016125307A1 (en) 2015-02-06 2015-02-06 Information distribution apparatus and information distribution program
PCT/JP2015/053408 WO2016125307A1 (en) 2015-02-06 2015-02-06 Information delivery device and information delivery program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/053408 WO2016125307A1 (en) 2015-02-06 2015-02-06 Information delivery device and information delivery program

Publications (1)

Publication Number Publication Date
WO2016125307A1 true WO2016125307A1 (en) 2016-08-11

Family

ID=56563671

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/053408 WO2016125307A1 (en) 2015-02-06 2015-02-06 Information delivery device and information delivery program

Country Status (2)

Country Link
JP (1) JPWO2016125307A1 (en)
WO (1) WO2016125307A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019023779A (en) * 2017-07-23 2019-02-14 株式会社フューチャー・アイ Order system
JP6539418B1 (en) * 2017-10-27 2019-07-03 楽天株式会社 Image extracting apparatus, image extracting method and image extracting program
JP2021500684A (en) * 2017-10-24 2021-01-07 ウーバー テクノロジーズ,インコーポレイテッド Food item delivery system coordinated on demand

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002015220A (en) * 2000-06-30 2002-01-18 Digipri Kk System and method for providing advertisement information and advertisement information selecting device
JP2006106404A (en) * 2004-10-06 2006-04-20 Canon Inc Advertisement display method
JP2011523744A (en) * 2008-05-29 2011-08-18 イーストマン コダック カンパニー Assessment of interest from digital image recording
JP2014052728A (en) * 2012-09-05 2014-03-20 Yahoo Japan Corp Information provision device, network system, information provision method, and information provision program

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010225123A (en) * 2009-03-25 2010-10-07 Sony Ericsson Mobile Communications Ab Data registration system, server, terminal device, and data registration method
JP4960526B1 (en) * 2011-09-29 2012-06-27 株式会社ぐるなび Store information provision system
JP5696119B2 (en) * 2012-10-04 2015-04-08 ヤフー株式会社 Advertisement distribution apparatus, advertisement distribution method, and advertisement distribution program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002015220A (en) * 2000-06-30 2002-01-18 Digipri Kk System and method for providing advertisement information and advertisement information selecting device
JP2006106404A (en) * 2004-10-06 2006-04-20 Canon Inc Advertisement display method
JP2011523744A (en) * 2008-05-29 2011-08-18 イーストマン コダック カンパニー Assessment of interest from digital image recording
JP2014052728A (en) * 2012-09-05 2014-03-20 Yahoo Japan Corp Information provision device, network system, information provision method, and information provision program

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019023779A (en) * 2017-07-23 2019-02-14 株式会社フューチャー・アイ Order system
JP7057583B2 (en) 2017-07-23 2022-04-20 株式会社フューチャー・アイ Order system
JP2021500684A (en) * 2017-10-24 2021-01-07 ウーバー テクノロジーズ,インコーポレイテッド Food item delivery system coordinated on demand
JP6539418B1 (en) * 2017-10-27 2019-07-03 楽天株式会社 Image extracting apparatus, image extracting method and image extracting program
US10853643B2 (en) 2017-10-27 2020-12-01 Rakuten, Inc. Image extraction device, image extraction method, and image extraction program

Also Published As

Publication number Publication date
JPWO2016125307A1 (en) 2017-08-31

Similar Documents

Publication Publication Date Title
US11068940B2 (en) System and method for providing mobile advertising services
US9934254B2 (en) Terminal apparatus, information processing system, and information processing method
US8929591B2 (en) Providing information associated with an identified representation of an object
US20140111542A1 (en) Platform for recognising text using mobile devices with a built-in device video camera and automatically retrieving associated content based on the recognised text
US9595059B2 (en) Image-related methods and arrangements
EP2754309B1 (en) Systems and methods involving augmented menu using a mobile device
US20130311329A1 (en) Image-related methods and arrangements
JP6120467B1 (en) Server device, terminal device, information processing method, and program
JP6168702B2 (en) Cooking price search device and cooking price search method
JP6114706B2 (en) Search system and search system control method
JP2013167973A (en) Retrieval device, retrieval method, retrieval program, and recording medium for storing the program
US20180130039A1 (en) Shopping support computing device
US20130086087A1 (en) Apparatus and method for generating and retrieving location-tagged content in computing device
WO2016125307A1 (en) Information delivery device and information delivery program
JP5664234B2 (en) Portable terminal device, information browsing program, server device, and browsing information providing program
KR102102572B1 (en) System and method for providing online shopping mall
JP6047939B2 (en) Evaluation system, program
JP2017228278A (en) Server device, terminal device, information processing method, and program
US20120158546A1 (en) System and method for identifying digital articles of commerce with a portable mobile device and linking the article to a social networking site
JP6780691B2 (en) Information distribution device and information distribution program
KR20110088643A (en) Collection system for personal information of contents user using mobile terminal and method thereof
US20200409991A1 (en) Information processing apparatus and method, and program
JP5929573B2 (en) Evaluation system, program
CN110968710B (en) Image processing device, image processing method, and image processing program
JP6019888B2 (en) Evaluation system, information processing device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15881124

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016573162

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15881124

Country of ref document: EP

Kind code of ref document: A1