CN104504155B - A kind of data capture method and system based on picture search - Google Patents

A kind of data capture method and system based on picture search Download PDF

Info

Publication number
CN104504155B
CN104504155B CN201510019426.6A CN201510019426A CN104504155B CN 104504155 B CN104504155 B CN 104504155B CN 201510019426 A CN201510019426 A CN 201510019426A CN 104504155 B CN104504155 B CN 104504155B
Authority
CN
China
Prior art keywords
target object
information
data
image
decoder
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201510019426.6A
Other languages
Chinese (zh)
Other versions
CN104504155A (en
Inventor
刘畅
Original Assignee
Liu Chang International Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Liu Chang International Co Ltd filed Critical Liu Chang International Co Ltd
Priority to CN201510019426.6A priority Critical patent/CN104504155B/en
Publication of CN104504155A publication Critical patent/CN104504155A/en
Application granted granted Critical
Publication of CN104504155B publication Critical patent/CN104504155B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The present invention relates to a kind of data-acquisition system and method based on picture search, the system comprises:System includes:Terminal device, target object is captured to create target object image, image recognition analysis is carried out to target object image so that the target object to be classified, the target object image is decoded by automatically driving the type corresponding decoder of the target object with being classified, to obtain target object information, target object information is sent to search server;Search server receives target object information, is scanned for, with acquisition and the relevant identification information of target object information, identification information being sent to content server according to target object information;And content server, receive identification information and according to identification information determine with the relevant presentation data of target object, by it is described presentation data be sent to terminal device, wherein it is described present data for project information and obtain address list.

Description

A kind of data capture method and system based on picture search
Technical field
The present invention relates to data acquisition arts, and more particularly, to a kind of data recipient based on picture search Method and system.
Background technology
Image recognition technology refers to handle image using computer, analyzed and understood, to identify various different moulds The target of formula and the technology of object.General industry using industrial camera in use, shoot picture, and then recycling software is according to figure Piece ash scale does further identifying processing.In addition, image recognition technology refers to the skill that remote sensing images are classified in geography Art.
However, with the rapid development of mobile terminal and the Internet, applications, more and more demands are for daily life In various objects carry out image identification.Happiness is found when user watches film or advertising programme on a television set, in tourism In the case of joyous object (for example, persons or things) or in the case where finding the interested predetermined object of user, user's phase It hopes and remembers the object, and user wishes by capturing object to object using memorandum or using the camera of mobile terminal It can obtain and the relevant data of the object.
When user has found some building, somewhere scenery or some article, but do not know this building name, Scenic spot residing for scenery is wherein or during article details, it will usually take to be dealt into after photo and obtains related letter on the net Breath.Alternatively, user needs to manually enter the information appeared in image, for utilizing captured picture search desired right As.For example, user needs to input title or product identification, shape, color of the product appeared in the picture etc..
But, on the one hand, the time that this mode obtains answer can not determine, on the other hand, user be needed to input A large amount of description information.In general, above-mentioned aspect can all cause the bad experience of user.
Invention content
To solve the above-mentioned problems, present invention offer is a kind of carries out image identification using mobile terminal and wireless network Method and system meets the needs that user identifies image with this.
According to an aspect of the present invention, a kind of data-acquisition system based on picture search is provided, the system comprises:
Terminal device, capture target object carry out target object image image identification point to create target object image Analysis passes through the corresponding decoding of the type of target object for automatically driving Yu being classified so that the target object to be classified Device is decoded the target object image, and to obtain target object information, target object information is sent to search clothes Business device;
Search server receives target object information, is scanned for according to target object information with acquisition and target object The relevant identification information of information, content server is sent to by identification information;And
Content server, receive identification information and according to identification information determine with the relevant presentation data of target object, The presentation data are sent to terminal device, wherein the list that data are presented as project information and acquisition address.
According to another aspect of the present invention, a kind of data capture method based on picture search is provided, the method includes:
Target object is captured to create target object image by terminal device, image identification point is carried out to target object image Analysis passes through the corresponding decoding of the type of target object for automatically driving Yu being classified so that the target object to be classified Device is decoded the target object image, and to obtain target object information, target object information is sent to search clothes Business device;And
Target object information is received by search server, is scanned for according to target object information with acquisition and target object The relevant identification information of information, content server is sent to by identification information;And
By content server receive identification information and according to identification information determine with the relevant presentation data of target object, The presentation data are sent to terminal device, wherein the list that data are presented as project information and acquisition address.
Preferably, it after wherein mobile terminal receives presentation data, utilizes pre-stored with enhancing in terminal device Reality handles relevant additional data and carries out augmented reality processing to data are presented, to obtain the presentation by augmented reality processing Data, and the presentation data of enhanced reality processing are shown.
Preferably, wherein carrying out image recognition analysis to target object image so that the target object is carried out classification packet It includes:Characteristic point is extracted from the target object image based on image recognition technology, and based on the characteristic point extracted come pair The type of the target object is classified.
Preferably, the terminal device includes the multiple decoders that can be decoded according to the type of the object.
Preferably, wherein, the multiple decoder includes following whole decoders:Bar code decoding device, quick response (QR) code decoder, near-field communication (NFC) decoder, radio frequency identification (RFID) decoder, optical character identification (OCR) decoding Device, face recognition decoder and image identification decoder.
Description of the drawings
The following detailed description of preferred illustrative embodiment is read in conjunction with the accompanying drawings, and the present invention may be better understood These and further feature and advantage, wherein:
Fig. 1 shows the structure chart of mobile terminal of the prior art.
Fig. 2 shows the structure charts of mobile terminal according to the preferred embodiment of the present invention;
Fig. 3 shows the structure chart of search server according to the preferred embodiment of the present invention;
Fig. 4 shows the structure chart of content server according to the preferred embodiment of the present invention;
Fig. 5 shows the structure chart of data-acquisition system according to the preferred embodiment of the present invention;And
Fig. 6 shows the flow chart of data capture method according to the preferred embodiment of the present invention.
It should be noted that these attached drawings are intended to the general characteristic of description method, make in certain exemplary embodiments Structure and/or material, and be intended to supplement description provided below.However, these attached drawings are not in proportion, and And nor accurately reflect the fine structure or performance characteristics of the embodiment arbitrarily provided, and also should not be construed as passing through Illustrative embodiments are defined or limit to the numberical range or attribute that are included.In various figures using similary or phase With reference numeral be intended to instruction there are similary or identical elements or feature.
Specific embodiment
Although illustrative embodiments can carry out various modifications and use alternative form, embodiments thereof is as real It applies example to provide in the accompanying drawings, and will be described in detail herein.It it is to be understood, however, that should not be by exemplary embodiment party Formula is defined to particular forms disclosed, on the contrary, illustrative embodiments be intended to fall into right to go in the range of it is all Modification, equivalent and substitute.In the description of entire attached drawing, identical reference numeral represents identical element.
Fig. 1 shows the structure chart of mobile terminal of the prior art.As shown in Figure 1, mobile terminal 100 includes shooting list Member 101, processor 102, storage unit 103 and interface unit 104.When needing to recognize some image, shooting list is usually utilized Member 101 takes photo and stores into storage unit 103.Then, photo is published to by interface unit 104 on internet Seek relevant information.In general, user needs to manually enter the information appeared in image, for utilizing captured picture search Desired object.For example, user needs to input title or product identification, shape, color of the product appeared in the picture etc..
Fig. 2 shows the structure charts of mobile terminal according to the preferred embodiment of the present invention.As shown in Fig. 2, mobile terminal 200 include:Shooting unit 201, taxon 202, recognition unit 203, input unit 204, processing unit 205, interface unit 206th, storage unit 207 and display unit 208.Mobile terminal 200 can obtain the image for it is expected searched target object, and And it is sent to search server via communication network after preliminary treatment is carried out to image, to obtain inquiry response.
When mobile terminal 200 is the image that can send and receive data via communication network and obtain target object During terminal, mobile terminal 200 can be any type device that can run and store various applications, such as personal digital assistant (PDA), smart mobile phone, tablet computer, radio telephone, mobile computing device, camera, video recorder, audio/video are broadcast Put device, positioning device (for example, global positioning system (GPS) equipment), game station, wireless device or various other similar Equipment or combination.
Preferably, shooting unit 201 is used to obtain the image of target object.For example, when user has found some commodity, it can With the bar code of items scanning or Quick Response Code or the picture for commodity of taking pictures, so as to fulfill the ground of acquisition commodity is found by mobile phone Location, the relevant information of commodity or similar clause.When user has found some building, somewhere scenery, but this building is not known Scenic spot residing for the name or scenery of object wherein when, the photo of the building or scenery can be taken with shooting unit 201.User Various planar materials, such as the picture poster of building or scenery can also be shot using shooting unit 201.Preferably, shooting is single Member 201 can be any kind of image acquisition device.
Preferably, taxon 202 is used to classify to the image of target object that shooting unit 201 obtains.For example, The image of target object can be divided into personage, landscape, building, word, picture poster, bar code or Quick Response Code etc..Taxon 202 Image analysis data type of target object, color, predetermined pattern for being obtained etc. are divided come the type to target object Class, and can be enable to carry out integrated identification for a variety of objects based on the type selection classified and app decoder. Taxon 202 is extracted by feature based point to be applied to image-recognizing method to utilize the target captured by shooting unit 201 The image of object, to analyze the data type of target object, color, characteristic pattern (shape of characteristic point, pattern, positional value) Deng, and classified by the analysis come the type to object.Target object includes the image characteristic point based on object type.Example Such as, when target object is bar code, target object includes to identify the unique features point information of the form of bar code.When right As if during Quick Response Code, object includes to identify the unique features point information of the form of Quick Response Code.As described above, when utilizing image When identification technology extracts characteristic point from the image of object, can the type of each object be verified with feature based point information.
Preferably, recognition unit 203 can read the decoder and feature needed for identification target object from storage unit 207 Point information.Classified based on the type of target object to the characteristic point information and decoder.Characteristic point information is supplied to Taxon 202, and thereby it is applied to the type of identifying object.Recognition unit 203 is selected based on the type of target object Decoder, so as to which the image of target object is decoded/be identified.
Preferably, input unit 204 is used to input additional data.When target object is commodity, additional data can wrap Include the details about things, shopping purchase information (pricing information, selling spot information etc.), price comparison information etc..When When target object is people, the additional data can include with the relevant details of the people (for example, the clothing that the people is worn Clothes, accessory information, birthday, age, experience etc.).When target object is building, additional data can be that building builds up year Generation, inter-related task, relevant historical event, architectural feature etc..When target object is landscape, additional data can be scenic spot name Title, scenic spot grade, primary attraction introduction, parking lot information, dining room information etc..Preferably, additional data can be user Audio, vision content of user's design that picture that photo, user like, user record etc..Preferably, input unit 204 can For being inputted to select that the data in data list are presented according to user.Data are presented as project information and the row of acquisition address Table.For example, presentation data are:(project information A and acquisition address A), (project information B and acquisition address B) etc..When user wishes When obtaining project information A, select to obtain address A by input unit 204, so as to obtain the data corresponding to project information A Content.
Preferably, processing unit 205 controls the integrated operation of each unit, and is controlled by using interface unit 206 Data send and receive.Processing unit 205 controls the image by the target object captured using shooting unit 201 to be stored in In storage unit 207.
Preferably, interface unit 206 communicates for mobile terminal 200 with the data of other equipment.Interface unit 206 can It is received or transmission data according to various agreements.Interface unit 206 can utilize wired internet, and can be by mobile logical Radio data network (the more matchmakers of internet, IP of communication network (CDMA (CDMA), wideband code division multiple access (W-CDMA) etc.) connection Body subsystem (IMS) etc.), satellite communication network, internet etc. for being connected by Wireless Fidelity (Wi-Fi) communicate.Especially Ground, in the case of cdma network, interface unit 206 can access data network by packet data serving node (PDSN). In the case of W-CDMA networks, interface unit 206 can access data network by Gateway GPRS Support Node (GGSN).Separately Outside, in the region of hot spot etc., interface unit 206 can access internet by the NFC of Wi-Fi etc..
Preferably, storage unit 207 can store the image of the target object captured by using shooting unit 201, And the target object information identified using taxon 202 and recognition unit 203 can be stored.Storage unit 207 is also Additional data can be stored.The additional data can be that user pre-enters or obtain target object letter What breath inputted later.
Preferably, display unit 208 is used to show information to user.Display unit 208 can be various types of liquid crystal Screen or other screens.Display unit 208 can be used for display that data are presented.Data, which are wherein presented, can be project information and obtain Take the list of address.For example, presentation data are:(project information A and acquisition address A), (project information B and acquisition address B) etc.. Preferably, the presentation data shown by display unit 208 can be that (number is referred to as presented in the presentation data handled without augmented reality According to) or the enhanced real presentation data handled.According to the preferred embodiment of the present invention, data can will be presented in server side Presentation data of the progress augmented reality processing to obtain the presentation data of enhanced reality processing and handle enhanced reality It is sent to mobile terminal.Wherein, server side prestore user terminal transmission with the relevant additional number of augmented reality processing According to.Alternatively, it after the presentation data that mobile terminal reception is handled without augmented reality, is deposited in advance using in terminal device Storage carries out augmented reality processing with the relevant additional data of augmented reality processing to data are presented, to obtain by augmented reality The presentation data of processing.Fig. 3 shows the structure chart of search server according to the preferred embodiment of the present invention.As described in Figure 3, Search server 300 includes:Query unit 301, interface unit 302, processing unit 303 and storage unit 304.Search server 300 include the inquiry request of target object information via communication network from mobile terminal reception.In response to inquiry request, search clothes Business device 300 carries out data query, and query result is sent to content server in storage unit 304.Wherein inquiry knot Fruit is the identification information of target object.
Preferably, the inquiry request that query unit 301 is handled according to processing unit 303, is looked into storage unit 304 It askes.Query unit 301 analyzes the target object information in inquiry request, to obtain the figure of the classification of target object, target object The contents such as picture.First, query unit 301 determines that the content to be inquired is related to storage unit 305 according to the classification of target object Which of database.Then, by image recognition technology by database involved by the image of target object and storage unit 305 In content be compared, to search occurrence.When finding occurrence, query unit 301 obtains inquiry data.Wherein, it looks into Ask the identification information that data can be target object.Preferably, identification information can be classification information, directory information, title letter Breath, string data etc. identify the information of target object.
Preferably, interface unit 302 is used to search server 300 and mobile terminal or other equipment into row data communication. Interface unit 302 can receive or transmission data according to various agreements.Interface unit 302 can utilize wired internet, and And the wireless data network that can be connected by mobile communications network (CDMA (CDMA), wideband code division multiple access (W-CDMA) etc.) Network (internet, IP multimedia subsystem (IMS) etc.), satellite communication network, the internet connected by Wireless Fidelity (Wi-Fi) Etc. communicating.Particularly, in the case of cdma network, interface unit 302 can pass through packet data serving node (PDSN) data network is accessed.In the case of W-CDMA networks, interface unit 302 can pass through Gateway GPRS Support Node (GGSN) data network is accessed.In addition, in the region of hot spot etc., interface unit 302 can pass through Wi-Fi's etc. NFC accesses internet.
Preferably, processing unit 303 controls the integrated operation of each unit, and controls data by interface unit 302 Send and receive.
Preferably, storage unit 304 can store multiple libraries, wherein each library corresponds to each classification respectively.Query unit 301 scan for according in multiple libraries of the target object information in storage unit 304.It can wherein be wrapped in storage unit 304 Include the various information banks such as personage library, landscape library, building library, literal pool, picture poster library, bar code library or Quick Response Code library.
Fig. 4 shows the structure chart of content server according to the preferred embodiment of the present invention.As described in Figure 4, content takes Business device 400 includes:Query unit 401, integrated unit 402, interface unit 403, processing unit 404 and storage unit 405.Content Server 400 receives the identification information of mark target object via communication network from search server.Content server 400 is being deposited Data query is carried out in storage unit 405 to obtain presentation data.Also, content server 400 is sent to movement by data are presented Terminal.
Preferably, content server 400 can will be presented data and carry out augmented reality processing to obtain at enhanced reality The presentation data of reason.Then, the data of enhanced reality processing can be in response sent to mobile whole by content server 400 End.Wherein, user terminal can will be sent to content before data acquisition is carried out with the relevant additional data of augmented reality processing Server 400 is to be stored in content server 400.
Alternatively, it is existing using pre-stored in terminal device and enhancing after mobile terminal receives presentation data Crucial point manages relevant additional data and carries out augmented reality processing to data are presented, to obtain the presentation number by augmented reality processing According to.
Preferably, the identification information that query unit 401 is handled according to processing unit 404, is looked into storage unit 405 It askes.Query unit 401 analyzes identification information, to obtain the contents such as the image of the classification of target object, target object.Preferably, Identification information can be the mark target object such as classification information, directory information, name information, string data, image information Information.Content in the database according to involved by the identification information of target object with storage unit 405 of query unit 401 is compared It is right, to search occurrence, so that it is determined that data are presented.Preferably, it can be project information and the row for obtaining address that data, which are presented, Table.For example, presentation data are:(project information A and acquisition address A), (project information B and acquisition address B) etc..Project information example YiHeYuan,BeiJing in this way, and obtain homepage address or letter related to YiHeYuan,BeiJing can be obtained that address is YiHeYuan,BeiJing The network address of breath.Preferably, query unit 401 can will be presented data and be sent to integrated unit 402.
Preferably, integrated unit 402 receives the presentation data that query unit 401 is sent.Integrated unit 402 can be from storage Unit 405 obtains additional data.Integrated unit 402 carries out augmented reality processing according to additional data to data are presented, to obtain The data of enhanced reality processing.For example, when target object for film and additional data is movie reviews, acts the leading role information, on Time, cinemas information etc. are reflected, integrated unit 402 can melt the information such as movie reviews, protagonist information, show time, cinemas information It closes in film introduction video, to obtain the film introduction video of enhanced reality processing.The film of enhanced reality processing is situated between The video that continues can make user obtained while Highlight is watched oneself it should be understood that content.For example, when target object is Commodity and additional data include the details about commodity, information (pricing information, selling spot information etc.) is bought in shopping, Durings price comparison information etc., integrated unit 402 can will buy information (pricing information, sale about the details of commodity, shopping Location information etc.), price comparison information etc. be fused to the introducing in short-movie of commodity, be situated between with the commodity for obtaining enhanced reality processing Continue short-movie.It is enhanced reality processing buyer's guide short-movie can make user understand while exterior of commodity is watched commodity its His attribute.For example, working as, target object is people and additional data is included with the relevant details of the people (for example, the people is worn The clothes, accessory information, the birthday, the age, experience when) when, integrated unit 402 can will be with the relevant details (example of the people Such as, clothes, accessory information, birthday, age, experience that the people is worn etc.) it is fused in personal brief introduction video, to obtain through increasing The personal brief introduction video of strong reality processing.The personal brief introduction video of enhanced reality processing can make viewer that can check always The relevant information of the people, without omitting relevant information because some part of video is missed.For example, when target object is to build Whens to build object and additional data be that building builds up age, inter-related task, relevant historical event, architectural feature etc., integrated unit 402, which can build up building the information such as age, inter-related task, relevant historical event, architectural feature, is fused to and introduces outside building Portion is introduced with internal in video, and video is introduced with the building for obtaining enhanced reality processing.Enhanced reality processing is built Building object and introducing video can make user fully understand building, and can carry out direct feel to internal structure.Example Such as, when target object is landscape and additional data is scenic spot title, scenic spot grade, primary attraction introduction, parking lot letter Whens breath, dining room information etc., integrated unit 402 can by scenic spot title, scenic spot grade, primary attraction introduction, parking lot information, Dining room information etc. is fused to scenic spot and introduces in short-movie, and short-movie is introduced with the scenic spot for obtaining enhanced reality processing.User can With by it is enhanced reality processing scenic spot introduce short-movie browse scenic spot beautiful scenery while, understand scenic spot title, The contents such as scenic spot grade, primary attraction introduction, parking lot information, dining room information.Preferably, additional data can also be user Photo, user like picture, the audio that user records, user's design vision content etc..Integrated unit 402 can be by user Photo, user like picture, the audio that user records, user's design vision content etc. be fused to above-mentioned various data In.
Preferably, user can record picture that the photo of user, user like, user audio, user's design Vision content etc. is prestored in the storage unit 405 of content server.
Preferably, interface unit 403 is used for content server 400 and mobile terminal or other equipment into row data communication. Interface unit 403 can receive or transmission data according to various agreements.Interface unit 403 can utilize wired internet, and And the wireless data network that can be connected by mobile communications network (CDMA (CDMA), wideband code division multiple access (W-CDMA) etc.) Network (internet, IP multimedia subsystem (IMS) etc.), satellite communication network, the internet connected by Wireless Fidelity (Wi-Fi) Etc. communicating.Particularly, in the case of cdma network, interface unit 403 can pass through packet data serving node (PDSN) data network is accessed.In the case of W-CDMA networks, interface unit 403 can pass through Gateway GPRS Support Node (GGSN) data network is accessed.In addition, in the region of hot spot etc., interface unit 403 can pass through Wi-Fi's etc. NFC accesses internet.
Preferably, processing unit 404 controls the integrated operation of each unit, and controls data by interface unit 403 Send and receive.
Preferably, storage unit 405 can store multiple libraries, wherein each library corresponds to each classification respectively.Storage unit 405 also store user additional data, such as the photo of user, user like picture, user record audio, user design Vision content.
Fig. 5 shows the structure chart of data-acquisition system 500 according to the preferred embodiment of the present invention.As shown in figure 5, number Include multiple mobile terminal 501-1,501-2 ..., 501-N, communication network 502,503 and of search server according to system 500 is obtained Content server 504.
Mobile terminal 501-1,501-2 ..., 501-N can obtain the image for it is expected searched target object, and Preliminary treatment is carried out to image, it is sent to search server 503 via communication network later, to obtain inquiry response.Search Server 503 is received from mobile terminal comprising target object information inquiry request via communication network 502.It please in response to inquiry It asks, search server 503 carries out data query, and query result is sent to content server 504.Wherein query result is The identification information of target object.Content server 504 receives the mark of mark target object via communication network from search server Know information.Content server 504 carries out data query to obtain presentation data in the memory unit.Also, server 504 will be in Existing data are sent to mobile terminal.
Preferably, communication network 502 can be wired internet, mobile communications network (CDMA (CDMA), broadband code Point multiple access (W-CDMA) etc.) connection radio data network (internet, IP multimedia subsystem (IMS) etc.), satellite communication network Network passes through internet of Wireless Fidelity (Wi-Fi) connection etc..
Fig. 6 shows the flow chart of data capture method according to the preferred embodiment of the present invention.As shown in fig. 6, method 600 since step 601 place.Then method 600 enters step 602, captures target object by terminal device to create target pair As image, image recognition analysis is carried out to target object image so that the target object to be classified, by automatically driving The corresponding decoder of type of target object with being classified is decoded the target object image, to obtain target Target object information is sent to search server by object information.The terminal device includes can be according to the class of the object Multiple decoders that type is decoded.The multiple decoder includes following whole decoders:Bar code decoding device, quick response (QR) code decoder, near-field communication (NFC) decoder, radio frequency identification (RFID) decoder, optical character identification (OCR) decoding Device, face recognition decoder and image identification decoder.
Then, method 600 enters step 603, target object information is received by search server, according to target object information It scans for, with acquisition and the relevant identification information of target object information, identification information being sent to content server.
Then, method 600 enters step 604, is received identification information by content server and is determined according to identification information With the relevant presentation data of target object, the presentation data are sent to terminal device, wherein the presentation data are project Information and the list for obtaining address.Wherein mobile terminal receives data are presented after, using pre-stored in terminal device Augmented reality processing is carried out to data are presented with the relevant additional data of augmented reality processing, is handled with obtaining by augmented reality Presentation data, and to it is enhanced reality processing presentation data show.
Finally, method 600 terminates at step 605.

Claims (6)

1. a kind of data-acquisition system based on picture search, the system comprises:
Terminal device, capture target object to create target object image, to target object image carry out image recognition analysis with The target object is classified, by automatically drive with the corresponding decoder of the type of target object classified come The target object image is decoded, to obtain target object information, target object information is sent to search server;
Search server receives target object information, is scanned for according to target object information with acquisition and target object information Identification information is sent to content server by relevant identification information;And
Content server, receive identification information and according to identification information determine with the relevant presentation data of target object, by institute It states presentation data and is sent to terminal device, wherein the list that data are presented as project information and acquisition address;
Wherein mobile terminal receives data are presented after, using pre-stored related to augmented reality processing in terminal device Additional data carry out augmented reality processing to data are presented, it is and right to obtain the presentation data by augmented reality processing The presentation data of enhanced reality processing are shown;
Image recognition analysis is wherein carried out to target object image the target object is carried out classification to include:Known based on image Other technology extracts characteristic point from the target object image, and based on the characteristic point extracted come to the target object Type is classified.
2. system according to claim 1, the terminal device includes to be decoded according to the type of the object Multiple decoders.
3. system according to claim 2, wherein, the multiple decoder includes one or more in following decoder It is a:Bar code decoding device, quick response QR codes decoder, near-field communication NFC decoders, radio frequency discrimination RFID decoder, optics Character recognition OCR decoders, face recognition decoder and image identification decoder.
4. a kind of data capture method based on picture search, the method includes:
By terminal device capture target object to create target object image, to target object image carry out image recognition analysis with The target object is classified, by automatically drive with the corresponding decoder of the type of target object classified come The target object image is decoded, to obtain target object information, target object information is sent to search server; And
Target object information is received by search server, is scanned for according to target object information with acquisition and target object information Identification information is sent to content server by relevant identification information;And
By content server receive identification information and according to identification information determine with the relevant presentation data of target object, by institute It states presentation data and is sent to terminal device, wherein the list that data are presented as project information and acquisition address;
Wherein mobile terminal receives data are presented after, using pre-stored related to augmented reality processing in terminal device Additional data carry out augmented reality processing to data are presented, it is and right to obtain the presentation data by augmented reality processing The presentation data of enhanced reality processing are shown;
Image recognition analysis is wherein carried out to target object image the target object is carried out classification to include:Known based on image Other technology extracts characteristic point from the target object image, and based on the characteristic point extracted come to the target object Type is classified.
5. according to the method described in claim 4, the terminal device includes to be decoded according to the type of the object Multiple decoders.
6. according to the method described in claim 5, wherein, the multiple decoder includes following whole decoders:Bar code solution Code device, quick response (QR) code decoder, near-field communication (NFC) decoder, radio frequency identification (RFID) decoder, optical character are known Not (OCR) decoder, face recognition decoder and image identification decoder.
CN201510019426.6A 2015-01-15 2015-01-15 A kind of data capture method and system based on picture search Expired - Fee Related CN104504155B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510019426.6A CN104504155B (en) 2015-01-15 2015-01-15 A kind of data capture method and system based on picture search

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510019426.6A CN104504155B (en) 2015-01-15 2015-01-15 A kind of data capture method and system based on picture search

Publications (2)

Publication Number Publication Date
CN104504155A CN104504155A (en) 2015-04-08
CN104504155B true CN104504155B (en) 2018-06-08

Family

ID=52945552

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510019426.6A Expired - Fee Related CN104504155B (en) 2015-01-15 2015-01-15 A kind of data capture method and system based on picture search

Country Status (1)

Country Link
CN (1) CN104504155B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102658873B1 (en) * 2015-06-24 2024-04-17 매직 립, 인코포레이티드 Augmented reality devices, systems and methods for purchasing
US10984373B2 (en) * 2016-03-07 2021-04-20 Sony Corporation System and method for information processing
CN105893613B (en) * 2016-04-27 2019-12-10 宇龙计算机通信科技(深圳)有限公司 image identification information searching method and device
CN107979572A (en) * 2016-10-25 2018-05-01 鄂尔多斯市赛博购商贸有限公司 The method of augmented reality service is provided and uses end
CN110119650A (en) * 2018-02-06 2019-08-13 优酷网络技术(北京)有限公司 Information displaying method and device
KR102546026B1 (en) 2018-05-21 2023-06-22 삼성전자주식회사 Electronic apparatus and method of obtaining contents recognition information thereof
CN109711502B (en) * 2018-12-28 2022-07-15 电能易购(北京)科技有限公司 Information processing method, electronic equipment and storage medium
US10966342B2 (en) * 2019-01-31 2021-03-30 Dell Products, L.P. System and method for determining location and navigating a datacenter using augmented reality and available sensor data
CN111428121B (en) * 2020-03-17 2021-07-02 百度在线网络技术(北京)有限公司 Method and device for searching information

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103049728A (en) * 2012-12-30 2013-04-17 成都理想境界科技有限公司 Method, system and terminal for augmenting reality based on two-dimension code
CN103049729A (en) * 2012-12-30 2013-04-17 成都理想境界科技有限公司 Method, system and terminal for augmenting reality based on two-dimension code
CN103119593A (en) * 2010-08-09 2013-05-22 Sk普兰尼特有限公司 Integrated image search system and a service method therewith
CN103778261A (en) * 2014-03-04 2014-05-07 福建瑞恒信息技术有限公司 Self-guided tour method based on mobile cloud computing image recognition

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5776903B2 (en) * 2012-03-02 2015-09-09 カシオ計算機株式会社 Image processing apparatus, image processing method, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103119593A (en) * 2010-08-09 2013-05-22 Sk普兰尼特有限公司 Integrated image search system and a service method therewith
CN103049728A (en) * 2012-12-30 2013-04-17 成都理想境界科技有限公司 Method, system and terminal for augmenting reality based on two-dimension code
CN103049729A (en) * 2012-12-30 2013-04-17 成都理想境界科技有限公司 Method, system and terminal for augmenting reality based on two-dimension code
CN103778261A (en) * 2014-03-04 2014-05-07 福建瑞恒信息技术有限公司 Self-guided tour method based on mobile cloud computing image recognition

Also Published As

Publication number Publication date
CN104504155A (en) 2015-04-08

Similar Documents

Publication Publication Date Title
CN104504155B (en) A kind of data capture method and system based on picture search
JP7302638B2 (en) mobile information equipment
CN103369049B (en) Mobile terminal and server exchange method and system thereof
US10380170B2 (en) Integrated image searching system and service method thereof
RU2731362C1 (en) Method of providing information on purchases in live video
JP5951759B2 (en) Extended live view
CN100465957C (en) Information search system, information search method, information search device, information search program, image recognition device, image recognition method, image recognition program, and sales sy
US8180396B2 (en) User augmented reality for camera-enabled mobile devices
JP5395920B2 (en) Search device, search method, search program, and computer-readable recording medium storing the program
CN106412229B (en) Method and device for interaction and information provision of mobile terminal and method and device for providing contact information and mobile terminal
US20110255736A1 (en) Networked image recognition methods and systems
CN106598998B (en) Information acquisition method and information acquisition device
US20140079281A1 (en) Augmented reality creation and consumption
US20170169495A1 (en) System and method for accessing electronic data via an image search engine
US20160070809A1 (en) System and method for accessing electronic data via an image search engine
CN104504402A (en) Data processing method and system based on image search
JP2014241151A (en) Use of image-derived information as search criteria for internet and other search engines
JP2014524062A5 (en)
US20140078174A1 (en) Augmented reality creation and consumption
JP6120467B1 (en) Server device, terminal device, information processing method, and program
WO2007116500A1 (en) Information presenting system, information presenting terminal, and server
US9600720B1 (en) Using available data to assist in object recognition
JP6609434B2 (en) Product information providing system, product information providing method, and management server
US10123094B1 (en) Location-based movie identification systems and methods
JP2017228278A (en) Server device, terminal device, information processing method, and program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20180608

Termination date: 20210115

CF01 Termination of patent right due to non-payment of annual fee