CN107239567A - A kind of recognition methods of object scene, equipment and computer-readable recording medium - Google Patents
A kind of recognition methods of object scene, equipment and computer-readable recording medium Download PDFInfo
- Publication number
- CN107239567A CN107239567A CN201710480671.6A CN201710480671A CN107239567A CN 107239567 A CN107239567 A CN 107239567A CN 201710480671 A CN201710480671 A CN 201710480671A CN 107239567 A CN107239567 A CN 107239567A
- Authority
- CN
- China
- Prior art keywords
- object scene
- image
- street view
- view image
- checked
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/29—Geographical information databases
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Databases & Information Systems (AREA)
- Library & Information Science (AREA)
- Data Mining & Analysis (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Mobile Radio Communication Systems (AREA)
Abstract
The application describes a kind of recognition methods of object scene, equipment and computer-readable recording medium, and this method includes:Image to be checked is obtained, and determines the object scene in the image to be checked;By the map application on the mobile terminal, the street view image of the current location information based on terminal user is obtained, and searches in the street view image object scene;When finding the object scene in the street view image, by presentation of information of the object scene in the street view image in the image to be checked.The camera function and map application of mobile terminal can be combined by the application, the Name & Location information of quick identification object scene, improve user experience and terminal ease for use.
Description
Technical field
The application is related to communication technical field, more particularly to a kind of recognition methods of object scene, equipment and computer can
Read storage medium.
Background technology
With the development of electronic equipment, the terminal with camera function has been obtained for popularization in the life of people.Work(
The terminal that can increasingly enrich is very easy to the life of people.In recent years, image processing techniques is developed rapidly, the photograph of terminal
Phase function is also become stronger day by day, and adds terminal advantage easy to carry, and increasing user's favor is taken pictures by terminal.With
This simultaneously, with continuing to develop for GPS (Global Positioning System, global positioning system), on smart mobile phone all
Map application is installed, terminal user can search object scene by map application and carry out path planning.But
Be, when terminal user do not know object scene name information and positional information, only object scene image information when, with regard to nothing
Method finds the object scene by map application.
The content of the invention
The main purpose of the application is to propose a kind of recognition methods of object scene, equipment and computer-readable storage medium
The camera function and map application of mobile terminal, can be combined by matter, the Name & Location of quick identification object scene
Information, improves user experience and terminal ease for use.
To achieve the above object, it is described applied to mobile terminal this application provides a kind of recognition methods of object scene
The recognition methods of object scene includes:
Image to be checked is obtained, and determines the object scene in the image to be checked;
By the map application on the mobile terminal, the streetscape of the current location information based on terminal user is obtained
Image, and search in the street view image object scene;
When finding the object scene in the street view image, by the object scene in the street view image
Presentation of information in the image to be checked.
Optionally, the acquisition image to be checked, including:
The image to be checked is obtained by the camera of the mobile terminal;Or, it is pre- from the mobile terminal
The image to be checked is obtained in the image first stored.
Optionally, the map application by the mobile terminal, obtains the present bit based on terminal user
The street view image of confidence breath, and the object scene is searched in the street view image, including:
By the map application, using the current location information of the terminal user as starting point, from the close-by examples to those far off, obtain
Street view image in the range of different distance, and according to order from the close-by examples to those far off successively in the street view image in the range of different distance
Search the object scene.
Optionally, information of the object scene in the street view image, including:The object scene is in the streetscape
Name information in image, and positional information of the object scene in the street view image.
Optionally, methods described also includes:
When finding the object scene in the street view image, by the map application, according to described
The positional information of the current location information of terminal user and the object scene in the street view image, was determined from the end
End subscriber to the object scene routing information, and by the routing information include in the image to be checked.
In addition, to achieve the above object, the application also proposes a kind of identification equipment of object scene, applied to mobile terminal
In, the identification equipment of the object scene includes:Processor, memory and communication bus;
The communication bus is used to realize the connection communication between processor and memory;
The processor is used for the recognizer for performing the object scene stored in memory, to realize following steps:
Image to be checked is obtained, and determines the object scene in the image to be checked;
By the map application on the mobile terminal, the streetscape of the current location information based on terminal user is obtained
Image, and search in the street view image object scene;
When finding the object scene in the street view image, by the object scene in the street view image
Presentation of information in the image to be checked.
Optionally, the processor is being realized by the map application on the mobile terminal, is obtained and is based on terminal
The street view image of the current location information of user, and when searching in the street view image step of the object scene, specifically
Including:
By the map application, using the current location information of the terminal user as starting point, from the close-by examples to those far off, obtain
Street view image in the range of different distance, and according to order from the close-by examples to those far off successively in the street view image in the range of different distance
Search the object scene.
Optionally, information of the object scene in the street view image, including:The object scene is in the streetscape
Name information in image, and positional information of the object scene in the street view image.
Optionally, the processor is additionally operable to perform the recognizer of the object scene, to realize following steps:
When finding the object scene in the street view image, by the map application, according to described
The positional information of the current location information of terminal user and the object scene in the street view image, was determined from the end
End subscriber to the object scene routing information, and by the routing information include in the image to be checked.
In addition, to achieve the above object, the application also proposes a kind of computer-readable recording medium, described computer-readable
Storage medium is stored with the recognizer of object scene;
When the recognizer of the object scene is by least one computing device, cause at least one described processor
The step of recognition methods for the object scene for performing above-mentioned introduction.
Recognition methods, equipment and the computer-readable recording medium for the object scene that the application is proposed, by mobile terminal
Camera function and map application are combined, and when determining object scene from image to be checked, pass through map application journey
Streetscape pattern in sequence finds out the object scene, so that it is determined that going out the Name & Location information of the object scene.Terminal
User can improve use using the image information quick search of object scene to the name information and positional information of object scene
Family Experience Degree and terminal ease for use.
Brief description of the drawings
Fig. 1 is a kind of hardware architecture diagram for the mobile terminal for realizing each embodiment of the application;
Fig. 2 is the communications network system Organization Chart of mobile terminal as shown in Figure 1;
Fig. 3 is the flow chart of the recognition methods of the object scene of the application first embodiment;
Fig. 4 is the flow chart of the recognition methods of the object scene of the application second embodiment;
Fig. 5 is the flow chart of the recognition methods of the object scene of the application 3rd embodiment;
Realization, functional characteristics and the advantage of the application purpose will be described further referring to the drawings in conjunction with the embodiments.
Embodiment
It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not intended to limit the present invention.
In follow-up description, the suffix using such as " module ", " part " or " unit " for representing element is only
Be conducive to the explanation of the present invention, itself there is no a specific meaning.Therefore, " module ", " part " or " unit " can be mixed
Ground is used.
Terminal can be implemented in a variety of manners.For example, the terminal described in the present invention can include such as mobile phone, flat board
Computer, notebook computer, palm PC, personal digital assistant (Personal Digital Assistant, PDA), portable
Media player (Portable Media Player, PMP), guider, wearable device, Intelligent bracelet, pedometer etc. are moved
Move the fixed terminals such as terminal, and numeral TV, desktop computer.
It will be illustrated in subsequent descriptions by taking mobile terminal as an example, it will be appreciated by those skilled in the art that except special
Outside element for moving purpose, construction according to the embodiment of the present invention can also apply to the terminal of fixed type.
Referring to Fig. 1, its hardware architecture diagram for a kind of mobile terminal of realization each embodiment of the invention, the shifting
Dynamic terminal 100 can include:RF (Radio Frequency, radio frequency) unit 101, WiFi module 102, audio output unit
103rd, A/V (audio/video) input block 104, sensor 105, display unit 106, user input unit 107, interface unit
108th, the part such as memory 109, processor 110 and power supply 111.It will be understood by those skilled in the art that shown in Fig. 1
Mobile terminal structure does not constitute the restriction to mobile terminal, and mobile terminal can be included than illustrating more or less parts,
Either combine some parts or different parts arrangement.
The all parts of mobile terminal are specifically introduced with reference to Fig. 1:
Radio frequency unit 101 can be used for receiving and sending messages or communication process in, the reception and transmission of signal, specifically, by base station
Downlink information receive after, handled to processor 110;In addition, up data are sent into base station.Generally, radio frequency unit 101
Including but not limited to antenna, at least one amplifier, transceiver, coupler, low-noise amplifier, duplexer etc..In addition, penetrating
Frequency unit 101 can also be communicated by radio communication with network and other equipment.Above-mentioned radio communication can use any communication
Standard or agreement, including but not limited to GSM (Global System of Mobile communication, global system for mobile telecommunications
System), GPRS (General Packet Radio Service, general packet radio service), CDMA2000 (Code
Division Multiple Access 2000, CDMA 2000), WCDMA (Wideband Code Division
Multiple Access, WCDMA), TD-SCDMA (Time Division-Synchronous Code
Division Multiple Access, TD SDMA), FDD-LTE (Frequency Division
Duplexing-Long Term Evolution, FDD Long Term Evolution) and TDD-LTE (Time Division
Duplexing-Long Term Evolution, time division duplex Long Term Evolution) etc..
WiFi belongs to short range wireless transmission technology, and mobile terminal can help user's transmitting-receiving electricity by WiFi module 102
Sub- mail, browse webpage and access streaming video etc., it has provided the user wireless broadband internet and accessed.Although Fig. 1 shows
Go out WiFi module 102, but it is understood that, it is simultaneously not belonging to must be configured into for mobile terminal, completely can be according to need
To be omitted in the essential scope for do not change invention.
Audio output unit 103 can be in call signal reception pattern, call mode, record mould in mobile terminal 1 00
When under the isotypes such as formula, speech recognition mode, broadcast reception mode, it is that radio frequency unit 101 or WiFi module 102 are received or
The voice data stored in memory 109 is converted into audio signal and is output as sound.Moreover, audio output unit 103
The audio output related to the specific function that mobile terminal 1 00 is performed can also be provided (for example, call signal receives sound, disappeared
Breath receives sound etc.).Audio output unit 103 can include loudspeaker, buzzer etc..
A/V input blocks 104 are used to receive audio or video signal.A/V input blocks 104 can include graphics processor
(Graphics Processing Unit, GPU) 1041 and microphone 1042,1041 pairs of graphics processor is in video acquisition mode
Or the view data progress of the static images or video obtained in image capture mode by image capture apparatus (such as camera)
Reason.Picture frame after processing may be displayed on display unit 106.Picture frame after being handled through graphics processor 1041 can be deposited
Storage is transmitted in memory 109 (or other storage mediums) or via radio frequency unit 101 or WiFi module 102.Mike
Wind 1042 can connect in telephone calling model, logging mode, speech recognition mode etc. operational mode via microphone 1042
Quiet down sound (voice data), and can be voice data by such acoustic processing.Audio (voice) data after processing can
To be converted to the form output that mobile communication base station can be sent to via radio frequency unit 101 in the case of telephone calling model.
Microphone 1042 can implement various types of noises and eliminate (or suppression) algorithm to eliminate (or suppression) in reception and send sound
The noise produced during frequency signal or interference.
Mobile terminal 1 00 also includes at least one sensor 105, such as optical sensor, motion sensor and other biographies
Sensor.Specifically, optical sensor includes ambient light sensor and proximity transducer, wherein, ambient light sensor can be according to environment
The light and shade of light adjusts the brightness of display panel 1061, and proximity transducer can close when mobile terminal 1 00 is moved in one's ear
Display panel 1061 and/or backlight.As one kind of motion sensor, accelerometer sensor can detect in all directions (general
For three axles) size of acceleration, size and the direction of gravity are can detect that when static, the application available for identification mobile phone posture
(such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, percussion) etc.;
The fingerprint sensor that can also configure as mobile phone, pressure sensor, iris sensor, molecule sensor, gyroscope, barometer,
The other sensors such as hygrometer, thermometer, infrared ray sensor, will not be repeated here.
Display unit 106 is used for the information for showing the information inputted by user or being supplied to user.Display unit 106 can be wrapped
Display panel 1061 is included, liquid crystal display (Liquid Crystal Display, LCD), Organic Light Emitting Diode can be used
Forms such as (Organic Light-Emitting Diode, OLED) configures display panel 1061.
User input unit 107 can be used for the numeral or character information for receiving input, and produce the use with mobile terminal
The key signals input that family is set and function control is relevant.Specifically, user input unit 107 may include contact panel 1071 with
And other input equipments 1072.Contact panel 1071, also referred to as touch-screen, collect touch operation of the user on or near it
(such as user is using any suitable objects such as finger, stylus or annex on contact panel 1071 or in contact panel 1071
Neighbouring operation), and corresponding attachment means are driven according to formula set in advance.Contact panel 1071 may include touch detection
Two parts of device and touch controller.Wherein, touch detecting apparatus detects the touch orientation of user, and detects touch operation band
The signal come, transmits a signal to touch controller;Touch controller receives touch information from touch detecting apparatus, and by it
It is converted into contact coordinate, then gives processor 110, and the order sent of reception processing device 110 and can be performed.In addition, can
To realize contact panel 1071 using polytypes such as resistance-type, condenser type, infrared ray and surface acoustic waves.Except contact panel
1071, user input unit 107 can also include other input equipments 1072.Specifically, other input equipments 1072 can be wrapped
Include but be not limited to physical keyboard, in function key (such as volume control button, switch key etc.), trace ball, mouse, action bars etc.
One or more, do not limit herein specifically.
Further, contact panel 1071 can cover display panel 1061, detect thereon when contact panel 1071 or
After neighbouring touch operation, processor 110 is sent to determine the type of touch event, with preprocessor 110 according to touch thing
The type of part provides corresponding visual output on display panel 1061.Although in Fig. 1, contact panel 1071 and display panel
1061 be input and the output function that mobile terminal is realized as two independent parts, but in certain embodiments, can
By contact panel 1071 and the input that is integrated and realizing mobile terminal of display panel 1061 and output function, not do specifically herein
Limit.
Interface unit 108 is connected the interface that can pass through as at least one external device (ED) with mobile terminal 1 00.For example,
External device (ED) can include wired or wireless head-band earphone port, external power source (or battery charger) port, wired or nothing
Line FPDP, memory card port, the port for connecting the device with identification module, audio input/output (I/O) end
Mouth, video i/o port, ear port etc..Interface unit 108 can be used for receiving the input from external device (ED) (for example, number
It is believed that breath, electric power etc.) and the input received is transferred to one or more elements in mobile terminal 1 00 or can be with
For transmitting data between mobile terminal 1 00 and external device (ED).
Memory 109 can be used for storage software program and various data.Memory 109 can mainly include storing program area
And storage data field, wherein, application program (the such as sound that storing program area can be needed for storage program area, at least one function
Sound playing function, image player function etc.) etc.;Storage data field can be stored uses created data (such as according to mobile phone
Voice data, phone directory etc.) etc..In addition, memory 109 can include high-speed random access memory, it can also include non-easy
The property lost memory, for example, at least one disk memory, flush memory device or other volatile solid-state parts.
Processor 110 is the control centre of mobile terminal, utilizes each of various interfaces and the whole mobile terminal of connection
Individual part, by operation or performs and is stored in software program and/or module in memory 109, and calls and be stored in storage
Data in device 109, perform the various functions and processing data of mobile terminal, so as to carry out integral monitoring to mobile terminal.Place
Reason device 110 may include one or more processing units;It is preferred that, processor 110 can integrated application processor and modulatedemodulate mediate
Device is managed, wherein, application processor mainly handles operating system, user interface and application program etc., and modem processor is main
Handle radio communication.It is understood that above-mentioned modem processor can not also be integrated into processor 110.
Mobile terminal 1 00 can also include the power supply 111 (such as battery) powered to all parts, it is preferred that power supply 111
Can be logically contiguous by power-supply management system and processor 110, so as to realize management charging by power-supply management system, put
The function such as electricity and power managed.
Although Fig. 1 is not shown, mobile terminal 1 00 can also will not be repeated here including bluetooth module etc..
For the ease of understanding the embodiment of the present invention, the communications network system that the mobile terminal of the present invention is based on is entered below
Row description.
Referring to Fig. 2, Fig. 2 is a kind of communications network system Organization Chart provided in an embodiment of the present invention, the communication network system
Unite as the LTE system of universal mobile communications technology, UE (User Equipment, use of the LTE system including communicating connection successively
Family equipment) 201, E-UTRAN (Evolved UMTS Terrestrial Radio Access Network, evolved UMTS lands
Ground wireless access network) 202, EPC (Evolved Packet Core, evolved packet-based core networks) 203 and operator IP operation
204。
Specifically, UE201 can be above-mentioned terminal 100, and here is omitted.
E-UTRAN202 includes eNodeB2021 and other eNodeB2022 etc..Wherein, eNodeB2021 can be by returning
Journey (backhaul) (such as X2 interface) is connected with other eNodeB2022, and eNodeB2021 is connected to EPC203,
ENodeB2021 can provide UE201 to EPC203 access.
EPC203 can include MME (Mobility Management Entity, mobility management entity) 2031, HSS
(Home Subscriber Server, home subscriber server) 2032, other MME2033, SGW (Serving Gate Way,
Gateway) 2034, PGW (PDN Gate Way, grouped data network gateway) 2035 and PCRF (Policy and
Charging Rules Function, policy and rate functional entity) 2036 etc..Wherein, MME2031 be processing UE201 and
There is provided carrying and connection management for the control node of signaling between EPC203.HSS2032 is all to manage for providing some registers
Such as function of attaching position register (not shown) etc, and some are preserved about the use such as service features, data rate
The special information in family.All customer data can be transmitted by SGW2034, and PGW2035 can provide UE 201 IP
Address is distributed and other functions, and PCRF2036 is strategy and the charging control strategic decision-making of business data flow and IP bearing resources
Point, it selects and provided available strategy and charging control decision-making with charge execution function unit (not shown) for strategy.
IP operation 204 can include internet, Intranet, IMS (IP Multimedia Subsystem, IP multimedia
System) or other IP operations etc..
Although above-mentioned be described by taking LTE system as an example, those skilled in the art it is to be understood that the present invention not only
Suitable for LTE system, be readily applicable to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA with
And following new network system etc., do not limit herein.
Based on above-mentioned mobile terminal hardware configuration and communications network system, each embodiment of the inventive method is proposed.
The first embodiment of the application proposes a kind of recognition methods of object scene, the mobile end applied to above-mentioned introduction
In end, as shown in figure 3, methods described specifically includes following steps:
Step S301:Image to be checked is obtained, and determines the object scene in the image to be checked.
Specifically, the acquisition image to be checked, including:
The image to be checked is obtained by the camera of the mobile terminal;Or, it is pre- from the mobile terminal
The image to be checked is obtained in the image first stored.
Further, the object scene in the image to be checked is determined, including:
The image to be checked is shown by the display screen of mobile terminal, user is by touching display screen with from described to be checked
Ask in image and determine object scene.
The positional information that user touches the display screen is obtained, based on the positional information, and is utilized of the prior art
Edge detection algorithm, such as Robert edge algorithms, Prewitt edge algorithms, Sobel edge algorithms, Kirsch edge algorithms
And Laplacian algorithms etc., determine the region of object scene.
Step S302:By the map application on the mobile terminal, the current location based on terminal user is obtained
The street view image of information, and search in the street view image object scene.
Specifically, step S302, including:
By the map application, using the current location information of the terminal user as starting point, from the close-by examples to those far off, obtain
Street view image in the range of different distance, and according to order from the close-by examples to those far off successively in the street view image in the range of different distance
Search the object scene.
Mounted map application on mobile terminal is called in the present embodiment, is obtained eventually by map application
The current location information of end subscriber, and the acquisition different distance model of the current location information based on the terminal user from the close-by examples to those far off
Enclose the street view image at value.The street view image is obtained by the streetscape pattern (or panning mode) of map application.
The object scene is searched in the street view image at different distance value range again, compared, so that it is determined that the target
Scenery is with the presence or absence of in the street view image at different distance value range.For example, obtaining distance respectively by map application
Street view image at 100 meters, 500 meters and 1000 meters of the current location of terminal user.Looked into street view image first at 100 meters
Look for the object scene;If do not found, then the object scene is searched in the street view image at 500 meters;If do not had also
Find, finally search the object scene in the street view image at 1000 meters.
Step S303:When finding the object scene in the street view image, by the object scene described
Presentation of information in street view image is in the image to be checked.
Specifically, information of the object scene in the street view image, including:Object scene is in the street view image
In name information, and positional information of the object scene in the street view image.
By the way that the object scene is searched in the street view image at different distance value range, compared, if described
Object scene is present in street view image, then obtains positional information, scenery name of the object scene in the street view image
Claim the relevant informations such as information, scenery introduction.
The present embodiment, will be from photograph by using the streetscape pattern in the camera-enabled and map application of mobile terminal
Scenery in the object scene and streetscape pattern chosen in piece is contrasted, so that it is determined that going out the relevant information of object scene.When
Mobile terminal user only has the image information of object scene, but when not knowing the specific name or positional information of the object scene,
The relevant information of the object scene can be determined by the method for this implementation.
The second embodiment of the application proposes a kind of recognition methods of object scene, the mobile end applied to above-mentioned introduction
In end, as shown in figure 4, methods described specifically includes following steps:
Step S401:Image to be checked is obtained, and determines the object scene in the image to be checked.
Specifically, the acquisition image to be checked, including:
The image to be checked is obtained by the camera of the mobile terminal;Or, it is pre- from the mobile terminal
The image to be checked is obtained in the image first stored.
Further, the object scene in the image to be checked is determined, including:
Scene features identification is carried out to the image to be checked, each scene features area in the image to be checked is determined
Domain, further according to the selection of terminal user, determines object scene.
Further, scene features identification is carried out to the image to be checked, determined each in the image to be checked
Individual scene features region, including:
The fringe region of each scenery object in the image to be checked is determined, according to the marginal information, it is determined that often
The characteristic area of individual scenery object.The edge contour of each object is first determined, is determined further according to the edge contour of each object
Go out the characteristic area of each object.Wherein, it is determined that each scenery object in the image to be checked fringe region when, i.e.,
When being detected to the edge of each scenery object, edge detection algorithm of the prior art, such as Robert sides can be used
Edge algorithm, Prewitt edge algorithms, Sobel edge algorithms, Kirsch edge algorithms and Laplacian algorithms etc. are to image
Carry out the detection of edge contour.
The determination method of object scene in the present embodiment is:First determine the institute included in the image to be checked
There is scene features region, then obtain the positional information that user clicks on the display screen of mobile terminal, so that according to the positional information
Determine object scene in belonging scene features region.
Step S402:By the map application on the mobile terminal, the current location based on terminal user is obtained
The street view image of information, and search in the street view image object scene.
Specifically, step S402, including:
By the map application, using the current location information of the terminal user as starting point, from the close-by examples to those far off, obtain
Street view image in the range of different distance, and according to order from the close-by examples to those far off successively in the street view image in the range of different distance
Search the object scene.
Mounted map application on mobile terminal is called in the present embodiment, is obtained eventually by map application
The current location information of end subscriber, and the acquisition different distance model of the current location information based on the terminal user from the close-by examples to those far off
Enclose the street view image at value.The street view image is obtained by the streetscape pattern (or panning mode) of map application.
The object scene is searched in the street view image at different distance value range again, compared, so that it is determined that the target
Scenery is with the presence or absence of in the street view image at different distance value range.For example, obtaining distance respectively by map application
Street view image at 100 meters, 500 meters and 1000 meters of the current location of terminal user.Looked into street view image first at 100 meters
Look for the object scene;If do not found, then the object scene is searched in the street view image at 500 meters;If do not had also
Find, finally search the object scene in the street view image at 1000 meters.
Step S403:When finding the object scene in the street view image, by the map application,
According to the positional information of the current location information of the terminal user and the object scene in the street view image, determine
From the terminal user to the routing information of the object scene.
Step S404:The routing information is included in the image to be checked.
The present embodiment, will be from photograph by using the streetscape pattern in the camera-enabled and map application of mobile terminal
Scenery in the object scene and streetscape pattern chosen in piece is contrasted, so that it is determined that go out the particular location of object scene, and
The path from the current location of terminal user to the object scene is cooked up using map application.Work as mobile terminal user
The only image information of object scene, but when not knowing the specific name or positional information of the object scene, pass through this implementation
Method can be determined to the routing information up to the scenery.
The 3rd embodiment of the application proposes a kind of recognition methods of object scene, the mobile end applied to above-mentioned introduction
In end, as shown in figure 5, methods described specifically includes following steps:
Step S501:Image to be checked is obtained by the camera of mobile terminal, and determined in the image to be checked
Object scene.
In the present embodiment, using the camera function of mobile terminal, image to be checked, terminal user are obtained by camera
Object scene is selected from the image to be checked.Preferably, terminal user can be by the mode of focusing from described to be checked
Select object scene in image, or terminal user selects target scape by clicking on display screen from the image to be checked
Thing.
Specifically, terminal user selects object scene by focusing mode from the image to be checked, including:
Recognize the focal position in the image to be checked, and based on the focal position, using edge detection algorithm, really
Set the goal the edge contour of scenery, so that it is determined that the characteristic area of the object scene gone out in the image to be checked.
In the present embodiment, can use edge detection algorithm of the prior art, such as Robert edge algorithms,
Prewitt edge algorithms, Sobel edge algorithms, Kirsch edge algorithms and Laplacian algorithms etc. carry out edge to image
The detection of profile.
Step S502:The map application on mobile terminal is called, the current location information of terminal user is obtained, and
Street view image based on the current location information.
Specifically, step S502, including:
Step A1:The map application on mobile terminal is called, into streetscape pattern (or panning mode).
Step A2:Obtain the current location information of terminal user.
Step A3:Using the current location information of the terminal user as starting point, from the close-by examples to those far off, obtain in the range of different distance
Street view image.For example, the distance for getting the current location respectively is the first distance value, second distance value and the 3rd distance
The street view image of value, wherein, first distance value is less than the second distance value, and the second distance value is less than the described 3rd
Distance value.
Step S503:The object scene is compared with the street view image, judges whether the object scene deposits
It is in the street view image.
Specifically, step S503, including:
The object scene is searched in the street view image in the range of different distance successively according to order from the close-by examples to those far off.Example
Such as, first determine whether that the object scene whether there is in the street view image of first distance value, if it is not, then continuing to judge institute
State object scene to whether there is in the street view image of the second distance value, if it is not, then continuing to judge that the object scene is
It is no to be present in the street view image of the 3rd distance value.
Step S504:, will by the map application when the object scene is present in the street view image
The Name & Location presentation of information of the object scene is in the image to be checked.
Step S505:When the object scene is present in the street view image, pass through the map application, root
According to positional information of the current location information and the object scene of the terminal user in the street view image, determine from
The terminal user to the object scene routing information, and by the routing information include in the image to be checked.
Wherein, step S502- steps S503 is handled in the running background of mobile terminal, i.e., when terminal user opens phase
Machine function simultaneously determined after object scene, can be by the object scene after the above-mentioned operation processing on the backstage of mobile terminal
Relevant information is directly displayed on the display screen of mobile terminal, in order to checking for terminal user.
The fourth embodiment of the application proposes a kind of identification equipment of object scene, for above-mentioned mobile terminal, such as Fig. 1
Shown, the photo synthesis device is specifically included:Processor 110, memory 109 and communication bus;
The communication bus is used to realize the connection communication between processor 110 and memory 109;
Processor 110 is used to perform the photo synthesis program stored in memory 109, to realize following steps:
Image to be checked is obtained, and determines the object scene in the image to be checked;
By the map application on the mobile terminal, the streetscape of the current location information based on terminal user is obtained
Image, and search in the street view image object scene;
When finding the object scene in the street view image, by the object scene in the street view image
Presentation of information in the image to be checked.
Specifically, processor 110 is specifically included when realizing the step for obtaining image to be checked:
The image to be checked is obtained by the camera of the mobile terminal;Or, it is pre- from the mobile terminal
The image to be checked is obtained in the image first stored.
Further, processor 110 is in the step of the object scene during the image to be checked is determined in realization, specifically
Including:
The image to be checked is shown by the display screen of mobile terminal, the position letter that user clicks on the display screen is obtained
Breath, based on the positional information, and utilizes edge detection algorithm of the prior art, such as Robert edge algorithms, Prewitt
Edge algorithms, Sobel edge algorithms, Kirsch edge algorithms and Laplacian algorithms etc., determine the area of object scene
Domain.
Further, processor 110 is being realized by the map application on the mobile terminal, is obtained and is based on terminal
The street view image of the current location information of user, and when searching in the street view image step of the object scene, specifically
Including:
By the map application, using the current location information of the terminal user as starting point, from the close-by examples to those far off, obtain
Street view image in the range of different distance, and according to order from the close-by examples to those far off successively in the street view image in the range of different distance
Search the object scene.
Mounted map application on mobile terminal is called in the present embodiment, is obtained eventually by map application
The current location information of end subscriber, and the acquisition different distance model of the current location information based on the terminal user from the close-by examples to those far off
Enclose the street view image at value.The street view image is obtained by the streetscape pattern (or panning mode) of map application.
The object scene is searched in the street view image at different distance value range again, compared, so that it is determined that the target
Scenery is with the presence or absence of in the street view image at different distance value range.
Further, information of the object scene in the street view image, including:Object scene is in the streetscape figure
Name information as in, and positional information of the object scene in the street view image.
By the way that the object scene is searched in the street view image at different distance value range, compared, if described
Object scene is present in street view image, then obtains positional information, scenery name of the object scene in the street view image
Claim the relevant informations such as information, scenery introduction.
Further, processor 110 is additionally operable to perform the recognizer of the object scene, to realize following steps:
When finding the object scene in the street view image, by the map application, according to described
The positional information of the current location information of terminal user and the object scene in the street view image, was determined from the end
End subscriber to the object scene routing information, and by the routing information include in the image to be checked.
The 5th embodiment of the application proposes a kind of computer-readable recording medium, the computer-readable recording medium
The recognizer for the object scene that is stored with;
When the recognizer of the object scene is by least one computing device, cause at least one described processor
Perform following operate:
Image to be checked is obtained, and determines the object scene in the image to be checked;
By the map application on the mobile terminal, the streetscape of the current location information based on terminal user is obtained
Image, and search in the street view image object scene;
When finding the object scene in the street view image, by the object scene in the street view image
Presentation of information in the image to be checked.
It should be noted that herein, term " comprising ", "comprising" or its any other variant are intended to non-row
His property is included, so that process, method, article or device including a series of key elements not only include those key elements, and
And also including other key elements being not expressly set out, or also include for this process, method, article or device institute inherently
Key element.In the absence of more restrictions, the key element limited by sentence "including a ...", it is not excluded that including this
Also there is other identical element in process, method, article or the device of key element.
The embodiments of the present invention are for illustration only, and the quality of embodiment is not represented.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side
Method can add the mode of required general hardware platform to realize by software, naturally it is also possible to by hardware, but in many cases
The former is more preferably embodiment.Understood based on such, technical scheme is substantially done to prior art in other words
Going out the part of contribution can be embodied in the form of software product, and the computer software product is stored in a storage medium
In (such as ROM/RAM, magnetic disc, CD), including some instructions are to cause a station terminal (can be mobile phone, computer, service
Device, air conditioner, or network equipment etc.) perform method described in each of the invention embodiment.
Embodiments of the invention are described above in conjunction with accompanying drawing, but the invention is not limited in above-mentioned specific
Embodiment, above-mentioned embodiment is only schematical, rather than restricted, one of ordinary skill in the art
Under the enlightenment of the present invention, in the case of present inventive concept and scope of the claimed protection is not departed from, it can also make a lot
Form, these are belonged within the protection of the present invention.
Claims (10)
1. a kind of recognition methods of object scene, it is characterised in that applied in mobile terminal, the identification side of the object scene
Method, including:
Image to be checked is obtained, and determines the object scene in the image to be checked;
By the map application on the mobile terminal, the streetscape figure of the current location information based on terminal user is obtained
Picture, and search in the street view image object scene;
When finding the object scene in the street view image, by letter of the object scene in the street view image
Breath is shown in the image to be checked.
2. the recognition methods of object scene according to claim 1, it is characterised in that the acquisition image to be checked, bag
Include:
The image to be checked is obtained by the camera of the mobile terminal;Or, deposited in advance from the mobile terminal
The image to be checked is obtained in the image of storage.
3. the recognition methods of object scene according to claim 1, it is characterised in that described by the mobile terminal
Map application, obtain the street view image of the current location information based on terminal user, and looked into the street view image
The object scene is looked for, including:
By the map application, using the current location information of the terminal user as starting point, from the close-by examples to those far off, obtain different
Street view image in distance range, and searched successively in the street view image in the range of different distance according to order from the close-by examples to those far off
The object scene.
4. the recognition methods of object scene according to claim 1, it is characterised in that the object scene is in the streetscape
Information in image, including:Name information of the object scene in the street view image, and the object scene is in institute
State the positional information in street view image.
5. the recognition methods of object scene according to claim 4, it is characterised in that methods described also includes:
When finding the object scene in the street view image, by the map application, according to the terminal
The positional information of the current location information of user and the object scene in the street view image, determines and is used from the terminal
Family to the object scene routing information, and by the routing information include in the image to be checked.
6. a kind of identification equipment of object scene, it is characterised in that applied in mobile terminal, the identification of the object scene is set
It is standby to include:Processor, memory and communication bus;
The communication bus is used to realize the connection communication between processor and memory;
The processor is used for the recognizer for performing the object scene stored in memory, to realize following steps:
Image to be checked is obtained, and determines the object scene in the image to be checked;
By the map application on the mobile terminal, the streetscape figure of the current location information based on terminal user is obtained
Picture, and search in the street view image object scene;
When finding the object scene in the street view image, by letter of the object scene in the street view image
Breath is shown in the image to be checked.
7. the identification equipment of object scene according to claim 6, it is characterised in that the processor is being realized by institute
The map application on mobile terminal is stated, the street view image of the current location information based on terminal user is obtained, and described
When the step of the object scene is searched in street view image, specifically include:
By the map application, using the current location information of the terminal user as starting point, from the close-by examples to those far off, obtain different
Street view image in distance range, and searched successively in the street view image in the range of different distance according to order from the close-by examples to those far off
The object scene.
8. the identification equipment of object scene according to claim 6, it is characterised in that the object scene is in the streetscape
Information in image, including:Name information of the object scene in the street view image, and the object scene is in institute
State the positional information in street view image.
9. the identification equipment of object scene according to claim 8, it is characterised in that the processor is additionally operable to perform institute
The recognizer of object scene is stated, to realize following steps:
When finding the object scene in the street view image, by the map application, according to the terminal
The positional information of the current location information of user and the object scene in the street view image, determines and is used from the terminal
Family to the object scene routing information, and by the routing information include in the image to be checked.
10. a kind of computer-readable recording medium, it is characterised in that the computer-readable recording medium storage has object scene
Recognizer;
When the recognizer of the object scene is by least one computing device, cause at least one described computing device
The step of recognition methods of object scene any one of the claim 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710480671.6A CN107239567A (en) | 2017-06-22 | 2017-06-22 | A kind of recognition methods of object scene, equipment and computer-readable recording medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710480671.6A CN107239567A (en) | 2017-06-22 | 2017-06-22 | A kind of recognition methods of object scene, equipment and computer-readable recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107239567A true CN107239567A (en) | 2017-10-10 |
Family
ID=59987734
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710480671.6A Pending CN107239567A (en) | 2017-06-22 | 2017-06-22 | A kind of recognition methods of object scene, equipment and computer-readable recording medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107239567A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108765970A (en) * | 2018-08-07 | 2018-11-06 | 普行智能停车(深圳)有限公司 | The method and apparatus of curb parking automatic identification automotive license plate |
CN109302670A (en) * | 2018-10-18 | 2019-02-01 | 珠海格力电器股份有限公司 | Facility information display methods, server, device and system |
CN110019628A (en) * | 2017-12-27 | 2019-07-16 | 努比亚技术有限公司 | Localization method, mobile terminal and computer readable storage medium |
CN111292372A (en) * | 2020-02-24 | 2020-06-16 | 当家移动绿色互联网技术集团有限公司 | Target object positioning method, target object positioning device, storage medium and electronic equipment |
CN111832560A (en) * | 2020-06-23 | 2020-10-27 | 维沃移动通信有限公司 | Information output method, device, equipment and medium |
CN112100418A (en) * | 2020-09-11 | 2020-12-18 | 北京百度网讯科技有限公司 | Method and device for inquiring historical street view, electronic equipment and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102565831A (en) * | 2012-02-13 | 2012-07-11 | 惠州市德赛西威汽车电子有限公司 | System and method for utilizing portable intelligent terminal to assist in automatic navigation |
CN103903428A (en) * | 2014-03-24 | 2014-07-02 | 宇龙计算机通信科技(深圳)有限公司 | Method, terminal and system for booking taxi |
CN104281840A (en) * | 2014-09-28 | 2015-01-14 | 无锡清华信息科学与技术国家实验室物联网技术中心 | Method and device for positioning and identifying building based on intelligent terminal |
CN105045935A (en) * | 2015-09-09 | 2015-11-11 | 北京奇虎科技有限公司 | Method for recommending position information and electronic equipment |
US9460120B2 (en) * | 2010-10-01 | 2016-10-04 | Microsoft Licensing Technology, LLC | Travel route planning using geo-tagged photographs |
-
2017
- 2017-06-22 CN CN201710480671.6A patent/CN107239567A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9460120B2 (en) * | 2010-10-01 | 2016-10-04 | Microsoft Licensing Technology, LLC | Travel route planning using geo-tagged photographs |
CN102565831A (en) * | 2012-02-13 | 2012-07-11 | 惠州市德赛西威汽车电子有限公司 | System and method for utilizing portable intelligent terminal to assist in automatic navigation |
CN103903428A (en) * | 2014-03-24 | 2014-07-02 | 宇龙计算机通信科技(深圳)有限公司 | Method, terminal and system for booking taxi |
CN104281840A (en) * | 2014-09-28 | 2015-01-14 | 无锡清华信息科学与技术国家实验室物联网技术中心 | Method and device for positioning and identifying building based on intelligent terminal |
CN105045935A (en) * | 2015-09-09 | 2015-11-11 | 北京奇虎科技有限公司 | Method for recommending position information and electronic equipment |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110019628A (en) * | 2017-12-27 | 2019-07-16 | 努比亚技术有限公司 | Localization method, mobile terminal and computer readable storage medium |
CN110019628B (en) * | 2017-12-27 | 2023-12-29 | 努比亚技术有限公司 | Positioning method, mobile terminal and computer readable storage medium |
CN108765970A (en) * | 2018-08-07 | 2018-11-06 | 普行智能停车(深圳)有限公司 | The method and apparatus of curb parking automatic identification automotive license plate |
CN109302670A (en) * | 2018-10-18 | 2019-02-01 | 珠海格力电器股份有限公司 | Facility information display methods, server, device and system |
CN111292372A (en) * | 2020-02-24 | 2020-06-16 | 当家移动绿色互联网技术集团有限公司 | Target object positioning method, target object positioning device, storage medium and electronic equipment |
CN111832560A (en) * | 2020-06-23 | 2020-10-27 | 维沃移动通信有限公司 | Information output method, device, equipment and medium |
CN112100418A (en) * | 2020-09-11 | 2020-12-18 | 北京百度网讯科技有限公司 | Method and device for inquiring historical street view, electronic equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107239567A (en) | A kind of recognition methods of object scene, equipment and computer-readable recording medium | |
CN106961706A (en) | Method, mobile terminal and the computer-readable recording medium of communication pattern switching | |
CN108234295A (en) | Display control method, terminal and the computer readable storage medium of group's functionality controls | |
CN107329682A (en) | Edge exchange method and mobile terminal | |
CN107770725A (en) | Arcade shop premises air navigation aid and mobile terminal | |
CN107566635A (en) | Screen intensity method to set up, mobile terminal and computer-readable recording medium | |
CN107145385A (en) | A kind of multitask interface display methods, mobile terminal and computer-readable storage medium | |
CN107704176A (en) | A kind of picture-adjusting method and terminal | |
CN106953684A (en) | A kind of method for searching star, mobile terminal and computer-readable recording medium | |
CN108241752A (en) | Photo display methods, mobile terminal and computer readable storage medium | |
CN107566980A (en) | The localization method and mobile terminal of a kind of mobile terminal | |
CN110007816A (en) | A kind of display area determines method, terminal and computer readable storage medium | |
CN107657583A (en) | A kind of screenshot method, terminal and computer-readable recording medium | |
CN107239205A (en) | A kind of photographic method, mobile terminal and storage medium | |
CN108172161A (en) | Display methods, mobile terminal and computer readable storage medium based on flexible screen | |
CN110180181A (en) | Screenshot method, device and the computer readable storage medium of Wonderful time video | |
CN107979667A (en) | Double-screen display method, mobile terminal and computer-readable recording medium | |
CN109525699A (en) | Good friend's adding method, device, mobile terminal and readable storage medium storing program for executing | |
CN107566608A (en) | A kind of system air navigation aid, equipment and computer-readable recording medium | |
CN107729104A (en) | A kind of display methods, mobile terminal and computer-readable storage medium | |
CN107948397A (en) | A kind of information-pushing method, device and computer-readable recording medium | |
CN107687854A (en) | A kind of indoor navigation method, terminal and computer-readable recording medium | |
CN107133350A (en) | Data-updating method, mobile terminal and storage medium based on search engine | |
CN107203278A (en) | One-handed performance input method, mobile terminal and storage medium | |
CN108322611A (en) | A kind of screen locking information-pushing method, equipment and computer readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20171010 |