CN106778514A - A kind of method and device for identifying object - Google Patents
A kind of method and device for identifying object Download PDFInfo
- Publication number
- CN106778514A CN106778514A CN201611050424.4A CN201611050424A CN106778514A CN 106778514 A CN106778514 A CN 106778514A CN 201611050424 A CN201611050424 A CN 201611050424A CN 106778514 A CN106778514 A CN 106778514A
- Authority
- CN
- China
- Prior art keywords
- target site
- image
- information
- default
- feature
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
It is a primary object of the present invention to propose a kind of method and device for identifying object, it is intended to which user faces the more image information for the treatment of in solving the problems, such as augmented reality equipment usage scenario of the prior art, and the method includes:Read the image of object;The image of object is carried out into feature recognition according to default image feature, the target site of object is identified;Target site to object is pointed out, and the program reduces the treating capacity that user uses the information in augmented reality device procedures.
Description
Technical field
The present invention relates to communication technique field, more particularly to a kind of method and device for identifying object.
Background technology
Augmented reality (AR), it is a kind of by real world information and the integrated new skill of virtual world information " seamless "
Art, be script in the certain hour of real world, spatial dimension be difficult experience entity information (visual information, sound,
Taste and tactile etc.), by science and technology such as computers, it is superimposed again after analog simulation, by virtual Information application to true generation
Boundary, is perceived by human sensory, so as to reach the sensory experience of exceeding reality.Real environment and virtual object are folded in real time
It is added to same picture or space exists simultaneously.At present, user can be acted on one's own by augmented reality, than
Such as:Play the game of design class, the game of risk labyrinth type and the game of dancing class etc..It can be seen that augmented reality provide the user based on existing
The experience of the interactive complete immersion in the real world.
But user using AR when being acted on one's own, due to receiving virtual information from computer fitting and existing simultaneously
Information in the real world, brain workload is sharply increased, it may be difficult to process substantial amounts of image information in time:Such as, Yong Huke
Can mix up the door for confusing at the moment is that virtual world can be rotated freely or forbids door of unlatching etc. in real world.
The content of the invention
It is a primary object of the present invention to propose a kind of method and device for identifying object, it is intended to solve of the prior art
User faces the problem of more image information in AR equipment usage scenarios.
According to the first aspect of the invention, there is provided a kind of method for identifying object, the method includes:Read object
Image;The image of object is carried out into feature recognition according to default image feature, the target site of object is identified;To object
Target site is pointed out.
Wherein, the target site to object is pointed out, including:The target site of object is exported by augmented reality equipment
Corresponding cue.
Wherein, the target site to object is pointed out, including:Use visual information, auditory information or touch information pair
Target site is pointed out.
Wherein, the image of object is carried out into feature recognition according to default image feature, identifies the target site of object,
Including:The image of object is carried out into feature recognition according to default image feature, the species of object is determined;According to the species of object
Determine the corresponding default category identities of target site and target site of object;Target site to object is pointed out, bag
Include:Default category identities are presented in the target site of object.
Further, the above method also includes:Read object image before, prestore object image feature,
The corresponding identification information of image feature and target site of the target site of object.
According to the second aspect of the invention, there is provided a kind of device for identifying object, the device includes:Read module,
Image for reading object;Identification module, for the image of object to be carried out into feature recognition according to default image feature, knows
Do not go out the target site of object;Reminding module, points out for the target site to object.
Wherein, above-mentioned reminding module specifically for:Carried by the way that the target site of augmented reality equipment output object is corresponding
Show signal.
Wherein, above-mentioned reminding module specifically for:Target site is entered using visual information, auditory information or touch information
Row prompting.
Wherein, above-mentioned identification module includes:Recognition unit, for the image of object to be carried out according to default image feature
Feature recognition, determines the species of object;Determining unit, target site and target for determining object according to the species of object
The corresponding default category identities in position;Wherein, above-mentioned reminding module specifically for:Default species is presented in the target site of object
Mark.
Further, said apparatus also include:Memory module, for before the image of object is read, prestoring thing
The image feature of body, the image feature of the target site of object and the corresponding identification information of target site.
Method provided in an embodiment of the present invention is identified by the target site of the object to recognizing, and user is played
Suggesting effect, reduces the treating capacity that user uses the information in AR device procedures.
Brief description of the drawings
Fig. 1 is the hardware architecture diagram for realizing each optional mobile terminal of embodiment one of the invention;
Fig. 2 is the wireless communication system schematic diagram of mobile terminal as shown in Figure 1;
Fig. 3 is the flow chart of the method for the mark object that first embodiment of the invention is provided;
Fig. 4 is the applicating flow chart of the method for the mark object that second embodiment of the invention is provided;
Fig. 5 is the AR usage scenario schematic diagrames being related in second embodiment of the invention;
Fig. 6 is the schematic diagram that is identified of target site of the object to recognizing in second embodiment of the invention;
Fig. 7 is the structured flowchart of the device of the mark object that third embodiment of the invention is provided.
Specific embodiment
It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not intended to limit the present invention.
The mobile terminal of each embodiment of the invention is realized referring now to Description of Drawings.In follow-up description, use
For represent element such as " module ", " part " or " unit " suffix only for being conducive to explanation of the invention, itself
Not specific meaning.Therefore, " module " can be used mixedly with " part ".
Mobile terminal can be implemented in a variety of manners.For example, the terminal described in the present invention can include such as moving
Phone, smart phone, notebook computer, digit broadcasting receiver, PDA (personal digital assistant), PAD (panel computer), PMP
The mobile terminal of (portable media player), guider etc. and such as numeral TV, desktop computer etc. are consolidated
Determine terminal.Hereinafter it is assumed that terminal is mobile terminal.However, it will be understood by those skilled in the art that, except being used in particular for movement
Outside the element of purpose, construction according to the embodiment of the present invention can also apply to the terminal of fixed type.
Fig. 1 is that the hardware configuration of the mobile terminal for realizing each embodiment of the invention is illustrated.
Mobile terminal 1 00 can include wireless communication unit 110, A/V (audio/video) input block 120, user input
Unit 130, output unit 150, memory 160, interface unit 170, controller 180 and power subsystem 190 etc..Fig. 1 shows
Mobile terminal with various assemblies, it should be understood that being not required for implementing all components for showing.Can be alternatively
Implement more or less component.The element of mobile terminal will be discussed in more detail below.
Wireless communication unit 110 generally includes one or more assemblies, and it allows mobile terminal 1 00 and wireless communication system
Or the radio communication between network.For example, wireless communication unit can include broadcasting reception module 111, mobile communication module
112nd, at least one of wireless Internet module 113, short range communication module 114 and location information module 115.
Broadcasting reception module 111 receives broadcast singal and/or broadcast via broadcast channel from external broadcast management server
Relevant information.Broadcast channel can include satellite channel and/or terrestrial channel.Broadcast management server can be generated and sent
The broadcast singal and/or broadcast related information generated before the server or reception of broadcast singal and/or broadcast related information
And send it to the server of terminal.Broadcast singal can include TV broadcast singals, radio signals, data broadcasting
Signal etc..And, broadcast singal may further include the broadcast singal combined with TV or radio signals.Broadcast phase
Pass information can also be provided via mobile communications network, and in this case, broadcast related information can be by mobile communication mould
Block 112 is received.Broadcast singal can exist in a variety of manners, for example, it can be with the electronics of DMB (DMB)
The form of program guide (EPG), the electronic service guidebooks (ESG) of digital video broadcast-handheld (DVB-H) etc. and exist.Broadcast
Receiver module 111 can receive signal and broadcast by using various types of broadcast systems.Especially, broadcasting reception module 111
Can be wide by using such as multimedia broadcasting-ground (DMB-T), DMB-satellite (DMB-S), digital video
Broadcast-hand-held (DVB-H), forward link media (MediaFLO@) Radio Data System, received terrestrial digital broadcasting integrated service
Etc. (ISDB-T) digit broadcasting system receives digital broadcasting.Broadcasting reception module 111 may be constructed such that and be adapted to provide for extensively
Broadcast the various broadcast systems and above-mentioned digit broadcasting system of signal.Via broadcasting reception module 111 receive broadcast singal and/
Or broadcast related information can be stored in memory 160 (or other types of storage medium).
Mobile communication module 112 sends radio signals to base station (for example, access point, node B etc.), exterior terminal
And at least one of server and/or receive from it radio signal.Such radio signal can be logical including voice
Words signal, video calling signal or the various types of data for sending and/or receiving according to text and/or Multimedia Message.
Wireless Internet module 113 supports the Wi-Fi (Wireless Internet Access) of mobile terminal.The module can be internally or externally
It is couple to terminal.Wi-Fi (Wireless Internet Access) technology involved by the module can include WLAN (WLAN) (Wi-Fi), Wibro
(WiMAX), Wimax (worldwide interoperability for microwave accesses), HSDPA (high-speed downlink packet access) etc..
Short range communication module 114 is the module for supporting junction service.Some examples of short-range communication technology include indigo plant
ToothTM, radio frequency identification (RFID), Infrared Data Association (IrDA), ultra wide band (UWB), purple honeybeeTMEtc..
Location information module 115 is the module for checking or obtaining the positional information of mobile terminal.Location information module
Typical case be GPS (global positioning system).According to current technology, the calculating of location information module 115 is from three or more
The range information and correct time information of many satellites and the Information application triangulation for calculating, so as to according to warp
Degree, latitude and highly accurately calculate three-dimensional current location information.Currently, used for calculating the method for position and temporal information
Three satellites and the position that is calculated by using other satellite correction and the error of temporal information.Additionally, GPS moulds
Block 115 can be by Continuous plus current location information in real time come calculating speed information.
A/V input blocks 120 are used to receive audio or video signal.A/V input blocks 120 can include the He of camera 121
Microphone 122, the static images that 121 pairs, camera is obtained in Video Capture pattern or image capture mode by image capture apparatus
Or the view data of video is processed.Picture frame after treatment may be displayed on display unit 151.Processed through camera 121
Picture frame afterwards can be stored in memory 160 (or other storage mediums) or sent out via wireless communication unit 110
Send, two or more cameras 121 can be provided according to the construction of mobile terminal.Microphone 122 can be in telephone calling model, note
Sound (voice data) is received via microphone in record pattern, speech recognition mode etc. operational mode, and can be by so
Acoustic processing be voice data.Audio (voice) data after treatment can be converted in the case of telephone calling model can
The form for being sent to mobile communication base station via mobile communication module 112 is exported.Microphone 122 can implement various types of making an uproar
Sound eliminates (or suppression) algorithm to eliminate the noise or dry that (or suppression) produces during reception and transmission audio signal
Disturb.
User input unit 130 can generate key input data to control each of mobile terminal according to the order of user input
Plant operation.User input unit 130 allows the various types of information of user input, and can include keyboard, metal dome, touch
Plate (for example, detection due to being touched caused by resistance, pressure, electric capacity etc. change sensitive component), roller, rocking bar etc.
Deng.Especially, when touch pad is superimposed upon on display unit 151 in the form of layer, touch-screen can be formed.
Interface unit 170 is connected the interface that can pass through with mobile terminal 1 00 as at least one external device (ED).For example,
External device (ED) can include wired or wireless head-band earphone port, external power source (or battery charger) port, wired or nothing
Line FPDP, memory card port, the port for connecting the device with identification module, audio input/output (I/O) end
Mouth, video i/o port, ear port etc..Identification module can be that storage uses each of mobile terminal 1 00 for verifying user
Kind of information and subscriber identification module (UIM), client identification module (SIM), Universal Subscriber identification module (USIM) can be included
Etc..In addition, the device (hereinafter referred to as " identifying device ") with identification module can take the form of smart card, therefore, know
Other device can be connected via port or other attachment means with mobile terminal 1 00.Interface unit 170 can be used for reception and come from
The input (for example, data message, electric power etc.) of the external device (ED) and input that will be received is transferred in mobile terminal 1 00
One or more elements can be used for transmitting data between mobile terminal and external device (ED).
In addition, when mobile terminal 1 00 is connected with external base, interface unit 170 can serve as allowing by it by electricity
Power provides to the path of mobile terminal 1 00 from base or can serve as allowing the various command signals being input into from base to pass through it
It is transferred to the path of mobile terminal.Whether accurate identification mobile terminal is can serve as from the various command signals or electric power of base input
True the signal on base.Output unit 150 is configured to provide output letter with vision, audio and/or tactile manner
Number (for example, audio signal, vision signal, alarm signal, vibration signal etc.).Output unit 150 can include display unit
151st, dio Output Modules 152, alarm unit 153 etc..
Display unit 151 may be displayed on the information processed in mobile terminal 1 00.For example, when mobile terminal 1 00 is in electricity
During words call mode, display unit 151 can show and converse or other communicate (for example, text messaging, multimedia file
Download etc.) related user interface (UI) or graphic user interface (GUI).When mobile terminal 1 00 is in video calling pattern
Or during image capture mode, display unit 151 can show the image of capture and/or the image of reception, show video or figure
UI or GUI of picture and correlation function etc..
Meanwhile, when display unit 151 and touch pad in the form of layer it is superposed on one another to form touch-screen when, display unit
151 can serve as input unit and output device.Display unit 151 can include liquid crystal display (LCD), thin film transistor (TFT)
In LCD (TFT-LCD), Organic Light Emitting Diode (OLED) display, flexible display, three-dimensional (3D) display etc. at least
It is a kind of.Some in these displays may be constructed such that transparence to allow user to be watched from outside, and this is properly termed as transparent
Display, typical transparent display can be, for example, TOLED (transparent organic light emitting diode) display etc..According to specific
Desired implementation method, mobile terminal 1 00 can include two or more display units (or other display devices), for example, moving
Dynamic terminal can include outernal display unit (not shown) and inner display unit (not shown).Touch-screen can be used to detect touch
Input pressure and touch input position and touch input area.
Dio Output Modules 152 can mobile terminal be in call signal reception pattern, call mode, logging mode,
It is that wireless communication unit 110 is received or in memory 160 when under the isotypes such as speech recognition mode, broadcast reception mode
The voice data transducing audio signal of middle storage and it is output as sound.And, dio Output Modules 152 can be provided and movement
The audio output (for example, call signal receives sound, message sink sound etc.) of the specific function correlation that terminal 100 is performed.
Dio Output Modules 152 can include loudspeaker, buzzer etc..
Alarm unit 153 can provide output and be notified to mobile terminal 1 00 with by event.Typical event can be with
Including calling reception, message sink, key signals input, touch input etc..In addition to audio or video is exported, alarm unit
153 can in a different manner provide output with the generation of notification event.For example, alarm unit 153 can be in the form of vibrating
Output is provided, when calling, message or some other entrance communication (incoming communication) are received, alarm list
Unit 153 can provide tactile output (that is, vibrating) to notify to user.Exported by providing such tactile, even if
When in pocket of the mobile phone of user in user, user also can recognize that the generation of various events.Alarm unit 153
The output of the generation of notification event can be provided via display unit 151 or dio Output Modules 152.
Memory 160 can store software program for the treatment and control operation performed by controller 180 etc., Huo Zheke
Temporarily to store oneself data (for example, telephone directory, message, still image, video etc.) through exporting or will export.And
And, memory 160 can store the vibration of various modes on being exported when touching and being applied to touch-screen and audio signal
Data.
Memory 160 can include the storage medium of at least one type, and the storage medium includes flash memory, hard disk, many
Media card, card-type memory (for example, SD or DX memories etc.), random access storage device (RAM), static random-access storage
Device (SRAM), read-only storage (ROM), Electrically Erasable Read Only Memory (EEPROM), programmable read only memory
(PROM), magnetic storage, disk, CD etc..And, mobile terminal 1 00 can perform memory with by network connection
The network storage device cooperation of 160 store function.
The overall operation of the generally control mobile terminal of controller 180.For example, controller 180 is performed and voice call, data
Communication, video calling etc. related control and treatment.In addition, controller 180 can be included for reproducing (or playback) many matchmakers
The multi-media module 181 of volume data, multi-media module 181 can be constructed in controller 180, or can be structured as and control
Device 180 is separated.Controller 180 can be with execution pattern identifying processing, the handwriting input that will be performed on the touchscreen or picture
Draw input and be identified as character or image.
Power subsystem 190 receives external power or internal power under the control of controller 180 and provides operation each unit
Appropriate electric power needed for part and component.
Various implementation methods described herein can be with use such as computer software, hardware or its any combination of calculating
Machine computer-readable recording medium is implemented.Implement for hardware, implementation method described herein can be by using application-specific IC
(ASIC), digital signal processor (DSP), digital signal processing device (DSPD), programmable logic device (PLD), scene can
Programming gate array (FPGA), processor, controller, microcontroller, microprocessor, it is designed to perform function described herein
At least one in electronic unit is implemented, and in some cases, such implementation method can be implemented in controller 180.
For software implementation, the implementation method of such as process or function can with allow to perform the single of at least one function or operation
Software module is implemented.Software code can be come by the software application (or program) write with any appropriate programming language
Implement, software code can be stored in memory 160 and performed by controller 180.
So far, oneself according to its function through describing mobile terminal.In addition, the mobile terminal 1 00 in the embodiment of the present invention can
To be such as folded form, board-type, oscillating-type, sliding-type and other various types of mobile terminals, do not limit herein specifically
It is fixed.
Mobile terminal 1 00 as shown in Figure 1 may be constructed such that using via frame or packet transmission data it is all if any
Line and wireless communication system and satellite-based communication system are operated.
The communication system that mobile terminal wherein of the invention can be operated is described referring now to Fig. 2.
Such communication system can use different air interface and/or physical layer.For example, used by communication system
Air interface includes such as frequency division multiple access (FDMA), time division multiple acess (TDMA), CDMA (CDMA) and universal mobile communications system
System (UMTS) (especially, Long Term Evolution (LTE)), global system for mobile communications (GSM) etc..As non-limiting example, under
The description in face is related to cdma communication system, but such teaching is equally applicable to other types of system.
With reference to Fig. 2, cdma wireless communication system can include multiple mobile terminal 1s 00, multiple base station (BS) 270, base station
Controller (BSC) 275 and mobile switching centre (MSC) 280.MSC 280 is configured to and Public Switched Telephony Network (PSTN)
290 form interface.MSC 280 is also structured to be formed with the BSC 275 that can be couple to base station 270 via back haul link and connects
Mouthful.If any one in the interface that back haul link can be known according to Ganji is constructed, the interface can include such as Europe mark
Quasi- high power capacity digital circuit/Unite States Standard high power capacity digital circuit (E1/T1), asynchronous transfer mode (ATM), procotol
(IP), point-to-point protocol (PPP), frame relay, high-bit-rate digital subscriber line road (HDSL), Asymmetrical Digital Subscriber Line (ADSL)
Or all kinds digital subscriber line (xDSL).It will be appreciated that system can include multiple BSC 275 as shown in Figure 2.
Each BS 270 can service one or more subregions (or region), by multidirectional antenna or the day of sensing specific direction
Each subregion of line covering is radially away from BS 270.Or, each subregion can by two for diversity reception or more
Multiple antennas are covered.Each BS 270 may be constructed such that the multiple frequency distribution of support, and the distribution of each frequency has specific frequency
Spectrum (for example, 1.25MHz, 5MHz etc.).
What subregion and frequency were distributed intersects can be referred to as CDMA Channel.BS 270 can also be referred to as base station transceiver
System (BTS) or other equivalent terms.In this case, term " base station " can be used for broadly representing single BSC
275 and at least one BS 270.Base station can also be referred to as " cellular station ".Or, each subregion of specific BS 270 can be claimed
It is multiple cellular stations.
As shown in Figure 2, broadcast singal is sent to broadcsting transmitter (BT) 295 mobile terminal operated in system
100.Broadcasting reception module 111 as shown in Figure 1 is arranged at mobile terminal 1 00 to receive the broadcast sent by BT 295
Signal.In fig. 2 it is shown that several global positioning system (GPS) satellites 300.Satellite 300 helps position multiple mobile terminals
At least one of 100.
In fig. 2, multiple satellites 300 are depicted, it is understood that be, it is possible to use any number of satellite obtains useful
Location information.Location information module 115 as shown in Figure 1 is generally configured to coordinate desired to obtain with satellite 300
Location information.Substitute GPS tracking techniques or outside GPS tracking techniques, it is possible to use the position of mobile terminal can be tracked
Other technologies.In addition, at least one gps satellite 300 can optionally or additionally process satellite dmb transmission.
Used as a typical operation of wireless communication system, BS 270 receives the reverse strand from various mobile terminal 1s 00
Road signal.Mobile terminal 1 00 generally participates in call, information receiving and transmitting and other types of communication.Each of the reception of certain base station 270
Reverse link signal is processed in specific BS 270.The data of acquisition are forwarded to the BSC 275 of correlation.BSC is provided
Call resource allocation and the mobile management function of the coordination including the soft switching process between BS 270.BSC 275 will also be received
To data be routed to MSC 280, its provide for PSTN 290 formed interface extra route service.Similarly,
PSTN 290 and MSC 280 form interface, and MSC and BSC 275 forms interface, and BSC 275 correspondingly control BS 270 with
Forward link signals are sent to mobile terminal 1 00.
First embodiment
A kind of method for identifying object is present embodiments provided, Fig. 3 is the flow chart of the method, as shown in figure 3, the method
Including following treatment, the method can be realized that is, following steps can be performed by AR equipment by AR equipment:
Step 301:Read the image of object;
AR equipment involved in the present embodiment can have gyroscope, acceleration sensor, range sensor, infrared biography
Sensor, camera, input equipment and output equipment etc., wherein can be examined by the gyroscope and acceleration transducer of AR equipment
The information such as motion state, the body gesture of user are measured, meanwhile, camera and infrared sensor according to AR equipment can be read
Take the image information of the object in the local environment of family.
In the present embodiment, the image of the object for reading can be specifically the 3D images or 2D images of the object for reading
Information.
Step 302:The image of the object that will be read carries out feature recognition according to default image feature, identifies object
Target site;
Wherein, target site involved in the present embodiment can be specifically the position that mark is needed in object, to the portion
Position is identified, for pointing out the user position specific some features.
The image of object is carried out into feature recognition according to default image feature, the target site of object is identified, specifically
Can include:It (can be the feature in image recognition that the image of object is carried out into feature recognition according to default image feature
RM), specifically, when according to default image feature and the image of current object carry out contrast match when, when the two
When similarity reaches a preset value, can both determine that current object was the default object of contrast, so that it is determined that the kind of object
Class further determines that what the object is, that is, determine the title of object;Species or title according to object determine object
Target site and the corresponding default category identities of target site;Target site to object is pointed out, including:In object
Target site is presented default category identities.
Wherein, default image feature can specifically include:The image spy of the image feature of object, the target site of object
Levy and object target position identification information;The image feature of object can specifically include the two dimensional image feature of object, with
And three-dimensional parameter feature, or the image feature can also directly including object 3-D view feature;The target site of object
Image feature can include the two dimensional image feature of the target site of object, and three-dimensional parameter feature, or the image feature
The 3-D view feature of the target site of object can directly be included;The identification information at object target position can specifically include:
The identification means and/or identifier of the target site of object.
Based on above-mentioned default image feature, the method that the present embodiment is provided can also include:Read object image it
Before, image feature, the image feature of the target site of object and the corresponding identification information of target site of object are prestored,
Specifically, the image information of a large amount of different objects can be included with one material database of default settings in the material database, and it is right
The target site of these objects carries out signature.
In the present embodiment, the image feature for being recorded in the material database can be updated, specifically can be according to service end
Increased material is updated, it is also possible to which some positions according to user actively to object during to the use of AR equipment are entered
Line identifier is cancelled the AR equipment mark that provides and is updated.
In the present embodiment, different marks represents different implications, can pre-set the implication representated by mark, example
Such as, can according to the degree of danger at object target position be divided into operation it is dangerous, operation potential danger, operate it is safer and
Very safe these grades of operation, by the different mark of these different grade correspondences;Or, it is also possible to according to object target
The difficulty of the operation at position is divided into that operation is highly difficult, operation is more difficult, operation more easily and operation very easily these
These different grade of difficulty are corresponded to different marks by grade of difficulty, specifically, in same usage scenario, can be only
Use a kind of identification means, it is also possible to be applied in combination several identification means.
Step 303:Target site to object is pointed out.
In the present embodiment, carrying out prompting to the target site of object can be carried out using in the following manner:
Identified in the target site display reminding of object, for example, different implications can be pointed out by identifier, had
Body, different marks can be distinguished by different colors, for example, represented using red target site be it is dangerous,
Green represents that target site is safe, manipulable.
After the target site for detecting object is touched, prompt tone or vibration signal are sent, specifically, it is possible to use AR
Range sensor in equipment determine to detect user with the distance at object target position user whether the target of touching object
Position, after the target site for detecting user's touching object, AR equipment i.e. can by way of voice message or vibrations come
Prompting is issued the user with, for example, the target site that user touches is danger zone, then the dangerous voice signal of prompting is provided, or
Sharp pounding.
Method provided in an embodiment of the present invention, is identified by the target site of the object recognized to AR equipment, right
User serves suggesting effect, the treating capacity that user uses the information in AR device procedures is reduced, so that user more accelerates
Judging for speed, while can also avoid user that the part of some dangerous objects or object is operated due to erroneous judgement, is improve
User uses the security in AR device procedures.
Second embodiment
The present embodiment relates generally to the application flow of the method for identifying object, in the present embodiment, the field of the method application
Scape can be specifically that user is acted on one's own using AR equipment, as shown in figure 4, the flow mainly includes following treatment:
Step 401:Read the image information of the object in actual environment;
As shown in figure 5, wherein 50 is to hold the user of AR equipment 51, the i.e. user of AR equipment 51, AR equipment 51 can be with
Read object 52 in actual environment, object 53 and another person 54, user 50 its pass through AR equipment 51 and can watch
Object 52, object 53 and another person 54.
Step 402:The image information of the object to reading carries out the matching analysis with default image feature, identifies object
Operable position;
As shown in figure 5, AR equipment by the image information of object 52, object 53 and another person 5 to reading with
Default image information carries out the matching analysis, identifies the operable position 521 of object 52, and object 53 operable position
531。
Step 403:Assign the operable position for recognizing different identification characteristics values;
If 521 and 531 in Fig. 5 belong to different types of target site, such as 521 is operable position, and 531 are
Dangerous position, then assign 521 and 531 different types of signature identifications.Step 404:The image of reconstruct AR identification displays, passes through
The mode of setting shows identification characteristics value, for example, operable position is identified by different visual information, or by not
Same auditory information is identified to operable position, or enters rower to the operable position by different tactile forms
Know.
In AR equipment, 521 and 531 can be identified with special marking, or can be presented by special color
521 and 531, to play a part of to point out user.
Assuming that in Fig. 5 52 and 53 is respectively door and tap, then as shown in fig. 6, the handle 521 of door 52 then belongs to
Operable position, it can be identified by AR equipment, and can prompt the user with the operable position, and, tap 53 is opened
Close 531 and belong to operable position, it can also be identified by AR equipment, and user points out the position.
It should be noted that illustrated so that operable position is as target site as an example in the present embodiment, but the present embodiment
The operable position that is not limited to only to object of scheme point out, the mesh position identifies the need for specifically can also be on object
Any position.
The method that the present embodiment is provided captures the object in reality scene and it is analyzed by AR equipment, so as to know
Do not go out controllable position in object;Special identifier is carried out to operable position using presupposed solution simultaneously.This identification is simultaneously identified
The method at the controllable position of object, on the one hand reduces the information processing capacity during user uses AR device procedures, so that user
More quickly make correct judgement;On the other hand user can then be avoided because faulty operation is dangerous or unnecessary object portion
Part and the hazard event that occurs.
3rd embodiment
A kind of device for identifying object is present embodiments provided, the device can apply in AR equipment, and Fig. 7 is the device
Structured flowchart, as shown in fig. 7, the device 70 include following part:
Read module 71, the image for reading object;
In the present embodiment, the image of the object for reading can be specifically the 3D images or 2D images of the object for reading
Information.
Identification module 72, for the image of object to be carried out into feature recognition according to default image feature, identifies object
Target site;
In the present embodiment, identification module 72 can specifically include:Recognition unit, for by the image of object according to default
Image feature carry out feature recognition, determine the species of object;Determining unit, the mesh for determining object according to the species of object
Mark position and the corresponding default category identities of target site;Reminding module specifically for:It is presented pre- in the target site of object
If category identities.
Wherein, target site involved in the present embodiment can be specifically the position that mark is needed in object, and mark is then
For pointing out the user position specific some features.
The image of object is carried out into feature recognition according to default image feature, the target site of object is identified, specifically
Can include:The image of object is carried out into feature recognition (the feature recognition side i.e. in image recognition according to default image feature
Formula), specifically, when according to default image feature and the image of current object carry out contrast match when, when the similarity of the two
When reaching a preset value, both it was determined that current object is the default object of contrast, so that it is determined that the species of object or entering
What the determination current object of one step is, that is, determine the title of object;Species or title according to object determine the target of object
Position and the corresponding default category identities of target site;Target site to object is pointed out, including:In the target of object
Position is presented default category identities.
Wherein, default image feature can specifically include:The image spy of the image feature of object, the target site of object
Levy and object target position identification information;The image feature of object can specifically include the two dimensional image feature of object, with
And three-dimensional parameter feature, or the image feature can also directly including object 3-D view feature;The target site of object
Image feature can include the two dimensional image feature of the target site of object, and three-dimensional parameter feature, or the image feature
The 3-D view feature of the target site of object can directly be included;The identification information at object target position can specifically include:
The identification means and/or identifier of the target site of object.
Based on above-mentioned default image feature, the method that the present embodiment is provided can also include:Read object image it
Before, image feature, the image feature of the target site of object and the corresponding identification information of target site of object are prestored,
Specifically, the image information of a large amount of different objects can be included with one material database of default settings in the material database, and it is right
The target site of these objects carries out signature.
In the present embodiment, the image feature for being recorded in the material database can be updated, specifically can be according to service end
Increased material, it is also possible to during to the use of AR equipment actively some positions of object are identified according to user or
Cancel the mark that provides of AR equipment to be updated.
In the present embodiment, different marks represents different implications, can pre-set the implication representated by mark, example
Such as, can according to the degree of danger at object target position be divided into operation it is dangerous, operation potential danger, operate it is safer and
Very safe these grades of operation, by the different mark of these different grade of difficulty correspondences;Or, it is also possible to according to object
The difficulty of the operation of target site is divided into that operation is highly difficult, operation is more difficult, operation more easily and operation very easily this
These different grade of difficulty are corresponded to different marks by several grade of difficulty, specifically, in same usage scenario, can
With using only a kind of identification means, it is also possible to which several identification means are applied in combination.
Reminding module 73, points out for the target site to object.
Wherein, reminding module 73 specifically can be used for:The target site correspondence of object is exported by augmented reality AR equipment
Cue.
In the present embodiment, carrying out prompting to the target site of object can be carried out using in the following manner:
Identified in the target site display reminding of object, for example, different implications can be pointed out by identifier, had
Body, different marks can be distinguished by different colors, for example, represented using red target site be it is dangerous,
Green represents that target site is safe, manipulable.
After the target site for detecting object is touched, prompt tone or vibration signal are sent, specifically, it is possible to use AR
Range sensor in equipment determine to detect user with the distance at object target position user whether the target of touching object
Position, after the target site for detecting user's touching object, AR equipment i.e. can by way of voice message or vibrations come
Prompting is issued the user with, for example, the target site that user touches is danger zone, then the dangerous voice signal of prompting is provided, or
Sharp pounding.
The reminding module 73 that the present embodiment is related to specifically for:
The target site is pointed out using visual information, auditory information or touch information.Based on this, it specifically may be used
With including following part:
First Tip element, identifies for the target site display reminding in object;
Second Tip element, after being touched in the target site for detecting object, sends prompt tone or vibration signal.
Further, the device that the present embodiment is provided can also include:
Memory module, for before the image of object is read, prestoring image feature, the target portion of object of object
The corresponding identification information of image feature and target site of position.
It should be noted that herein, term " including ", "comprising" or its any other variant be intended to non-row
His property is included, so that process, method, article or device including a series of key elements not only include those key elements, and
And also include other key elements being not expressly set out, or also include for this process, method, article or device institute are intrinsic
Key element.In the absence of more restrictions, the key element limited by sentence "including a ...", it is not excluded that including this
Also there is other identical element in the process of key element, method, article or device.
The embodiments of the present invention are for illustration only, and the quality of embodiment is not represented.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side
Method can add the mode of required general hardware platform to realize by software, naturally it is also possible to by hardware, but in many cases
The former is more preferably implementation method.Based on such understanding, technical scheme is substantially done to prior art in other words
The part for going out contribution can be embodied in the form of software product, and the computer software product is stored in a storage medium
In (such as ROM/RAM, magnetic disc, CD), including some instructions are used to so that a station terminal equipment (can be mobile phone, computer, clothes
Business device, air-conditioner, or network equipment etc.) perform method described in each embodiment of the invention.
The preferred embodiments of the present invention are these are only, the scope of the claims of the invention is not thereby limited, it is every to utilize this hair
Equivalent structure or equivalent flow conversion that bright specification and accompanying drawing content are made, or directly or indirectly it is used in other related skills
Art field, is included within the scope of the present invention.
Claims (10)
1. it is a kind of identify object method, it is characterised in that methods described includes:
Read the image of object;
The image of the object is carried out into feature recognition according to default image feature, the target site of the object is identified;
Target site to the object is pointed out.
2. method according to claim 1, it is characterised in that the target site to the object is pointed out, bag
Include:
The corresponding cue of target site of the object is exported by augmented reality equipment.
3. method according to claim 1, it is characterised in that the target site to the object is pointed out, bag
Include:
The target site is pointed out using visual information, auditory information or touch information.
4. method according to claim 1, it is characterised in that enter the image of the object according to default image feature
Row feature recognition, identifies the target site of the object, including:
The image of the object is carried out into feature recognition according to default image feature, the species of the object is determined;
Species according to the object determines the corresponding default kind of category of the target site and the target site of the object
Know;
Target site to the object is pointed out, including:The default kind of category is presented in the target site of the object
Know.
5. the method according to Claims 1-4 any one, it is characterised in that methods described also includes:
Before the image of object is read, prestore the image feature of object, the image feature of the target site of object and
The corresponding identification information of the target site.
6. it is a kind of identify object device, it is characterised in that described device includes:
Read module, the image for reading object;
Identification module, for the image of the object to be carried out into feature recognition according to default image feature, identifies the thing
The target site of body;
Reminding module, points out for the target site to the object.
7. device according to claim 6, it is characterised in that the reminding module specifically for:
The corresponding cue of target site of the object is exported by augmented reality equipment.
8. device according to claim 6, it is characterised in that the reminding module specifically for:
The target site is pointed out using visual information, auditory information or touch information.
9. device according to claim 6, it is characterised in that the identification module, including:
Recognition unit, for the image of the object to be carried out into feature recognition according to default image feature, determines the object
Species;
Determining unit, target site and the target site correspondence for determining the object according to the species of the object
Default category identities;
The reminding module specifically for:The default category identities are presented in the target site of the object.
10. the device according to claim 6 to 9 any one, it is characterised in that described device also includes:
Memory module, for before the image of object is read, prestoring the image feature of object, the target site of object
Image feature and the corresponding identification information of the target site.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611050424.4A CN106778514A (en) | 2016-11-24 | 2016-11-24 | A kind of method and device for identifying object |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611050424.4A CN106778514A (en) | 2016-11-24 | 2016-11-24 | A kind of method and device for identifying object |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106778514A true CN106778514A (en) | 2017-05-31 |
Family
ID=58910589
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611050424.4A Pending CN106778514A (en) | 2016-11-24 | 2016-11-24 | A kind of method and device for identifying object |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106778514A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107252495A (en) * | 2017-06-28 | 2017-10-17 | 太仓迪米克斯节能服务有限公司 | A kind of shared sterilization method and its system based on unmanned plane |
CN107281507A (en) * | 2017-06-28 | 2017-10-24 | 太仓迪米克斯节能服务有限公司 | A kind of intelligent classification sterilization method and its equipment based on sterilizing equipment decontaminating apparatus |
CN107320746A (en) * | 2017-08-23 | 2017-11-07 | 苏州浩哥文化传播有限公司 | Intelligent disinfection system and method based on disinfection equipment |
CN107832769A (en) * | 2017-11-09 | 2018-03-23 | 苏州铭冠软件科技有限公司 | Object is located at the visual identity method in environment |
CN109959969A (en) * | 2017-12-26 | 2019-07-02 | 同方威视技术股份有限公司 | Assist safety inspection method, device and system |
CN110458113A (en) * | 2019-08-14 | 2019-11-15 | 旭辉卓越健康信息科技有限公司 | A kind of non-small face identification method cooperated under scene of face |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102663448A (en) * | 2012-03-07 | 2012-09-12 | 北京理工大学 | Network based augmented reality object identification analysis method |
CN103577788A (en) * | 2012-07-19 | 2014-02-12 | 华为终端有限公司 | Augmented reality realizing method and augmented reality realizing device |
CN103679204A (en) * | 2013-12-23 | 2014-03-26 | 上海安琪艾可网络科技有限公司 | Image identification and creation application system and method based on intelligent mobile device platform |
CN103903013A (en) * | 2014-04-15 | 2014-07-02 | 复旦大学 | Optimization algorithm of unmarked flat object recognition |
CN103942558A (en) * | 2013-01-22 | 2014-07-23 | 日电(中国)有限公司 | Method and apparatus for obtaining object detectors |
-
2016
- 2016-11-24 CN CN201611050424.4A patent/CN106778514A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102663448A (en) * | 2012-03-07 | 2012-09-12 | 北京理工大学 | Network based augmented reality object identification analysis method |
CN103577788A (en) * | 2012-07-19 | 2014-02-12 | 华为终端有限公司 | Augmented reality realizing method and augmented reality realizing device |
CN103942558A (en) * | 2013-01-22 | 2014-07-23 | 日电(中国)有限公司 | Method and apparatus for obtaining object detectors |
CN103679204A (en) * | 2013-12-23 | 2014-03-26 | 上海安琪艾可网络科技有限公司 | Image identification and creation application system and method based on intelligent mobile device platform |
CN103903013A (en) * | 2014-04-15 | 2014-07-02 | 复旦大学 | Optimization algorithm of unmarked flat object recognition |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107252495A (en) * | 2017-06-28 | 2017-10-17 | 太仓迪米克斯节能服务有限公司 | A kind of shared sterilization method and its system based on unmanned plane |
CN107281507A (en) * | 2017-06-28 | 2017-10-24 | 太仓迪米克斯节能服务有限公司 | A kind of intelligent classification sterilization method and its equipment based on sterilizing equipment decontaminating apparatus |
CN107281507B (en) * | 2017-06-28 | 2020-06-26 | 何雪琴 | Intelligent classification disinfection method based on disinfection equipment and equipment thereof |
CN107320746A (en) * | 2017-08-23 | 2017-11-07 | 苏州浩哥文化传播有限公司 | Intelligent disinfection system and method based on disinfection equipment |
CN107832769A (en) * | 2017-11-09 | 2018-03-23 | 苏州铭冠软件科技有限公司 | Object is located at the visual identity method in environment |
CN109959969A (en) * | 2017-12-26 | 2019-07-02 | 同方威视技术股份有限公司 | Assist safety inspection method, device and system |
CN109959969B (en) * | 2017-12-26 | 2021-03-12 | 同方威视技术股份有限公司 | Auxiliary security inspection method, device and system |
CN110458113A (en) * | 2019-08-14 | 2019-11-15 | 旭辉卓越健康信息科技有限公司 | A kind of non-small face identification method cooperated under scene of face |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104808944B (en) | Touch operation inducing method and device | |
CN105094613B (en) | Terminal control mechanism and method | |
CN106778514A (en) | A kind of method and device for identifying object | |
CN104898959B (en) | A kind of method and apparatus for adjusting virtual push button position | |
CN104991772B (en) | Remote operation bootstrap technique and device | |
CN105357367B (en) | Recognition by pressing keys device and method based on pressure sensor | |
CN106230597A (en) | Short message verification code checking device and method | |
CN106911850A (en) | Mobile terminal and its screenshotss method | |
CN106550128A (en) | A kind of EMS memory management process and terminal | |
CN106569678A (en) | Display adjusting method and device of suspending operation board and terminal | |
CN106789589A (en) | One kind shares processing method, device and terminal | |
CN106776270A (en) | A kind of code detection method, device and terminal | |
CN106851006A (en) | A kind of apparatus and method for recognizing Quick Response Code | |
CN106570945A (en) | Terminal, check-in machine and check-in method | |
CN106793159A (en) | A kind of screen prjection method and mobile terminal | |
CN106648324A (en) | Hidden icon operating method, device and terminal | |
CN106535228A (en) | System upgrading device and method | |
CN106657579A (en) | Content sharing method, device and terminal | |
CN106791187A (en) | A kind of mobile terminal and NFC method | |
CN107071161A (en) | The aggregation display method and mobile terminal of icon in a kind of status bar | |
CN106547674A (en) | A kind of fingerprint input method, device and terminal | |
CN106778212A (en) | A kind of mobile terminal and control method | |
CN106341687A (en) | Method and device for locating video decoding abnormality | |
CN106125898A (en) | The method and device of screen rotation | |
CN104636044B (en) | The method and mobile terminal of one-handed performance |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170531 |