CN105120050B - A kind of detection method and its terminal - Google Patents
A kind of detection method and its terminal Download PDFInfo
- Publication number
- CN105120050B CN105120050B CN201510355057.8A CN201510355057A CN105120050B CN 105120050 B CN105120050 B CN 105120050B CN 201510355057 A CN201510355057 A CN 201510355057A CN 105120050 B CN105120050 B CN 105120050B
- Authority
- CN
- China
- Prior art keywords
- image
- image acquisition
- terminal
- acquisition units
- range information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Abstract
The embodiment of the invention discloses a kind of terminal, the terminal is provided at least two image acquisition units;The terminal further include: detection unit, for using at least two image acquisition units the first image acquisition units and the second image acquisition units determine the corresponding range information of target object;Processing unit obtains parameter corresponding relationship for parsing the range information;Corresponding first image of the detection table is presented for determining detection table according to the parameter corresponding relationship in determination unit;Acquiring unit, for obtaining the input data based on the first image;First judging unit records judging result for judging whether the input data matches with the first image.The embodiment of the invention also discloses a kind of detection methods.
Description
Technical field
The present invention relates to the application technology of terminal more particularly to a kind of detection methods and its terminal.
Background technique
With scientific and technological progress, electronic equipment is already at indispensable article in for people's lives;Terminal device, especially intelligence
Energy terminal, can provide more and more applications for user.But in the application of existing terminal, detection target pair is not related to
Therefore the application mode of the eyesight of elephant needs a kind of detection method, to enrich user experience.
Summary of the invention
To solve existing technical problem, the embodiment of the invention provides a kind of detection method and its terminals.
The technical solution of the embodiment of the present invention is achieved in that
The embodiment of the invention provides a kind of terminal, the terminal is provided at least two image acquisition units;The end
End further include:
Detection unit, for utilizing the first image acquisition units and the second figure at least two image acquisition units
As acquisition unit determines the corresponding range information of target object;
Processing unit obtains parameter corresponding relationship for parsing the range information;
The detection table corresponding the is presented for according to the parameter corresponding relationship, determining detection table in determination unit
One image;
Acquiring unit, for obtaining the input data based on the first image;
First judging unit records judging result for judging whether the input data matches with the first image.
The embodiment of the invention also provides a kind of detection methods, are applied to terminal;The terminal is provided at least two figures
As acquisition unit;The described method includes:
Using at least two image acquisition units the first image acquisition units and the second image acquisition units it is true
Make the corresponding range information of target object;
The range information is parsed, parameter corresponding relationship is obtained;
According to the parameter corresponding relationship, detection table is determined, corresponding first image of the detection table is presented;
Obtain the input data based on the first image;
Judge whether the input data matches with the first image, records judging result.
Detection method described in the embodiment of the present invention and terminal utilize at least two image acquisition units being arranged in terminal
In the first image acquisition units and the second image acquisition units determine the corresponding range information of target object, parsing it is described away from
From information, parameter corresponding relationship is obtained, according to the parameter corresponding relationship, detection table is determined, it is corresponding that the detection table is presented
The first image, obtain the input data based on the first image, and then judge the input data and the first image
Whether match, record judging result, in this way, realizing the purpose for carrying out eyesight detection using terminal, enriches and improve user's body
It tests.
Detailed description of the invention
The hardware structural diagram of Fig. 1 mobile terminal of each embodiment to realize the present invention;
Fig. 2 is the wireless communication system schematic diagram of mobile terminal as shown in Figure 1;
Fig. 3 is the implementation process schematic diagram one of detection method of the embodiment of the present invention;
Fig. 4 is the corresponding characterization visual chart of range information in the embodiment of the present invention;
Fig. 5 is the schematic diagram for the third image that terminal is presented for the first time in the embodiment of the present invention;
Fig. 6 is the implementation process schematic diagram two of detection method of the embodiment of the present invention;
Fig. 7 is the structural schematic diagram for the terminal that the embodiment of the present invention is provided with the first camera and second camera;
Fig. 8 is dual camera of embodiment of the present invention side away from schematic illustration;
Fig. 9 is the implementation process schematic diagram three of detection method of the embodiment of the present invention;
Figure 10 is the implementation process schematic diagram four of detection method of the embodiment of the present invention;
Figure 11 is the structural schematic diagram one of the terminal of that embodiment of the invention;
Figure 12 is the result schematic diagram two of the terminal of that embodiment of the invention;
Figure 13 is the structural schematic diagram three of the terminal of that embodiment of the invention.
The embodiments will be further described with reference to the accompanying drawings for the realization, the function and the advantages of the object of the present invention.
Specific embodiment
It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not intended to limit the present invention.
The mobile terminal of each embodiment of the present invention is realized in description with reference to the drawings.In subsequent description, use
For indicate element such as " module ", " component " or " unit " suffix only for being conducive to explanation of the invention, itself
There is no specific meanings.Therefore, " module " can be used mixedly with " component ".
Mobile terminal can be implemented in a variety of manners.For example, terminal described in the present invention may include such as moving
Phone, smart phone, laptop, digit broadcasting receiver, PDA (personal digital assistant), PAD (tablet computer), PMP
The mobile terminal of (portable media player), navigation device etc. and such as number TV, desktop computer etc. are consolidated
Determine terminal.Hereinafter it is assumed that terminal is mobile terminal.However, it will be understood by those skilled in the art that in addition to being used in particular for moving
Except the element of purpose, the construction of embodiment according to the present invention can also apply to the terminal of fixed type.
Fig. 1 to realize the present invention the mobile terminal of each embodiment hardware configuration signal.
Mobile terminal 100 may include wireless communication unit 110, A/V (audio/video) input unit 120, user's input
Unit 130, sensing unit 140, output unit 150, memory 160, interface unit 170, controller 180 and power supply unit 190
Etc..Fig. 1 shows the mobile terminal with various assemblies, it should be understood that being not required for implementing all groups shown
Part.More or fewer components can alternatively be implemented.The element of mobile terminal will be discussed in more detail below.
Wireless communication unit 110 generally includes one or more components, allows mobile terminal 100 and wireless communication system
Or the radio communication between network.For example, wireless communication unit may include broadcasting reception module 111, mobile communication module
112, at least one of wireless Internet module 113, short range communication module 114 and location information module 115.
Broadcasting reception module 111 receives broadcast singal and/or broadcast from external broadcast management server via broadcast channel
Relevant information.Broadcast channel may include satellite channel and/or terrestrial channel.Broadcast management server, which can be, to be generated and sent
The broadcast singal and/or broadcast related information generated before the server or reception of broadcast singal and/or broadcast related information
And send it to the server of terminal.Broadcast singal may include TV broadcast singal, radio signals, data broadcasting
Signal etc..Moreover, broadcast singal may further include the broadcast singal combined with TV or radio signals.Broadcast phase
Closing information can also provide via mobile communications network, and in this case, broadcast related information can be by mobile communication mould
Block 112 receives.Broadcast singal can exist in a variety of manners, for example, it can be with the electronics of digital multimedia broadcasting (DMB)
Program guide (EPG), digital video broadcast-handheld (DVB-H) electronic service guidebooks (ESG) etc. form and exist.Broadcast
Receiving module 111 can receive signal broadcast by using various types of broadcast systems.Particularly, broadcasting reception module 111
It can be wide by using such as multimedia broadcasting-ground (DMB-T), digital multimedia broadcasting-satellite (DMB-S), digital video
It broadcasts-holds (DVB-H), forward link media (MediaFLO@) Radio Data System, received terrestrial digital broadcasting integrated service
(ISDB-T) etc. digit broadcasting system receives digital broadcasting.Broadcasting reception module 111, which may be constructed such that, to be adapted to provide for extensively
Broadcast the various broadcast systems and above-mentioned digit broadcasting system of signal.Via the received broadcast singal of broadcasting reception module 111 and/
Or broadcast related information can store in memory 160 (or other types of storage medium).
Mobile communication module 112 sends radio signals to base station (for example, access point, node B etc.), exterior terminal
And at least one of server and/or receive from it radio signal.Such radio signal may include that voice is logical
Talk about signal, video calling signal or according to text and/or Multimedia Message transmission and/or received various types of data.
The Wi-Fi (Wireless Internet Access) of the support mobile terminal of wireless Internet module 113.The module can be internally or externally
It is couple to terminal.Wi-Fi (Wireless Internet Access) technology involved in the module may include WLAN (Wireless LAN) (Wi-Fi), Wibro
(WiMAX), Wimax (worldwide interoperability for microwave accesses), HSDPA (high-speed downlink packet access) etc..
Short range communication module 114 is the module for supporting short range communication.Some examples of short-range communication technology include indigo plant
ToothTM, radio frequency identification (RFID), Infrared Data Association (IrDA), ultra wide band (UWB), purple honeybeeTMEtc..
Location information module 115 is the module for checking or obtaining the location information of mobile terminal.Location information module
Typical case be GPS (global positioning system).According to current technology, GPS module 115, which calculates, comes from three or more satellites
Range information and correct time information and the Information application triangulation for calculating, thus according to longitude, latitude
Highly accurately calculate three-dimensional current location information.Currently, it is defended for the method for calculating position and temporal information using three
Star and the error that calculated position and temporal information are corrected by using an other satellite.In addition, GPS module 115
It can be by Continuous plus current location information in real time come calculating speed information.
A/V input unit 120 is for receiving audio or video signal.A/V input unit 120 may include 121 He of camera
Microphone 1220, camera 121 is to the static map obtained in video acquisition mode or image capture mode by image capture apparatus
The image data of piece or video is handled.Treated, and picture frame may be displayed on display unit 151.At camera 121
Picture frame after reason can store in memory 160 (or other storage mediums) or carry out via wireless communication unit 110
It sends, two or more cameras 1210 can be provided according to the construction of mobile terminal.Microphone 122 can be in telephone relation mould
Sound (audio data) is received via microphone in formula, logging mode, speech recognition mode etc. operational mode, and can be incited somebody to action
Such acoustic processing is audio data.Audio that treated (voice) data can be converted in the case where telephone calling model
For the format output that can be sent to mobile communication base station via mobile communication module 112.Various types can be implemented in microphone 122
Noise eliminate (or inhibit) algorithm with eliminate noise that (or inhibition) generates during sending and receiving audio signal or
Person's interference.
The order that user input unit 130 can be inputted according to user generates key input data to control each of mobile terminal
Kind operation.User input unit 130 allows user to input various types of information, and may include keyboard, metal dome, touch
Plate (for example, the sensitive component of detection due to the variation of resistance, pressure, capacitor etc. caused by being contacted), idler wheel, rocking bar etc.
Deng.Particularly, when touch tablet is superimposed upon in the form of layer on display unit 151, touch screen can be formed.
Sensing unit 140 detects the current state of mobile terminal 100, (for example, mobile terminal 100 opens or closes shape
State), the position of mobile terminal 100, user is for the presence or absence of contact (that is, touch input) of mobile terminal 100, mobile terminal
100 orientation, the acceleration or deceleration movement of mobile terminal 100 and direction etc., and generate for controlling mobile terminal 100
The order of operation or signal.For example, sensing unit 140 can sense when mobile terminal 100 is embodied as sliding-type mobile phone
The sliding-type phone is to open or close.In addition, sensing unit 140 be able to detect power supply unit 190 whether provide electric power or
Whether person's interface unit 170 couples with external device (ED).Sensing unit 140 may include that proximity sensor 1410 will combine below
Touch screen is described this.
Interface unit 170 be used as at least one external device (ED) connect with mobile terminal 100 can by interface.For example,
External device (ED) may include wired or wireless headphone port, external power supply (or battery charger) port, wired or nothing
Line data port, memory card port, the port for connecting the device with identification module, audio input/output (I/O) end
Mouth, video i/o port, ear port etc..Identification module can be storage and use each of mobile terminal 100 for verifying user
It plants information and may include subscriber identification module (UIM), client identification module (SIM), Universal Subscriber identification module (USIM)
Etc..In addition, the device (hereinafter referred to as " identification device ") with identification module can take the form of smart card, therefore, know
Other device can be connect via port or other attachment devices with mobile terminal 100.Interface unit 170, which can be used for receiving, to be come from
The input (for example, data information, electric power etc.) of external device (ED) and the input received is transferred in mobile terminal 100
One or more elements can be used for transmitting data between mobile terminal and external device (ED).
In addition, when mobile terminal 100 is connect with external base, interface unit 170 may be used as allowing will be electric by it
Power, which is provided from pedestal to the path or may be used as of mobile terminal 100, allows the various command signals inputted from pedestal to pass through it
It is transferred to the path of mobile terminal.The various command signals or electric power inputted from pedestal, which may be used as mobile terminal for identification, is
The no signal being accurately fitted on pedestal.Output unit 150 is configured to provide with vision, audio and/or tactile manner defeated
Signal (for example, audio signal, vision signal, alarm signal, vibration signal etc.) out.Output unit 150 may include display
Unit 151, audio output module 152, alarm unit 153 etc..
Display unit 151 may be displayed on the information handled in mobile terminal 100.For example, when mobile terminal 100 is in electricity
When talking about call mode, display unit 151 can show and converse or other communicate (for example, text messaging, multimedia file
Downloading etc.) relevant user interface (UI) or graphic user interface (GUI).When mobile terminal 100 is in video calling mode
Or when image capture mode, display unit 151 can show captured image and/or received image, show video or figure
Picture and the UI or GUI of correlation function etc..
Meanwhile when display unit 151 and touch tablet in the form of layer it is superposed on one another to form touch screen when, display unit
151 may be used as input unit and output device.Display unit 151 may include liquid crystal display (LCD), thin film transistor (TFT)
In LCD (TFT-LCD), Organic Light Emitting Diode (OLED) display, flexible display, three-dimensional (3D) display etc. at least
It is a kind of.Some in these displays may be constructed such that transparence to allow user to watch from outside, this is properly termed as transparent
Display, typical transparent display can be, for example, TOLED (transparent organic light emitting diode) display etc..According to specific
Desired embodiment, mobile terminal 100 may include two or more display units (or other display devices), for example, moving
Dynamic terminal may include outernal display unit (not shown) and inner display unit (not shown).Touch screen can be used for detecting touch
Input pressure and touch input position and touch input area.
Audio output module 152 can mobile terminal be in call signal reception pattern, call mode, logging mode,
It is when under the isotypes such as speech recognition mode, broadcast reception mode, wireless communication unit 110 is received or in memory 160
The audio data transducing audio signal of middle storage and to export be sound.Moreover, audio output module 152 can provide and movement
The relevant audio output of specific function (for example, call signal receives sound, message sink sound etc.) that terminal 100 executes.
Audio output module 152 may include loudspeaker, buzzer etc..
Alarm unit 153 can provide output notifying event to mobile terminal 100.Typical event can be with
Including calling reception, message sink, key signals input, touch input etc..Other than audio or video output, alarm unit
153 can provide output in different ways with the generation of notification event.For example, alarm unit 153 can be in the form of vibration
Output is provided, when receiving calling, message or some other entrance communications (incomingcommunication), alarm list
Member 153 can provide tactile output (that is, vibration) to notify to user.By providing such tactile output, even if
When the mobile phone of user is in the pocket of user, user also can recognize that the generation of various events.Alarm unit 153
The output of the generation of notification event can be provided via display unit 151 or audio output module 152.
Memory 160 can store the software program etc. of the processing and control operation that are executed by controller 180, Huo Zheke
Temporarily to store oneself data (for example, telephone directory, message, still image, video etc.) through exporting or will export.And
And memory 160 can store about the vibrations of various modes and audio signal exported when touching and being applied to touch screen
Data.
Memory 160 may include the storage medium of at least one type, and the storage medium includes flash memory, hard disk, more
Media card, card-type memory (for example, SD or DX memory etc.), random access storage device (RAM), static random-access storage
Device (SRAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), programmable read only memory
(PROM), magnetic storage, disk, CD etc..Moreover, mobile terminal 100 can execute memory with by network connection
The network storage device of 160 store function cooperates.
The overall operation of the usually control mobile terminal of controller 180.For example, controller 180 executes and voice communication, data
Communication, video calling etc. relevant control and processing.In addition, controller 180 may include for reproducing (or playback) more matchmakers
The multi-media module 1810 of volume data, multi-media module 1810 can construct in controller 180, or can be structured as and control
Device 180 processed separates.Controller 180 can be with execution pattern identifying processing, by the handwriting input executed on the touchscreen or figure
Piece draws input and is identified as character or image.
Power supply unit 190 receives external power or internal power under the control of controller 180 and provides operation each member
Electric power appropriate needed for part and component.
Various embodiments described herein can be to use the calculating of such as computer software, hardware or any combination thereof
Machine readable medium is implemented.Hardware is implemented, embodiment described herein can be by using application-specific IC
(ASIC), digital signal processor (DSP), digital signal processing device (DSPD), programmable logic device (PLD), scene can
Programming gate array (FPGA), controller, microcontroller, microprocessor, is designed to execute function described herein processor
At least one of electronic unit is implemented, and in some cases, such embodiment can be implemented in controller 180.
For software implementation, the embodiment of such as process or function can with allow to execute the individual of at least one functions or operations
Software module is implemented.Software code can by the software application (or program) write with any programming language appropriate Lai
Implement, software code can store in memory 160 and be executed by controller 180.
So far, mobile terminal is described according to its function.In the following, for the sake of brevity, will description such as folded form,
Slide type mobile terminal in various types of mobile terminals of board-type, oscillating-type, slide type mobile terminal etc., which is used as, to be shown
Example.Therefore, the present invention can be applied to any kind of mobile terminal, and be not limited to slide type mobile terminal.
Mobile terminal 100 as shown in Figure 1 may be constructed such that using via frame or grouping send data it is all if any
Line and wireless communication system and satellite-based communication system operate.
Referring now to Fig. 2 description communication system that wherein mobile terminal according to the present invention can operate.
Different air interface and/or physical layer can be used in such communication system.For example, used by communication system
Air interface includes such as frequency division multiple access (FDMA), time division multiple acess (TDMA), CDMA (CDMA) and universal mobile communications system
System (UMTS) (particularly, long term evolution (LTE)), global system for mobile communications (GSM) etc..As non-limiting example, under
The description in face is related to cdma communication system, but such introduction is equally applicable to other types of system.
With reference to Fig. 2, cdma wireless communication system may include multiple mobile terminals 100, multiple base stations (BS) 270, base station
Controller (BSC) 275 and mobile switching centre (MSC) 280.MSC280 is configured to and Public Switched Telephony Network (PSTN)
290 form interface.MSC280 is also structured to form interface with the BSC275 that can be couple to base station 270 via back haul link.
Back haul link can be constructed according to any in several known interfaces, and the interface includes such as E1/T1, ATM, IP,
PPP, frame relay, HDSL, ADSL or xDSL.It will be appreciated that system may include multiple BSC2750 as shown in Figure 2.
Each BS270 can service one or more subregions (or region), by multidirectional antenna or the day of direction specific direction
Each subregion of line covering is radially far from BS270.Alternatively, each subregion can be by two or more for diversity reception
Antenna covering.Each BS270, which may be constructed such that, supports multiple frequency distribution, and the distribution of each frequency has specific frequency spectrum
(for example, 1.25MHz, 5MHz etc.).
What subregion and frequency were distributed, which intersects, can be referred to as CDMA Channel.BS270 can also be referred to as base station transceiver
System (BTS) or other equivalent terms.In this case, term " base station " can be used for broadly indicating single
BSC275 and at least one BS270.Base station can also be referred to as " cellular station ".Alternatively, each subregion of specific BS270 can be claimed
For multiple cellular stations.
As shown in Figure 2, broadcast singal is sent to the mobile terminal operated in system by broadcsting transmitter (BT) 295
100.Broadcasting reception module 111 as shown in Figure 1 is arranged at mobile terminal 100 to receive the broadcast sent by BT295
Signal.In fig. 2 it is shown that several global positioning system (GPS) satellites 300.The help of satellite 300 positions multiple mobile terminals
At least one of 100.
In Fig. 2, multiple satellites 300 are depicted, it is understood that, it is useful to can use any number of satellite acquisition
Location information.GPS module 115 as shown in Figure 1 is generally configured to cooperate with satellite 300 to obtain desired positioning and believe
Breath.It substitutes GPS tracking technique or except GPS tracking technique, the other of the position that can track mobile terminal can be used
Technology.In addition, at least one 300 property of can choose of GPS satellite or extraly processing satellite dmb transmission.
As a typical operation of wireless communication system, BS270 receives the reverse link from various mobile terminals 100
Signal.Mobile terminal 100 usually participates in call, information receiving and transmitting and other types of communication.Certain base station 270 is received each anti-
It is handled in specific BS270 to link signal.The data of acquisition are forwarded to relevant BSC275.BSC provides call
The mobile management function of resource allocation and the coordination including the soft switching process between BS270.The number that BSC275 will also be received
According to MSC280 is routed to, the additional route service for forming interface with PSTN290 is provided.Similarly, PSTN290 with
MSC280 forms interface, and MSC and BSC275 form interface, and BSC275 controls BS270 correspondingly with by forward link signals
It is sent to mobile terminal 100.
Based on above-mentioned mobile terminal hardware configuration and communication system, each embodiment of the method for the present invention is proposed.
Embodiment one
Fig. 3 is the implementation process schematic diagram one of detection method of the embodiment of the present invention;The terminal is provided at least two figures
As acquisition unit;As shown in Figure 3, which comprises
Step 301: using at least two image acquisition units the first image acquisition units and the second image adopt
Collection unit determines the corresponding range information of target object;
Step 302: parsing the range information, obtain parameter corresponding relationship;
Step 303: according to the parameter corresponding relationship, determining detection table, corresponding first figure of the detection table is presented
Picture;
Step 304: obtaining the input data based on the first image;
Step 305: judging whether the input data matches with the first image, record judging result.
In the present embodiment, the terminal can be specially the mobile terminals such as mobile phone, tablet computer, or computer etc. is solid
Determine terminal.
In the present embodiment, after the terminal needs to open the first application based on the first operation, at least two figure is triggered
As in acquisition unit the first image acquisition units and the second image acquisition units determine the corresponding range information of target object;
The corresponding range information of the target object is used to characterize the distance between the target object and the terminal.Here, institute
Stating the first application can be specially the corresponding application of measurement eyesight.It will be appreciated by those skilled in the art that first operation can
To be specially clicking operation, such as single-click operation or double click operation, or when the terminal supports touch-control display, described first
Operation can be specially to be implemented on the touch operation of the corresponding icon of first application.
In one embodiment, the first image acquisition unit is set on the corresponding first position of the terminal,
Second image acquisition units are set on the second position of the terminal, the equal position in the first position and the second position
In the same plane of the terminal, for example, the first image acquisition unit and second image acquisition units are respectively provided with
In in the corresponding plane of display unit of the terminal, specifically, the first image acquisition unit and second image are adopted
Collecting unit is front camera, in this way, simultaneously convenient for the first image acquisition unit and second image acquisition units
Image Acquisition is carried out to target object.
In another embodiment, there are three image acquisition units for the terminal setting;Three Image Acquisition lists
In the plane for the display unit that the first image acquisition units and the second image acquisition units in member are all set in the terminal,
I.e. preposition image acquisition units;Third image acquisition units in three image acquisition units are set to single with the display
In the opposite plane of member namely postposition image acquisition units.All image acquisition units can be realized by camera.
In practical applications, to keep the range information determined more accurate, the range information can also be one
Average distance information, specifically, the terminal using the first image acquisition unit and the second image acquisition units repeatedly into
Row distance measurement, determines multiple distances, such as first distance, second distance i.e. N distance, and N is just whole more than or equal to 2
Number, then, determines average value according to the multiple distance, and then using the average value as final range information, in this way,
The accuracy of distance value is improved, is laid the foundation to improve the accuracy of testing result.
In the present embodiment, after the terminal determines the distance between itself and the target object information, the end
End parses the range information, for example, the corresponding distance value of range information is compared the terminal with standard testing table, leads to
Conversion is crossed, obtains parameter corresponding relationship corresponding with the range information, and determine and be somebody's turn to do according to the parameter corresponding relationship
The matched detection table of the corresponding distance value of range information, the size of each image, resolution ratio in the detection table with it is described away from
It is corresponding from information, for example, the terminal according to the distance between target object information, determine and the range information
Corresponding visual chart, and then eyesight detection is provided for the target object, as shown in figure 4, Fig. 4 is and the range information pair
The standard visual acuity chart answered;Further, after the terminal determines detection table corresponding with current distance information, according to default
An image in the detection table, such as the first image is presented in rule, and the preset rules are corresponding when can be detected according to existing eyesight
Detected rule;But when the image in the detection table being presented for the first time, such as third image, the third image with it is described
First image is identical or not identical, and the third image can be any one image in the detection table, that is to say, that described
Third image can be what the terminal was determined at random;Alternatively, the third image is according to the in the preset rules
The corresponding any one image of a certain particular row that one sub-rule is determined;First sub-rule is for specifying in the detection table
A certain particular row, but which column does not limit is specially;That is, the third image is true according to first sub-rule
The corresponding random image of a certain particular row made;As shown in figure 5, the third image that Fig. 5 is as presented for the first time;This field skill
Art personnel should be appreciated that " third " in " first " and third image in the first image is only for distinguishing image,
Not for restriction sequence.
It will be appreciated by those skilled in the art that in practical applications, the first image of presentation is also possible to the end
End is determined according to the second operation for receiving user, for example, determining degree to eyesight degree according to terminal input
The corresponding the first row in the detection table, then, the terminal determine the corresponding image of the first row at random.
In addition, the image that the terminal is presented should be black white image when the detection table is characterization visual chart, in this way,
It is convenient for eyesight detection, to improve accuracy in detection.
In the present embodiment, the terminal shows the first image in the detection table in the display unit of itself, described
Target object observes the first image, and by voice input, key-press input or touches the modes such as output, itself is seen
To the corresponding information of the first image feed back to the terminal, to enable the terminal to get input data, and then basis
The input data itself got image feature information corresponding with the first image is matched, and realizes testing goal.
In the present embodiment, the input data can be the corresponding text data of key of audio data or input, or
The corresponding touch data of person's touch operation;Specifically, the target object can by voice mode input audio data, such as
The voices such as upper and lower, left and right are inputted, terminal is made to get the input data of characterization audio.Alternatively, by four keys in terminal
It corresponds to four direction (upper and lower, left and right), and then direction is determined by key, and then get terminal to characterize direction
Input data;Alternatively, different four region in terminal is corresponding with four direction respectively, and then by being implemented on given zone
The input data that can characterize direction is determined in the touch operation in domain.It will be appreciated by those skilled in the art that if passing through voice side
Formula makes terminal obtain input data, and the terminal needs to open phonetic function, in order to which the voice of user inputs;Here, it opens
At the time of can be to open the first application at the time of phonetic function, namely opens the first application and open phonetic function, Huo Zhegen
Phonetic function is opened in the operation for factually imposing on user interface, also either after executing the step 303, opens voice input
Function.
It is described to judge whether the input data matches with the first image in the present embodiment, it can be specifically, judgement
Whether the corresponding input characteristic parameter of input data image features corresponding with the first image match;Described
With can be specially identical;Judge the corresponding input characteristic parameter of input data image corresponding with the first image
Whether characteristic parameter is identical.In practical application, the input characteristic parameter can characterize directioin parameter, such as upper and lower, left and right;
Described image characteristic parameter can also characterize directioin parameter, such as upper and lower, left and right.
Detection method described in the embodiment of the present invention utilizes at least two image acquisition units being arranged in terminal
One image acquisition units and the second image acquisition units determine the corresponding range information of target object, parse the distance letter
Breath, obtains parameter corresponding relationship, according to the parameter corresponding relationship, determines detection table, and the detection table corresponding the is presented
One image obtains the input data based on the first image, and then judge whether are the input data and the first image
Matching records judging result, in this way, realizing the purpose for carrying out eyesight detection using terminal, enriches and the user experience is improved.
In addition, detection method described in the embodiment of the present invention, be detected using two image acquisition units the terminals with
The distance between the target object, therefore, compared with the mode of existing terminal detecting distance, described in the embodiment of the present invention
Method is more accurate, can more improve accuracy in detection.
Embodiment two
Based on detection method described in embodiment one, the range information how is determined to further clarify, the present invention
Embodiment has refined step 301, specifically, as shown in fig. 6, step 301 includes:
Step 301A: using at least two image acquisition units the first image acquisition units and the second image adopt
Collect unit and Image Acquisition is carried out to the target image, obtains the second image and third image;
Step 301B: the positional relationship letter between the first image acquisition unit and the second image acquisition units is obtained
Breath;
Step 301C: according to the positional relationship information, second image and third image determine target object pair
The range information answered.
In one embodiment, the terminal is smart phone;Fig. 7 is to be provided with the first camera and second camera
Terminal structural schematic diagram;As shown in fig. 7, the first image acquisition unit being arranged in the smart phone passes through first
Camera 71 realizes that second image acquisition units are realized by second camera 72, first camera 71 and second
Camera 72 is front camera.
Further, the embodiment of the present invention realizes ranging process using dual camera side away from method;Fig. 8 is dual camera
Side is away from schematic illustration;As shown in Figure 7 and Figure 8, specifically, using first camera 71 and second camera 72 to described
Target object P carries out Image Acquisition, obtains the first image PlWith the second image Pr, here, O shown in Fig. 8lIt is taken the photograph for described first
As first 71 light source, OrFor the light source of the second camera 72, by the OlWith the OrBetween connecting line as X-axis, such as
X-axis shown in Fig. 8, using in the display plane perpendicular to the direction of the X-axis as Y-axis, Y-axis as shown in Figure 8, to establish
Coordinate system;In this way, obtaining the first image PlX relative to the X-axisl, the second image PrRelative to the X-axis
xr, according to the following formula (1), shift out following formula (2) onto, obtain calculating the terminal and the target according to (2) formula
The distance between object P information Z;The formula (1) and formula (2) are as follows:
Wherein, center of the T between first camera 71 and the second camera 72 away from;The f is coke
Away from.
In this way, the distance between the terminal determined using the above method and described target object information more subject to
Really, it and then can adequately determine parameter corresponding relationship, and determine that detection table is laid a good foundation.
Embodiment three
Fig. 9 is the implementation process schematic diagram three of detection method of the embodiment of the present invention;The terminal is provided at least two figures
As acquisition unit;As shown in Figure 9, which comprises
Step 901: using at least two image acquisition units the first image acquisition units and the second image adopt
Collection unit determines the corresponding range information of target object;
To further determine that whether the range information meets the requirements, the embodiment of the present invention also needs to judge the distance letter
Whether breath is in threshold range;Specifically, terminal enters the first application unlatching distance measurement mode, and target object is away from the end
After holding a certain distance fixed, the terminal detects the distance between the target object and terminal, and the terminal will be examined
The range information measured is compared with threshold range, if the range information detected is not fallen in the threshold range, eventually
End can by display reminding information, loudspeaker play cuing information or vibration etc. modes prompt target object shorten with it is described
The distance between terminal or the distance between elongation and the terminal, until the range information that the terminal detects
Until falling into the preset range.
Step 902: judging whether the range information is in threshold range;
Step 903: when the range information is in the threshold range, parsing the range information, obtain parameter
Corresponding relationship;Then execute step 905;
Step 904: when the range information is not in the threshold range, prompt information is generated, to prompt target
Object adjusts the location information of itself, continues to execute step 901;Until the corresponding range information of target object adjusted is in
Until in the threshold range;
Step 905: according to the parameter corresponding relationship, determining detection table, corresponding first figure of the detection table is presented
Picture;
Step 906: obtaining the input data based on the first image;
Step 907: judging whether the input data matches with the first image, record judging result.
In the present embodiment, the terminal can be specially the mobile terminals such as mobile phone, tablet computer, or computer etc. is solid
Determine terminal.
In the present embodiment, after the terminal needs to open the first application based on the first operation, at least two figure is triggered
As in acquisition unit the first image acquisition units and the second image acquisition units determine the corresponding range information of target object;
The corresponding range information of the target object is used to characterize the distance between the target object and the terminal.Here, institute
Stating the first application can be specially the corresponding application of measurement eyesight.It will be appreciated by those skilled in the art that first operation can
To be specially clicking operation, such as single-click operation or double click operation, or when the terminal supports touch-control display, described first
Operation can be specially to be implemented on the touch operation of the corresponding icon of first application.
In one embodiment, the first image acquisition unit is set on the corresponding first position of the terminal,
Second image acquisition units are set on the second position of the terminal, the equal position in the first position and the second position
In the same plane of the terminal, for example, the first image acquisition unit and second image acquisition units are respectively provided with
In in the corresponding plane of display unit of the terminal, specifically, the first image acquisition unit and second image are adopted
Collecting unit is front camera, in this way, simultaneously convenient for the first image acquisition unit and second image acquisition units
Image Acquisition is carried out to target object.
In another embodiment, there are three image acquisition units for the terminal setting;Three Image Acquisition lists
In the plane for the display unit that the first image acquisition units and the second image acquisition units in member are all set in the terminal,
I.e. preposition image acquisition units;Third image acquisition units in three image acquisition units are set to single with the display
In the opposite plane of member namely postposition image acquisition units.All image acquisition units can be realized by camera.
In practical applications, to keep the range information determined more accurate, the range information can also be one
Average distance information, specifically, the terminal using the first image acquisition unit and the second image acquisition units repeatedly into
Row distance measurement, determines multiple distances, such as first distance, second distance i.e. N distance, and N is just whole more than or equal to 2
Number, then, determines average value according to the multiple distance, and then using the average value as final range information, in this way,
The accuracy of distance value is improved, is laid the foundation to improve the accuracy of testing result.
In the present embodiment, after the terminal determines the distance between itself and the target object information, the end
End parses the range information, for example, the corresponding distance value of range information is compared the terminal with standard testing table, leads to
Conversion is crossed, obtains parameter corresponding relationship corresponding with the range information, and determine and be somebody's turn to do according to the parameter corresponding relationship
The matched detection table of the corresponding distance value of range information, the size of each image, resolution ratio in the detection table with it is described away from
It is corresponding from information, for example, the terminal according to the distance between target object information, determine and the range information
Corresponding visual chart, and then eyesight detection is provided for the target object, as shown in figure 4, Fig. 4 is and the range information pair
The standard visual acuity chart answered;Further, after the terminal determines detection table corresponding with current distance information, according to default
An image in the detection table, such as the first image is presented in rule, and the preset rules are corresponding when can be detected according to existing eyesight
Detected rule;But when the image in the detection table being presented for the first time, such as third image, the third image with it is described
First image is identical or not identical, and the third image can be any one image in the detection table, that is to say, that described
Third image can be what the terminal was determined at random;Alternatively, the third image is according to the in the preset rules
The corresponding any one image of a certain particular row that one sub-rule is determined;First sub-rule is for specifying in the detection table
A certain particular row, but which column does not limit is specially;That is, the third image is true according to first sub-rule
The corresponding random image of a certain particular row made;As shown in figure 5, the third image that Fig. 5 is as presented for the first time;This field skill
Art personnel should be appreciated that " third " in " first " and third image in the first image is only for distinguishing image,
Not for restriction sequence.
It will be appreciated by those skilled in the art that in practical applications, the first image of presentation is also possible to the end
End is determined according to the second operation for receiving user, for example, determining degree to eyesight degree according to terminal input
The corresponding the first row in the detection table, then, the terminal determine the corresponding image of the first row at random.
In addition, the image that the terminal is presented should be black white image when the detection table is characterization visual chart, in this way,
It is convenient for eyesight detection, to improve accuracy in detection.
In the present embodiment, the terminal shows the first image in the detection table in the display unit of itself, described
Target object observes the first image, and by voice input, key-press input or touches the modes such as output, itself is seen
To the corresponding information of the first image feed back to the terminal, to enable the terminal to get input data, and then basis
The input data itself got image feature information corresponding with the first image is matched, and realizes testing goal.
In the present embodiment, the input data can be the corresponding text data of key of audio data or input, or
The corresponding touch data of person's touch operation;Specifically, the target object can by voice mode input audio data, such as
The voices such as upper and lower, left and right are inputted, terminal is made to get the input data of characterization audio.Alternatively, by four keys in terminal
It corresponds to four direction (upper and lower, left and right), and then direction is determined by key, and then get terminal to characterize direction
Input data;Alternatively, different four region in terminal is corresponding with four direction respectively, and then by being implemented on given zone
The input data that can characterize direction is determined in the touch operation in domain.It will be appreciated by those skilled in the art that if passing through voice side
Formula makes terminal obtain input data, and the terminal needs to open phonetic function, in order to which the voice of user inputs;Here, it opens
At the time of can be to open the first application at the time of phonetic function, namely opens the first application and open phonetic function, Huo Zhegen
Phonetic function is opened in the operation for factually imposing on user interface, also either after executing the step 903, opens voice input
Function.
It is described to judge whether the input data matches with the first image in the present embodiment, it can be specifically, judgement
Whether the corresponding input characteristic parameter of input data image features corresponding with the first image match;Described
With can be specially identical;Judge the corresponding input characteristic parameter of input data image corresponding with the first image
Whether characteristic parameter is identical.In practical application, the input characteristic parameter can characterize directioin parameter, such as upper and lower, left and right;
Described image characteristic parameter can also characterize directioin parameter, such as upper and lower, left and right.
In one embodiment, when the distance between the target object and the terminal information are in the threshold value model
In enclosing, when the first image acquisition unit and the second image acquisition units are realized by front camera, the terminal
The collected picture of front camera is showed in display unit, at this point, carrying out eyesight detection using mobile phone to further increase
Accuracy, show the first display area in the display unit, first display area includes two sub-regions, such as
This, so that the eyes of the collected target object are shown in two sub-regions, and then ensures the target object
With the terminal substantially on a horizontal plane;In order to further ensure the eyes of the target object and the detection table of presentation
For corresponding image in same level, the corresponding image of the detection table is also presented in first display area, further mentions
The accuracy that eyesight detection is carried out using terminal is risen.Here, when the corresponding image of the terminal presentation detection table, nothing
Two sub-regions need to be repartitioned.
Detection method described in the embodiment of the present invention utilizes at least two image acquisition units being arranged in terminal
One image acquisition units and the second image acquisition units determine the corresponding range information of target object, parse the distance letter
Breath, obtains parameter corresponding relationship, according to the parameter corresponding relationship, determines detection table, and the detection table corresponding the is presented
One image obtains the input data based on the first image, and then judge whether are the input data and the first image
Matching records judging result, in this way, realizing the purpose for carrying out eyesight detection using terminal, enriches and the user experience is improved.
In addition, detection method described in the embodiment of the present invention, be detected using two image acquisition units the terminals with
The distance between the target object, therefore, compared with the mode of existing terminal detecting distance, described in the embodiment of the present invention
Method is more accurate, can more improve accuracy in detection.
Example IV
Figure 10 is the implementation process schematic diagram four of detection method of the embodiment of the present invention;The terminal is provided at least two figures
As acquisition unit;As shown in Figure 10, which comprises
Step 1001: using at least two image acquisition units the first image acquisition units and the second image adopt
Collection unit determines the corresponding range information of target object;
Step 1002: parsing the range information, obtain parameter corresponding relationship;
Step 1003: according to the parameter corresponding relationship, determining detection table, corresponding first figure of the detection table is presented
Picture;
Step 1004: obtaining the input data based on the first image;
Step 1005: judging whether the input data matches with the first image, record judging result;
Step 1006: the image presented according to judging result adjust automatically, corresponding second figure of the detection table is presented
Picture;Second image and the first image have the first relationship;Alternatively, based on the image that user's operation adjustment is presented, with
Corresponding second image of the detection table is presented.
In the present embodiment, the terminal can be specially the mobile terminals such as mobile phone, tablet computer, or computer etc. is solid
Determine terminal.
In the present embodiment, after the terminal needs to open the first application based on the first operation, at least two figure is triggered
As in acquisition unit the first image acquisition units and the second image acquisition units determine the corresponding range information of target object;
The corresponding range information of the target object is used to characterize the distance between the target object and the terminal.Here, institute
Stating the first application can be specially the corresponding application of measurement eyesight.It will be appreciated by those skilled in the art that first operation can
To be specially clicking operation, such as single-click operation or double click operation, or when the terminal supports touch-control display, described first
Operation can be specially to be implemented on the touch operation of the corresponding icon of first application.
In one embodiment, the first image acquisition unit is set on the corresponding first position of the terminal,
Second image acquisition units are set on the second position of the terminal, the equal position in the first position and the second position
In the same plane of the terminal, for example, the first image acquisition unit and second image acquisition units are respectively provided with
In in the corresponding plane of display unit of the terminal, specifically, the first image acquisition unit and second image are adopted
Collecting unit is front camera, in this way, simultaneously convenient for the first image acquisition unit and second image acquisition units
Image Acquisition is carried out to target object.
In another embodiment, there are three image acquisition units for the terminal setting;Three Image Acquisition lists
In the plane for the display unit that the first image acquisition units and the second image acquisition units in member are all set in the terminal,
I.e. preposition image acquisition units;Third image acquisition units in three image acquisition units are set to single with the display
In the opposite plane of member namely postposition image acquisition units.All image acquisition units can be realized by camera.
In practical applications, to keep the range information determined more accurate, the range information can also be one
Average distance information, specifically, the terminal using the first image acquisition unit and the second image acquisition units repeatedly into
Row distance measurement, determines multiple distances, such as first distance, second distance i.e. N distance, and N is just whole more than or equal to 2
Number, then, determines average value according to the multiple distance, and then using the average value as final range information, in this way,
The accuracy of distance value is improved, is laid the foundation to improve the accuracy of testing result.
In the present embodiment, after the terminal determines the distance between itself and the target object information, the end
End parses the range information, for example, the corresponding distance value of range information is compared the terminal with standard testing table, leads to
Conversion is crossed, obtains parameter corresponding relationship corresponding with the range information, and determine and be somebody's turn to do according to the parameter corresponding relationship
The matched detection table of the corresponding distance value of range information, the size of each image, resolution ratio in the detection table with it is described away from
It is corresponding from information, for example, the terminal according to the distance between target object information, determine and the range information
Corresponding visual chart, and then eyesight detection is provided for the target object, as shown in figure 4, Fig. 4 is and the range information pair
The standard visual acuity chart answered;Further, after the terminal determines detection table corresponding with current distance information, according to default
An image in the detection table, such as the first image is presented in rule, and the preset rules are corresponding when can be detected according to existing eyesight
Detected rule;But when the image in the detection table being presented for the first time, such as third image, the third image with it is described
First image is identical or not identical, and the third image can be any one image in the detection table, that is to say, that described
Third image can be what the terminal was determined at random;Alternatively, the third image is according to the in the preset rules
The corresponding any one image of a certain particular row that one sub-rule is determined;First sub-rule is for specifying in the detection table
A certain particular row, but which column does not limit is specially;That is, the third image is true according to first sub-rule
The corresponding random image of a certain particular row made;As shown in figure 5, the third image that Fig. 5 is as presented for the first time;This field skill
Art personnel should be appreciated that " third " in " first " and third image in the first image is only for distinguishing image,
Not for restriction sequence.
It will be appreciated by those skilled in the art that in practical applications, the first image of presentation is also possible to the end
End is determined according to the second operation for receiving user, for example, determining degree to eyesight degree according to terminal input
The corresponding the first row in the detection table, then, the terminal determine the corresponding image of the first row at random.
In addition, the image that the terminal is presented should be black white image when the detection table is characterization visual chart, in this way,
It is convenient for eyesight detection, to improve accuracy in detection.
In the present embodiment, the terminal shows the first image in the detection table in the display unit of itself, described
Target object observes the first image, and by voice input, key-press input or touches the modes such as output, itself is seen
To the corresponding information of the first image feed back to the terminal, to enable the terminal to get input data, and then basis
The input data itself got image feature information corresponding with the first image is matched, and realizes testing goal.
In the present embodiment, the input data can be the corresponding text data of key of audio data or input, or
The corresponding touch data of person's touch operation;Specifically, the target object can by voice mode input audio data, such as
The voices such as upper and lower, left and right are inputted, terminal is made to get the input data of characterization audio.Alternatively, by four keys in terminal
It corresponds to four direction (upper and lower, left and right), and then direction is determined by key, and then get terminal to characterize direction
Input data;Alternatively, different four region in terminal is corresponding with four direction respectively, and then by being implemented on given zone
The input data that can characterize direction is determined in the touch operation in domain.It will be appreciated by those skilled in the art that if passing through voice side
Formula makes terminal obtain input data, and the terminal needs to open phonetic function, in order to which the voice of user inputs;Here, it opens
At the time of can be to open the first application at the time of phonetic function, namely opens the first application and open phonetic function, Huo Zhegen
Phonetic function is opened in the operation for factually imposing on user interface, and also either after executing the step 1003, it is defeated to open voice
Enter function.
It is described to judge whether the input data matches with the first image in the present embodiment, it can be specifically, judgement
Whether the corresponding input characteristic parameter of input data image features corresponding with the first image match;Described
With can be specially identical;Judge the corresponding input characteristic parameter of input data image corresponding with the first image
Whether characteristic parameter is identical.In practical application, the input characteristic parameter can characterize directioin parameter, such as upper and lower, left and right;
Described image characteristic parameter can also characterize directioin parameter, such as upper and lower, left and right.
In the present embodiment, after the presentation detection table corresponding second image, the terminal, which obtains, is based on described second
The input data of image, judges whether the input data based on second image matches with second image, record judgement
As a result, until exporting judging result when judging result meets preset require.Here, the preset requirement can be according to reality
Border situation and be arbitrarily arranged, for example, in one embodiment, phonetic function is utilized to obtain the voice letter of target object input
Breath, and then judge whether the voice messaging matches with the first image, if matching, judging result is recorded, and according to judgement
As a result enter lower level to test, i.e., according to existing eyesight detected rule, it is corresponding to export the first image described in the detection table
Next line any one image;If mismatching, judging result is also recorded, or prompt user tries again;When upper and lower two
The judging result of a grade is all when mismatching, user to be prompted to terminate eyesight detection, this detection process finishes.
In the present embodiment, when the image of the terminal adjust automatically program, second image is the first image
Any one image in corresponding lower level;For example, second image is that the first image described in detection table is corresponding next
Any one image in row image.Alternatively, the image that the terminal is presented can be adjusted according to user demand and arbitrarily, example
Such as, when carrying out eyesight detection using method described in the embodiment of the present invention, user can be in by down key adjustment in pressing volume
The corresponding grade of existing image such as adjusts degree;The corresponding grade of image presented can also be adjusted by voice mode, such as added
Degree subtracts degree, specifically, can be set 25 degree be one grade perhaps 50 degree be one grade then by voice mode or sound
Key mode is measured, adjusts the grade of the image of presentation, and then realize the purpose to give a test of one's eyesight.
Detection method described in the embodiment of the present invention utilizes at least two image acquisition units being arranged in terminal
One image acquisition units and the second image acquisition units determine the corresponding range information of target object, parse the distance letter
Breath, obtains parameter corresponding relationship, according to the parameter corresponding relationship, determines detection table, and the detection table corresponding the is presented
One image obtains the input data based on the first image, and then judge whether are the input data and the first image
Matching records judging result, in this way, realizing the purpose for carrying out eyesight detection using terminal, enriches and the user experience is improved.
In addition, detection method described in the embodiment of the present invention, be detected using two image acquisition units the terminals with
The distance between the target object, therefore, compared with the mode of existing terminal detecting distance, described in the embodiment of the present invention
Method is more accurate, can more improve accuracy in detection.
Embodiment five
Figure 11 is the structural schematic diagram one of the terminal of that embodiment of the invention;The terminal is provided at least two Image Acquisition lists
Member;As shown in figure 11, the terminal further include:
Detection unit 21, for utilizing the first image acquisition units and second at least two image acquisition units
Image acquisition units determine the corresponding range information of target object;
Processing unit 22 obtains parameter corresponding relationship for parsing the range information;
It is corresponding that the detection table is presented for determining detection table according to the parameter corresponding relationship in determination unit 23
First image;
Acquiring unit 24, for obtaining the input data based on the first image;
First judging unit 25, for judging whether the input data matches with the first image, record judgement knot
Fruit.
It will be appreciated by those skilled in the art that in the electronic equipment of the embodiment of the present invention each processing unit function, can join
According to aforementioned detection method associated description and understand.Specifically,
In the present embodiment, the terminal can be specially the mobile terminals such as mobile phone, tablet computer, or computer etc. is solid
Determine terminal.
In the present embodiment, after the terminal needs to open the first application based on the first operation, the detection unit 21 is triggered
Using at least two image acquisition units the first image acquisition units and the second image acquisition units determine target
The corresponding range information of object;The corresponding range information of the target object is used to characterize the target object and the terminal
The distance between.Here, first application can be specially the corresponding application of measurement eyesight.Those skilled in the art should manage
Solution, first operation can be specially clicking operation, such as single-click operation or double click operation, or when the terminal supports touching
When control display, first operation can be specially to be implemented on the touch operation of the corresponding icon of first application.
In one embodiment, the first image acquisition unit is set on the corresponding first position of the terminal,
Second image acquisition units are set on the second position of the terminal, the equal position in the first position and the second position
In the same plane of the terminal, for example, the first image acquisition unit and second image acquisition units are respectively provided with
In in the corresponding plane of display unit of the terminal, specifically, the first image acquisition unit and second image are adopted
Collecting unit is front camera, in this way, simultaneously convenient for the first image acquisition unit and second image acquisition units
Image Acquisition is carried out to target object.
In another embodiment, there are three image acquisition units for the terminal setting;Three Image Acquisition lists
In the plane for the display unit that the first image acquisition units and the second image acquisition units in member are all set in the terminal,
I.e. preposition image acquisition units;Third image acquisition units in three image acquisition units are set to single with the display
In the opposite plane of member namely postposition image acquisition units.All image acquisition units can be realized by camera.
In practical applications, to keep the range information determined more accurate, the range information can also be one
Average distance information, specifically, the terminal trigger the detection unit 21 and utilize the first image acquisition unit and second
Image acquisition units repeatedly carry out range measurement, determine multiple distances, such as first distance, second distance i.e. N distance, N
Then average value is determined according to the multiple distance, and then using the average value as most for the positive integer more than or equal to 2
Whole range information lays the foundation in this way, improving the accuracy of distance value to improve the accuracy of testing result.
In the present embodiment, after the detection unit 21 determines the distance between terminal and the target object information,
The processing unit 22 parses the range information, for example, the processing unit 22 is by the corresponding distance value of range information and marks
Quasi- test table is compared, and by conversion, obtains parameter corresponding relationship corresponding with the range information, and trigger the determination
Unit 23 determines the matched detection table of distance value corresponding with the range information, the detection according to the parameter corresponding relationship
Size, the resolution ratio of each image in table are corresponding with the range information, for example, the terminal according to the target object
The distance between information, determine visual chart corresponding with the range information, and then provide eyesight inspection for the target object
It surveys, as shown in figure 4, Fig. 4 is standard visual acuity chart corresponding with the range information;Further, when the determination unit 23
After determining detection table corresponding with current distance information, an image in the detection table, such as first is presented according to preset rules
Image, the preset rules corresponding detected rule when can be detected according to existing eyesight;But the detection table is presented for the first time
In image when, such as third image, the third image is identical as the first image or not identical, and the third image can
Think any one image in the detection table, that is to say, that the third image can be what the terminal was determined at random;
Alternatively, the third image is corresponding any for a certain particular row determined according to the first sub-rule in the preset rules
One image;First sub-rule is for specifying a certain particular row in the detection table, but which column does not limit is specially;?
That is the third image is the corresponding random image of a certain particular row determined according to first sub-rule;Such as
Shown in Fig. 5, third image that Fig. 5 is as presented for the first time;It will be appreciated by those skilled in the art that in the first image " the
One " and third image in " third " be only for distinguishing image, not for restriction sequence.
It will be appreciated by those skilled in the art that in practical applications, the first image of presentation is also possible to the end
End is determined according to the second operation for receiving user, for example, determining degree to eyesight degree according to terminal input
The corresponding the first row in the detection table, then, the terminal determine the corresponding image of the first row at random.
In addition, the image that the terminal is presented should be black white image when the detection table is characterization visual chart, in this way,
It is convenient for eyesight detection, to improve accuracy in detection.
In the present embodiment, the terminal shows the first image in the detection table in the display unit of itself, described
Target object observes the first image, and by voice input, key-press input or touches the modes such as output, itself is seen
To the corresponding information of the first image feed back to the terminal so that the acquiring unit 24 can get input data, into
And trigger the input data image feature information corresponding with the first image that first judging unit 25 is got according to itself
It is matched, realizes testing goal.
In the present embodiment, the input data can be the corresponding text data of key of audio data or input, or
The corresponding touch data of person's touch operation;Specifically, the target object can by voice mode input audio data, such as
The voices such as upper and lower, left and right are inputted, terminal is made to get the input data of characterization audio.Alternatively, by four keys in terminal
It corresponds to four direction (upper and lower, left and right), and then direction is determined by key, and then get terminal to characterize direction
Input data;Alternatively, different four region in terminal is corresponding with four direction respectively, and then by being implemented on given zone
The input data that can characterize direction is determined in the touch operation in domain.It will be appreciated by those skilled in the art that if passing through voice side
Formula makes terminal obtain input data, and the terminal needs to open phonetic function, in order to which the voice of user inputs;Here, it opens
At the time of can be to open the first application at the time of phonetic function, namely opens the first application and open phonetic function, Huo Zhegen
Phonetic function is opened in the operation for factually imposing on user interface, also either when the determination unit 23 shows the first image
Afterwards, speech voice input function is opened.
It is described to judge whether the input data matches with the first image in the present embodiment, it can be specifically, judgement
Whether the corresponding input characteristic parameter of input data image features corresponding with the first image match;Described
With can be specially identical;Judge the corresponding input characteristic parameter of input data image corresponding with the first image
Whether characteristic parameter is identical.In practical application, the input characteristic parameter can characterize directioin parameter, such as upper and lower, left and right;
Described image characteristic parameter can also characterize directioin parameter, such as upper and lower, left and right.
Terminal described in the embodiment of the present invention utilizes the first figure at least two image acquisition units being arranged in terminal
As acquisition unit and the second image acquisition units determine the corresponding range information of target object, the range information is parsed, is obtained
Detection table is determined according to the parameter corresponding relationship to parameter corresponding relationship, and corresponding first figure of the detection table is presented
Picture obtains the input data based on the first image, and then judges whether the input data matches with the first image,
Judging result is recorded, in this way, realizing the purpose for carrying out eyesight detection using terminal, enriches and the user experience is improved.
In addition, terminal described in the embodiment of the present invention, be detected using two image acquisition units the terminals with it is described
The distance between target object, therefore, compared with the mode of existing terminal detecting distance, terminal described in the embodiment of the present invention is adopted
The detection method taken is more accurate, can more improve accuracy in detection.
Embodiment six
Based on terminal described in embodiment five, in the embodiment of the present invention, as shown in figure 12, the detection unit 21 includes:
Image Acquisition subelement 2101, for utilizing the first Image Acquisition list at least two image acquisition units
Member and the second image acquisition units carry out Image Acquisition to the target image, obtain the second image and third image;
Subelement 2102 is obtained, for obtaining the position between the first image acquisition unit and the second image acquisition units
Set relation information;
Subelement 2103 is handled, for according to the positional relationship information, second image and third image to be determined
The corresponding range information of target object.
In one embodiment, the terminal is smart phone;Fig. 7 is to be provided with the first camera and second camera
Terminal structural schematic diagram;As shown in fig. 7, the first image acquisition unit being arranged in the smart phone passes through first
Camera 71 realizes that second image acquisition units are realized by second camera 72, first camera 71 and second
Camera 72 is front camera.
Further, the embodiment of the present invention realizes ranging process using dual camera side away from method;Fig. 8 is dual camera
Side is away from schematic illustration;As shown in Figure 7 and Figure 8, specifically, using first camera 71 and second camera 72 to described
Target object P carries out Image Acquisition, obtains the first image PlWith the second image Pr, here, O shown in Fig. 8lIt is taken the photograph for described first
As first 71 light source, OrFor the light source of the second camera 72, by the OlWith the OrBetween connecting line as X-axis, such as
X-axis shown in Fig. 8, using in the display plane perpendicular to the direction of the X-axis as Y-axis, Y-axis as shown in Figure 8, to establish
Coordinate system;In this way, obtaining the first image PlX relative to the X-axisl, the second image PrRelative to the X-axis
xr, according to the following formula (1), shift out following formula (2) onto, obtain calculating the terminal and the target according to (2) formula
The distance between object P information Z;The formula (1) and formula (2) are as follows:
Wherein, center of the T between first camera 71 and the second camera 72 away from;The f is coke
Away from.
In this way, the distance between the terminal determined using the above method and described target object information more subject to
Really, it and then can adequately determine parameter corresponding relationship, and determine that detection table is laid a good foundation.
It will be appreciated by those skilled in the art that in the electronic equipment of the embodiment of the present invention each processing unit function, can join
According to aforementioned detection method associated description and understand.
Embodiment seven
Figure 13 is the structural schematic diagram three of the terminal of that embodiment of the invention;The terminal is provided at least two Image Acquisition lists
Member;As shown in figure 13, the terminal further include:
Detection unit 21, for utilizing the first image acquisition units and second at least two image acquisition units
Image acquisition units determine the corresponding range information of target object;
Processing unit 22 obtains parameter corresponding relationship for parsing the range information;
It is corresponding that the detection table is presented for determining detection table according to the parameter corresponding relationship in determination unit 23
First image;
Acquiring unit 24, for obtaining the input data based on the first image;
First judging unit 25, for judging whether the input data matches with the first image, record judgement knot
Fruit.
In the present embodiment, the terminal further include:
Second judgment unit 26, for judging whether the range information is in threshold range;
Accordingly, the processing unit 22 is also used to when the range information is in the threshold range, parses institute
Range information is stated, parameter corresponding relationship is obtained.
In the present embodiment, the processing unit 22 is also used to when the range information is not in the threshold range,
Prompt information is generated, to prompt target object to adjust the location information of itself, makes the corresponding distance letter of target object adjusted
Breath is in the threshold range.
In the present embodiment, the terminal further include:
Adjustment unit 27, the image for being presented according to judging result adjust automatically are corresponding the detection table is presented
Second image;Second image and the first image have the first relationship;Alternatively, the figure presented based on user's operation adjustment
Picture, corresponding second image of the detection table is presented.
It will be appreciated by those skilled in the art that in the electronic equipment of the embodiment of the present invention each processing unit function, can join
According to aforementioned detection method associated description and understand.In addition, the function of each processing unit is referred to embodiment in the present embodiment
Five description, which is not described herein again.
In one embodiment, when the distance between the target object and the terminal information are in the threshold value model
In enclosing, when the first image acquisition unit and the second image acquisition units are realized by front camera, the terminal
The collected picture of front camera is showed in display unit, at this point, carrying out eyesight detection using mobile phone to further increase
Accuracy, show the first display area in the display unit, first display area includes two sub-regions, such as
This, so that the eyes of the collected target object are shown in two sub-regions, and then ensures the target object
With the terminal substantially on a horizontal plane;In order to further ensure the eyes of the target object and the detection table of presentation
For corresponding image in same level, the corresponding image of the detection table is also presented in first display area, further mentions
The accuracy that eyesight detection is carried out using terminal is risen.Here, when the corresponding image of the terminal presentation detection table, nothing
Two sub-regions need to be repartitioned.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side
Method can be realized by means of software and necessary general hardware platform, naturally it is also possible to by hardware, but in many cases
The former is more preferably embodiment.Based on this understanding, technical solution of the present invention substantially in other words does the prior art
The part contributed out can be embodied in the form of software products, which is stored in a storage medium
In (such as ROM/RAM, magnetic disk, CD), including some instructions are used so that a terminal device (can be mobile phone, computer, clothes
Business device, air conditioner or the network equipment etc.) execute method described in each embodiment of the present invention.
The above is only a preferred embodiment of the present invention, is not intended to limit the scope of the invention, all to utilize this hair
Equivalent structure or equivalent flow shift made by bright specification and accompanying drawing content is applied directly or indirectly in other relevant skills
Art field, is included within the scope of the present invention.
Claims (10)
1. a kind of terminal, which is characterized in that the terminal is provided at least two image acquisition units;The terminal further include:
Detection unit, for using at least two image acquisition units the first image acquisition units and the second image adopt
Collection unit determines the corresponding range information of target object, wherein the first image acquisition unit and second image are adopted
Collecting unit is front camera;
Processing unit obtains parameter corresponding relationship for parsing the range information;
Corresponding first figure of the detection table is presented for determining detection table according to the parameter corresponding relationship in determination unit
Picture;
Acquiring unit, for obtaining the input data based on the first image;
First judging unit records judging result for judging whether the input data matches with the first image;
Wherein, the terminal further include: display unit, for the eyes of the target object to be respectively displayed on the first viewing area
In two sub-regions in domain;The display unit is also used to for the corresponding the first image of the detection table being presented on described
In first display area.
2. terminal according to claim 1, which is characterized in that the detection unit includes:
Image Acquisition subelement, for utilizing the first image acquisition units and second at least two image acquisition units
Image acquisition units carry out Image Acquisition to the target object, obtain the second image and third image;
Subelement is obtained, for obtaining the letter of the positional relationship between the first image acquisition unit and the second image acquisition units
Breath;
Subelement is handled, for according to the positional relationship information, second image and third image to determine target object
Corresponding range information.
3. terminal according to claim 1, which is characterized in that the terminal further include:
Second judgment unit, for judging whether the range information is in threshold range;
Accordingly, the processing unit is also used to parse the distance when the range information is in the threshold range
Information obtains parameter corresponding relationship.
4. terminal according to claim 3, which is characterized in that the processing unit is also used to work as the range information not
When in the threshold range, generate prompt information is made adjusted with prompting target object to adjust the location information of itself
The corresponding range information of target object is in the threshold range.
5. terminal according to claim 1, which is characterized in that the terminal further include:
Adjustment unit, the image for being presented according to judging result adjust automatically, corresponding second figure of the detection table is presented
Picture;Second image and the first image have the first relationship;Alternatively,
Based on the image that user's operation adjustment is presented, corresponding second image of the detection table is presented.
6. a kind of detection method is applied to terminal;It is characterized in that, the terminal is provided at least two image acquisition units;
The described method includes:
Using at least two image acquisition units the first image acquisition units and the second image acquisition units determine
The corresponding range information of target object, wherein before the first image acquisition unit and second image acquisition units are
Set camera;
The range information is parsed, parameter corresponding relationship is obtained;
According to the parameter corresponding relationship, detection table is determined, corresponding first image of the detection table is presented;
Obtain the input data based on the first image;
Judge whether the input data matches with the first image, records judging result;
Wherein, it is described determine the corresponding range information of target object after, it is described according to the parameter corresponding relationship, determine
Table is detected, before corresponding first image of the detection table is presented, the method also includes:
Determine that the eyes of the target object are respectively displayed in two sub-regions of the first display area;
It is correspondingly, described that corresponding first image of the detection table is presented, comprising:
The corresponding the first image of the detection table is presented in first display area.
7. according to the method described in claim 6, it is characterized in that, described using at least two image acquisition units
First image acquisition units and the second image acquisition units determine the corresponding range information of target object, comprising:
Using at least two image acquisition units the first image acquisition units and the second image acquisition units to described
Target object carries out Image Acquisition, obtains the second image and third image;
Obtain the positional relationship information between the first image acquisition unit and the second image acquisition units;
According to the positional relationship information, second image and third image determine the corresponding range information of target object.
8. according to the method described in claim 6, it is characterized in that, the method also includes:
Judge whether the range information is in threshold range;
Accordingly, the parsing range information, obtains parameter corresponding relationship, comprising:
When the range information is in the threshold range, the range information is parsed, parameter corresponding relationship is obtained.
9. according to the method described in claim 8, it is characterized in that, the method also includes:
When the range information is not in the threshold range, prompt information is generated, to prompt target object to adjust itself
Location information, be in target object adjusted corresponding range information in the threshold range.
10. according to the method described in claim 6, it is characterized in that, the method also includes:
According to the image that judging result adjust automatically is presented, corresponding second image of the detection table is presented;Second figure
As there is the first relationship with the first image;Alternatively,
Based on the image that user's operation adjustment is presented, corresponding second image of the detection table is presented.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510355057.8A CN105120050B (en) | 2015-06-24 | 2015-06-24 | A kind of detection method and its terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510355057.8A CN105120050B (en) | 2015-06-24 | 2015-06-24 | A kind of detection method and its terminal |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105120050A CN105120050A (en) | 2015-12-02 |
CN105120050B true CN105120050B (en) | 2019-04-30 |
Family
ID=54667931
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510355057.8A Active CN105120050B (en) | 2015-06-24 | 2015-06-24 | A kind of detection method and its terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105120050B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105867860B (en) * | 2016-03-28 | 2019-04-26 | 联想(北京)有限公司 | A kind of information processing method and electronic equipment |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1753078A (en) * | 2004-09-20 | 2006-03-29 | Lg电子株式会社 | Adjustable display of mobile communications terminal |
CN102980556A (en) * | 2012-11-29 | 2013-03-20 | 北京小米科技有限责任公司 | Distance measuring method and device |
CN102984344A (en) * | 2012-10-16 | 2013-03-20 | 广东欧珀移动通信有限公司 | Method for testing vision with mobile phone and mobile phone |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20060054625A (en) * | 2004-11-15 | 2006-05-23 | 엘지전자 주식회사 | Method for measuring eyesight using portable terminal |
-
2015
- 2015-06-24 CN CN201510355057.8A patent/CN105120050B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1753078A (en) * | 2004-09-20 | 2006-03-29 | Lg电子株式会社 | Adjustable display of mobile communications terminal |
CN102984344A (en) * | 2012-10-16 | 2013-03-20 | 广东欧珀移动通信有限公司 | Method for testing vision with mobile phone and mobile phone |
CN102980556A (en) * | 2012-11-29 | 2013-03-20 | 北京小米科技有限责任公司 | Distance measuring method and device |
Also Published As
Publication number | Publication date |
---|---|
CN105120050A (en) | 2015-12-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104750420B (en) | Screenshotss method and device | |
CN104980588B (en) | A kind of method and apparatus for detecting mobile terminal state | |
CN104850799B (en) | The method and mobile terminal of a kind of data in hiding mobile terminal | |
CN105357367B (en) | Recognition by pressing keys device and method based on pressure sensor | |
CN106130734A (en) | The control method of mobile terminal and control device | |
CN104850343B (en) | Start the method and apparatus of one-hand operating format | |
CN105430258B (en) | A kind of method and apparatus of self-timer group photo | |
CN106131274B (en) | Mobile terminal control device and method | |
CN105099701B (en) | A kind of method of terminal and terminal authentication | |
CN105573916B (en) | Fault detection method and mobile terminal | |
CN109240579A (en) | A kind of touch operation method, equipment and computer can storage mediums | |
CN106303044B (en) | A kind of mobile terminal and obtain the method to coke number | |
CN105260096B (en) | A kind of method and mobile terminal controlling user's operation | |
CN105262953B (en) | A kind of mobile terminal and its method of control shooting | |
CN105049610B (en) | A kind of input method and terminal | |
CN106227454B (en) | A kind of touch trajectory detection system and method | |
CN106161790A (en) | A kind of mobile terminal and control method thereof | |
CN109542317A (en) | A kind of display control method, equipment and the storage medium of double-sided screen mobile terminal | |
CN105785608B (en) | A kind of screen water ripples test macro and method | |
CN104915021B (en) | A kind of one-handed performance false-touch prevention input method, device and mobile terminal | |
CN106650347A (en) | Synchronous unblocking method, device and terminals | |
CN105120050B (en) | A kind of detection method and its terminal | |
CN105721757A (en) | Device and method for adjusting photographing parameters | |
CN106713645B (en) | A kind of method and mobile terminal of the broadcasting of control loudspeaker | |
CN106506748B (en) | A kind of information processing method and mobile terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |