CN105120050A - Detection method and terminal employing same - Google Patents

Detection method and terminal employing same Download PDF

Info

Publication number
CN105120050A
CN105120050A CN201510355057.8A CN201510355057A CN105120050A CN 105120050 A CN105120050 A CN 105120050A CN 201510355057 A CN201510355057 A CN 201510355057A CN 105120050 A CN105120050 A CN 105120050A
Authority
CN
China
Prior art keywords
image
acquisition units
terminal
image acquisition
range information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510355057.8A
Other languages
Chinese (zh)
Other versions
CN105120050B (en
Inventor
赵欣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201510355057.8A priority Critical patent/CN105120050B/en
Publication of CN105120050A publication Critical patent/CN105120050A/en
Application granted granted Critical
Publication of CN105120050B publication Critical patent/CN105120050B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The embodiment of the invention discloses a terminal, and the terminal is provided with at least two image collection units. The terminal also comprises a detection unit which is used for determining the corresponding distance information of a target object through the first and second image collection units; a processing unit which is used for analyzing the distance information and obtaining corresponding relation of parameters; a determining unit which is used for determining a detection table and representing a first image corresponding to the detection table according to the corresponding relation of parameters; an obtaining unit which is used for obtaining input data based on the first image; and a first judgment unit which is used for judging whether the inputted data is matched with the first image or not, and recording the judgment results. The embodiment of the invention discloses a detection method.

Description

A kind of detection method and terminal thereof
Technical field
The present invention relates to the application technology of terminal, particularly relate to a kind of detection method and terminal thereof.
Background technology
Along with scientific and technological progress, electronic equipment becomes article indispensable in people's life already; Terminal equipment, especially intelligent terminal, can provide increasing application for user.But, in the application of existing terminal, do not have the application mode of the eyesight relating to detected target object, therefore, need a kind of detection method badly, to enrich Consumer's Experience.
Summary of the invention
For solving the technical problem of existing existence, embodiments provide a kind of detection method and terminal thereof.
The technical scheme of the embodiment of the present invention is achieved in that
Embodiments provide a kind of terminal, described terminal is provided with at least two image acquisition units; Described terminal also comprises:
Detecting unit, determines for the first image acquisition units at least two image acquisition units described in utilizing and the second image acquisition units the range information that destination object is corresponding;
Processing unit, for resolving described range information, obtains parameter corresponding relation;
Determining unit, for according to described parameter corresponding relation, determines detection table, presents the first image that described detection table is corresponding;
Acquiring unit, for obtaining the input data based on described first image;
First judging unit, for judging whether described input data mate with described first image, record judged result.
The embodiment of the present invention additionally provides a kind of detection method, is applied to terminal; Described terminal is provided with at least two image acquisition units; Described method comprises:
The first image acquisition units at least two image acquisition units described in utilization and the second image acquisition units determine the range information that destination object is corresponding;
Resolve described range information, obtain parameter corresponding relation;
According to described parameter corresponding relation, determine detection table, present the first image that described detection table is corresponding;
Obtain the input data based on described first image;
Judge whether described input data mate with described first image, record judged result.
Detection method described in the embodiment of the present invention and terminal, the first image acquisition units at least two image acquisition units arranged in terminal and the second image acquisition units is utilized to determine the range information that destination object is corresponding, resolve described range information, obtain parameter corresponding relation, according to described parameter corresponding relation, determine detection table, present the first image that described detection table is corresponding, obtain the input data based on described first image, and then judge whether described input data mate with described first image, record judged result, so, realize utilizing terminal to carry out the object of eyesight detection, enrich and improve Consumer's Experience.
Accompanying drawing explanation
Fig. 1 is the hardware configuration schematic diagram of the mobile terminal realizing each embodiment of the present invention;
Fig. 2 is the wireless communication system schematic diagram of mobile terminal as shown in Figure 1;
Fig. 3 is the realization flow schematic diagram one of embodiment of the present invention detection method;
Fig. 4 is sign visual chart corresponding to embodiment of the present invention middle distance information;
Fig. 5 is the schematic diagram of the 3rd image that in the embodiment of the present invention, terminal presents first;
Fig. 6 is the realization flow schematic diagram two of embodiment of the present invention detection method;
Fig. 7 is the structural representation that the embodiment of the present invention is provided with the terminal of the first camera and second camera;
Fig. 8 is that embodiment of the present invention dual camera side is apart from principle schematic;
Fig. 9 is the realization flow schematic diagram three of embodiment of the present invention detection method;
Figure 10 is the realization flow schematic diagram four of embodiment of the present invention detection method;
Figure 11 is the structural representation one of embodiment of the present invention terminal;
Figure 12 is the result schematic diagram two of embodiment of the present invention terminal;
Figure 13 is the structural representation three of embodiment of the present invention terminal.
The realization of the object of the invention, functional characteristics and advantage will in conjunction with the embodiments, are described further with reference to accompanying drawing.
Embodiment
Should be appreciated that specific embodiment described herein only in order to explain the present invention, be not intended to limit the present invention.
The mobile terminal realizing each embodiment of the present invention is described referring now to accompanying drawing.In follow-up description, use the suffix of such as " module ", " parts " or " unit " for representing element only in order to be conducive to explanation of the present invention, itself is specific meaning not.Therefore, " module " and " parts " can mixedly use.
Mobile terminal can be implemented in a variety of manners.Such as, the terminal described in the present invention can comprise the such as mobile terminal of mobile phone, smart phone, notebook computer, digit broadcasting receiver, PDA (personal digital assistant), PAD (panel computer), PMP (portable media player), guider etc. and the fixed terminal of such as digital TV, desktop computer etc.Below, suppose that terminal is mobile terminal.But it will be appreciated by those skilled in the art that except the element except being used in particular for mobile object, structure according to the embodiment of the present invention also can be applied to the terminal of fixed type.
Fig. 1 is the hardware configuration signal of the mobile terminal realizing each embodiment of the present invention.
Mobile terminal 100 can comprise wireless communication unit 110, A/V (audio/video) input unit 120, user input unit 130, sensing cell 140, output unit 150, memory 160, interface unit 170, controller 180 and power subsystem 190 etc.Fig. 1 shows the mobile terminal with various assembly, it should be understood that, does not require to implement all assemblies illustrated.Can alternatively implement more or less assembly.Will be discussed in more detail below the element of mobile terminal.
Wireless communication unit 110 generally includes one or more assembly, and it allows the radio communication between mobile terminal 100 and wireless communication system or network.Such as, wireless communication unit can comprise at least one in broadcast reception module 111, mobile communication module 112, wireless Internet module 113, short range communication module 114 and positional information module 115.
Broadcast reception module 111 via broadcast channel from external broadcasting management server receiving broadcast signal and/or broadcast related information.Broadcast channel can comprise satellite channel and/or terrestrial channel.Broadcast management server can be generate and send the server of broadcast singal and/or broadcast related information or the broadcast singal generated before receiving and/or broadcast related information and send it to the server of terminal.Broadcast singal can comprise TV broadcast singal, radio signals, data broadcasting signal etc.And broadcast singal may further include the broadcast singal combined with TV or radio signals.Broadcast related information also can provide via mobile communications network, and in this case, broadcast related information can be received by mobile communication module 112.Broadcast singal can exist in a variety of manners, such as, it can exist with the form of the electronic service guidebooks (ESG) of the electronic program guides of DMB (DMB) (EPG), digital video broadcast-handheld (DVB-H) etc.Broadcast reception module 111 can by using the broadcast of various types of broadcast system Received signal strength.Especially, broadcast reception module 111 can by using such as multimedia broadcasting-ground (DMB-T), DMB-satellite (DMB-S), digital video broadcasting-hand-held (DVB-H), forward link media (MediaFLO ) the digit broadcasting system receiving digital broadcast of Radio Data System, received terrestrial digital broadcasting integrated service (ISDB-T) etc.Broadcast reception module 111 can be constructed to be applicable to providing the various broadcast system of broadcast singal and above-mentioned digit broadcasting system.The broadcast singal received via broadcast reception module 111 and/or broadcast related information can be stored in memory 160 (or storage medium of other type).
Radio signal is sent at least one in base station (such as, access point, Node B etc.), exterior terminal and server and/or receives radio signals from it by mobile communication module 112.Various types of data that such radio signal can comprise voice call signal, video calling signal or send according to text and/or Multimedia Message and/or receive.
Wireless Internet module 113 supports the Wi-Fi (Wireless Internet Access) of mobile terminal.This module can be inner or be externally couple to terminal.Wi-Fi (Wireless Internet Access) technology involved by this module can comprise WLAN (WLAN) (Wi-Fi), Wibro (WiMAX), Wimax (worldwide interoperability for microwave access), HSDPA (high-speed downlink packet access) etc.
Short range communication module 114 is the modules for supporting junction service.Some examples of short-range communication technology comprise bluetooth tM, radio-frequency (RF) identification (RFID), Infrared Data Association (IrDA), ultra broadband (UWB), purple honeybee tMetc..
Positional information module 115 is the modules of positional information for checking or obtain mobile terminal.The typical case of positional information module is GPS (global positioning system).According to current technology, GPS module 115 calculates from the range information of three or more satellite and correct time information and for the Information application triangulation calculated, thus calculates three-dimensional current location information according to longitude, latitude and pin-point accuracy.Current, the method for calculating location and temporal information uses three satellites and by the error of the position that uses an other satellite correction calculation to go out and temporal information.In addition, GPS module 115 can carry out computational speed information by Continuous plus current location information in real time.
A/V input unit 120 is for audio reception or vision signal.A/V input unit 120 can comprise camera 121 and microphone 1220, and the view data of camera 121 to the static images obtained by image capture apparatus in Video Capture pattern or image capture mode or video processes.Picture frame after process may be displayed on display unit 151.Picture frame after camera 121 processes can be stored in memory 160 (or other storage medium) or via wireless communication unit 110 and send, and can provide two or more cameras 1210 according to the structure of mobile terminal.Such acoustic processing can via microphones sound (voice data) in telephone calling model, logging mode, speech recognition mode etc. operational mode, and can be voice data by microphone 122.Audio frequency (voice) data after process can be converted to the formatted output that can be sent to mobile communication base station via mobile communication module 112 when telephone calling model.Microphone 122 can be implemented various types of noise and eliminate (or suppress) algorithm and receiving and sending to eliminate (or suppression) noise or interference that produce in the process of audio signal.
User input unit 130 can generate key input data to control the various operations of mobile terminal according to the order of user's input.User input unit 130 allows user to input various types of information, and keyboard, the young sheet of pot, touch pad (such as, detecting the touch-sensitive assembly of the change of the resistance, pressure, electric capacity etc. that cause owing to being touched), roller, rocking bar etc. can be comprised.Especially, when touch pad is superimposed upon on display unit 151 as a layer, touch-screen can be formed.
Sensing cell 140 detects the current state of mobile terminal 100, (such as, mobile terminal 100 open or close state), the position of mobile terminal 100, user for mobile terminal 100 contact (namely, touch input) presence or absence, the orientation of mobile terminal 100, the acceleration or deceleration of mobile terminal 100 move and direction etc., and generate order or the signal of the operation for controlling mobile terminal 100.Such as, when mobile terminal 100 is embodied as sliding-type mobile phone, sensing cell 140 can sense this sliding-type phone and open or close.In addition, whether whether sensing cell 140 can detect power subsystem 190 provides electric power or interface unit 170 to couple with external device (ED).Sensing cell 140 can comprise proximity transducer 1410 and will be described this in conjunction with touch-screen below.
Interface unit 170 is used as at least one external device (ED) and is connected the interface that can pass through with mobile terminal 100.Such as, external device (ED) can comprise wired or wireless head-band earphone port, external power source (or battery charger) port, wired or wireless FPDP, memory card port, for connecting the port, audio frequency I/O (I/O) port, video i/o port, ear port etc. of the device with identification module.Identification module can be that storage uses the various information of mobile terminal 100 for authentication of users and can comprise subscriber identification module (UIM), client identification module (SIM), Universal Subscriber identification module (USIM) etc.In addition, the device (hereinafter referred to " recognition device ") with identification module can take the form of smart card, and therefore, recognition device can be connected with mobile terminal 100 via port or other jockey.Interface unit 170 may be used for receive from external device (ED) input (such as, data message, electric power etc.) and the input received be transferred to the one or more element in mobile terminal 100 or may be used for transmitting data between mobile terminal and external device (ED).
In addition, when mobile terminal 100 is connected with external base, interface unit 170 can be used as to allow by it electric power to be provided to the path of mobile terminal 100 from base or can be used as the path that allows to be transferred to mobile terminal by it from the various command signals of base input.The various command signal inputted from base or electric power can be used as and identify whether mobile terminal is arranged on the signal base exactly.Output unit 150 is constructed to provide output signal (such as, audio signal, vision signal, alarm signal, vibration signal etc.) with vision, audio frequency and/or tactile manner.Output unit 150 can comprise display unit 151, dio Output Modules 152, alarm unit 153 etc.
Display unit 151 may be displayed on the information of process in mobile terminal 100.Such as, when mobile terminal 100 is in telephone calling model, display unit 151 can show with call or other communicate (such as, text messaging, multimedia file are downloaded etc.) be correlated with user interface (UI) or graphic user interface (GUI).When mobile terminal 100 is in video calling pattern or image capture mode, display unit 151 can the image of display capture and/or the image of reception, UI or GUI that video or image and correlation function are shown etc.
Meanwhile, when display unit 151 and touch pad as a layer superposed on one another to form touch-screen time, display unit 151 can be used as input unit and output device.Display unit 151 can comprise at least one in liquid crystal display (LCD), thin-film transistor LCD (TFT-LCD), Organic Light Emitting Diode (OLED) display, flexible display, three-dimensional (3D) display etc.Some in these displays can be constructed to transparence and watch from outside to allow user, and this can be called transparent display, and typical transparent display can be such as TOLED (transparent organic light emitting diode) display etc.According to the specific execution mode wanted, mobile terminal 100 can comprise two or more display units (or other display unit), such as, mobile terminal can comprise outernal display unit (not shown) and inner display unit (not shown).Touch-screen can be used for detecting touch input pressure and touch input position and touch and inputs area.
When dio Output Modules 152 can be under the isotypes such as call signal receiving mode, call mode, logging mode, speech recognition mode, broadcast reception mode at mobile terminal, voice data convert audio signals that is that wireless communication unit 110 is received or that store in memory 160 and exporting as sound.And dio Output Modules 152 can provide the audio frequency relevant to the specific function that mobile terminal 100 performs to export (such as, call signal receives sound, message sink sound etc.).Dio Output Modules 152 can comprise loud speaker, buzzer etc.
Alarm unit 153 can provide and export that event informed to mobile terminal 100.Typical event can comprise calling reception, message sink, key signals input, touch input etc.Except audio or video exports, alarm unit 153 can provide in a different manner and export with the generation of notification event.Such as, alarm unit 153 can provide output with the form of vibration, when receive calling, message or some other enter communication (incomingcommunication) time, alarm unit 153 can provide sense of touch to export (that is, vibrating) to notify to user.By providing such sense of touch to export, even if when the mobile phone of user is in the pocket of user, user also can identify the generation of various event.Alarm unit 153 also can provide the output of the generation of notification event via display unit 151 or dio Output Modules 152.
Memory 160 software program that can store process and the control operation performed by controller 180 etc., or temporarily can store oneself through exporting the data (such as, telephone directory, message, still image, video etc.) that maybe will export.And, memory 160 can store about when touch be applied to touch-screen time the vibration of various modes that exports and the data of audio signal.
Memory 160 can comprise the storage medium of at least one type, described storage medium comprises flash memory, hard disk, multimedia card, card-type memory (such as, SD or DX memory etc.), random access storage device (RAM), static random-access memory (SRAM), read-only memory (ROM), Electrically Erasable Read Only Memory (EEPROM), programmable read only memory (PROM), magnetic storage, disk, CD etc.And mobile terminal 100 can be connected the memory function of execute store 160 network storage device with by network cooperates.
Controller 180 controls the overall operation of mobile terminal usually.Such as, controller 180 performs the control relevant to voice call, data communication, video calling etc. and process.In addition, controller 180 can comprise the multi-media module 1810 for reproducing (or playback) multi-medium data, and multi-media module 1810 can be configured in controller 180, or can be configured to be separated with controller 180.Controller 180 can pattern recognition process, is identified as character or image so that input is drawn in the handwriting input performed on the touchscreen or picture.
Power subsystem 190 receives external power or internal power and provides each element of operation and the suitable electric power needed for assembly under the control of controller 180.
Various execution mode described herein can to use such as computer software, the computer-readable medium of hardware or its any combination implements.For hardware implementation, execution mode described herein can by using application-specific IC (ASIC), digital signal processor (DSP), digital signal processing device (DSPD), programmable logic device (PLD), field programmable gate array (FPGA), processor, controller, microcontroller, microprocessor, being designed at least one performed in the electronic unit of function described herein and implementing, in some cases, such execution mode can be implemented in controller 180.For implement software, the execution mode of such as process or function can be implemented with allowing the independent software module performing at least one function or operation.Software code can be implemented by the software application (or program) write with any suitable programming language, and software code can be stored in memory 160 and to be performed by controller 180.
So far, the mobile terminal according to its functional description.Below, for the sake of brevity, by the slide type mobile terminal that describes in various types of mobile terminals of such as folded form, board-type, oscillating-type, slide type mobile terminal etc. exemplarily.Therefore, the present invention can be applied to the mobile terminal of any type, and is not limited to slide type mobile terminal.
Mobile terminal 100 as shown in Figure 1 can be constructed to utilize and send the such as wired and wireless communication system of data via frame or grouping and satellite-based communication system operates.
Describe wherein according to the communication system that mobile terminal of the present invention can operate referring now to Fig. 2.
Such communication system can use different air interfaces and/or physical layer.Such as, the air interface used by communication system comprises such as frequency division multiple access (FDMA), time division multiple access (TDMA), code division multiple access (CDMA) and universal mobile telecommunications system (UMTS) (especially, Long Term Evolution (LTE)), global system for mobile communications (GSM) etc.As non-limiting example, description below relates to cdma communication system, but such instruction is equally applicable to the system of other type.
With reference to figure 2, cdma wireless communication system can comprise multiple mobile terminal 100, multiple base station (BS) 270, base station controller (BSC) 275 and mobile switching centre (MSC) 280.MSC280 is constructed to form interface with Public Switched Telephony Network (PSTN) 290.MSC280 is also constructed to form interface with the BSC275 that can be couple to base station 270 via back haul link.Back haul link can construct according to any one in some interfaces that oneself knows, described interface comprises such as E1/T1, ATM, IP, PPP, frame relay, HDSL, ADSL or xDSL.Will be appreciated that system as shown in Figure 2 can comprise multiple BSC2750.
Each BS270 can serve one or more subregion (or region), by multidirectional antenna or point to specific direction each subregion of antenna cover radially away from BS270.Or each subregion can by two or more antenna covers for diversity reception.Each BS270 can be constructed to support multiple parallel compensate, and each parallel compensate has specific frequency spectrum (such as, 1.25MHz, 5MHz etc.).
Subregion can be called as CDMA Channel with intersecting of parallel compensate.BS270 also can be called as base station transceiver subsystem (BTS) or other equivalent terms.Under these circumstances, term " base station " may be used for broadly representing single BSC275 and at least one BS270.Base station also can be called as " cellular station ".Or each subregion of particular B S270 can be called as multiple cellular station.
As shown in Figure 2, broadcast singal is sent to the mobile terminal 100 at operate within systems by broadcsting transmitter (BT) 295.Broadcast reception module 111 as shown in Figure 1 is arranged on mobile terminal 100 and sentences the broadcast singal receiving and sent by BT295.In fig. 2, several global positioning system (GPS) satellite 300 is shown.Satellite 300 helps at least one in the multiple mobile terminal 100 in location.
In fig. 2, depict multiple satellite 300, but understand, the satellite of any number can be utilized to obtain useful locating information.GPS module 115 as shown in Figure 1 is constructed to coordinate to obtain the locating information wanted with satellite 300 usually.Substitute GPS tracking technique or outside GPS tracking technique, can use can other technology of position of tracking mobile terminal.In addition, at least one gps satellite 300 optionally or extraly can process satellite dmb transmission.
As a typical operation of wireless communication system, BS270 receives the reverse link signal from various mobile terminal 100.Mobile terminal 100 participates in call usually, information receiving and transmitting communicates with other type.Each reverse link signal that certain base station 270 receives is processed by particular B S270.The data obtained are forwarded to relevant BSC275.BSC provides call Resourse Distribute and comprises the mobile management function of coordination of the soft switching process between BS270.The data received also are routed to MSC280 by BSC275, and it is provided for the extra route service forming interface with PSTN290.Similarly, PSTN290 and MSC280 forms interface, and MSC and BSC275 forms interface, and BSC275 correspondingly control BS270 so that forward link signals is sent to mobile terminal 100.
Based on above-mentioned mobile terminal hardware configuration and communication system, each embodiment of the inventive method is proposed.
Embodiment one
Fig. 3 is the realization flow schematic diagram one of embodiment of the present invention detection method; Described terminal is provided with at least two image acquisition units; As shown in Figure 3, described method comprises:
Step 301: the first image acquisition units at least two image acquisition units described in utilization and the second image acquisition units determine the range information that destination object is corresponding;
Step 302: resolve described range information, obtains parameter corresponding relation;
Step 303: according to described parameter corresponding relation, determine detection table, presents the first image that described detection table is corresponding;
Step 304: obtain the input data based on described first image;
Step 305: judge whether described input data mate with described first image, record judged result.
In the present embodiment, described terminal can be specially the mobile terminal such as mobile phone, panel computer, also can be the fixed terminals such as computer.
In the present embodiment, described terminal needs based on after the first operation unlatching first application, and the first image acquisition units at least two image acquisition units described in triggering and the second image acquisition units determine the range information that destination object is corresponding; Range information corresponding to described destination object is namely for characterizing the distance between described destination object and described terminal.Here, described first application can be specially and measure application corresponding to eyesight.Those skilled in the art are to be understood that, described first operation can be specially clicking operation, such as single-click operation or double click operation, or when described terminal supports touch-control display, described first operation can be specially the touch operation being implemented on icon corresponding to described first application.
In one embodiment, described first image acquisition units is arranged in primary importance corresponding to described terminal, described second image acquisition units is arranged in the second place of described terminal, described primary importance and the described second place are all arranged in the same plane of described terminal, such as, in the plane that the display unit that described first image acquisition units and described second image acquisition units are all arranged at described terminal is corresponding, particularly, described first image acquisition units and described second image acquisition units are front-facing camera, so, be convenient to described first image acquisition units and described second image acquisition units carries out IMAQ to destination object simultaneously.
In another specific embodiment, described terminal is provided with three image acquisition units; The first image acquisition units in described three image acquisition units and the second image acquisition units are all arranged in the plane of the display unit of described terminal, are also preposition image acquisition units; The 3rd image acquisition units in described three image acquisition units is arranged in the plane relative with described display unit, is also rearmounted image acquisition units.All image acquisition units all realize by camera.
In actual applications, for making the described range information determined more accurate, described range information can also be an average distance information, particularly, described terminal utilizes described first image acquisition units and the second image acquisition units repeatedly to carry out range measurement, determine multiple distance, such as the first distance, second distance i.e. N distance, N be more than or equal to 2 positive integer, subsequently, mean value is determined according to described multiple distance, and then using described mean value as final range information, so, improve the accuracy of distance value, for the accuracy improving testing result lays the foundation.
In the present embodiment, when after the range information that described terminal is determined between self and described destination object, range information described in described terminal parses, such as, distance value corresponding for range information and standard testing table are compared by described terminal, by converting, obtain the parameter corresponding relation corresponding with described range information, and determine according to described parameter corresponding relation the detection table that the distance value corresponding with this range information mates, the size of the described each image detected in table, resolution is all corresponding with described range information, such as described terminal is according to the range information between described destination object, determine the visual chart corresponding with described range information, and then provide eyesight to detect for described destination object, as shown in Figure 4, Fig. 4 is the standard visual acuity chart corresponding with described range information, further, after described terminal determines the detection table corresponding with current distance information, present an image in described detection table according to preset rules, as the first image, detected rule corresponding when described preset rules can detect according to existing eyesight, but, when presenting the image in described detection table first, such as the 3rd image, described 3rd image is identical with described first image or not identical, described 3rd image can be the described any image detected in table, and that is, described 3rd image can described terminal be determined at random, or described 3rd image is any image that a certain particular row determined according to the first sub-rule in described preset rules is corresponding, described first sub-rule is used to specify a certain particular row in described detection table, but does not limit to be specially which row, that is, described 3rd image is the random image that a certain particular row determined according to described first sub-rule is corresponding, as shown in Figure 5, Fig. 5 is the 3rd image presented first, it will be appreciated by those skilled in the art that " first " in described first image, and " the 3rd " in the 3rd image is only for differentiate between images, but not for limiting order.
Those skilled in the art are to be understood that, in actual applications, described first image presented also can described terminal be determined according to the second operation receiving user, such as, according to terminal input to the eyesight number of degrees, determine the first row that the number of degrees are corresponding in described detection table, subsequently, described terminal determines the image that described the first row is corresponding at random.
In addition, when described detection table is for characterizing visual chart, the image that described terminal presents should be black and white image, so, is convenient to carry out eyesight detection, to improve accuracy in detection.
In the present embodiment, described terminal presents the first image in described detection table at the display unit of self, described destination object observes described first image, and by modes such as phonetic entry, key-press input or touch outputs, the information feed back extremely described terminal that the first image self seen is corresponding, to enable described terminal get input data, and then the input data got according to self image feature information corresponding with the first image mates, and realizes testing goal.
In the present embodiment, described input data can be the text data that the button of voice data or input is corresponding, or the touch data that touch operation is corresponding; Particularly, described destination object can pass through voice mode input audio data, such as, input the voice such as upper and lower, left and right, makes terminal get the input data characterizing audio frequency.Or, by corresponding for the button of four in terminal four direction (upper and lower, left and right), and then determine direction by button, and then terminal got can characterize the input data in direction; Or, by corresponding with four direction respectively for four the different regions in terminal, and then by being implemented on the touch operation of specific region, determine the input data that can characterize direction.By voice mode, terminal is obtained if it will be appreciated by those skilled in the art that and input data, described terminal needs opening voice function, so that the phonetic entry of user; Here, the moment of opening voice function can be the moment of unlatching first application, also namely opens the first application and opening voice function, or according to being implemented on the operation opening voice function of User Interface, also or when after execution of step 303, opening voice input function.
In the present embodiment, describedly judge whether described input data mate with described first image, can be specially, judge whether the image features that input characteristic parameter corresponding to described input data is corresponding with described first image mates; Described coupling can be specially identical; Namely judge that whether the image features that input characteristic parameter corresponding to described input data is corresponding with described first image is identical.In practical application, described input characteristic parameter can characterize directioin parameter, such as upper and lower, left and right; Described image features also can characterize directioin parameter, such as upper and lower, left and right.
Detection method described in the embodiment of the present invention, the first image acquisition units at least two image acquisition units arranged in terminal and the second image acquisition units is utilized to determine the range information that destination object is corresponding, resolve described range information, obtain parameter corresponding relation, according to described parameter corresponding relation, determine detection table, present the first image that described detection table is corresponding, obtain the input data based on described first image, and then judge whether described input data mate with described first image, record judged result, so, realize utilizing terminal to carry out the object of eyesight detection, enrich and improve Consumer's Experience.
In addition, the detection method described in the embodiment of the present invention utilizes two image acquisition units to detect distance between described terminal and described destination object, therefore, compared with the mode of existing terminal detecting distance, the method described in the embodiment of the present invention is more accurate, more can improve accuracy in detection.
Embodiment two
Based on the detection method described in embodiment one, how to determine described range information for clear and definite further, embodiment of the present invention refinement step 301, particularly, as shown in Figure 6, step 301 comprises:
Step 301A: the first image acquisition units at least two image acquisition units described in utilization and the second image acquisition units carry out IMAQ to described target image, obtains the second image and the 3rd image;
Step 301B: obtain the position relationship information between described first image acquisition units and the second image acquisition units;
Step 301C: according to described position relationship information, described second image and the 3rd image determine the range information that destination object is corresponding.
In one embodiment, described terminal is smart mobile phone; Fig. 7 is the structural representation of the terminal being provided with the first camera and second camera; As shown in Figure 7, described first image acquisition units arranged in described smart mobile phone is realized by the first camera 71, and described second image acquisition units is realized by second camera 72, and described first camera 71 and second camera 72 are front-facing camera.
Further, the embodiment of the present invention, adopts dual camera side distance method, realizes ranging process; Fig. 8 is that dual camera side is apart from principle schematic; As shown in Figure 7 and Figure 8, particularly, utilize described first camera 71 and second camera 72 to carry out IMAQ to described destination object P, obtain the first image P lwith the second image P r, here, the O shown in Fig. 8 lfor the light source of described first camera 71, O rfor the light source of described second camera 72, by described O lwith described O rbetween connecting line as X-axis, X-axis as shown in Figure 8, using in described display plane perpendicular to the direction of described X-axis as Y-axis, Y-axis as shown in Figure 8, to set up coordinate system; Like this, described first image P is obtained lrelative to the x of described X-axis l, described second image P rrelative to the x of described X-axis r, according to following formula (1), shift out following formula (2) onto, obtain calculating the range information Z between described terminal and described destination object P according to (2) formula; Described formula (1) and formula (2) as follows:
T - ( x l - x r ) Z - f = T Z - - - ( 1 )
Z = f × T ( x l - x r ) - - - ( 2 )
Wherein, described T is the centre-to-centre spacing between described first camera 71 and described second camera 72; Described f is focal length.
Like this, the range information between the described terminal utilizing said method to determine and described destination object is more accurate, and then can adequately determine parameter corresponding relation, and determines that detection table is laid a good foundation.
Embodiment three
Fig. 9 is the realization flow schematic diagram three of embodiment of the present invention detection method; Described terminal is provided with at least two image acquisition units; As shown in Figure 9, described method comprises:
Step 901: the first image acquisition units at least two image acquisition units described in utilization and the second image acquisition units determine the range information that destination object is corresponding;
For determining whether described range information meets the demands further, the embodiment of the present invention also needs to judge whether described range information is in threshold range, particularly, terminal enters the first application and opens distance measurement mode, and destination object is after the distance certain apart from described terminal maintains static, described terminal detects the distance between described destination object and terminal, described terminal is by the range information that detects and threshold range comparison, if when the described range information detected does not fall in described threshold range, terminal can pass through display reminding information, loud speaker play cuing information, or the mode such as vibration points out destination object to shorten distance between described terminal, or the distance of elongating between described terminal, till the described range information detected until described terminal falls in described preset range.
Step 902: judge whether described range information is in threshold range;
Step 903: when described range information is in described threshold range, resolve described range information, obtain parameter corresponding relation; Perform step 905 subsequently;
Step 904: when described range information is not in described threshold range, generates information, with the positional information pointing out destination object to adjust self, continues to perform step 901; Till the range information that destination object after adjustment is corresponding is in described threshold range;
Step 905: according to described parameter corresponding relation, determine detection table, presents the first image that described detection table is corresponding;
Step 906: obtain the input data based on described first image;
Step 907: judge whether described input data mate with described first image, record judged result.
In the present embodiment, described terminal can be specially the mobile terminal such as mobile phone, panel computer, also can be the fixed terminals such as computer.
In the present embodiment, described terminal needs based on after the first operation unlatching first application, and the first image acquisition units at least two image acquisition units described in triggering and the second image acquisition units determine the range information that destination object is corresponding; Range information corresponding to described destination object is namely for characterizing the distance between described destination object and described terminal.Here, described first application can be specially and measure application corresponding to eyesight.Those skilled in the art are to be understood that, described first operation can be specially clicking operation, such as single-click operation or double click operation, or when described terminal supports touch-control display, described first operation can be specially the touch operation being implemented on icon corresponding to described first application.
In one embodiment, described first image acquisition units is arranged in primary importance corresponding to described terminal, described second image acquisition units is arranged in the second place of described terminal, described primary importance and the described second place are all arranged in the same plane of described terminal, such as, in the plane that the display unit that described first image acquisition units and described second image acquisition units are all arranged at described terminal is corresponding, particularly, described first image acquisition units and described second image acquisition units are front-facing camera, so, be convenient to described first image acquisition units and described second image acquisition units carries out IMAQ to destination object simultaneously.
In another specific embodiment, described terminal is provided with three image acquisition units; The first image acquisition units in described three image acquisition units and the second image acquisition units are all arranged in the plane of the display unit of described terminal, are also preposition image acquisition units; The 3rd image acquisition units in described three image acquisition units is arranged in the plane relative with described display unit, is also rearmounted image acquisition units.All image acquisition units all realize by camera.
In actual applications, for making the described range information determined more accurate, described range information can also be an average distance information, particularly, described terminal utilizes described first image acquisition units and the second image acquisition units repeatedly to carry out range measurement, determine multiple distance, such as the first distance, second distance i.e. N distance, N be more than or equal to 2 positive integer, subsequently, mean value is determined according to described multiple distance, and then using described mean value as final range information, so, improve the accuracy of distance value, for the accuracy improving testing result lays the foundation.
In the present embodiment, when after the range information that described terminal is determined between self and described destination object, range information described in described terminal parses, such as, distance value corresponding for range information and standard testing table are compared by described terminal, by converting, obtain the parameter corresponding relation corresponding with described range information, and determine according to described parameter corresponding relation the detection table that the distance value corresponding with this range information mates, the size of the described each image detected in table, resolution is all corresponding with described range information, such as described terminal is according to the range information between described destination object, determine the visual chart corresponding with described range information, and then provide eyesight to detect for described destination object, as shown in Figure 4, Fig. 4 is the standard visual acuity chart corresponding with described range information, further, after described terminal determines the detection table corresponding with current distance information, present an image in described detection table according to preset rules, as the first image, detected rule corresponding when described preset rules can detect according to existing eyesight, but, when presenting the image in described detection table first, such as the 3rd image, described 3rd image is identical with described first image or not identical, described 3rd image can be the described any image detected in table, and that is, described 3rd image can described terminal be determined at random, or described 3rd image is any image that a certain particular row determined according to the first sub-rule in described preset rules is corresponding, described first sub-rule is used to specify a certain particular row in described detection table, but does not limit to be specially which row, that is, described 3rd image is the random image that a certain particular row determined according to described first sub-rule is corresponding, as shown in Figure 5, Fig. 5 is the 3rd image presented first, it will be appreciated by those skilled in the art that " first " in described first image, and " the 3rd " in the 3rd image is only for differentiate between images, but not for limiting order.
Those skilled in the art are to be understood that, in actual applications, described first image presented also can described terminal be determined according to the second operation receiving user, such as, according to terminal input to the eyesight number of degrees, determine the first row that the number of degrees are corresponding in described detection table, subsequently, described terminal determines the image that described the first row is corresponding at random.
In addition, when described detection table is for characterizing visual chart, the image that described terminal presents should be black and white image, so, is convenient to carry out eyesight detection, to improve accuracy in detection.
In the present embodiment, described terminal presents the first image in described detection table at the display unit of self, described destination object observes described first image, and by modes such as phonetic entry, key-press input or touch outputs, the information feed back extremely described terminal that the first image self seen is corresponding, to enable described terminal get input data, and then the input data got according to self image feature information corresponding with the first image mates, and realizes testing goal.
In the present embodiment, described input data can be the text data that the button of voice data or input is corresponding, or the touch data that touch operation is corresponding; Particularly, described destination object can pass through voice mode input audio data, such as, input the voice such as upper and lower, left and right, makes terminal get the input data characterizing audio frequency.Or, by corresponding for the button of four in terminal four direction (upper and lower, left and right), and then determine direction by button, and then terminal got can characterize the input data in direction; Or, by corresponding with four direction respectively for four the different regions in terminal, and then by being implemented on the touch operation of specific region, determine the input data that can characterize direction.By voice mode, terminal is obtained if it will be appreciated by those skilled in the art that and input data, described terminal needs opening voice function, so that the phonetic entry of user; Here, the moment of opening voice function can be the moment of unlatching first application, also namely opens the first application and opening voice function, or according to being implemented on the operation opening voice function of User Interface, also or when after execution of step 903, opening voice input function.
In the present embodiment, describedly judge whether described input data mate with described first image, can be specially, judge whether the image features that input characteristic parameter corresponding to described input data is corresponding with described first image mates; Described coupling can be specially identical; Namely judge that whether the image features that input characteristic parameter corresponding to described input data is corresponding with described first image is identical.In practical application, described input characteristic parameter can characterize directioin parameter, such as upper and lower, left and right; Described image features also can characterize directioin parameter, such as upper and lower, left and right.
In one embodiment, range information between described destination object and described terminal is in described threshold range, when described first image acquisition units and the second image acquisition units are all realized by front-facing camera, the picture that front-facing camera collects is presented in the display unit of described terminal, now, mobile phone is utilized to carry out the accuracy of eyesight detection for improving further, the first viewing area is presented in described display unit, described first viewing area includes two sub regions, so, be shown in two sub regions to make the eyes of the described destination object collected, and then guarantee that described destination object and described terminal are roughly on a horizontal plane, in order to ensure that image that the eyes of described destination object are corresponding with the described detection table presented is in same level further, the image that described detection table is corresponding is also presented in described first viewing area, improves the accuracy utilizing terminal to carry out eyesight detection further.Here, when described terminal presents image corresponding to described detection table, without the need to distinguishing two sub regions again.
Detection method described in the embodiment of the present invention, the first image acquisition units at least two image acquisition units arranged in terminal and the second image acquisition units is utilized to determine the range information that destination object is corresponding, resolve described range information, obtain parameter corresponding relation, according to described parameter corresponding relation, determine detection table, present the first image that described detection table is corresponding, obtain the input data based on described first image, and then judge whether described input data mate with described first image, record judged result, so, realize utilizing terminal to carry out the object of eyesight detection, enrich and improve Consumer's Experience.
In addition, the detection method described in the embodiment of the present invention utilizes two image acquisition units to detect distance between described terminal and described destination object, therefore, compared with the mode of existing terminal detecting distance, the method described in the embodiment of the present invention is more accurate, more can improve accuracy in detection.
Embodiment four
Figure 10 is the realization flow schematic diagram four of embodiment of the present invention detection method; Described terminal is provided with at least two image acquisition units; As shown in Figure 10, described method comprises:
Step 1001: the first image acquisition units at least two image acquisition units described in utilization and the second image acquisition units determine the range information that destination object is corresponding;
Step 1002: resolve described range information, obtains parameter corresponding relation;
Step 1003: according to described parameter corresponding relation, determine detection table, presents the first image that described detection table is corresponding;
Step 1004: obtain the input data based on described first image;
Step 1005: judge whether described input data mate with described first image, record judged result;
Step 1006: automatically adjust the image presented according to judged result, to present the second image corresponding to described detection table; Described second image and described first image have the first relation; Or, based on the image that user operation adjustment presents, to present the second image corresponding to described detection table.
In the present embodiment, described terminal can be specially the mobile terminal such as mobile phone, panel computer, also can be the fixed terminals such as computer.
In the present embodiment, described terminal needs based on after the first operation unlatching first application, and the first image acquisition units at least two image acquisition units described in triggering and the second image acquisition units determine the range information that destination object is corresponding; Range information corresponding to described destination object is namely for characterizing the distance between described destination object and described terminal.Here, described first application can be specially and measure application corresponding to eyesight.Those skilled in the art are to be understood that, described first operation can be specially clicking operation, such as single-click operation or double click operation, or when described terminal supports touch-control display, described first operation can be specially the touch operation being implemented on icon corresponding to described first application.
In one embodiment, described first image acquisition units is arranged in primary importance corresponding to described terminal, described second image acquisition units is arranged in the second place of described terminal, described primary importance and the described second place are all arranged in the same plane of described terminal, such as, in the plane that the display unit that described first image acquisition units and described second image acquisition units are all arranged at described terminal is corresponding, particularly, described first image acquisition units and described second image acquisition units are front-facing camera, so, be convenient to described first image acquisition units and described second image acquisition units carries out IMAQ to destination object simultaneously.
In another specific embodiment, described terminal is provided with three image acquisition units; The first image acquisition units in described three image acquisition units and the second image acquisition units are all arranged in the plane of the display unit of described terminal, are also preposition image acquisition units; The 3rd image acquisition units in described three image acquisition units is arranged in the plane relative with described display unit, is also rearmounted image acquisition units.All image acquisition units all realize by camera.
In actual applications, for making the described range information determined more accurate, described range information can also be an average distance information, particularly, described terminal utilizes described first image acquisition units and the second image acquisition units repeatedly to carry out range measurement, determine multiple distance, such as the first distance, second distance i.e. N distance, N be more than or equal to 2 positive integer, subsequently, mean value is determined according to described multiple distance, and then using described mean value as final range information, so, improve the accuracy of distance value, for the accuracy improving testing result lays the foundation.
In the present embodiment, when after the range information that described terminal is determined between self and described destination object, range information described in described terminal parses, such as, distance value corresponding for range information and standard testing table are compared by described terminal, by converting, obtain the parameter corresponding relation corresponding with described range information, and determine according to described parameter corresponding relation the detection table that the distance value corresponding with this range information mates, the size of the described each image detected in table, resolution is all corresponding with described range information, such as described terminal is according to the range information between described destination object, determine the visual chart corresponding with described range information, and then provide eyesight to detect for described destination object, as shown in Figure 4, Fig. 4 is the standard visual acuity chart corresponding with described range information, further, after described terminal determines the detection table corresponding with current distance information, present an image in described detection table according to preset rules, as the first image, detected rule corresponding when described preset rules can detect according to existing eyesight, but, when presenting the image in described detection table first, such as the 3rd image, described 3rd image is identical with described first image or not identical, described 3rd image can be the described any image detected in table, and that is, described 3rd image can described terminal be determined at random, or described 3rd image is any image that a certain particular row determined according to the first sub-rule in described preset rules is corresponding, described first sub-rule is used to specify a certain particular row in described detection table, but does not limit to be specially which row, that is, described 3rd image is the random image that a certain particular row determined according to described first sub-rule is corresponding, as shown in Figure 5, Fig. 5 is the 3rd image presented first, it will be appreciated by those skilled in the art that " first " in described first image, and " the 3rd " in the 3rd image is only for differentiate between images, but not for limiting order.
Those skilled in the art are to be understood that, in actual applications, described first image presented also can described terminal be determined according to the second operation receiving user, such as, according to terminal input to the eyesight number of degrees, determine the first row that the number of degrees are corresponding in described detection table, subsequently, described terminal determines the image that described the first row is corresponding at random.
In addition, when described detection table is for characterizing visual chart, the image that described terminal presents should be black and white image, so, is convenient to carry out eyesight detection, to improve accuracy in detection.
In the present embodiment, described terminal presents the first image in described detection table at the display unit of self, described destination object observes described first image, and by modes such as phonetic entry, key-press input or touch outputs, the information feed back extremely described terminal that the first image self seen is corresponding, to enable described terminal get input data, and then the input data got according to self image feature information corresponding with the first image mates, and realizes testing goal.
In the present embodiment, described input data can be the text data that the button of voice data or input is corresponding, or the touch data that touch operation is corresponding; Particularly, described destination object can pass through voice mode input audio data, such as, input the voice such as upper and lower, left and right, makes terminal get the input data characterizing audio frequency.Or, by corresponding for the button of four in terminal four direction (upper and lower, left and right), and then determine direction by button, and then terminal got can characterize the input data in direction; Or, by corresponding with four direction respectively for four the different regions in terminal, and then by being implemented on the touch operation of specific region, determine the input data that can characterize direction.By voice mode, terminal is obtained if it will be appreciated by those skilled in the art that and input data, described terminal needs opening voice function, so that the phonetic entry of user; Here, the moment of opening voice function can be the moment of unlatching first application, also namely opens the first application and opening voice function, or according to being implemented on the operation opening voice function of User Interface, also or when after execution of step 1003, opening voice input function.
In the present embodiment, describedly judge whether described input data mate with described first image, can be specially, judge whether the image features that input characteristic parameter corresponding to described input data is corresponding with described first image mates; Described coupling can be specially identical; Namely judge that whether the image features that input characteristic parameter corresponding to described input data is corresponding with described first image is identical.In practical application, described input characteristic parameter can characterize directioin parameter, such as upper and lower, left and right; Described image features also can characterize directioin parameter, such as upper and lower, left and right.
In the present embodiment, after presenting the second image corresponding to described detection table, described terminal obtains the input data based on described second image, judge whether the input data based on described second image mate with described second image, record judged result, until when judged result meets default requirement, export judged result.Here, described default requirement can be arranged arbitrarily according to actual conditions, such as, in one embodiment, phonetic function is utilized to obtain the voice messaging of destination object input, and then judge whether described voice messaging mates with described first image, if coupling, record judged result, and enter lower level test according to judged result, namely according to existing eyesight detected rule, any one image of the next line that the first image is corresponding described in described detection table is exported; If do not mate, also record judged result, or prompting user tries again; When the judged result of upper and lower two grades be all do not mate time, prompting user stops eyesight and detects, and this testing process is complete.
In the present embodiment, when the image of the automatic adjustment programme of described terminal, described second image is any image in the lower level that described first image is corresponding; Such as, described second image is any image in next line image corresponding to the first image described in detection table.Or, the image that described terminal presents can be adjusted arbitrarily according to user's request, such as, when adopting the method described in the embodiment of the present invention to carry out eyesight detection, user can adjust grade corresponding to the image that presents, as adjusted the number of degrees by pressing volume upper and lower key; The grade that the image that can also be presented by voice mode adjustment is corresponding, as added the number of degrees or subtracting the number of degrees, particularly, can to arrange 25 degree be one grade or 50 degree is one grade, then by voice mode or volume key mode, the grade of the image presented is adjusted, and then the object that realization gives a test of one's eyesight.
Detection method described in the embodiment of the present invention, the first image acquisition units at least two image acquisition units arranged in terminal and the second image acquisition units is utilized to determine the range information that destination object is corresponding, resolve described range information, obtain parameter corresponding relation, according to described parameter corresponding relation, determine detection table, present the first image that described detection table is corresponding, obtain the input data based on described first image, and then judge whether described input data mate with described first image, record judged result, so, realize utilizing terminal to carry out the object of eyesight detection, enrich and improve Consumer's Experience.
In addition, the detection method described in the embodiment of the present invention utilizes two image acquisition units to detect distance between described terminal and described destination object, therefore, compared with the mode of existing terminal detecting distance, the method described in the embodiment of the present invention is more accurate, more can improve accuracy in detection.
Embodiment five
Figure 11 is the structural representation one of embodiment of the present invention terminal; Described terminal is provided with at least two image acquisition units; As shown in figure 11, described terminal also comprises:
Detecting unit 21, determines for the first image acquisition units at least two image acquisition units described in utilizing and the second image acquisition units the range information that destination object is corresponding;
Processing unit 22, for resolving described range information, obtains parameter corresponding relation;
Determining unit 23, for according to described parameter corresponding relation, determines detection table, presents the first image that described detection table is corresponding;
Acquiring unit 24, for obtaining the input data based on described first image;
First judging unit 25, for judging whether described input data mate with described first image, record judged result.
It will be appreciated by those skilled in the art that the function of each processing unit in the electronic equipment of the embodiment of the present invention, can refer to the associated description of aforementioned detection method and understand.Particularly,
In the present embodiment, described terminal can be specially the mobile terminal such as mobile phone, panel computer, also can be the fixed terminals such as computer.
In the present embodiment, after described terminal needs to apply based on the first operation unlatching first, trigger described detecting unit 21 utilize described in the first image acquisition units at least two image acquisition units and the second image acquisition units determine the range information that destination object is corresponding; Range information corresponding to described destination object is namely for characterizing the distance between described destination object and described terminal.Here, described first application can be specially and measure application corresponding to eyesight.Those skilled in the art are to be understood that, described first operation can be specially clicking operation, such as single-click operation or double click operation, or when described terminal supports touch-control display, described first operation can be specially the touch operation being implemented on icon corresponding to described first application.
In one embodiment, described first image acquisition units is arranged in primary importance corresponding to described terminal, described second image acquisition units is arranged in the second place of described terminal, described primary importance and the described second place are all arranged in the same plane of described terminal, such as, in the plane that the display unit that described first image acquisition units and described second image acquisition units are all arranged at described terminal is corresponding, particularly, described first image acquisition units and described second image acquisition units are front-facing camera, so, be convenient to described first image acquisition units and described second image acquisition units carries out IMAQ to destination object simultaneously.
In another specific embodiment, described terminal is provided with three image acquisition units; The first image acquisition units in described three image acquisition units and the second image acquisition units are all arranged in the plane of the display unit of described terminal, are also preposition image acquisition units; The 3rd image acquisition units in described three image acquisition units is arranged in the plane relative with described display unit, is also rearmounted image acquisition units.All image acquisition units all realize by camera.
In actual applications, for making the described range information determined more accurate, described range information can also be an average distance information, particularly, described terminal triggers described detecting unit 21 and utilizes described first image acquisition units and the second image acquisition units repeatedly to carry out range measurement, determine multiple distance, such as the first distance, second distance i.e. N distance, N be more than or equal to 2 positive integer, subsequently, mean value is determined according to described multiple distance, and then using described mean value as final range information, so, improve the accuracy of distance value, for the accuracy improving testing result lays the foundation.
In the present embodiment, after described detecting unit 21 determines the range information between terminal and described destination object, described processing unit 22 resolves described range information, such as, distance value corresponding for range information and standard testing table are compared by described processing unit 22, by converting, obtain the parameter corresponding relation corresponding with described range information, and trigger described determining unit 23 and determine according to described parameter corresponding relation the detection table that the distance value corresponding with this range information mates, the size of the described each image detected in table, resolution is all corresponding with described range information, such as described terminal is according to the range information between described destination object, determine the visual chart corresponding with described range information, and then provide eyesight to detect for described destination object, as shown in Figure 4, Fig. 4 is the standard visual acuity chart corresponding with described range information, further, after described determining unit 23 determines the detection table corresponding with current distance information, present an image in described detection table according to preset rules, as the first image, detected rule corresponding when described preset rules can detect according to existing eyesight, but, when presenting the image in described detection table first, such as the 3rd image, described 3rd image is identical with described first image or not identical, described 3rd image can be the described any image detected in table, and that is, described 3rd image can described terminal be determined at random, or described 3rd image is any image that a certain particular row determined according to the first sub-rule in described preset rules is corresponding, described first sub-rule is used to specify a certain particular row in described detection table, but does not limit to be specially which row, that is, described 3rd image is the random image that a certain particular row determined according to described first sub-rule is corresponding, as shown in Figure 5, Fig. 5 is the 3rd image presented first, it will be appreciated by those skilled in the art that " first " in described first image, and " the 3rd " in the 3rd image is only for differentiate between images, but not for limiting order.
Those skilled in the art are to be understood that, in actual applications, described first image presented also can described terminal be determined according to the second operation receiving user, such as, according to terminal input to the eyesight number of degrees, determine the first row that the number of degrees are corresponding in described detection table, subsequently, described terminal determines the image that described the first row is corresponding at random.
In addition, when described detection table is for characterizing visual chart, the image that described terminal presents should be black and white image, so, is convenient to carry out eyesight detection, to improve accuracy in detection.
In the present embodiment, described terminal presents the first image in described detection table at the display unit of self, described destination object observes described first image, and by modes such as phonetic entry, key-press input or touch outputs, the information feed back extremely described terminal that the first image self seen is corresponding, input data can be got to make described acquiring unit 24, and then the image feature information triggering input data that described first judging unit 25 gets according to self corresponding with the first image mates, and realizes testing goal.
In the present embodiment, described input data can be the text data that the button of voice data or input is corresponding, or the touch data that touch operation is corresponding; Particularly, described destination object can pass through voice mode input audio data, such as, input the voice such as upper and lower, left and right, makes terminal get the input data characterizing audio frequency.Or, by corresponding for the button of four in terminal four direction (upper and lower, left and right), and then determine direction by button, and then terminal got can characterize the input data in direction; Or, by corresponding with four direction respectively for four the different regions in terminal, and then by being implemented on the touch operation of specific region, determine the input data that can characterize direction.By voice mode, terminal is obtained if it will be appreciated by those skilled in the art that and input data, described terminal needs opening voice function, so that the phonetic entry of user; Here, the moment of opening voice function can be the moment of unlatching first application, also namely the first application and opening voice function is opened, or according to being implemented on the operation opening voice function of User Interface, also or after described determining unit 23 presents the first image, opening voice input function.
In the present embodiment, describedly judge whether described input data mate with described first image, can be specially, judge whether the image features that input characteristic parameter corresponding to described input data is corresponding with described first image mates; Described coupling can be specially identical; Namely judge that whether the image features that input characteristic parameter corresponding to described input data is corresponding with described first image is identical.In practical application, described input characteristic parameter can characterize directioin parameter, such as upper and lower, left and right; Described image features also can characterize directioin parameter, such as upper and lower, left and right.
Terminal described in the embodiment of the present invention, the first image acquisition units at least two image acquisition units arranged in terminal and the second image acquisition units is utilized to determine the range information that destination object is corresponding, resolve described range information, obtain parameter corresponding relation, according to described parameter corresponding relation, determine detection table, present the first image that described detection table is corresponding, obtain the input data based on described first image, and then judge whether described input data mate with described first image, record judged result, so, realize utilizing terminal to carry out the object of eyesight detection, enrich and improve Consumer's Experience.
In addition, terminal described in the embodiment of the present invention, utilize two image acquisition units to detect distance between described terminal and described destination object, therefore, compared with the mode of existing terminal detecting distance, the detection method that terminal described in the embodiment of the present invention is taked is more accurate, more can improve accuracy in detection.
Embodiment six
Based on the terminal described in embodiment five, in the embodiment of the present invention, as shown in figure 12, described detecting unit 21 comprises:
IMAQ subelement 2101, carries out IMAQ for the first image acquisition units at least two image acquisition units described in utilizing and the second image acquisition units to described target image, obtains the second image and the 3rd image;
Obtain subelement 2102, for obtaining the position relationship information between described first image acquisition units and the second image acquisition units;
Process subelement 2103, for according to described position relationship information, described second image and the 3rd image determine the range information that destination object is corresponding.
In one embodiment, described terminal is smart mobile phone; Fig. 7 is the structural representation of the terminal being provided with the first camera and second camera; As shown in Figure 7, described first image acquisition units arranged in described smart mobile phone is realized by the first camera 71, and described second image acquisition units is realized by second camera 72, and described first camera 71 and second camera 72 are front-facing camera.
Further, the embodiment of the present invention, adopts dual camera side distance method, realizes ranging process; Fig. 8 is that dual camera side is apart from principle schematic; As shown in Figure 7 and Figure 8, particularly, utilize described first camera 71 and second camera 72 to carry out IMAQ to described destination object P, obtain the first image P lwith the second image P r, here, the O shown in Fig. 8 lfor the light source of described first camera 71, O rfor the light source of described second camera 72, by described O lwith described O rbetween connecting line as X-axis, X-axis as shown in Figure 8, using in described display plane perpendicular to the direction of described X-axis as Y-axis, Y-axis as shown in Figure 8, to set up coordinate system; Like this, described first image P is obtained lrelative to the x of described X-axis l, described second image P rrelative to the x of described X-axis r, according to following formula (1), shift out following formula (2) onto, obtain calculating the range information Z between described terminal and described destination object P according to (2) formula; Described formula (1) and formula (2) as follows:
T - ( x l - x r ) Z - f = T Z - - - ( 1 )
Z = f × T ( x l - x r ) - - - ( 2 )
Wherein, described T is the centre-to-centre spacing between described first camera 71 and described second camera 72; Described f is focal length.
Like this, the range information between the described terminal utilizing said method to determine and described destination object is more accurate, and then can adequately determine parameter corresponding relation, and determines that detection table is laid a good foundation.
It will be appreciated by those skilled in the art that the function of each processing unit in the electronic equipment of the embodiment of the present invention, can refer to the associated description of aforementioned detection method and understand.
Embodiment seven
Figure 13 is the structural representation three of embodiment of the present invention terminal; Described terminal is provided with at least two image acquisition units; As shown in figure 13, described terminal also comprises:
Detecting unit 21, determines for the first image acquisition units at least two image acquisition units described in utilizing and the second image acquisition units the range information that destination object is corresponding;
Processing unit 22, for resolving described range information, obtains parameter corresponding relation;
Determining unit 23, for according to described parameter corresponding relation, determines detection table, presents the first image that described detection table is corresponding;
Acquiring unit 24, for obtaining the input data based on described first image;
First judging unit 25, for judging whether described input data mate with described first image, record judged result.
In the present embodiment, described terminal also comprises:
Second judging unit 26, for judging whether described range information is in threshold range;
Accordingly, described processing unit 22, also for when described range information is in described threshold range, resolves described range information, obtains parameter corresponding relation.
In the present embodiment, described processing unit 22, also for when described range information is not in described threshold range, generates information, with the positional information pointing out destination object to adjust self, range information corresponding to the destination object after adjustment is in described threshold range.
In the present embodiment, described terminal also comprises:
Adjustment unit 27, for automatically adjusting the image presented according to judged result, to present the second image corresponding to described detection table; Described second image and described first image have the first relation; Or, based on the image that user operation adjustment presents, to present the second image corresponding to described detection table.
It will be appreciated by those skilled in the art that the function of each processing unit in the electronic equipment of the embodiment of the present invention, can refer to the associated description of aforementioned detection method and understand.In addition, in the present embodiment, the function of each processing unit with reference to the description of embodiment five, can repeat no more here.
In one embodiment, range information between described destination object and described terminal is in described threshold range, when described first image acquisition units and the second image acquisition units are all realized by front-facing camera, the picture that front-facing camera collects is presented in the display unit of described terminal, now, mobile phone is utilized to carry out the accuracy of eyesight detection for improving further, the first viewing area is presented in described display unit, described first viewing area includes two sub regions, so, be shown in two sub regions to make the eyes of the described destination object collected, and then guarantee that described destination object and described terminal are roughly on a horizontal plane, in order to ensure that image that the eyes of described destination object are corresponding with the described detection table presented is in same level further, the image that described detection table is corresponding is also presented in described first viewing area, improves the accuracy utilizing terminal to carry out eyesight detection further.Here, when described terminal presents image corresponding to described detection table, without the need to distinguishing two sub regions again.
Through the above description of the embodiments, those skilled in the art can be well understood to the mode that above-described embodiment method can add required general hardware platform by software and realize, hardware can certainly be passed through, but in a lot of situation, the former is better execution mode.Based on such understanding, technical scheme of the present invention can embody with the form of software product the part that prior art contributes in essence in other words, this computer software product is stored in a storage medium (as ROM/RAM, magnetic disc, CD), comprising some instructions in order to make a station terminal equipment (can be mobile phone, computer, server, air conditioner, or the network equipment etc.) perform method described in each embodiment of the present invention.
These are only the preferred embodiments of the present invention; not thereby the scope of the claims of the present invention is limited; every utilize specification of the present invention and accompanying drawing content to do equivalent structure or equivalent flow process conversion; or be directly or indirectly used in other relevant technical fields, be all in like manner included in scope of patent protection of the present invention.

Claims (10)

1. a terminal, is characterized in that, described terminal is provided with at least two image acquisition units; Described terminal also comprises:
Detecting unit, determines for the first image acquisition units at least two image acquisition units described in utilizing and the second image acquisition units the range information that destination object is corresponding;
Processing unit, for resolving described range information, obtains parameter corresponding relation;
Determining unit, for according to described parameter corresponding relation, determines detection table, presents the first image that described detection table is corresponding;
Acquiring unit, for obtaining the input data based on described first image;
First judging unit, for judging whether described input data mate with described first image, record judged result.
2. terminal according to claim 1, is characterized in that, described detecting unit comprises:
IMAQ subelement, carries out IMAQ for the first image acquisition units at least two image acquisition units described in utilizing and the second image acquisition units to described target image, obtains the second image and the 3rd image;
Obtain subelement, for obtaining the position relationship information between described first image acquisition units and the second image acquisition units;
Process subelement, for according to described position relationship information, described second image and the 3rd image determine the range information that destination object is corresponding.
3. terminal according to claim 1, is characterized in that, described terminal also comprises:
Second judging unit, for judging whether described range information is in threshold range;
Accordingly, described processing unit, also for when described range information is in described threshold range, resolves described range information, obtains parameter corresponding relation.
4. terminal according to claim 3, it is characterized in that, described processing unit, also for when described range information is not in described threshold range, generate information, with the positional information pointing out destination object to adjust self, range information corresponding to the destination object after adjustment is in described threshold range.
5. terminal according to claim 1, is characterized in that, described terminal also comprises:
Adjustment unit, for automatically adjusting the image presented according to judged result, to present the second image corresponding to described detection table; Described second image and described first image have the first relation; Or,
Based on the image that user operation adjustment presents, to present the second image corresponding to described detection table.
6. a detection method, is applied to terminal; It is characterized in that, described terminal is provided with at least two image acquisition units; Described method comprises:
The first image acquisition units at least two image acquisition units described in utilization and the second image acquisition units determine the range information that destination object is corresponding;
Resolve described range information, obtain parameter corresponding relation;
According to described parameter corresponding relation, determine detection table, present the first image that described detection table is corresponding;
Obtain the input data based on described first image;
Judge whether described input data mate with described first image, record judged result.
7. method according to claim 6, is characterized in that, the first image acquisition units at least two image acquisition units described in described utilization and the second image acquisition units are determined and comprised the range information that destination object is corresponding:
The first image acquisition units at least two image acquisition units described in utilization and the second image acquisition units carry out IMAQ to described target image, obtain the second image and the 3rd image;
Obtain the position relationship information between described first image acquisition units and the second image acquisition units;
According to described position relationship information, described second image and the 3rd image determine the range information that destination object is corresponding.
8. method according to claim 6, is characterized in that, described method also comprises:
Judge whether described range information is in threshold range;
Accordingly, the described range information of described parsing, obtains parameter corresponding relation, comprising:
When described range information is in described threshold range, resolve described range information, obtain parameter corresponding relation.
9. method according to claim 8, is characterized in that, described method also comprises:
When described range information is not in described threshold range, generate information, with the positional information pointing out destination object to adjust self, range information corresponding to the destination object after adjustment is in described threshold range.
10. method according to claim 6, is characterized in that, described method also comprises:
Automatically the image presented is adjusted, to present the second image corresponding to described detection table according to judged result; Described second image and described first image have the first relation; Or,
Based on the image that user operation adjustment presents, to present the second image corresponding to described detection table.
CN201510355057.8A 2015-06-24 2015-06-24 A kind of detection method and its terminal Active CN105120050B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510355057.8A CN105120050B (en) 2015-06-24 2015-06-24 A kind of detection method and its terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510355057.8A CN105120050B (en) 2015-06-24 2015-06-24 A kind of detection method and its terminal

Publications (2)

Publication Number Publication Date
CN105120050A true CN105120050A (en) 2015-12-02
CN105120050B CN105120050B (en) 2019-04-30

Family

ID=54667931

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510355057.8A Active CN105120050B (en) 2015-06-24 2015-06-24 A kind of detection method and its terminal

Country Status (1)

Country Link
CN (1) CN105120050B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105867860A (en) * 2016-03-28 2016-08-17 联想(北京)有限公司 Information processing method and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1753078A (en) * 2004-09-20 2006-03-29 Lg电子株式会社 Adjustable display of mobile communications terminal
KR20060054625A (en) * 2004-11-15 2006-05-23 엘지전자 주식회사 Method for measuring eyesight using portable terminal
CN102984344A (en) * 2012-10-16 2013-03-20 广东欧珀移动通信有限公司 Method for testing vision with mobile phone and mobile phone
CN102980556A (en) * 2012-11-29 2013-03-20 北京小米科技有限责任公司 Distance measuring method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1753078A (en) * 2004-09-20 2006-03-29 Lg电子株式会社 Adjustable display of mobile communications terminal
KR20060054625A (en) * 2004-11-15 2006-05-23 엘지전자 주식회사 Method for measuring eyesight using portable terminal
CN102984344A (en) * 2012-10-16 2013-03-20 广东欧珀移动通信有限公司 Method for testing vision with mobile phone and mobile phone
CN102980556A (en) * 2012-11-29 2013-03-20 北京小米科技有限责任公司 Distance measuring method and device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105867860A (en) * 2016-03-28 2016-08-17 联想(北京)有限公司 Information processing method and electronic equipment
CN105867860B (en) * 2016-03-28 2019-04-26 联想(北京)有限公司 A kind of information processing method and electronic equipment

Also Published As

Publication number Publication date
CN105120050B (en) 2019-04-30

Similar Documents

Publication Publication Date Title
CN104731501B (en) Control chart calibration method and mobile terminal
CN104898926A (en) Screen capture method and device of mobile terminal
CN104660913A (en) Focus length adjusting method and device
CN104750420A (en) Screen capturing method and device
CN105430158A (en) Processing method of non-touch operation and terminal
CN104898961A (en) Application rapid starting method and apparatus
CN104902095A (en) Mobile terminal single hand mode recognition method and device
CN105357367B (en) Recognition by pressing keys device and method based on pressure sensor
CN105093016A (en) Automation testing method and device for mobile terminal
CN104735256A (en) Method and device for judging holding mode of mobile terminal
CN105138261A (en) Shooting parameter adjustment apparatus and method
CN104767889A (en) Screen state control method and device
CN104951549A (en) Mobile terminal and photo/video sort management method thereof
CN105141738A (en) Volume adjusting method and device
CN105100421A (en) Call control method and call control device
CN104991772A (en) Remote operation guide method and apparatus
CN104658535A (en) Voice control method and device
CN104881218A (en) Mobile terminal screen scrolling method and mobile terminal screen scrolling device
CN104917891A (en) Mobile terminal and page-turning control method thereof
CN104951229A (en) Screen capturing method and device
CN104777982A (en) Method and device for switching terminal input method
CN106131274A (en) Mobile terminal control device and method
CN104915122A (en) Quick application starting method and device
CN105573916A (en) Fault detection method and mobile terminal
CN105049612A (en) Method of realizing recording and device of realizing recording

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant