CN105180817A - Data processing method and electronic equipment - Google Patents

Data processing method and electronic equipment Download PDF

Info

Publication number
CN105180817A
CN105180817A CN201510478487.9A CN201510478487A CN105180817A CN 105180817 A CN105180817 A CN 105180817A CN 201510478487 A CN201510478487 A CN 201510478487A CN 105180817 A CN105180817 A CN 105180817A
Authority
CN
China
Prior art keywords
target area
described target
target object
length information
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510478487.9A
Other languages
Chinese (zh)
Other versions
CN105180817B (en
Inventor
郁凌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201510478487.9A priority Critical patent/CN105180817B/en
Publication of CN105180817A publication Critical patent/CN105180817A/en
Application granted granted Critical
Publication of CN105180817B publication Critical patent/CN105180817B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention discloses a data processing method and electronic equipment. The method comprises: image collection is carried out on a target zone to obtain a target zone image; and on the basis of the target zone image, first virtual length information of a target object in the target zone is determined, obtaining a first reference parameter a reference object in the target zone, and obtaining actual length information of the target object based on the first virtual length information and the first reference parameter.

Description

A kind of data processing method and electronic equipment
Technical field
The present invention relates to the information processing technology, particularly relate to a kind of data processing method and electronic equipment.
Background technology
In life, user can run into following scene usually: need measure object size or length does not carry measuring equipment or routine measurement is carried out in residing environment inconvenience, user can only be solved by the mode estimated or adopt other to waste time and energy.
Summary of the invention
The embodiment of the present invention provides a kind of data processing method and electronic equipment.
The technical scheme of the embodiment of the present invention is achieved in that
Embodiments provide a kind of data processing method, be applied to an electronic equipment, described method comprises:
Image acquisition is carried out to obtain target area image to target area;
The first virtual length information of target object in described target area is determined based on described target area image;
Obtain the first reference parameter of references object in described target area, and obtain the physical length information of described target object based on described first virtual length information and the first reference parameter.
In such scheme, described image acquisition is carried out to target area before, described method also comprises:
Produce and project two bundle visible beam, irradiating described target object to make described visible beam and form described references object.
In such scheme, described image acquisition is carried out to target area before, described method also comprises:
First content is projected to described target object, to form described references object.
In such scheme, image acquisition is carried out to target area and comprises to obtain target area image:
Adjust and determine that image capturing angle is the first acquisition angles, under the first acquisition angles, image acquisition being carried out to obtain target area image to target area.
In such scheme, describedly determine that the first virtual length information of target object in described target area comprises based on described target area image:
Resolve described target area image, determine the bottom position of the bottom position of target object in described target area, apical position and references object, apical position, determine the first virtual length information of described target object based on the bottom position of described target object and references object and apical position.
In such scheme, in the described target area of described acquisition, the first reference parameter of references object comprises:
Obtain the physical length information of references object in the described target area prestored.
In such scheme, in the described target area of described acquisition, the first reference parameter of references object comprises:
Distance length information between the described two bundle visible beam that acquisition prestores.
In such scheme, in the described target area of described acquisition, the first reference parameter of references object comprises:
Obtain projector distance information when carrying out described projection.
The embodiment of the present invention additionally provides a kind of electronic equipment, and described electronic equipment comprises: image acquisition units, determining unit and processing unit; Wherein,
Described image acquisition units, for carrying out image acquisition to obtain target area image to target area;
Described determining unit, for determining the first virtual length information of target object in described target area based on described target area image;
Described processing unit, for obtaining the first reference parameter of references object in described target area, and obtains the physical length information of described target object based on described first virtual length information and the first reference parameter.
In such scheme, described electronic equipment also comprises visible beam generation unit, for generation of and project two bundle visible beam, irradiate described target object to make described visible beam and form described references object.
In such scheme, described electronic equipment also comprises projecting cell, for first content is projected to described target object, to form described references object.
In such scheme, described electronic equipment also comprises angle adjusting, for adjusting and determining that image capturing angle is the first acquisition angles;
Accordingly, described image acquisition units, also for carrying out image acquisition to obtain target area image to target area under the first acquisition angles.
In such scheme, described determining unit, also for resolving described target area image, determine the bottom position of the bottom position of target object in described target area, apical position and references object, apical position, determine the first virtual length information of described target object based on the bottom position of described target object and references object and apical position.
In such scheme, described processing unit, also for obtaining the physical length information of references object in the described target area that prestores.
In such scheme, described processing unit, also for obtain prestore described two bundle visible beam between distance length information.
In such scheme, described processing unit, also for obtaining projector distance information when carrying out described projection.
The data processing method that the embodiment of the present invention provides and electronic equipment, carry out image acquisition to obtain target area image to target area; The first virtual length information of target object in described target area is determined based on described target area image; Obtain the first reference parameter of references object in described target area, and obtain the physical length information of described target object based on described first virtual length information and the first reference parameter.So, the measurement to object length can be realized accurately, realize simply, apply conveniently, improve Consumer's Experience.
Accompanying drawing explanation
Fig. 1 is the realization flow schematic diagram one of data processing method in the embodiment of the present invention;
Fig. 2 is the effect schematic diagram that the embodiment of the present invention adopts actual Reference;
Fig. 3 is the realization flow schematic diagram two of data processing method in the embodiment of the present invention;
Fig. 4 is the effect schematic diagram one that the embodiment of the present invention adopts virtual reference object;
Fig. 5 is the realization flow schematic diagram three of data processing method in the embodiment of the present invention;
Fig. 6 is the effect schematic diagram two that the embodiment of the present invention adopts virtual reference object;
Fig. 7 is the composition structural representation of embodiment of the present invention electronic equipment.
Embodiment
In embodiments of the present invention, electronic equipment carries out image acquisition to obtain target area image to target area; The first virtual length information of target object in described target area is determined based on described target area image; Obtain the first reference parameter of references object in described target area, and obtain the physical length information of described target object based on described first virtual length information and the first reference parameter;
Described electronic equipment includes but not limited to notebook computer, panel computer, mobile phone etc. in embodiments of the present invention, and preferably, described electronic equipment is mobile phone.
Below in conjunction with the drawings and specific embodiments, the present invention is described in further detail.
Embodiment one
Figure 1 shows that the embodiment of the present invention one data processing method schematic flow sheet, be applied to an electronic equipment, as shown in Figure 1, embodiment of the present invention data processing method comprises:
Step 101: image acquisition is carried out to obtain target area image to target area;
Here, described target area is the region at least including target object and references object; Described target object is actual object under test; Described references object both can for being placed on the actual object of reference on target object, also can for being formed in the virtual reference object on described target object, as: two on target object as described in being formed in reference to the rule etc. on hot spots or target object as described in being projected in.
This step specifically comprises: utilize angle adjusting to adjust and determine that image capturing angle is the first acquisition angles, utilizes image acquisition units to carry out image acquisition to obtain target area image to target area under the first acquisition angles;
Wherein, described first acquisition angles can be the angle between the image acquisition direction of described image acquisition units and described target area plane; Preferably, described first acquisition angles can be 90 degree, namely ensures that described image acquisition units is parallel with described target area plane; Certainly, described first acquisition angles also can be that other is arbitrarily angled, but when described first acquisition angles is not 90 degree, the target area image of collection is stravismus design sketch, and follow-up still need is processed into the image overlooking or look squarely effect;
Described angle adjusting can be realized by level meter, and described image acquisition units can be realized by camera.
Step 102: the first virtual length information determining target object in described target area based on described target area image;
This step specifically comprises: resolve described target area image, determine the bottom position of the bottom position of target object in described target area, apical position and references object, apical position, determine the first virtual length information of described target object based on the bottom position of described target object and references object and apical position;
Here, the first virtual length information of described target object is the length information of the described target object utilizing the length of described references object to represent; Such as: in described target area image, the length of described references object is a, based on bottom position and the apical position of described target object and references object, can determine that the first virtual length information of described target object is 15.6a;
The bottom position of described target object, apical position both can be respectively the two ends of described target object length direction, also the two ends of described target object Width can be respectively, if described target object is circular, the bottom position of described target object, apical position can also be the two ends in circular target article diameters direction; It should be noted that, bottom position, the apical position of described target object can be different according to the difference of the shape of described target object, and concrete foundation practical measurement requirement sets; The bottom position of described references object, apical position are respectively the two ends of the described references object as a reference of setting;
The described bottom position based on described target object and references object and apical position determine that the first virtual length information of described target object comprises:
Determine the length a of references object described in described target area image based on the bottom position of described references object and apical position, determine the length b of the described target object utilizing described length a to represent based on the bottom position of described target object and apical position; I.e. b=xa; X is positive number.
Step 103: the first reference parameter obtaining references object in described target area, and the physical length information obtaining described target object based on described first virtual length information and the first reference parameter;
Here, because described references object can be actual Reference or virtual references object, therefore, the first reference parameter of references object described in corresponding different references object can be different; Such as: when described references object is when being placed in the actual Reference on described target object, first reference parameter of described references object is the physical length of described actual Reference, due to the length b=xa of described target object, then can obtain the physical length information of target object according to the proportionate relationship of described b and a; When described references object is two the reference hot spots be formed on described target object, first reference parameter of described references object can be the actual range c between described two reference hot spots, and the proportionate relationship according to described b and a obtains the physical length information of target object; And when described references object is when being projected in the rule on described target object, first reference parameter of described references object can for carrying out projector distance information during described projection, namely projecting cell is to the vertical range information on projecting plane, utilizes known projective parameter, described projector distance and described first virtual length can obtain the physical length information of target object.
Embodiment two
Embodiments provide a kind of data processing method, be applied to an electronic equipment, Figure 2 shows that the embodiment of the present invention adopts the effect schematic diagram of actual Reference; As shown in Figure 1, 2, embodiment of the present invention data processing method comprises:
Step 101: image acquisition is carried out to obtain target area image to target area;
Here, described target area is the region at least including target object and references object; Described target object is actual object under test; Described references object both can for being placed on the actual object of reference on target object, also can for being formed in the virtual reference object on described target object;
In embodiments of the present invention, described references object is the actual Reference be placed on described target object; The object that described actual Reference can be easy to carry about with one for cell phone rear cover, business card etc.; Accordingly, in advance described actual Reference was placed on described target object before method described in the enforcement embodiment of the present invention; The physical length of described actual Reference is known, and accordingly, described electronic equipment has prestored the length information of described actual Reference;
It should be noted that, when placing described actual Reference, need ensure that described actual Reference is parallel with the length direction to be measured of described target object as the direction with reference to length.
This step specifically comprises: utilize angle adjusting to adjust and determine that image capturing angle is the first acquisition angles, utilizes image acquisition units to carry out image acquisition to obtain target area image to target area under the first acquisition angles;
Wherein, described first acquisition angles can be the angle between the image acquisition direction of described image acquisition units and described target area plane; In embodiments of the present invention, described first acquisition angles is 90 degree, namely ensures that described image acquisition units is parallel with described target area plane; Described angle adjusting can be realized by level meter, and described image acquisition units can be realized by camera.
Step 102: the first virtual length information determining target object in described target area based on described target area image;
This step specifically comprises: resolve described target area image, determine the bottom position of the bottom position of target object in described target area, apical position and references object, apical position, determine the first virtual length information of described target object based on the bottom position of described target object and references object and apical position;
Here, the first virtual length information of described target object is the length information of the described target object utilizing the length of described references object to represent; Such as: in described target area image, the length of described references object is a, based on bottom position and the apical position of described target object and references object, can determine that the first virtual length information of described target object is 15.6a;
The bottom position of described target object, apical position both can be respectively the two ends of described target object length direction, also the two ends of described target object Width can be respectively, if described target object is circular, the bottom position of described target object, apical position can also be the two ends in circular target article diameters direction; It should be noted that, bottom position, the apical position of described target object can be different according to the difference of the shape of described target object, and concrete foundation practical measurement requirement sets; The bottom position of described references object, apical position are respectively the two ends of the described references object as a reference of setting; In embodiments of the present invention, the bottom position of described references object, apical position are respectively the two ends of the reference length of described actual Reference;
The described bottom position based on described target object and references object and apical position determine that the first virtual length information of described target object comprises:
Determine the length a of actual Reference described in described target area image based on the bottom position of described references object and apical position, determine the length b of the described target object utilizing described length a to represent based on the bottom position of described target object and apical position; I.e. b=xa; X is positive number; Here, determine the length a of described actual Reference and the physical length of actual Reference described in non-determined, but determine the length a of actual Reference described in described target area image, and a can be a unknown number, namely supposes that the length of described actual Reference is a.
Step 103: the first reference parameter obtaining references object in described target area, and the physical length information obtaining described target object based on described first virtual length information and the first reference parameter;
Here, in the described target area of described acquisition, the first reference parameter of references object comprises:
The physical length information of actual Reference described in the described target area that acquisition prestores;
The described physical length information obtaining described target object based on described first virtual length information and the first reference parameter comprises:
Due to the length that described first virtual length is the described target object utilizing the length of described actual Reference to represent, therefore, by the length of actual Reference described in the expression of described first virtual length of the physical length of described actual Reference replacement, calculate the physical length of described target object.
Embodiment three
Figure 3 shows that the embodiment of the present invention three data processing method schematic flow sheet, be applied to an electronic equipment; Figure 4 shows that the embodiment of the present invention adopts the effect schematic diagram one of virtual reference object; As shown in Figure 3,4, embodiment of the present invention data processing method comprises:
Step 301: produce and project two bundle visible beam, irradiates described target object to make described visible beam and forms described references object;
Here, described visible beam can be the less visible beam of any scattering, as the angle of divergence is less than the visible beam of 6mrad; Described two bundle visible beam are irradiated described target object and are formed two visual reference hot spots, and described two visual reference hot spots are described references object; Preferably, described visible beam is visual laser light beam, as visual red laser light beam, exposes to described target object and forms two red reference hot spots;
The spacing of described two bundle visible beam is known, and accordingly, described electronic equipment has prestored the distance length information between two bundle visible beam, and two of also namely being formed on target object are with reference to the center distance of hot spot;
It should be noted that, when carrying out the projection of visible beam, need ensure formed two visual reference hot spots between line parallel with the length direction to be measured of described target object.
Step 302: image acquisition is carried out to obtain target area image to target area;
Here, described target area is the region at least including target object and references object; Described target object is actual object under test, such as: a face wall, desk, a car etc.;
This step specifically comprises: utilize angle adjusting to adjust and determine that image capturing angle is the first acquisition angles, utilizes image acquisition units to carry out image acquisition to obtain target area image to target area under the first acquisition angles;
Wherein, described first acquisition angles can be the angle between the image acquisition direction of described image acquisition units and described target area plane; In embodiments of the present invention, described first acquisition angles is 90 degree, namely ensures that described image acquisition units is parallel with described target area plane; Described angle adjusting can be realized by level meter, and described image acquisition units can be realized by camera.
Step 303: the first virtual length information determining target object in described target area based on described target area image;
This step specifically comprises: resolve described target area image, determine the bottom position of the bottom position of target object in described target area, apical position and references object, apical position, determine the first virtual length information of described target object based on the bottom position of described target object and references object and apical position;
Here, the first virtual length information of described target object is the length information of the described target object utilizing the length of described references object to represent; In embodiments of the present invention, the length of described references object is the center distance of two the visual reference hot spots be formed on target object, the distance namely between two bundle visible beam; When described visible beam scattering is less, the large I of the described visual reference hot spot of formation is ignored;
The bottom position of described target object, apical position both can be respectively the two ends of described target object length direction, also the two ends of described target object Width can be respectively, if described target object is circular, the bottom position of described target object, apical position can also be the two ends in circular target article diameters direction; It should be noted that, bottom position, the apical position of described target object can be different according to the difference of the shape of described target object, and concrete foundation practical measurement requirement sets; The bottom position of described references object, apical position are respectively the two ends of the described references object as a reference of setting; In embodiments of the present invention, the bottom position of described references object, apical position are respectively the position of two visual reference hot spots;
The described bottom position based on described target object and references object and apical position determine that the first virtual length information of described target object comprises:
The spacing a of two visual reference hot spots described in described target area image is determined in position based on described two visual reference hot spots, determines the length b of the described target object utilizing described spacing a to represent based on the bottom position of described target object and apical position; I.e. b=xa; X is positive number; Here, determine the spacing a of described two visual reference hot spots and the actual pitch of non-determined two visual reference hot spots, but determine in described target area image, the spacing a of two visual reference hot spots, and a can be a unknown number, namely suppose that described spacing is a.
Step 304: the first reference parameter obtaining references object in described target area, and the physical length information obtaining described target object based on described first virtual length information and the first reference parameter;
Here, in the described target area of described acquisition, the first reference parameter of references object comprises:
Actual range length information between the described two bundle visible beam that acquisition prestores;
The described physical length information obtaining described target object based on described first virtual length information and the first reference parameter comprises:
Due to the length that described first virtual length is the described target object that utilizes the spacing of two visual reference hot spots described in target area image to represent, therefore, by the actual pitch of described two visual reference hot spots, described in the expression of described first virtual length of actual range length replacement namely between described two bundle visible beam, the spacing of two visual reference hot spots, calculates the physical length of described target object; As described in the first virtual length b=xa, and two bundle visible beam between actual range length be c, then b=xc.
Embodiment four
Figure 5 shows that the embodiment of the present invention four data processing method schematic flow sheet, be applied to an electronic equipment; Figure 6 shows that the embodiment of the present invention adopts the effect schematic diagram two of virtual reference object; As shown in Figure 5,6, embodiment of the present invention data processing method comprises:
Step 501: first content is projected to described target object, to form described references object;
Here, described first content can for can be used as arbitrarily the content of survey instrument, as: rule, the object etc. possessing master scale mark; In embodiments of the present invention, described first content is rule, and rule is projected to target object, and to form rule picture on described target object, the rule that described target object is formed is as described references object;
Described projection can be parallel projection, namely projection line be parallel to each other produce projection;
It should be noted that, the direction of the described rule of projection should be parallel with the length direction to be measured of described target object, and the rule length that projection is formed is not shorter than the length of described target object.
Step 502: image acquisition is carried out to obtain target area image to target area;
Here, described target area is the region at least including target object and references object; Described target object is actual object under test, such as: a face wall, desk, a car etc.;
This step specifically comprises: utilize angle adjusting to adjust and determine that image capturing angle is the first acquisition angles, utilizes image acquisition units to carry out image acquisition to obtain target area image to target area under the first acquisition angles;
Wherein, described first acquisition angles can be the angle between the image acquisition direction of described image acquisition units and described target area plane; In embodiments of the present invention, described first acquisition angles is 90 degree, namely ensures that described image acquisition units is parallel with described target area plane; Described angle adjusting can be realized by level meter, and described image acquisition units can be realized by camera.
Step 503: the first virtual length information determining target object in described target area based on described target area image;
This step specifically comprises: resolve described target area image, determine the bottom position of the bottom position of target object in described target area, apical position and references object, apical position, determine the first virtual length information of described target object based on the bottom position of described target object and references object and apical position;
Here, the first virtual length information of described target object is the length information of the described target object utilizing the length of described references object to represent; In embodiments of the present invention, because described references object can be used as dimensional measuring instrument, therefore, directly can read the length to be measured of described target object according to described references object from described target area image, this length only read not is the physical length of described target object;
The bottom position of described target object, apical position both can be respectively the two ends of described target object length direction, also the two ends of described target object Width can be respectively, if described target object is circular, the bottom position of described target object, apical position can also be the two ends in circular target article diameters direction; It should be noted that, bottom position, the apical position of described target object can be different according to the difference of the shape of described target object, and concrete foundation practical measurement requirement sets; The bottom position of described references object, apical position are respectively the two ends of the described references object as a reference of setting; In embodiments of the present invention, the bottom position of described references object, apical position are respectively the two ends of the described rule of projection;
The described bottom position based on described target object and references object and apical position determine that the first virtual length information of described target object comprises:
Based on bottom position and the apical position of described target object, read the length of the described target object obtained according to the rule measurement projected; It should be noted that therefore, the length of the described target object utilizing this rule to read not is its physical length because this rule obtains through certain distance projection.
Step 504: the first reference parameter obtaining references object in described target area, and the physical length information obtaining described target object based on described first virtual length information and the first reference parameter;
Here, in the described target area of described acquisition, the first reference parameter of references object comprises:
Obtain projector distance information when carrying out described projection, namely projecting cell is to the vertical range information on projecting plane;
Here, described projector distance information both can, for presetting, also can adopt the mode of existing measuring distance to obtain, and range observation APP electronic equipment installed as adopted obtains;
The described physical length information obtaining described target object based on described first virtual length information and the first reference parameter comprises:
The physical length information of described target object is calculated based on described first virtual length information, described projector distance and dependent projections parameter; Wherein, described dependent projections parameter comprises: focal length, chip size etc.; And the detailed process of described calculating is prior art, do not repeat herein.
Embodiment five
Figure 7 shows that the composition structural representation of embodiment of the present invention electronic equipment, as shown in Figure 7, the composition of described electronic equipment comprises: image acquisition units 71, determining unit 72 and processing unit 73; Wherein,
Described image acquisition units 71, for carrying out image acquisition to obtain target area image to target area;
Described determining unit 72, for determining the first virtual length information of target object in described target area based on described target area image;
Described processing unit 73, for obtaining the first reference parameter of references object in described target area, and obtains the physical length information of described target object based on described first virtual length information and the first reference parameter;
Here, described target area is the region at least including target object and references object; Described target object is actual object under test; Described references object both can for being placed on the actual object of reference on target object, as the object that cell phone rear cover, business card etc. are easy to carry about with one, also can for being formed in the virtual reference object on described target object, as: two on target object as described in being formed in reference to the rule etc. on hot spots or target object as described in being projected in;
Because described references object can be actual Reference or virtual references object, therefore, the first reference parameter of references object described in corresponding different references object can be different; Such as: when described references object is when being placed in the actual Reference on described target object, the first reference parameter of described references object is the physical length of described actual Reference; When described references object is two the reference hot spots be formed on described target object, the first reference parameter of described references object can be the actual range between described two reference hot spots; And when described references object is when being projected in the rule on described target object, the first reference parameter of described references object can for carrying out projector distance information during described projection, namely projecting cell is to the vertical range information on projecting plane.
Further, described electronic equipment also comprises visible beam generation unit 74, for generation of and project two bundle visible beam, irradiate described target object to make described visible beam and form described references object;
Here, described visible beam can be the less visible beam of any scattering, as the angle of divergence is less than the visible beam of 6mrad; Described two bundle visible beam are irradiated described target object and are formed two visual reference hot spots, and described two visual reference hot spots are described references object; Preferably, described visible beam is visual laser light beam, as visual red laser light beam, exposes to described target object and forms two red reference hot spots;
The spacing of described two bundle visible beam is known, and accordingly, described electronic equipment also comprises storage unit 75, for prestoring the distance length information between two bundle visible beam, and two that are also namely formed on the target object center distance with reference to hot spot.
Further, described electronic equipment also comprises projecting cell 76, for first content is projected to described target object, to form described references object;
Here, described first content can for can be used as arbitrarily the content of survey instrument, as: rule, the object etc. possessing master scale mark; In embodiments of the present invention, described first content is rule, and rule is projected to target object, and to form rule picture on described target object, the rule that described target object is formed is as described references object.
Further, described electronic equipment also comprises angle adjusting 77, for adjusting and determining that image capturing angle is the first acquisition angles;
Accordingly, described image acquisition units 71, also for carrying out image acquisition to obtain target area image to target area under the first acquisition angles;
Wherein, described first acquisition angles can be the angle between the image acquisition direction of described image acquisition units and described target area plane; Preferably, described first acquisition angles can be 90 degree, namely ensures that described image acquisition units is parallel with described target area plane; Certainly, described first acquisition angles also can be that other is arbitrarily angled, but when described first acquisition angles is not 90 degree, the target area image of collection is stravismus design sketch, and follow-up still need is processed into the image overlooking or look squarely effect.
Further, described determining unit 72, also for resolving described target area image, determine the bottom position of the bottom position of target object in described target area, apical position and references object, apical position, determine the first virtual length information of described target object based on the bottom position of described target object and references object and apical position;
Here, the first virtual length information of described target object is the length information of the described target object utilizing the length of described references object to represent; Such as: in described target area image, the length of described references object is a, based on bottom position and the apical position of described target object and references object, can determine that the first virtual length information of described target object is 15.6a;
The bottom position of described target object, apical position both can be respectively the two ends of described target object length direction, also the two ends of described target object Width can be respectively, if described target object is circular, the bottom position of described target object, apical position can also be the two ends in circular target article diameters direction; It should be noted that, bottom position, the apical position of described target object can be different according to the difference of the shape of described target object, and concrete foundation practical measurement requirement sets; The bottom position of described references object, apical position are respectively the two ends of the described references object as a reference of setting.
Further, described determining unit 72, also for determining the length a of references object described in described target area image based on the bottom position of described references object and apical position, determine the length b of the described target object utilizing described length a to represent based on the bottom position of described target object and apical position; I.e. b=xa; X is positive number.
Further, described processing unit 73, also for obtaining the physical length information of references object in the described target area that prestores;
Accordingly, described storage unit 75, also for storing the physical length information of references object in described target area;
Described processing unit 73, also for the length by actual Reference described in the expression of described first virtual length of the physical length of described actual Reference replacement, calculates the physical length of described target object.
Further, described processing unit 73, also for obtain prestore described two bundle visible beam between distance length information;
Described processing unit 73, also for the actual pitch by described two visual reference hot spots, described in the expression of described first virtual length of actual range length replacement namely between described two bundle visible beam, the spacing of two visual reference hot spots, calculates the physical length of described target object; As described in the first virtual length b=xa, and two bundle visible beam between actual range length be c, then b=xc.
Further, described processing unit 73, also for obtaining projector distance information when carrying out described projection, namely projecting cell is to the vertical range information on projecting plane;
Accordingly, described processing unit 73, also for calculating the physical length information of described target object based on described first virtual length information, described projector distance and dependent projections parameter; Wherein, described dependent projections parameter comprises: focal length, chip size etc.
One of ordinary skill in the art will appreciate that: all or part of step realizing said method embodiment can have been come by the hardware that programmed instruction is relevant, aforesaid program can be stored in a computer read/write memory medium, this program, when performing, performs the step comprising said method embodiment; And aforesaid storage medium comprises: movable storage device, ROM (read-only memory) (ROM, Read-OnlyMemory), random access memory (RAM, RandomAccessMemory), magnetic disc or CD etc. various can be program code stored medium.
Or, if the above-mentioned integrated unit of the present invention using the form of software function module realize and as independently production marketing or use time, also can be stored in a computer read/write memory medium.Based on such understanding, the technical scheme of the embodiment of the present invention can embody with the form of software product the part that prior art contributes in essence in other words, this computer software product is stored in a storage medium, comprises some instructions and performs all or part of of method described in each embodiment of the present invention in order to make a computer equipment (can be personal computer, server or the network equipment etc.).And aforesaid storage medium comprises: movable storage device, ROM, RAM, magnetic disc or CD etc. various can be program code stored medium.
The above; be only the specific embodiment of the present invention, but protection scope of the present invention is not limited thereto, is anyly familiar with those skilled in the art in the technical scope that the present invention discloses; change can be expected easily or replace, all should be encompassed within protection scope of the present invention.Therefore, protection scope of the present invention should be as the criterion with the protection domain of described claim.

Claims (16)

1. a data processing method, is applied to an electronic equipment, it is characterized in that, described method comprises:
Image acquisition is carried out to obtain target area image to target area;
The first virtual length information of target object in described target area is determined based on described target area image;
Obtain the first reference parameter of references object in described target area, and obtain the physical length information of described target object based on described first virtual length information and the first reference parameter.
2. method according to claim 1, is characterized in that, described image acquisition is carried out to target area before, described method also comprises:
Produce and project two bundle visible beam, irradiating described target object to make described visible beam and form described references object.
3. method according to claim 1, is characterized in that, described image acquisition is carried out to target area before, described method also comprises:
First content is projected to described target object, to form described references object.
4. method according to claim 1, is characterized in that, carry out image acquisition comprise to obtain target area image target area:
Adjust and determine that image capturing angle is the first acquisition angles, under the first acquisition angles, image acquisition being carried out to obtain target area image to target area.
5. method according to claim 1, is characterized in that, describedly determines that the first virtual length information of target object in described target area comprises based on described target area image:
Resolve described target area image, determine the bottom position of the bottom position of target object in described target area, apical position and references object, apical position, determine the first virtual length information of described target object based on the bottom position of described target object and references object and apical position.
6. method according to claim 1, it is characterized in that, in the described target area of described acquisition, the first reference parameter of references object comprises:
Obtain the physical length information of references object in the described target area prestored.
7. method according to claim 2, it is characterized in that, in the described target area of described acquisition, the first reference parameter of references object comprises:
Distance length information between the described two bundle visible beam that acquisition prestores.
8. method according to claim 3, it is characterized in that, in the described target area of described acquisition, the first reference parameter of references object comprises:
Obtain projector distance information when carrying out described projection.
9. an electronic equipment, is characterized in that, described electronic equipment comprises: image acquisition units, determining unit and processing unit; Wherein,
Described image acquisition units, for carrying out image acquisition to obtain target area image to target area;
Described determining unit, for determining the first virtual length information of target object in described target area based on described target area image;
Described processing unit, for obtaining the first reference parameter of references object in described target area, and obtains the physical length information of described target object based on described first virtual length information and the first reference parameter.
10. electronic equipment according to claim 9, it is characterized in that, described electronic equipment also comprises visible beam generation unit, for generation of and project two bundle visible beam, irradiate described target object to make described visible beam and form described references object.
11. electronic equipments according to claim 9, it is characterized in that, described electronic equipment also comprises projecting cell, for first content is projected to described target object, to form described references object.
12. electronic equipments according to claim 9, it is characterized in that, described electronic equipment also comprises angle adjusting, for adjusting and determining that image capturing angle is the first acquisition angles;
Accordingly, described image acquisition units, also for carrying out image acquisition to obtain target area image to target area under the first acquisition angles.
13. electronic equipments according to claim 9, it is characterized in that, described determining unit, also for resolving described target area image, determine the bottom position of the bottom position of target object in described target area, apical position and references object, apical position, determine the first virtual length information of described target object based on the bottom position of described target object and references object and apical position.
14. electronic equipments according to claim 9, is characterized in that, described processing unit, also for obtaining the physical length information of references object in the described target area that prestores.
15. electronic equipments according to claim 10, is characterized in that, described processing unit, also for obtaining the distance length information between the described two bundle visible beam that prestore.
16., according to electronic equipment described in claim 11, is characterized in that, described processing unit, also for obtaining projector distance information when carrying out described projection.
CN201510478487.9A 2015-08-06 2015-08-06 A kind of data processing method and electronic equipment Active CN105180817B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510478487.9A CN105180817B (en) 2015-08-06 2015-08-06 A kind of data processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510478487.9A CN105180817B (en) 2015-08-06 2015-08-06 A kind of data processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN105180817A true CN105180817A (en) 2015-12-23
CN105180817B CN105180817B (en) 2018-08-10

Family

ID=54903076

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510478487.9A Active CN105180817B (en) 2015-08-06 2015-08-06 A kind of data processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN105180817B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111480050A (en) * 2017-12-15 2020-07-31 麦普威有限公司 Machine vision system with computer-generated virtual reference

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000205821A (en) * 1999-01-07 2000-07-28 Nec Corp Instrument and method for three-dimensional shape measurement
CN103185567A (en) * 2011-12-27 2013-07-03 联想(北京)有限公司 Electronic apparatus and method for measuring distance
CN103363916A (en) * 2012-03-26 2013-10-23 联想(北京)有限公司 Information processing method and processing device
CN104539926A (en) * 2014-12-19 2015-04-22 北京智谷睿拓技术服务有限公司 Distance determination method and equipment
CN104534992A (en) * 2014-12-23 2015-04-22 广东欧珀移动通信有限公司 Length measurement method and terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000205821A (en) * 1999-01-07 2000-07-28 Nec Corp Instrument and method for three-dimensional shape measurement
CN103185567A (en) * 2011-12-27 2013-07-03 联想(北京)有限公司 Electronic apparatus and method for measuring distance
CN103363916A (en) * 2012-03-26 2013-10-23 联想(北京)有限公司 Information processing method and processing device
CN104539926A (en) * 2014-12-19 2015-04-22 北京智谷睿拓技术服务有限公司 Distance determination method and equipment
CN104534992A (en) * 2014-12-23 2015-04-22 广东欧珀移动通信有限公司 Length measurement method and terminal

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111480050A (en) * 2017-12-15 2020-07-31 麦普威有限公司 Machine vision system with computer-generated virtual reference
US11443418B2 (en) 2017-12-15 2022-09-13 Oy Mapvision Ltd Machine vision system with a computer generated virtual reference object

Also Published As

Publication number Publication date
CN105180817B (en) 2018-08-10

Similar Documents

Publication Publication Date Title
CN103381436B (en) Equipment for the angle of bend of sheet material measurement and method
Trujillo et al. Light detection and ranging measurements of wake dynamics. Part II: two‐dimensional scanning
CN106484748B (en) A kind of determining point of interest POI lays the method and device of demand
RU2009128340A (en) MEASURING DEVICE, MEASUREMENT METHOD AND IMAGE FORMING DEVICE
CN103335824B (en) Detection method of outfield wave front aberration of large-aperture space optical system
US20150112470A1 (en) Computing device and method for image measurement
CN103765870A (en) Image processing apparatus, projector and projector system including image processing apparatus, image processing method
CN1952595A (en) Three-dimensional shape measuring apparatus, computer-readable recording medium, and three-dimensional shape measuring method
EP2830022A2 (en) Information processing apparatus, and displaying method
CN111161358B (en) Camera calibration method and device for structured light depth measurement
CN105388478A (en) Method and arrangement for detecting acoustic and optical information as well as a corresponding computer program and a corresponding computer-readable storage medium
CN104833342B (en) Mobile terminal and method of establishing stereoscopic model through multipoint distance measurement
CN101241190A (en) Zernike modal based laser atmospheric turbulence intensity profile measuring systems and method
CN110929612A (en) Target object labeling method, device and equipment
CN105306922A (en) Method and device for obtaining depth camera reference diagram
Mei et al. High resolution volumetric dual-camera light-field PIV
CN108242962B (en) Indoor signal propagation loss calculation method and device based on measurement report
JP2012253316A5 (en)
CN106705857A (en) Automatic monitoring system of laser surface displacement
CN105444729B (en) Optical remote measurement method
CN112381921A (en) Edge reconstruction method and system
CN105180817A (en) Data processing method and electronic equipment
Yu et al. Correlation analysis of spatio-temporal images for estimating two-dimensional flow velocity field in a rotating flow condition
CN110910446A (en) Method and device for determining building removal area and method and device for determining indoor area of building
CN107167088A (en) The measuring method and device of glass deformation amount

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant