CN107505619A - A kind of terminal imaging method, camera shooting terminal and computer-readable recording medium - Google Patents
A kind of terminal imaging method, camera shooting terminal and computer-readable recording medium Download PDFInfo
- Publication number
- CN107505619A CN107505619A CN201710527599.8A CN201710527599A CN107505619A CN 107505619 A CN107505619 A CN 107505619A CN 201710527599 A CN201710527599 A CN 201710527599A CN 107505619 A CN107505619 A CN 107505619A
- Authority
- CN
- China
- Prior art keywords
- subject
- terminal
- distance data
- embossment
- datum plane
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a kind of terminal imaging method, camera shooting terminal and computer-readable recording medium, this method realizes the measurement to the surface distance data between the body surface and camera terminal of subject by ultrasonic wave, on this basis, can draws embossment picture according to these data, present the picture of stereoeffect, such as 3D pictures, solve the problems, such as that existing terminal imaging is only capable of realizing planar imaging, enhance the usage experience of user.
Description
Technical field
The present invention relates to the field of taking pictures, more particularly to a kind of terminal imaging method, camera shooting terminal and computer-readable storage
Medium.
Background technology
As the strong of the terminal systems such as mobile phone is with growing, user's using terminal carries out daily routines, such as takes pictures, still
Existing take pictures all is planar picture, and end-user experience is poor.
The content of the invention
It is a primary object of the present invention to propose a kind of terminal imaging method, camera shooting terminal and computer-readable storage medium
Matter, it is intended to solve the problems, such as that existing terminal imaging is only capable of realizing planar imaging.
To achieve the above object, the present invention proposes a kind of terminal imaging method, including:
The first ultrasonic signal is sent out, receives second ultrasonic signal of first ultrasonic signal through reflecting to form;
According to the first ultrasonic signal and the second ultrasonic signal, determine between the body surface of subject and camera terminal
Surface distance data;
According to the surface distance data of subject, the embossment picture of subject is generated.
In certain embodiments, according to the first ultrasonic signal and the second ultrasonic signal, the surface distance of subject is determined
Data include:
Non-directional sends the first ultrasonic signal;
According to the first ultrasonic signal and the second ultrasonic signal, the object table of all objects in camera terminal sensing range is calculated
Surface distance data between face and camera terminal;
According to the shooting direction of the camera of camera terminal, subject is determined;
The surface distance data of all objects are screened, determine the surface distance data of subject.
In certain embodiments, according to the first ultrasonic signal and the second ultrasonic signal, the body surface of subject is determined
Surface distance data between camera terminal include:
According to the shooting direction of the camera of camera terminal, subject is determined;
To the ultrasonic signal of subject directive sending first;
According to the first ultrasonic signal and the second ultrasonic signal, the surface distance data of subject are calculated.
In certain embodiments, according to the surface distance data of subject, generating the embossment picture of subject includes:
According to the surface distance data of subject, the datum plane of embossment picture is determined;
By the part in subject body surface between datum plane and camera terminal, as subject object
The surface to be treated of progress relief sculpture treatment is needed in surface;
The surface distance data of subject are screened, determine the surface distance data of surface to be treated;
According to the surface distance data for stating datum plane and surface to be treated, embossment picture is built.
In certain embodiments, according to the surface distance data of subject, determining the datum plane of embossment picture includes:
According to the selection operation of terminal user, the object of focus in subject is determined;
The surface distance data of subject are screened, determine the surface distance data of object of focus;
According to the surface distance data of object of focus, the datum plane of embossment picture is determined.
In certain embodiments, according to the surface distance data of object of focus, determining the datum plane of embossment picture includes:
Judge whether the quantity of object of focus is more than one;
If so, then being screened to the surface distance data of subject, the surface distance of each object of focus is determined respectively
Data;
According to the surface distance data of each object of focus, it is determined that the datum plane matched with each object of focus;
The datum plane matched according to each object of focus, determine the datum plane of embossment picture.
In certain embodiments, the datum plane matched according to each object of focus, the datum plane bag of embossment picture is determined
Include:
Judge the datum plane of each object of focus matching whether in approximately the same plane;
If it is not, the datum plane nearest apart from camera terminal in the datum plane for then matching each object of focus, as
The datum plane of embossment picture.
In certain embodiments, according to datum plane and the surface distance data of surface to be treated, embossment picture bag is built
Include:
According to the surface distance data of surface to be treated, surface to be treated is calculated to the relative distance of benchmark screen;
Using scaling of the subject in embossment photo, equal proportion contracting is carried out to the relative distance of surface to be treated
Put, determine in embossment picture relief surface to the embossment caliper of embossment bottom surface;
According to embossment caliper and surface to be treated, relief surface is generated;
Datum plane and the part outside camera terminal will be located in subject body surface, as on embossment bottom surface
Plane picture;
Relief surface and embossment bottom surface are merged, generates embossment picture.
Meanwhile the invention provides a kind of camera shooting terminal, terminal to include:Ultrasonic sensor, camera, memory, place
The terminal image forming program managed device and storage on a memory and can run on a processor, terminal image forming program are executed by processor
The step of Shi Shixian terminal imaging methods provided by the invention.
Meanwhile the invention provides a kind of computer-readable recording medium, end is stored with computer-readable recording medium
Hold image forming program, terminal image forming program realizes terminal imaging method provided by the invention when being executed by processor the step of.
A kind of terminal imaging method, camera shooting terminal and the computer-readable recording medium that the embodiment of the present invention is proposed, should
Method realizes the measurement to the surface distance data between the body surface and camera terminal of subject by ultrasonic wave, herein
On the basis of, it is possible to embossment picture is drawn according to these data, presents the picture of stereoeffect, such as 3D pictures, is solved
The problem of existing terminal imaging is only capable of realizing planar imaging, enhance the usage experience of user.
Brief description of the drawings
Fig. 1 is the hardware architecture diagram for realizing each optional mobile terminal of embodiment one of the present invention;
Fig. 2 be subject of the present invention body surface to the surface distance between camera shooting terminal schematic diagram;
Fig. 3 is the flow chart of terminal imaging method first embodiment of the present invention;
Fig. 4 is the flow chart of terminal imaging method second embodiment of the present invention;
Fig. 5 is the structural representation of terminal first embodiment of the present invention;
Fig. 6 is the structural representation of terminal second embodiment of the present invention;
Fig. 7 be the present embodiments relate to subject the first schematic diagram;
Fig. 8 be the present embodiments relate to subject second of schematic diagram;
Fig. 9 be the present embodiments relate to subject the third schematic diagram;
Figure 10 be the present embodiments relate to subject the 4th kind of schematic diagram;
Figure 11 be the present embodiments relate to embossment picture schematic diagram.
The realization, functional characteristics and advantage of the object of the invention will be described further referring to the drawings in conjunction with the embodiments.
Embodiment
It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not intended to limit the present invention.
In follow-up description, the suffix using such as " module ", " part " or " unit " for representing element is only
Be advantageous to the explanation of the present invention, itself there is no a specific meaning.Therefore, " module ", " part " or " unit " can mix
Ground uses.
Terminal can be implemented in a variety of manners.For example, the terminal described in the present invention can include such as mobile phone, flat board
Computer, notebook computer, palm PC, personal digital assistant (Personal Digital Assistant, PDA), portable
Media player (Portable Media Player, PMP), guider, wearable device, Intelligent bracelet, pedometer etc. move
Dynamic terminal, and the fixed terminal such as digital TV, desktop computer.
It will be illustrated in subsequent descriptions by taking mobile terminal as an example, it will be appreciated by those skilled in the art that except special
Outside element for moving purpose, construction according to the embodiment of the present invention can also apply to the terminal of fixed type.
Referring to Fig. 1, its hardware architecture diagram for a kind of mobile terminal of each embodiment of the realization present invention, the shifting
Dynamic terminal 100 can include:RF (Radio Frequency, radio frequency) unit 101, WiFi module 102, audio output unit
103rd, A/V (audio/video) input block 104, sensor 105, display unit 106, user input unit 107, interface unit
108th, the part such as memory 109, processor 110 and power supply 111.It will be understood by those skilled in the art that shown in Fig. 1
Mobile terminal structure does not form the restriction to mobile terminal, and mobile terminal can be included than illustrating more or less parts,
Either combine some parts or different parts arrangement.
The all parts of mobile terminal are specifically introduced with reference to Fig. 1:
Radio frequency unit 101 can be used for receiving and sending messages or communication process in, the reception and transmission of signal, specifically, by base station
Downlink information receive after, handled to processor 110;In addition, up data are sent to base station.Generally, radio frequency unit 101
Including but not limited to antenna, at least one amplifier, transceiver, coupler, low-noise amplifier, duplexer etc..In addition, penetrate
Frequency unit 101 can also be communicated by radio communication with network and other equipment.Above-mentioned radio communication can use any communication
Standard or agreement, including but not limited to GSM (Global System of Mobile communication, global system for mobile telecommunications
System), GPRS (General Packet Radio Service, general packet radio service), CDMA2000 (Code
Division Multiple Access 2000, CDMA 2000), WCDMA (Wideband Code Division
Multiple Access, WCDMA), TD-SCDMA (Time Division-Synchronous Code
Division Multiple Access, TD SDMA),FDD-LTE(Frequency Division
Duplexing-Long Term Evolution, FDD Long Term Evolution) and TDD-LTE (Time Division
Duplexing-Long Term Evolution, time division duplex Long Term Evolution) etc..
WiFi belongs to short range wireless transmission technology, and mobile terminal can help user to receive and dispatch by WiFi module 102Electricity Sub- mail, browse webpage and access streaming video etc., it has provided the user wireless broadband internet and accessed.Although Fig. 1 shows
Go out WiFi module 102, but it is understood that, it is simultaneously not belonging to must be configured into for mobile terminal, completely can be according to need
To be omitted in the essential scope for do not change invention.
Audio output unit 103 can be in call signal reception pattern, call mode, record mould in mobile terminal 100
When under the isotypes such as formula, speech recognition mode, broadcast reception mode, by radio frequency unit 101 or WiFi module 102 it is receiving or
It is sound that the voice data stored in memory 109, which is converted into audio signal and exported,.Moreover, audio output unit 103
The audio output related to the specific function that mobile terminal 100 performs can also be provided (for example, call signal receives sound, disappeared
Breath receives sound etc.).Audio output unit 103 can include loudspeaker, buzzer etc..
A/V input blocks 104 are used to receive audio or video signal.A/V input blocks 104 can include graphics processor
(Graphics Processing Unit, GPU) 1041 and microphone 1042, graphics processor 1041 is in video acquisition mode
Or the static images or the figure of video obtained in image capture mode by image capture apparatus (camera such as of the present invention)
As data are handled.Picture frame after processing may be displayed on display unit 106.After the processing of graphics processor 1041
Picture frame can be stored in memory 109 (or other storage mediums) or enter via radio frequency unit 101 or WiFi module 102
Row is sent.Microphone 1042 can be in telephone calling model, logging mode, speech recognition mode etc. operational mode via wheat
Gram wind 1042 receives sound (voice data), and can be voice data by such acoustic processing.Audio (language after processing
Sound) data can be converted in the case of telephone calling model to be sent to mobile communication base station via radio frequency unit 101
Form exports.Microphone 1042 can implement various types of noises elimination (or suppression) algorithms and received with eliminating (or suppression)
With caused noise or interference during transmission audio signal.
Mobile terminal 100 also includes at least one sensor 105, such as optical sensor, motion sensor, supersonic sensing
Device 1051 and other sensors.Specifically, optical sensor includes ambient light sensor and proximity transducer, wherein, ambient light
Sensor can adjust the brightness of display panel 1061 according to the light and shade of ambient light, and proximity transducer can be in mobile terminal 100
When being moved in one's ear, display panel 1061 and/or backlight are closed.As one kind of motion sensor, accelerometer sensor can be examined
The size of (generally three axles) acceleration in all directions is surveyed, size and the direction of gravity are can detect that when static, available for knowing
The application (such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating) of other mobile phone posture, Vibration identification correlation function (ratio
Such as pedometer, tap).For ultrasonic sensor 1051, the sensor developed using the characteristic of ultrasonic wave, ultrasonic wave
Ranging generally use degree gets over Time Method, i.e., the distance of testee is calculated using s=vt/2, and s is transmitting-receiving head and measured object in formula
The distance between body, v are the spread speed (v=331.41+T/273m/s) of ultrasonic wave in media as well, and t is coming and going for ultrasonic wave
Time interval, operation principle are:The ultrasonic wave that emitting head is sent is propagated in atmosphere with speed v, the quilt when reaching testee
The reflection of its surface returns, and is received by reception head, and its two-way time is t, and the distance of testee is calculated by s, and T is environment temperature,
Require that high occasion must take into consideration this influence in accuracy of measurement, but in general, this method can be cast out, benefit is adjusted by software
Repay.Fingerprint sensor, pressure sensor, iris sensor, molecule sensor, gyroscope, the air pressure that can also configure as mobile phone
The other sensors such as meter, hygrometer, thermometer, infrared ray sensor, will not be repeated here.
Display unit 106 is used for the information for showing the information inputted by user or being supplied to user.Display unit 106 can wrap
Display panel 1061 is included, liquid crystal display (Liquid Crystal Display, LCD), Organic Light Emitting Diode can be used
Forms such as (Organic Light-Emitting Diode, OLED) configures display panel 1061.
User input unit 107 can be used for the numeral or character information for receiving input, and produce the use with mobile terminal
The key signals input that family is set and function control is relevant.Specifically, user input unit 107 may include contact panel 1071 with
And other input equipments 1072.Contact panel 1071, also referred to as touch-screen, collect touch operation of the user on or near it
(for example user uses any suitable objects or annex such as finger, stylus on contact panel 1071 or in contact panel 1071
Neighbouring operation), and corresponding attachment means are driven according to formula set in advance.Contact panel 1071 may include touch detection
Two parts of device and touch controller.Wherein, the touch orientation of touch detecting apparatus detection user, and detect touch operation band
The signal come, transmits a signal to touch controller;Touch controller receives touch information from touch detecting apparatus, and by it
Contact coordinate is converted into, then gives processor 110, and the order sent of reception processing device 110 and can be performed.In addition, can
To realize contact panel 1071 using polytypes such as resistance-type, condenser type, infrared ray and surface acoustic waves.Except contact panel
1071, user input unit 107 can also include other input equipments 1072.Specifically, other input equipments 1072 can wrap
Include but be not limited to physical keyboard, in function key (such as volume control button, switch key etc.), trace ball, mouse, action bars etc.
One or more, do not limit herein specifically.
Further, contact panel 1071 can cover display panel 1061, detect thereon when contact panel 1071 or
After neighbouring touch operation, processor 110 is sent to determine the type of touch event, is followed by subsequent processing device 110 according to touch thing
The type of part provides corresponding visual output on display panel 1061.Although in Fig. 1, contact panel 1071 and display panel
1061 be the part independent as two to realize the input of mobile terminal and output function, but in certain embodiments, can
Input and the output function of mobile terminal are realized so that contact panel 1071 and display panel 1061 is integrated, is not done herein specifically
Limit.
Interface unit 108 is connected the interface that can pass through as at least one external device (ED) with mobile terminal 100.For example,
External device (ED) can include wired or wireless head-band earphone port, external power source (or battery charger) port, wired or nothing
Line FPDP, memory card port, the port for connecting the device with identification module, audio input/output (I/O) end
Mouth, video i/o port, ear port etc..Interface unit 108 can be used for receiving the input from external device (ED) (for example, number
It is believed that breath, electric power etc.) and the input received is transferred to one or more elements in mobile terminal 100 or can be with
For transmitting data between mobile terminal 100 and external device (ED).
Memory 109 can be used for storage software program and various data.Memory 109 can mainly include storing program area
And storage data field, wherein, storing program area can storage program area, application program (such as the sound needed at least one function
Sound playing function, image player function etc.) etc.;Storage data field can store according to mobile phone use created data (such as
Voice data, phone directory etc.) etc..In addition, memory 109 can include high-speed random access memory, can also include non-easy
The property lost memory, a for example, at least disk memory, flush memory device or other volatile solid-state parts.
Processor 110 is the control centre of mobile terminal, utilizes each of various interfaces and the whole mobile terminal of connection
Individual part, by running or performing the software program and/or module that are stored in memory 109, and call and be stored in storage
Data in device 109, the various functions and processing data of mobile terminal are performed, so as to carry out integral monitoring to mobile terminal.Place
Reason device 110 may include one or more processing units;Preferably, processor 110 can integrate application processor and modulatedemodulate is mediated
Device is managed, wherein, application processor mainly handles operating system, user interface and application program etc., and modem processor is main
Handle radio communication.It is understood that above-mentioned modem processor can not also be integrated into processor 110.
Mobile terminal 100 can also include the power supply 111 (such as battery) to all parts power supply, it is preferred that power supply 111
Can be logically contiguous by power-supply management system and processor 110, so as to realize management charging by power-supply management system, put
The function such as electricity and power managed.
Although Fig. 1 is not shown, mobile terminal 100 can also will not be repeated here including bluetooth module etc..
Based on above-mentioned mobile terminal hardware configuration, each embodiment of the present invention is proposed.
As shown in figure 3, propose terminal imaging method first embodiment of the present invention, in the present embodiment, terminal imaging method
Comprise the following steps:
S301:The first ultrasonic signal is sent out, receives second ultrasonic signal of first ultrasonic signal through reflecting to form;
In practical application, this step is realized by the ultrasonic sensor 1051 in camera shooting terminal equipment under the control of processor 110, its
Specific implementation principle can realize that the application repeats no more using routine techniques.
S302:According to the first ultrasonic signal and the second ultrasonic signal, the body surface and camera terminal of subject are determined
Between surface distance data;
S303:According to the surface distance data of subject, the embossment picture of subject is generated.
In actual applications, shooting of the present invention includes shooting video and images photo etc..
Fig. 2 is for the ease of understanding the definition of surface distance of the present invention, in reality by taking simple cylinder as an example
In, subject can be the object of arbitrary shape.In actual applications, the object table of subject of the present invention
Surface distance between face and camera terminal refers to point all on the body surface of subject to the camera of camera terminal
The distance of central plane, it is specific as shown in Figure 2:
In fig. 2, illustrated so that subject is a regular cylinder as an example, on subject A surface, bag
Numerous point is included, its specific number can calculate according to the definition of the embossment picture of terminal user's setting, when terminal is used
If definition that family requires is higher, it is necessary to many points of sampling, if the definition of end user requirements is relatively low, only adopt
The several points of sample, the present embodiment illustrate so that sampled point number is 10 as an example:Shootings of the sampled point A1 to camera terminal
Head central plane l (plane l is through camera center and the plane vertical with optical axis) distance d1, sampled point A2 to shooting
The camera central plane l of terminal distance d2, the like, and the table between the body surface of subject and camera terminal
Face range data is then set of these sampled points to camera central plane l distance, can be with arbitrary forms such as sample tables
Stored, such as shown in table 1 below:
Sampled point | Distance | Sampled point | Distance |
1 | d1 | 6 | d6 |
2 | d2 | 7 | d7 |
3 | d3 | 8 | d8 |
4 | d4 | 9 | d9 |
5 | d5 | 10 | d10 |
Table 1
In actual applications, it is of the present invention according to the first ultrasonic signal and the second ultrasonic signal, determine subject
Surface distance data mode include two kinds, be specially:
Mode one, non-directional send the first ultrasonic signal, according to the first ultrasonic signal and the second ultrasonic signal, calculate shooting
Surface distance data in terminal sensing range between the body surface and camera terminal of all objects, according to taking the photograph for camera terminal
As the shooting direction of head, subject is determined, the surface distance data of all objects are screened, determine the table of subject
Face range data.The manner can open ultrasonic sensor after user opens shooting application and carry out ranging, because
In fixed position, camera shooting terminal perceived scope be fixed, then corresponding subject that can only from perceive model
Enclose and selection is carried out in interior all objects, the manner obtains all objects before terminal user selects subject
Surface distance data, when subsequently being shot, it is only necessary to the subject selected according to user simply screen,
Realize simply, especially when user applies the manner in the case where frequently changing subject, camera shooting terminal is only needed using super
Sonic sensor measures once, and the multiple startup for avoiding ultrasonic sensor brings unnecessary power consumption.
Mode two, the shooting direction according to the camera of camera terminal, determine subject, to subject directive sending
First ultrasonic signal, according to the first ultrasonic signal and the second ultrasonic signal, calculate the surface distance data of subject.The manner
It is that the subject selected according to user is targetedly measured, time of measuring is short, and the manner is mainly used in terminal power not
Foot, or user will not frequently change the scene of subject.
In actual applications, for both modes, it can actively be selected, can also be write from memory by camera shooting terminal by terminal user
Recognize selection, or selected by camera shooting terminal according to electricity, for example, when dump energy is more than 60% selection mode one to avoid surpassing
The multiple startup of sonic sensor, when dump energy is less than 60%, selection mode two is to reduce the single work of ultrasonic sensor
Make the time;It can also be selected by the definition that camera shooting terminal is set according to user, when the definition that user requires is larger, used
Mode one is with multiple startup of ultrasonic sensor caused by avoiding user's frequent switching subject etc..
In actual applications, the surface distance data of the present invention according to subject, the floating of subject is generated
The mode of carving picture includes:According to the surface distance data of subject, the datum plane of embossment picture is determined, by subject
Part in body surface between datum plane and camera terminal, as needed in subject body surface carry out embossment
The surface to be treated of processing, the surface distance data of subject are screened, determine the surface distance number of surface to be treated
According to according to datum plane and the surface distance data of surface to be treated, structure embossment picture.
In actual applications, datum plane of the present invention refers to the plane m parallel with plane l through critical point,
And critical point then refers to:From camera shooting terminal to from the direction of subject, it can be seen that all samplings before critical point
Point, all sampled points are can't see after critical point, such as in fig. 2, from camera shooting terminal to from the direction of subject,
Sampled point 1 to sampled point 7 can see, and sampled point 8 to sampled point 10 cannot see that, this change is since sampled point 1
, then sampled point 1 is critical point.
Specifically, as shown in fig. 7, according to subject A surface distance data, it is determined that to datum plane be through adopting
The plane a of sampling point 1 is, it is necessary to which the surface to be treated for carrying out relief sculpture treatment is then the part surface where sampled point 1 to sampled point 7
(using the part of overstriking solid line description in Fig. 7), then according to datum plane and the surface distance data of surface to be treated, structure
Embossment picture as shown in figure 11.
The situation for giving 1 subject exemplary Fig. 7, in actual applications, subject can be arbitrarily more, that
, the surface distance data now of the present invention according to subject, determine the mode bag of the datum plane of embossment picture
Include:According to the selection operation of terminal user, the object of focus in subject is determined, the surface distance data of subject are entered
Row screening, determines the surface distance data of object of focus, according to the surface distance data of object of focus, determines the base of embossment picture
Directrix plane.
In actual applications, as shown in figure 8, subject includes 2, subject A and subject B is designated as respectively,
Wherein, subject B is easily selected by a user as object of focus (object for needing the processing such as emphasis focusing display), then, right
The surface distance data of subject are screened, and obtain object of focus B surface distance data, and then determine object of focus B
Datum plane b (determining that method is identical with embodiment illustrated in fig. 7, repeat no more), the base using datum plane b as embossment picture
Directrix plane.
The situation for giving 1 object of focus exemplary Fig. 8, in actual applications, object of focus can be arbitrarily more, that
, now in actual applications, the surface distance data of the present invention according to object of focus, determine the benchmark of embossment picture
The mode of plane includes:Judge whether the quantity of object of focus is more than one, if so, then entering to the surface distance data of subject
Row screening, determines the surface distance data of each object of focus respectively, according to the surface distance data of each object of focus, it is determined that with it is each
The datum plane of object of focus matching, the datum plane matched according to each object of focus and preset rules, determine embossment picture
Datum plane, if it is not, the datum plane for then directly matching object of focus, the datum plane as embossment picture.
When object of focus quantity no more than for the moment, using the embodiment as shown in 7 or Fig. 8, repeat no more.
When the quantity of object of focus is more than for the moment, as shown in Fig. 9 or Figure 10:
Subject includes 4, is designated as subject A, subject B, subject C and subject D respectively, its
In, subject B and subject C are easily selected by a user as object of focus, then, now, it is necessary to according to each object of focus
The datum plane and preset rules matched somebody with somebody, determine the datum plane of embossment picture.
When object of focus includes multiple, then the datum plane of these object of focus, it will two kinds of situations occur:Such as figure
Shown in 9, the datum plane of all object of focus is all in approximately the same plane, or as shown in Figure 10, at least one object of focus
Datum plane and other object of focus datum plane not in approximately the same plane.So, in actual applications, the present invention relates to
And according to each object of focus match datum plane, determining the mode of the datum plane of embossment picture includes:Judge each focus
The datum plane of object matching whether in approximately the same plane, if it is not, in the datum plane for then matching each object of focus away from
The datum plane nearest from camera terminal, the datum plane as embossment picture.
Specifically, as shown in figure 9, subject B datum plane b and subject C datum plane c are same flat
On face, then arbitrarily one datum plane as embossment picture of selection.
Specifically, as shown in Figure 10, subject B datum plane b and subject C datum plane c is not same
In individual plane, then the datum plane nearest apart from camera terminal in the datum plane that matches each object of focus is just needed,
As the datum plane of embossment picture, specifically, datum plane b to plane l distance is L1, datum plane c to plane l away from
From for L2, because L1 is more than L2, then datum plane c is the datum plane nearest apart from camera terminal, using datum plane c as
The datum plane of embossment picture.
In actual applications, it is of the present invention according to datum plane and the surface distance data of surface to be treated, structure
The mode of embossment picture includes:According to the surface distance data of surface to be treated, surface to be treated is calculated to the phase of benchmark screen
Adjust the distance, using scaling of the subject in embossment photo, equal proportion contracting is carried out to the relative distance of surface to be treated
Put, determine that relief surface is to the embossment caliper of embossment bottom surface in embossment picture, according to embossment caliper and surface to be treated, generation is floated
Surface is carved, datum plane and the part outside camera terminal will be located in subject body surface, as on embossment bottom surface
Plane picture, relief surface and embossment bottom surface are merged, generate embossment picture.
Specifically, as shown in figure 11, embossment picture includes relief surface m1 and embossment bottom surface m2, and its relief surface m1 is exactly
The scaled down of surface to be treated, and in the m2 subject body surfaces of embossment bottom surface positioned at datum plane and camera terminal it
Outer part.Overstriking black line indicates the imaging of subject in fig. 11, and the black line without overstriking is represented without shot pair
The imaging of elephant, bossing represent anaglyph, and it is corresponding with the embodiment shown in Figure 10.
In summary, a kind of terminal imaging method that the embodiment of the present invention is proposed, this method pass through ultrasonic wave realization pair
The measurement of surface distance data between the body surface and camera terminal of subject, on this basis, it is possible to according to this
A little data draw embossment picture, present the picture of stereoeffect, such as 3D pictures, solve existing terminal imaging and be only capable of reality
The problem of existing planar imaging, enhance the usage experience of user.
As shown in figure 4, propose terminal imaging method second embodiment of the present invention, in the present embodiment, terminal imaging method
Comprise the following steps:
S401:The mode of operation of ultrasonic sensor is set;
In the present embodiment, ultrasonic sensor two by the way of above (according to the bat of the camera of camera terminal
Direction is taken the photograph, determines subject, to the ultrasonic signal of subject directive sending first, according to the first ultrasonic signal and the second ultrasound
Signal, calculate the surface distance data of subject) it is operated.
S402:Terminal user opens application of taking pictures, and selects object of focus;
In the present embodiment, subject is as shown in Figure 10, and subject includes 4, is designated as subject A, quilt respectively
Object B, subject C and subject D are taken the photograph, wherein, subject B and subject C are easily selected by a user as object of focus.
S403:The surface distance data of subject are measured by ultrasonic sensor;
In the present embodiment, while by ultrasonic sensor the surface distance data of all subjects are measured;
S404:According to the surface distance data of subject, embossment picture is generated;
In the present embodiment, specific processing procedure is described in detail above, and the present embodiment repeats no more,
This step generates embossment picture as shown in figure 11.
The present embodiment proposes a kind of terminal imaging method, and this method realizes the object table to subject by ultrasonic wave
The measurement of surface distance data between face and camera terminal, on this basis, it is possible to draw embossment according to these data
Picture, the picture of stereoeffect, such as 3D pictures are presented, solve existing terminal imaging and be only capable of realizing asking for planar imaging
Topic, enhance the usage experience of user.
As shown in figure 5, being based on above-mentioned mobile terminal hardware configuration and communication system, the implementation of the terminal of the present invention is proposed
Example, specifically, terminal provided by the invention includes:
Acquisition module 51, for being sent out the first ultrasonic signal by ultrasonic sensor, receive the first ultrasonic signal
The second ultrasonic signal through reflecting to form;In actual applications, this step is by the ultrasonic sensor in camera shooting terminal equipment
1051 realize under the control of processor 110, and it, which implements principle, to realize acquisition module 51 using routine techniques, this
Application repeats no more.
Computing module 52, for according to the first ultrasonic signal and the second ultrasonic signal, determining the body surface of subject
Surface distance data between camera terminal;
Processing module 53, for the surface distance data according to subject, generate the embossment picture of subject.
In actual applications, acquisition module 51 can so work:
Mode one, non-directional send the first ultrasonic signal, according to the first ultrasonic signal and the second ultrasonic signal, calculate shooting
Surface distance data in terminal sensing range between the body surface and camera terminal of all objects, according to taking the photograph for camera terminal
As the shooting direction of head, subject is determined, the surface distance data of all objects are screened, determine the table of subject
Face range data.The manner can open ultrasonic sensor after user opens shooting application and carry out ranging, because
In fixed position, camera shooting terminal perceived scope be fixed, then corresponding subject that can only from perceive model
Enclose and selection is carried out in interior all objects, the manner obtains all objects before terminal user selects subject
Surface distance data, when subsequently being shot, it is only necessary to the subject selected according to user simply screen,
Realize simply, especially when user applies the manner in the case where frequently changing subject, camera shooting terminal is only needed using super
Sonic sensor measures once, and the multiple startup for avoiding ultrasonic sensor brings unnecessary power consumption.
Mode two, the shooting direction according to the camera of camera terminal, determine subject, to subject directive sending
First ultrasonic signal, according to the first ultrasonic signal and the second ultrasonic signal, calculate the surface distance data of subject.The manner
It is that the subject selected according to user is targetedly measured, time of measuring is short, and the manner is mainly used in terminal power not
Foot, or user will not frequently change the scene of subject.
In actual applications, for both modes, it can actively be selected, can also be write from memory by camera shooting terminal by terminal user
Recognize selection, or selected by camera shooting terminal according to electricity, for example, when dump energy is more than 60% selection mode one to avoid surpassing
The multiple startup of sonic sensor, when dump energy is less than 60%, selection mode two is to reduce the single work of ultrasonic sensor
Make the time;It can also be selected by the definition that camera shooting terminal is set according to user, when the definition that user requires is larger, used
Mode one is with multiple startup of ultrasonic sensor caused by avoiding user's frequent switching subject etc..
In actual applications, processing module 53 can so work:According to the surface distance data of subject, it is determined that floating
The datum plane of picture is carved, by the part in subject body surface between datum plane and camera terminal, as quilt
The surface to be treated for needing to carry out relief sculpture treatment is taken the photograph in target object surface, the surface distance data of subject are sieved
Choosing, the surface distance data of surface to be treated are determined, it is floating according to datum plane and the surface distance data of surface to be treated, structure
Carve picture.
Specifically, as shown in fig. 7, according to subject A surface distance data, it is determined that to datum plane be through adopting
The plane a of sampling point 1 is, it is necessary to which the surface to be treated for carrying out relief sculpture treatment is then the part surface where sampled point 1 to sampled point 7
(using the part of overstriking solid line description in Fig. 7), then according to datum plane and the surface distance data of surface to be treated, structure
Embossment picture as shown in figure 11.
The situation for giving 1 subject exemplary Fig. 7, in actual applications, subject can be arbitrarily more, that
, processing module 53 can so work:According to the selection operation of terminal user, the object of focus in subject is determined, it is right
The surface distance data of subject are screened, and the surface distance data of object of focus are determined, according to the surface of object of focus
Range data, determine the datum plane of embossment picture.
In actual applications, as shown in figure 8, subject includes 2, subject A and subject B is designated as respectively,
Wherein, subject B is easily selected by a user as object of focus (object for needing the processing such as emphasis focusing display), then, right
The surface distance data of subject are screened, and obtain object of focus B surface distance data, and then determine object of focus B
Datum plane b (determining that method is identical with embodiment illustrated in fig. 7, repeat no more), the base using datum plane b as embossment picture
Directrix plane.
The situation for giving 1 object of focus exemplary Fig. 8, in actual applications, object of focus can be arbitrarily more, that
, now in actual applications, processing module 53 can so work:Judge whether the quantity of object of focus is more than one, if so,
Then the surface distance data of subject are screened, the surface distance data of each object of focus are determined respectively, according to each Jiao
The surface distance data of point object, it is determined that the datum plane matched with each object of focus, the benchmark matched according to each object of focus
Plane and preset rules, the datum plane of embossment picture is determined, if it is not, the datum plane for then directly matching object of focus, makees
For the datum plane of embossment picture.
When object of focus quantity no more than for the moment, using the embodiment as shown in 7 or Fig. 8, repeat no more.
When the quantity of object of focus is more than for the moment, as shown in Fig. 9 or Figure 10:
Subject includes 4, is designated as subject A, subject B, subject C and subject D respectively, its
In, subject B and subject C are easily selected by a user as object of focus, then, now, it is necessary to according to each object of focus
The datum plane and preset rules matched somebody with somebody, determine the datum plane of embossment picture.
When object of focus includes multiple, then the datum plane of these object of focus, it will two kinds of situations occur:Such as figure
Shown in 9, the datum plane of all object of focus is all in approximately the same plane, or as shown in Figure 10, at least one object of focus
Datum plane and other object of focus datum plane not in approximately the same plane.So, in actual applications, processing module
53 can so work:The datum plane of each object of focus matching is judged whether in approximately the same plane, if it is not, then by each focus
The datum plane nearest apart from camera terminal in the datum plane of object matching, the datum plane as embossment picture.
Specifically, as shown in figure 9, subject B datum plane b and subject C datum plane c are same flat
On face, then arbitrarily one datum plane as embossment picture of selection.
Specifically, as shown in Figure 10, subject B datum plane b and subject C datum plane c is not same
In individual plane, then the datum plane nearest apart from camera terminal in the datum plane that matches each object of focus is just needed,
As the datum plane of embossment picture, specifically, datum plane b to plane l distance is L1, datum plane c to plane l away from
From for L2, because L1 is more than L2, then datum plane c is the datum plane nearest apart from camera terminal, using datum plane c as
The datum plane of embossment picture.
In actual applications, processing module 53 can so work:According to the surface distance data of surface to be treated, calculate
Surface to be treated to benchmark screen relative distance, using scaling of the subject in embossment photo, to pending table
The relative distance in face carries out equal proportion scaling, determines that relief surface in embossment picture arrives the embossment caliper of embossment bottom surface, according to floating
Carve thickness and surface to be treated, generate relief surface, will in subject body surface positioned at datum plane and camera terminal it
Outer part, as the plane picture on embossment bottom surface, relief surface and embossment bottom surface are merged, generates embossment picture.
Specifically, as shown in figure 11, embossment picture includes relief surface m1 and embossment bottom surface m2, and its relief surface m1 is exactly
The scaled down of surface to be treated, and in the m2 subject body surfaces of embossment bottom surface positioned at datum plane and camera terminal it
Outer part.Overstriking black line indicates the imaging of subject in fig. 11, and the black line without overstriking is represented without shot pair
The imaging of elephant, bossing represent anaglyph, and it is corresponding with the embodiment shown in Figure 10.
In summary, a kind of terminal that the present embodiment is proposed, the terminal realize the thing to subject by ultrasonic wave
The measurement of surface distance data between body surface face and camera terminal, on this basis, it is possible to drawn according to these data
Embossment picture, the picture of stereoeffect, such as 3D pictures are presented, solve existing terminal imaging and be only capable of realizing planar imaging
Problem, enhance the usage experience of user.
In an embodiment of the present invention, the institute that the processor 110 in Fig. 1 can be included in embodiment illustrated in fig. 5 is functional
The function of module.Now, above-described embodiment can be:
First, processor 110 is sent out the first ultrasonic signal by ultrasonic sensor 1051, receives the first ultrasound letter
Number the second ultrasonic signal through reflecting to form;
Then, processor 110 is according to the first ultrasonic signal and the second ultrasonic signal, determine the body surface of subject with
Surface distance data between camera terminal;
Finally, processor 110 generates the embossment picture of subject according to the surface distance data of subject.
In actual applications, processor 110 is used for:Non-directional sends the first ultrasonic signal, according to the first ultrasonic signal and
Second ultrasonic signal, calculate the surface distance between the body surface and camera terminal of all objects in camera terminal sensing range
Data, according to the shooting direction of the camera of camera terminal, subject is determined, the surface distance data of all objects are carried out
Screening, determine the surface distance data of subject;Or the shooting direction of the camera according to camera terminal, it is determined that shot
Object, to the ultrasonic signal of subject directive sending first, according to the first ultrasonic signal and the second ultrasonic signal, calculate shot pair
The surface distance data of elephant.
In actual applications, processor 110 is used for:According to the surface distance data of subject, embossment picture is determined
Datum plane, by the part in subject body surface between datum plane and camera terminal, as subject thing
The surface to be treated of progress relief sculpture treatment is needed in body surface face, the surface distance data of subject are screened, it is determined that treating
The surface distance data on surface are handled, according to datum plane and the surface distance data of surface to be treated, build embossment picture.
In actual applications, processor 110 is used for:According to the surface distance data of surface to be treated, pending table is calculated
Face to benchmark screen relative distance, using scaling of the subject in embossment photo, to the relative of surface to be treated
Distance carries out equal proportion scaling, determines relief surface in embossment picture to the embossment caliper of embossment bottom surface, according to embossment caliper and
Surface to be treated, relief surface is generated, datum plane and the part outside camera terminal will be located in subject body surface,
As the plane picture on embossment bottom surface, relief surface and embossment bottom surface are merged, generates embossment picture.
In summary, a kind of terminal that the present embodiment is proposed, its hobby is downloaded according to the selection operation of terminal user
First picture, then terminal user need with first terminal by ultrasonic wave realize to the body surface of subject with
The measurement of surface distance data between camera terminal, on this basis, it is possible to embossment picture is drawn according to these data,
The picture of stereoeffect, such as 3D pictures are presented, solves the problems, such as that existing terminal imaging is only capable of realizing planar imaging, strengthens
The usage experience of user.
As shown in fig. 6, proposing terminal second embodiment of the present invention, in the present embodiment, terminal comprises at least:Input and output
(IO) bus 61, processor 62, memory 63, internal memory 64, camera 65, ultrasonic sensor 66 and it is stored on memory 63
And the terminal image forming program that can be run on the processor 62, terminal image forming program realize following steps when being executed by processor.Its
In,
Input and output (IO) bus 61 respectively with itself belonging to terminal other parts (processor 62, RAM 63, internal memory
64th, camera 65, ultrasonic sensor 66) connection, and provide transmission lines for other parts.
Processor 62 generally controls the overall operation of the terminal belonging to itself.Calculate and confirm for example, processor 62 performs
Deng operation.Wherein, processor 62 can be central processing unit (CPU).
Memory 63 stores the software code that processor is readable, processor is executable, and it, which is included, is used for control processor 62
Perform the instruction (i.e. software perform function) of functions described herein.In the present embodiment, RAM63 at least needs to be stored with realization
Processor 62 performs the program that above-mentioned function needs.
Wherein, in terminal control mechanism provided by the invention, realize that the software code of all functions of modules of Fig. 5 is storable in
In memory 63, and performed after being performed or compiled by processor 62.
Internal memory 64, typically using semiconductor memory cell, including random access memory (RAM), read-only storage (ROM), with
And cache (CACHE), RAM are most important of which memories.Internal memory 44 is one of part important in computer, and it is
The bridge linked up with CPU, in computer the operation of all programs all carried out in internal memory, its act on be used for it is temporary transient
The operational data in CPU, and the data exchanged with external memory storages such as hard disks are deposited, as long as computer is in operation, CPU is just
Computing can be carried out needing the data of computing to be transferred in internal memory, CPU again sends out result after the completion of computing, the fortune of internal memory
Row also determines the stable operation of computer.
Camera 65, for shooting, and it is transferred to processor 62.
Ultrasonic sensor 46, for measurement distance.
It is real when the terminal image forming program that the present embodiment provides is executed by processor on the basis of the terminal member shown in Fig. 6
Existing following steps:
First ultrasonic signal is sent out by ultrasonic sensor 1051, receives the first ultrasonic signal through reflecting to form
Second ultrasonic signal;
According to the first ultrasonic signal and the second ultrasonic signal, determine between the body surface of subject and camera terminal
Surface distance data;
According to the surface distance data of subject, the embossment picture of subject is generated.
In actual applications, step is realized when terminal image forming program is executed by processor:Non-directional sends the first ultrasound letter
Number, according to the first ultrasonic signal and the second ultrasonic signal, calculate in camera terminal sensing range the body surface of all objects with
Surface distance data between camera terminal, according to the shooting direction of the camera of camera terminal, determine subject, to all
The surface distance data of object are screened, and determine the surface distance data of subject;Or the shooting according to camera terminal
The shooting direction of head, determines subject, to the ultrasonic signal of subject directive sending first, according to the first ultrasonic signal and the
Two ultrasonic signals, calculate the surface distance data of subject.
In actual applications, step is realized when terminal image forming program is executed by processor:According to the surface of subject away from
From data, determine the datum plane of embossment picture, by subject body surface between datum plane and camera terminal
Part, as the surface to be treated for needing to carry out relief sculpture treatment in subject body surface, to the surface of subject away from
Screened from data, determine the surface distance data of surface to be treated, according to the surface of datum plane and surface to be treated away from
From data, embossment picture is built.
In actual applications, step is realized when terminal image forming program is executed by processor:Grasped according to the selection of terminal user
Make, determine the object of focus in subject, the surface distance data of subject are screened, determine the table of object of focus
Face range data, according to the surface distance data of object of focus, determine the datum plane of embossment picture.
In actual applications, step is realized when terminal image forming program is executed by processor:Judging the quantity of object of focus is
It is no to be more than one;If so, then being screened to the surface distance data of subject, the surface distance of each object of focus is determined respectively
Data;According to the surface distance data of each object of focus, it is determined that the datum plane matched with each object of focus;According to each focus pair
As the datum plane of matching, the datum plane of embossment picture is determined.
In actual applications, step is realized when terminal image forming program is executed by processor:Judge each object of focus matching
Datum plane whether in approximately the same plane, if it is not, in the datum plane for then matching each object of focus apart from camera terminal
Nearest datum plane, the datum plane as embossment picture.
In actual applications, step is realized when terminal image forming program is executed by processor:According to the surface of surface to be treated
Range data, surface to be treated is calculated to the relative distance of benchmark screen, uses pantograph ratio of the subject in embossment photo
Example, equal proportion scaling is carried out to the relative distance of surface to be treated, determine relief surface floating to embossment bottom surface in embossment picture
Thickness is carved, according to embossment caliper and surface to be treated, relief surface is generated, datum plane will be located in subject body surface
With the part outside camera terminal, as the plane picture on embossment bottom surface, relief surface and embossment bottom surface are merged, generates embossment
Picture.
In summary, a kind of terminal that the present embodiment is proposed, the terminal realize the thing to subject by ultrasonic wave
The measurement of surface distance data between body surface face and camera terminal, on this basis, it is possible to drawn according to these data
Embossment picture, the picture of stereoeffect, such as 3D pictures are presented, solve existing terminal imaging and be only capable of realizing planar imaging
Problem, enhance the usage experience of user.
Meanwhile the invention provides a kind of computer-readable recording medium, end is stored with computer-readable recording medium
Image forming program is held, terminal image forming program realizes following steps when being executed by processor:
First ultrasonic signal is sent out by ultrasonic sensor, receives second of the first ultrasonic signal through reflecting to form
Ultrasonic signal;
According to the first ultrasonic signal and the second ultrasonic signal, determine between the body surface of subject and camera terminal
Surface distance data;
According to the surface distance data of subject, the embossment picture of subject is generated.
In actual applications, step is realized when terminal image forming program is performed:
Non-directional sends the first ultrasonic signal, according to the first ultrasonic signal and the second ultrasonic signal, calculates camera terminal sense
Surface distance data in the range of knowing between the body surface and camera terminal of all objects, according to the camera of camera terminal
Shooting direction, subject is determined, the surface distance data of all objects are screened, determine the surface distance of subject
Data;
Or the shooting direction of the camera according to camera terminal, subject is determined, to subject directive sending
One ultrasonic signal, according to the first ultrasonic signal and the second ultrasonic signal, calculate the surface distance data of subject.
In actual applications, step is realized when terminal image forming program is performed:
According to the surface distance data of subject, the datum plane of embossment picture is determined, by subject body surface
In part between datum plane and camera terminal, as needing to carry out treating for relief sculpture treatment in subject body surface
Surface is handled, the surface distance data of subject are screened, the surface distance data of surface to be treated are determined, according to base
The surface distance data of directrix plane and surface to be treated, build embossment picture.
In actual applications, step is realized when terminal image forming program is performed:
According to the selection operation of terminal user, the object of focus in subject is determined, to the surface distance of subject
Data are screened, and determine the surface distance data of object of focus, according to the surface distance data of object of focus, determine embossment figure
The datum plane of piece.
In actual applications, step is realized when terminal image forming program is performed:
Judge whether the quantity of object of focus is more than one;If so, then the surface distance data of subject are screened,
The surface distance data of each object of focus are determined respectively;According to the surface distance data of each object of focus, it is determined that with each focus pair
As the datum plane of matching;The datum plane matched according to each object of focus, determine the datum plane of embossment picture.
In actual applications, step is realized when terminal image forming program is performed:
The datum plane of each object of focus matching is judged whether in approximately the same plane, if it is not, then by each object of focus
The datum plane nearest apart from camera terminal in the datum plane matched somebody with somebody, the datum plane as embossment picture.
In actual applications, step is realized when terminal image forming program is performed:
According to the surface distance data of surface to be treated, surface to be treated is calculated to the relative distance of benchmark screen, is used
Scaling of the subject in embossment photo, equal proportion scaling is carried out to the relative distance of surface to be treated, determines embossment
Relief surface according to embossment caliper and surface to be treated, generates relief surface to the embossment caliper of embossment bottom surface in picture, will be by
Take the photograph in target object surface positioned at datum plane and the part outside camera terminal, as the plane picture on embossment bottom surface, melt
Relief surface and embossment bottom surface are closed, generates embossment picture.
In summary, a kind of storage medium that the present embodiment is proposed, the program of its memory storage is operationally so that terminal
Measurement to the surface distance data between the body surface and camera terminal of subject is realized by ultrasonic wave, it is basic herein
On, it is possible to embossment picture is drawn according to these data, presents the picture of stereoeffect, such as 3D pictures, is solved existing
There is the problem of terminal imaging is only capable of realizing planar imaging, enhance the usage experience of user.
A kind of terminal imaging method, camera shooting terminal and the computer-readable recording medium that the embodiment of the present invention is proposed, should
Method realizes the measurement to the surface distance data between the body surface and camera terminal of subject by ultrasonic wave, herein
On the basis of, it is possible to embossment picture is drawn according to these data, presents the picture of stereoeffect, such as 3D pictures, is solved
The problem of existing terminal imaging is only capable of realizing planar imaging, enhance the usage experience of user.
It should be noted that herein, term " comprising ", "comprising" or its any other variant are intended to non-row
His property includes, so that process, method, article or device including a series of elements not only include those key elements, and
And also include the other element being not expressly set out, or also include for this process, method, article or device institute inherently
Key element.In the absence of more restrictions, the key element limited by sentence "including a ...", it is not excluded that including this
Other identical element also be present in the process of key element, method, article or device.
The embodiments of the present invention are for illustration only, do not represent the quality of embodiment.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side
Method can add the mode of required general hardware platform to realize by software, naturally it is also possible to by hardware, but in many cases
The former is more preferably embodiment.Based on such understanding, technical scheme is substantially done to prior art in other words
Going out the part of contribution can be embodied in the form of software product, and the computer software product is stored in a storage medium
In (such as ROM/RAM, magnetic disc, CD), including some instructions to cause a station terminal equipment (can be mobile phone, computer, clothes
Be engaged in device, air conditioner, or network equipment etc.) method that performs each embodiment of the present invention.
The preferred embodiments of the present invention are these are only, are not intended to limit the scope of the invention, it is every to utilize this hair
The equivalent structure or equivalent flow conversion that bright specification and accompanying drawing content are made, or directly or indirectly it is used in other related skills
Art field, is included within the scope of the present invention.
Claims (10)
- A kind of 1. terminal imaging method, it is characterised in that including:The first ultrasonic signal is sent out, receives second ultrasonic signal of first ultrasonic signal through reflecting to form;According to first ultrasonic signal and second ultrasonic signal, determine subject body surface and camera terminal it Between surface distance data;According to the surface distance data of the subject, the embossment picture of the subject is generated.
- 2. terminal imaging method as claimed in claim 1, it is characterised in that described according to first ultrasonic signal and described Second ultrasonic signal, determining the surface distance data of subject includes:Non-directional sends first ultrasonic signal;According to first ultrasonic signal and second ultrasonic signal, all objects in the camera terminal sensing range are calculated Body surface and camera terminal between surface distance data;According to the shooting direction of the camera of camera terminal, the subject is determined;The surface distance data of all objects are screened, determine the surface distance data of the subject.
- 3. terminal imaging method as claimed in claim 1, it is characterised in that described according to first ultrasonic signal and described Second ultrasonic signal, determine that the surface distance data between the body surface of subject and camera terminal include:According to the shooting direction of the camera of camera terminal, the subject is determined;To the first ultrasonic signal described in the subject directive sending;According to first ultrasonic signal and second ultrasonic signal, the surface distance data of the subject are calculated.
- 4. the terminal imaging method as described in any one of claims 1 to 3, it is characterised in that described according to the subject Surface distance data, generating the embossment picture of the subject includes:According to the surface distance data of the subject, the datum plane of the embossment picture is determined;By the part in the subject body surface between the datum plane and the camera terminal, as described The surface to be treated of progress relief sculpture treatment is needed in subject body surface;The surface distance data of the subject are screened, determine the surface distance data of the surface to be treated;According to the datum plane and the surface distance data of the surface to be treated, the embossment picture is built.
- 5. terminal imaging method as claimed in claim 4, it is characterised in that the surface distance according to the subject Data, determining the datum plane of the embossment picture includes:According to the selection operation of terminal user, the object of focus in the subject is determined;The surface distance data of the subject are screened, determine the surface distance data of the object of focus;According to the surface distance data of the object of focus, the datum plane of the embossment picture is determined.
- 6. terminal imaging method as claimed in claim 5, it is characterised in that the surface distance according to the object of focus Data, determining the datum plane of the embossment picture includes:Judge whether the quantity of the object of focus is more than one;If so, then being screened to the surface distance data of the subject, the surface distance of each object of focus is determined respectively Data;According to the surface distance data of each object of focus, it is determined that the datum plane matched with each object of focus;The datum plane matched according to each object of focus, determine the datum plane of the embossment picture.
- 7. terminal imaging method as claimed in claim 6, it is characterised in that the base matched according to each object of focus Directrix plane, determining the datum plane of the embossment picture includes:Judge the datum plane of each object of focus matching whether in approximately the same plane;If it is not, the datum plane nearest apart from camera terminal in the datum plane for then matching each object of focus, as described The datum plane of embossment picture.
- 8. terminal imaging method as claimed in claim 4, it is characterised in that described according to the datum plane and described to wait to locate The surface distance data on surface are managed, building the embossment picture includes:According to the surface distance data of the surface to be treated, calculate the surface to be treated to the benchmark screen it is relative away from From;Using scaling of the subject in the embossment photo, the ratio such as carry out to the relative distance of the surface to be treated Example scaling, determine in the embossment picture relief surface to the embossment caliper of embossment bottom surface;According to the embossment caliper and the surface to be treated, the relief surface is generated;The datum plane and the part outside the camera terminal will be located in the subject body surface, as described Plane picture on embossment bottom surface;The relief surface and the embossment bottom surface are merged, generates the embossment picture.
- 9. a kind of camera shooting terminal, it is characterised in that the camera shooting terminal includes:Ultrasonic sensor, camera, memory, place Manage device and be stored in the terminal image forming program that can be run on the memory and on the processor, the terminal image forming program The step of terminal imaging method as any one of claim 1 to 8 is realized during the computing device.
- A kind of 10. computer-readable recording medium, it is characterised in that be stored with the computer-readable recording medium terminal into Picture program, the terminal imaging method as any one of claim 1 to 8 is realized when the terminal image forming program is performed Step.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710527599.8A CN107505619A (en) | 2017-06-30 | 2017-06-30 | A kind of terminal imaging method, camera shooting terminal and computer-readable recording medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710527599.8A CN107505619A (en) | 2017-06-30 | 2017-06-30 | A kind of terminal imaging method, camera shooting terminal and computer-readable recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107505619A true CN107505619A (en) | 2017-12-22 |
Family
ID=60679607
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710527599.8A Pending CN107505619A (en) | 2017-06-30 | 2017-06-30 | A kind of terminal imaging method, camera shooting terminal and computer-readable recording medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107505619A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108596666A (en) * | 2018-04-24 | 2018-09-28 | 重庆凯务电子商务有限公司 | Sales promotion system for glasses |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101008571A (en) * | 2007-01-29 | 2007-08-01 | 中南大学 | Three-dimensional environment perception method for mobile robot |
CN101806899A (en) * | 2010-05-26 | 2010-08-18 | 哈尔滨工业大学 | Striped tube UV laser imaging radar system for carrying out four-dimensional imaging on terrain and imaging method thereof |
CN101881830A (en) * | 2010-03-15 | 2010-11-10 | 中国电子科技集团公司第十研究所 | Method for reconstructing radar scanning data to generate three-dimensional visual terrain |
CN103047969A (en) * | 2012-12-07 | 2013-04-17 | 北京百度网讯科技有限公司 | Method for generating three-dimensional image through mobile terminal and mobile terminal |
CN103109539A (en) * | 2010-06-28 | 2013-05-15 | Pnf有限公司 | System and method for displaying 3d images |
US20140320609A1 (en) * | 2009-05-20 | 2014-10-30 | Roger Stettner | 3-dimensional hybrid camera and production system |
WO2015166711A1 (en) * | 2014-05-02 | 2015-11-05 | 富士フイルム株式会社 | Distance-measurement device, distance-measurement method, and distance-measurement program |
KR20170031656A (en) * | 2014-07-08 | 2017-03-21 | 삼성전자주식회사 | Electronic apparatus and method for processing three-dimensional information usins image |
-
2017
- 2017-06-30 CN CN201710527599.8A patent/CN107505619A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101008571A (en) * | 2007-01-29 | 2007-08-01 | 中南大学 | Three-dimensional environment perception method for mobile robot |
US20140320609A1 (en) * | 2009-05-20 | 2014-10-30 | Roger Stettner | 3-dimensional hybrid camera and production system |
CN101881830A (en) * | 2010-03-15 | 2010-11-10 | 中国电子科技集团公司第十研究所 | Method for reconstructing radar scanning data to generate three-dimensional visual terrain |
CN101806899A (en) * | 2010-05-26 | 2010-08-18 | 哈尔滨工业大学 | Striped tube UV laser imaging radar system for carrying out four-dimensional imaging on terrain and imaging method thereof |
CN103109539A (en) * | 2010-06-28 | 2013-05-15 | Pnf有限公司 | System and method for displaying 3d images |
CN103047969A (en) * | 2012-12-07 | 2013-04-17 | 北京百度网讯科技有限公司 | Method for generating three-dimensional image through mobile terminal and mobile terminal |
WO2015166711A1 (en) * | 2014-05-02 | 2015-11-05 | 富士フイルム株式会社 | Distance-measurement device, distance-measurement method, and distance-measurement program |
KR20170031656A (en) * | 2014-07-08 | 2017-03-21 | 삼성전자주식회사 | Electronic apparatus and method for processing three-dimensional information usins image |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108596666A (en) * | 2018-04-24 | 2018-09-28 | 重庆凯务电子商务有限公司 | Sales promotion system for glasses |
CN108596666B (en) * | 2018-04-24 | 2021-11-30 | 重庆艾里芸信息科技(集团)有限公司 | Promotion and sale system for glasses |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107786732A (en) | Terminal applies method for pushing, mobile terminal and computer-readable recording medium | |
CN107817939A (en) | A kind of image processing method and mobile terminal | |
CN107580176A (en) | A kind of terminal taking control method, camera shooting terminal and computer-readable recording medium | |
CN108089808A (en) | A kind of screen-picture acquisition methods, terminal and computer readable storage medium | |
CN109035185A (en) | A kind of image processing method and terminal | |
CN107248137A (en) | A kind of method and mobile terminal for realizing image procossing | |
CN108184057A (en) | Flexible screen terminal taking method, flexible screen terminal and computer readable storage medium | |
CN113365085B (en) | Live video generation method and device | |
CN107621228A (en) | A kind of object measuring method, camera terminal and computer-readable recording medium | |
CN107730433A (en) | One kind shooting processing method, terminal and computer-readable recording medium | |
CN110213485A (en) | A kind of image processing method and terminal | |
CN107632757A (en) | A kind of terminal control method, terminal and computer-readable recording medium | |
CN107843991A (en) | Detection method, system, terminal and the computer-readable recording medium of screen light leak | |
CN108196776A (en) | A kind of terminal split screen method, terminal and computer readable storage medium | |
CN107767430A (en) | One kind shooting processing method, terminal and computer-readable recording medium | |
CN107635065A (en) | A kind of screenshotss method, mobile terminal and computer-readable recording medium | |
CN107454531A (en) | A kind of loudspeaker, terminal and the method for adjusting volume | |
CN108008991A (en) | A kind of image processing method, terminal and computer-readable recording medium | |
CN107153500A (en) | It is a kind of to realize the method and apparatus that image is shown | |
CN110012148A (en) | A kind of bracelet control method, bracelet and computer readable storage medium | |
CN110717964B (en) | Scene modeling method, terminal and readable storage medium | |
CN111142396A (en) | Information display method and electronic equipment | |
CN107864086A (en) | The quick sharing method of information, mobile terminal and computer-readable recording medium | |
CN107479815A (en) | Realize the method, terminal and computer-readable recording medium of split screen screen control | |
CN107529012A (en) | Photo method of adjustment, mobile terminal and computer-readable recording medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20171222 |
|
RJ01 | Rejection of invention patent application after publication |