CN106646442A - Distance measurement method and terminal - Google Patents

Distance measurement method and terminal Download PDF

Info

Publication number
CN106646442A
CN106646442A CN201611119955.4A CN201611119955A CN106646442A CN 106646442 A CN106646442 A CN 106646442A CN 201611119955 A CN201611119955 A CN 201611119955A CN 106646442 A CN106646442 A CN 106646442A
Authority
CN
China
Prior art keywords
depth
target
terminal
preview image
map information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201611119955.4A
Other languages
Chinese (zh)
Inventor
郭启凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201611119955.4A priority Critical patent/CN106646442A/en
Publication of CN106646442A publication Critical patent/CN106646442A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves

Abstract

The embodiment of the invention discloses a distance measurement method, which can reduce measurement errors. The method comprises steps: in a first measurement mode, a first photographing instruction is received, and the field depth map information of the current preview image is acquired according to the first photographing instruction; a first selection instruction is received, and a first target and a second target are determined on the current preview image according to the first selection instruction, wherein the first target and the second target are targets in the current field depth range of the current preview image; according to the field depth map information and a first preset calculation rule, a first position difference between the first target and the second target on the current preview image is calculated; and according to the first position difference and a second preset calculation rule, a second position difference between the first target and the second target is calculated, wherein the second position difference is the actual position difference between the first target and the second target.

Description

A kind of measurement distance method and terminal
Technical field
The present invention relates to electronic application field, more particularly to a kind of measurement distance method and terminal.
Background technology
As miscellaneous service function is increasingly enriched in intelligent terminal, intelligent terminal becomes to get in real-life use Come more frequent, user can be obtained information by intelligent terminal come transmission information, shared resource etc., it is seen then that mobile terminal with The life of user becomes inseparable.User can measure the distance between two positions by intelligent terminal.
At present, user can be by being input into original position and final position so as to measure original position on intelligent terminal Arrive the distance between final position.
However, when the distance between original position and final position are close, intelligent terminal can be ignored between them Distance, cause measure error big.
The content of the invention
To solve above-mentioned technical problem, the embodiment of the present invention is expected to provide a kind of measurement distance method and terminal, can be subtracted Little measure error.
The technical scheme is that what is be achieved in that:
The embodiment of the present invention provides a kind of measurement distance method, is applied to terminal, and the terminal is provided with dual camera, bag Include:
In the first measurement pattern, receive first and shoot instruction, and current preview is obtained according to the described first shooting instruction The depth map information of image, the depth map information be in the present preview image each pixel current depth of field model Enclose interior depth of view information, first measurement pattern between two target locations on the measurement present preview image it is actual away from From pattern;
First choice instruction is received, and the first mesh is determined on the present preview image according to first choice instruction Mark and the second target, the first object and second target are the described current field depth in the present preview image Interior target;
Existed according to the depth map information and the first default computation rule calculating first object and second target First position on the present preview image is poor;
According to first position difference and the second default computation rule calculate the first object and second target it Between the second place it is poor, second place difference is position difference actual between the first object and second target.
In the above-mentioned methods, it is described that the first object is calculated according to the depth map information and the first default computation rule It is poor with first position of second target on the present preview image, including:
According to the depth map information and the first object, the first depth of view information of the first object is obtained, it is described First depth of view information is depth of view information corresponding with the first object in the depth map information;
According to the depth map information and second target, the second depth of view information of second target is obtained, it is described Second depth of view information is depth of view information corresponding with second target in the depth map information;
According to first depth of view information, second depth of view information and the first default computation rule, calculate described The first position between first object and second target is poor.
In the above-mentioned methods, it is described that the first object is calculated according to first position difference and the second default computation rule And the second place between second target is poor, including:
First position difference is amplified into default multiplication factor, the second place is calculated poor.
In the above-mentioned methods, the depth map information for obtaining present preview image, including:
The depth map information is obtained using the dual camera.
In the above-mentioned methods, it is described in the first measurement pattern, receive first and shoot before instruction, methods described is also wrapped Include:
Based on user first operates, into first measurement pattern.
The embodiment of the present invention provides a kind of terminal, and the terminal includes:
Acquisition module, for receiving first instruction is shot, and obtains present preview image according to the described first shooting instruction Depth map information, the depth map information be in the present preview image each pixel current field depth in Depth of view information;
Determining module, for receiving first choice instruction, and instructs in the current preview figure according to the first choice Determine first object and the second target as upper, the first object and second target are the institute in the present preview image State the target in current field depth;
Computing module, for calculating the first object and institute according to the depth map information and the first default computation rule State first position of second target on the present preview image poor;
Computing module, be additionally operable to according to first position difference and the second default computation rule calculate the first object and The second place between second target is poor, and the second place difference is real between the first object and second target The position difference on border.
In above-mentioned terminal, the acquisition module is additionally operable to, according to the depth map information and the first object, obtain First depth of view information of the first object, first depth of view information be the depth map information in the first object pair The depth of view information answered;According to the depth map information and second target, the second depth of view information of second target is obtained, Second depth of view information is depth of view information corresponding with second target in the depth map information;
The computing module, specifically for according to first depth of view information, second depth of view information and described first Default computation rule, the first position calculated between the first object and second target is poor.
In above-mentioned terminal, the computing module is additionally operable to for first position difference to amplify default multiplication factor, calculates Go out the second place poor.
In above-mentioned terminal, the acquisition module, specifically for obtaining the depth map information using the dual camera.
In above-mentioned terminal, the terminal also includes:Into module;
The entrance module, for the first operation based on user, into first measurement pattern.
A kind of measurement distance method and terminal are embodiments provided, by the first measurement pattern, the is received One shoots instruction, and shoots the depth map information that instruction obtains present preview image according to first, and depth map information is current pre- Look at each pixel in image current field depth in depth of view information, the first measurement pattern is measurement current preview figure As the pattern of actual range between upper two target locations;First choice instruction is received, and is instructed current according to first choice First object and the second target are determined on preview image, first object and the second target are the current depth of field in present preview image In the range of target;First object and the second target are calculated in current preview according to depth map information and the first default computation rule First position on image is poor;Calculated between first object and the second target according to first position difference and the second default computation rule The second place it is poor, second place difference is position difference actual between first object and the second target.Using above-mentioned technology reality Existing scheme, by using dual camera the depth map information of present preview image is obtained, so that it is determined that the on present preview image Actual range between one target and the second target, reduces the measure error that terminal is measured between two closer location.
Description of the drawings
Fig. 1 is that the hardware configuration of a kind of optional mobile terminal for realizing each embodiment of the invention is illustrated;
Fig. 2 is the communication system that the mobile terminal of the present invention is operable to;
Fig. 3 is a kind of flow chart one of measurement distance method provided in an embodiment of the present invention;
Fig. 4 is a kind of flowchart 2 of measurement distance method provided in an embodiment of the present invention;
Fig. 5 is the schematic diagram that a kind of exemplary user provided in an embodiment of the present invention opens terminal first mode;
Fig. 6 is the schematic diagram that a kind of exemplary user provided in an embodiment of the present invention selects first object and the second target One;
Fig. 7 is the schematic diagram that a kind of exemplary user provided in an embodiment of the present invention selects first object and the second target Two;
Fig. 8 is a kind of structural representation one of terminal 1 provided in an embodiment of the present invention;
Fig. 9 is a kind of structural representation two of terminal 1 provided in an embodiment of the present invention;
Figure 10 is a kind of structural representation three of terminal 1 provided in an embodiment of the present invention.
Specific embodiment
It should be appreciated that specific embodiment described herein is not intended to limit the present invention only to explain the present invention.
The mobile terminal of each embodiment of the invention is realized referring now to Description of Drawings.In follow-up description, use For represent element such as " module ", " part " or " unit " suffix only for be conducive to the present invention explanation, itself Not specific meaning.Therefore, " module " can be used mixedly with " part ".
Mobile terminal can be implemented in a variety of manners.For example, the terminal described in the embodiment of the present invention can include all Such as mobile phone, smart phone, notebook computer, digit broadcasting receiver, personal digital assistant (PDA), panel computer (PAD), the mobile terminal of portable media player (PMP), guider etc. and such as numeral TV, desktop computer Etc. fixed terminal.Hereinafter it is assumed that terminal is mobile terminal.However, it will be understood by those skilled in the art that, except special For outside the element for moving purpose, construction according to the embodiment of the present invention can also apply to the terminal of fixed type.
Fig. 1 is the hardware architecture diagram for realizing the optional mobile terminal of each embodiment one of the invention.
Mobile terminal 1 00 can include wireless communication unit 110, audio/video (A/V) input block 120, user input Unit 130, sensing unit 140, output unit 150, memory 160, interface unit 170, controller 180 and power subsystem 190 Etc..Fig. 1 shows the mobile terminal with various assemblies, it should be understood that being not required for implementing all groups for illustrating Part.More or less of component can alternatively be implemented.Will be discussed in more detail below the element of mobile terminal.
Wireless communication unit 110 generally includes one or more assemblies, and it allows mobile terminal 1 00 and wireless communication system Or the radio communication between network.For example, wireless communication unit can include broadcasting reception module 111, mobile communication module 112nd, at least one of wireless Internet module 113, short range communication module 114 and location information module 115.
Broadcasting reception module 111 receives broadcast singal and/or broadcast via broadcast channel from external broadcast management server Relevant information.Broadcast channel can include satellite channel and/or terrestrial channel.Broadcast management server can be generated and sent The broadcast singal generated before the server or reception of broadcast singal and/or broadcast related information and/or broadcast related information And send it to the server of terminal.Broadcast singal can include TV broadcast singals, radio signals, data broadcasting Signal etc..And, broadcast singal may further include the broadcast singal combined with TV or radio signals.Broadcast phase Pass information can also be provided via mobile communications network, and in this case, broadcast related information can be by mobile communication mould Block 112 is receiving.Broadcast singal can be present in a variety of manners, and for example, it can be with the electronics of DMB (DMB) The form of program guide (EPG), the electronic service guidebooks (ESG) of digital video broadcast-handheld (DVB-H) etc. and exist.Broadcast Receiver module 111 can receive signal broadcast by using various types of broadcast systems.Especially, broadcasting reception module 111 Can be wide by using such as multimedia broadcasting-ground (DMB-T), DMB-satellite (DMB-S), digital video Broadcast-hand-held (DVB-H), Radio Data System, the received terrestrial digital broadcasting integrated service of forward link media (MediaFLO@) Etc. (ISDB-T) digit broadcasting system receives digital broadcasting.Broadcasting reception module 111 may be constructed such that and be adapted to provide for extensively Broadcast the various broadcast systems and above-mentioned digit broadcasting system of signal.Via broadcasting reception module 111 receive broadcast singal and/ Or broadcast related information can be stored in memory 160 (or other types of storage medium).
Mobile communication module 112 sends radio signals to base station (for example, access point, node B etc.), exterior terminal And at least one of server and/or receive from it radio signal.Such radio signal can be logical including voice Words signal, video calling signal or the various types of data for sending and/or receiving according to text and/or Multimedia Message.
Wireless Internet module 113 supports the Wi-Fi (Wireless Internet Access) of mobile terminal.The module can be internally or externally It is couple to terminal.Wi-Fi (Wireless Internet Access) technology involved by the module can include WLAN (WLAN) (Wi-Fi), nothing Live width band (Wibro), worldwide interoperability for microwave accesses (Wimax), high-speed downlink packet are accessed (HSDPA) etc..
Short range communication module 114 is the module for supporting junction service.Some examples of short-range communication technology include indigo plant Tooth TM, RF identification (RFID), Infrared Data Association (IrDA), ultra broadband (UWB), purple honeybee TM etc..
Location information module 115 is the module for checking or obtaining the positional information of mobile terminal.Location information module Typical case be global positioning system (GPS).According to current technology, as GPS location information module 115 calculate from The range information and correct time information of three or more satellites and for calculate Information application triangulation, so as to Calculate according to longitude, latitude and highly accurately three-dimensional current location information.Currently, for calculating the side of position and temporal information Position and the error of temporal information that method is calculated using three satellites and by using other satellite correction.This Outward, GPS module 115 can be by Continuous plus current location information in real time come calculating speed information.
A/V input blocks 120 are used to receive audio or video signal.A/V input blocks 120 can include the He of camera 121 Microphone 1220, the static map that 121 pairs, camera is obtained in Video Capture pattern or image capture mode by image capture apparatus The view data of piece or video is processed.Picture frame after process may be displayed on display unit 151.At Jing cameras 121 Picture frame after reason can be stored in memory 160 (or other storage mediums) or via wireless communication unit 110 and carry out Send, two or more cameras 121 can be provided according to the construction of mobile terminal.Microphone 122 can telephone calling model, Sound (voice data) is received via microphone in logging mode, speech recognition mode etc. operational mode, and can be by this The acoustic processing of sample is voice data.Audio frequency (voice) data after process can be converted in the case of telephone calling model The form output of mobile communication base station can be sent to via mobile communication module 112.Microphone 122 can be implemented various types of Noise eliminate (or suppress) algorithm eliminating (or suppression) in the noise for receiving and producing during sending audio signal or Interference.
User input unit 130 can generate key input data to control each of mobile terminal according to the order of user input Plant operation.User input unit 130 allows the various types of information of user input, and can include keyboard, metal dome, touch Plate (for example, detection is due to the sensitive component of the changes such as touched and caused resistance, pressure, electric capacity), roller, rocking bar etc.. Especially, when touch pad is superimposed upon in the form of layer on display unit 151, touch-screen can be formed.
Sensing unit 140 detects the current state of mobile terminal 1 00, and (for example, mobile terminal 1 00 opens or closes shape State), the presence or absence of contact (that is, touch input), the mobile terminal of the position of mobile terminal 1 00, user for mobile terminal 1 00 100 orientation, the acceleration or deceleration movement of mobile terminal 1 00 and direction etc., and generate for controlling mobile terminal 1 00 The order of operation or signal.For example, when mobile terminal 1 00 is embodied as sliding-type mobile phone, sensing unit 140 can be sensed The sliding-type phone is opened or closed.In addition, sensing unit 140 can detect power subsystem 190 whether provide electric power or Whether person's interface unit 170 couples with external device (ED).
Interface unit 170 is connected the interface that can pass through with mobile terminal 1 00 as at least one external device (ED).For example, External device (ED) can include wired or wireless head-band earphone port, external power source (or battery charger) port, wired or nothing Line FPDP, memory card port, the port for device of the connection with identification module, audio input/output (I/O) end Mouth, video i/o port, ear port etc..Identification module can be that storage uses each of mobile terminal 1 00 for verifying user Kind of information and subscriber identification module (UIM), client identification module (SIM), Universal Subscriber identification module (USIM) can be included Etc..In addition, the device (hereinafter referred to as " identifying device ") with identification module can take the form of smart card, therefore, know Other device can be connected via port or other attachment means with mobile terminal 1 00.Interface unit 170 can be used for receive from The input (for example, data message, electric power etc.) of external device (ED) and the input for receiving is transferred in mobile terminal 1 00 One or more elements can be used for the transmission data between mobile terminal and external device (ED).
In addition, when mobile terminal 1 00 is connected with external base, interface unit 170 can serve as allowing to pass through it by electricity Power from base provide to mobile terminal 1 00 path or can serve as allow from base be input into various command signals pass through its It is transferred to the path of mobile terminal.Can serve as recognizing that mobile terminal is from the various command signals or electric power of base input The no signal being accurately fitted within base.Output unit 150 is configured to provide defeated with vision, audio frequency and/or tactile manner Go out signal (for example, audio signal, vision signal, alarm signal, vibration signal etc.).Output unit 150 can include showing Unit 151, dio Output Modules 152, alarm unit 153 etc..
Display unit 151 may be displayed on the information processed in mobile terminal 1 00.For example, when mobile terminal 1 00 is in electricity During words call mode, display unit 151 can show and converse or other communicate (for example, text messaging, multimedia files Download etc.) related user interface (User's Interface, UI) or graphic user interface (GUI).When mobile terminal 1 00 In video calling pattern or image capture mode when, display unit 151 can show the image of capture and/or the figure of reception As, UI or GUI of video or image and correlation function etc. are shown.
Meanwhile, when the display unit 151 and touch pad touch-screen with formation superposed on one another in the form of layer, display unit 151 can serve as input unit and output device.Display unit 151 can include liquid crystal display (LCD), thin film transistor (TFT) In LCD (TFT-LCD), Organic Light Emitting Diode (OLED) display, flexible display, three-dimensional (3D) display etc. at least It is a kind of.Some in these displays may be constructed such that transparence to allow user from outside viewing, and this is properly termed as transparent Display, typical transparent display can be, for example, transparent organic light emitting diode (TOLED) display etc..According to specific The embodiment wanted, mobile terminal 1 00 can include two or more display units (or other display devices), for example, move Dynamic terminal can include outernal display unit (not shown) and inner display unit (not shown).Touch-screen can be used for detection and touch Input pressure and touch input position and touch input area.
Dio Output Modules 152 can mobile terminal in call signal reception pattern, call mode, logging mode, It is that wireless communication unit 110 is received or in memory 160 when under the isotypes such as speech recognition mode, broadcast reception mode The voice data transducing audio signal of middle storage and it is output as sound.And, dio Output Modules 152 can be provided and movement The audio output (for example, call signal receives sound, message sink sound etc.) of the specific function correlation that terminal 100 is performed. Dio Output Modules 152 can include loudspeaker, buzzer etc..
Alarm unit 153 can provide output so that event is notified to mobile terminal 1 00.Typical event can be with Including calling reception, message sink, key signals input, touch input etc..In addition to audio or video is exported, alarm unit 153 can in a different manner provide output with the generation of notification event.For example, alarm unit 153 can be in the form of vibrating Output is provided, when calling, message or some other entrance communication (incoming communication) are received, alarm list Unit 153 can provide tactile output (that is, vibrating) to notify to user.By providing such tactile output, even if When the mobile phone of user is in the pocket of user, user also can recognize that the generation of various events.Alarm unit 153 The output of the generation of notification event can be provided via display unit 151 or dio Output Modules 152.
Memory 160 can store software program for the process and control operation performed by controller 180 etc., Huo Zheke Temporarily to store the data that exported or will export (for example, telephone directory, message, still image, video etc.).And And, memory 160 can be storing the vibration of various modes with regard to exporting when touching and being applied to touch-screen and audio signal Data.
Memory 160 can include the storage medium of at least one type, and the storage medium includes flash memory, hard disk, many Media card, card-type memory (for example, SD or DX memories etc.), random access storage device (RAM), static random-access storage Device (SRAM), read-only storage (ROM), Electrically Erasable Read Only Memory (EEPROM), programmable read only memory (PROM), magnetic storage, disk, CD etc..And, mobile terminal 1 00 can perform memory with by network connection The network storage device cooperation of 160 store function.
The overall operation of the generally control mobile terminal of controller 180.For example, controller 180 is performed and voice call, data The related control of communication, video calling etc. and process.In addition, controller 180 can be included for reproducing (or playback) many matchmakers The multi-media module 181 of volume data, multi-media module 181 can be constructed in controller 180, or is so structured that and control Device 180 is separated.Controller 180 can be with execution pattern identifying processing, by the handwriting input for performing on the touchscreen or picture Draw input and be identified as character or image.
Power subsystem 190 receives external power or internal power under the control of controller 180 and provides operation each unit Appropriate electric power needed for part and component.
Various embodiments described herein can be with using such as computer software, hardware or its any combination of calculating Machine computer-readable recording medium is implementing.For hardware is implemented, embodiment described herein can be by using application-specific IC (ASIC), digital signal processor (DSP), digital signal processing device (DSPD), programmable logic device (PLD), scene can Programming gate array (FPGA), processor, controller, microcontroller, microprocessor, it is designed to perform function described herein Implementing, in some cases, such embodiment can be implemented at least one in electronic unit in controller 180. For software is implemented, the embodiment of such as process or function can with allow to perform the single of at least one function or operation Software module is implementing.Software code can be come by the software application (or program) write with any appropriate programming language Implement, software code can be stored in memory 160 and be performed by controller 180.
So far, according to its function mobile terminal has been described.Below, for the sake of brevity, will description such as folded form, Slide type mobile terminal in various types of mobile terminals of board-type, oscillating-type, slide type mobile terminal etc. is used as showing Example.Therefore, the present invention can be applied to any kind of mobile terminal, and be not limited to slide type mobile terminal.
As shown in Figure 1 mobile terminal 1 00 may be constructed such that using via frame or packet transmission data it is all if any Line and wireless communication system and satellite-based communication system are operating.
The communication system that mobile terminal wherein of the invention is operable to is described referring now to Fig. 2.
Such communication system can use different air interface and/or physical layer.For example, used by communication system Air interface includes such as frequency division multiple access (FDMA), time division multiple acess (TDMA), CDMA (CDMA) and universal mobile communications system System (UMTS) (especially, Long Term Evolution (LTE)), global system for mobile communications (GSM) etc..As non-limiting example, under The description in face is related to cdma communication system, but such teaching is equally applicable to other types of system.
With reference to Fig. 2, cdma wireless communication system can include multiple mobile terminal 1s 00, multiple base stations (BS) 270, base station Controller (BSC) 275 and mobile switching centre (MSC) 280.MSC280 is configured to and Public Switched Telephony Network (PSTN) 290 form interface.MSC280 is also structured to form interface with the BSC275 that can be couple to base station 270 via back haul link. Back haul link can be in some known interfaces any one constructing, the interface includes such as E1/T1, ATM, IP, PPP, frame relay, HDSL, ADSL or xDSL.It will be appreciated that as shown in Figure 2 system can include multiple BSC275.
Each BS270 can service one or more subregions (or region), by multidirectional antenna or the day of sensing specific direction Each subregion that line is covered is radially away from BS270.Or, each subregion can by for diversity reception two or more Antenna is covered.Each BS270 may be constructed such that the multiple frequency distribution of support, and each frequency distribution has specific frequency spectrum (for example, 1.25MHz, 5MHz etc.).
What subregion and frequency were distributed intersects can be referred to as CDMA Channel.BS270 can also be referred to as base station transceiver System (BTS) or other equivalent terms.In this case, term " base station " can be used for broadly representing single BSC275 and at least one BS270.Base station can also be referred to as " cellular station ".Or, each subregion of specific BS270 can be claimed For multiple cellular stations.
As shown in Figure 2, broadcast singal is sent to broadcsting transmitter (BT) 295 mobile terminal operated in system 100.As shown in Figure 1 broadcasting reception module 111 is arranged at mobile terminal 1 00 to receive the broadcast sent by BT295 Signal.In fig. 2 it is shown that several global positioning system (GPS) satellites 300.Satellite 300 helps position multiple mobile terminals At least one of 100.
In fig. 2, multiple satellites 300 are depicted, it is understood that be, it is possible to use any number of satellite obtains useful Location information.As shown in Figure 1 GPS module 115 is generally configured to coordinate to obtain the positioning wanted letter with satellite 300 Breath.Substitute GPS tracking techniques or outside GPS tracking techniques, it is possible to use can track the position of mobile terminal other Technology.In addition, at least one gps satellite 300 can optionally or additionally process satellite dmb transmission.
Used as a typical operation of wireless communication system, BS270 receives the reverse link from various mobile terminal 1s 00 Signal.Mobile terminal 1 00 generally participates in call, information receiving and transmitting and other types of communication.Each of the reception of certain base station 270 is anti- Processed in specific BS270 to link signal.The data of acquisition are forwarded to the BSC275 of correlation.BSC provides call Resource allocation and the mobile management function of the coordination including the soft switching process between BS270.BSC275 is also by the number for receiving According to MSC280 is routed to, it provides the extra route service for forming interface with PSTN290.Similarly, PSTN290 with MSC280 forms interface, and MSC and BSC275 form interface, and BSC275 correspondingly controls BS270 with by forward link signals It is sent to mobile terminal 1 00.
Based on above-mentioned mobile terminal hardware configuration and communication system, each embodiment of the invention is proposed.
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete Site preparation is described.
Embodiment one
The embodiment of the present invention provides a kind of method of measurement distance, is applied to terminal, and terminal is provided with dual camera, such as schemes Shown in 3, the method can include:
S101, in the first measurement pattern, receive first shoot instruction, and according to first shoot instruction obtain current preview The depth map information of image, depth map information be present preview image in each pixel current field depth in scape Deeply convince breath, the first measurement pattern is the pattern for measuring actual range between two target locations on present preview image.
A kind of method of measurement distance provided in an embodiment of the present invention is applied to the depth of field that can obtain present preview image Under the scene of figure information.
In the embodiment of the present invention, user is sent first and is shot and referred to by the shooting push button specified on touch terminal to terminal Order, terminal is received after the first shooting instruction, indicates that dual camera carries out shooting operation to present preview image, obtains current The depth map information of preview image.
In the embodiment of the present invention, depth map information is each pixel in present preview image in current field depth Depth of view information.
In the embodiment of the present invention, current field depth can be determined by the default acquisition parameters in terminal, specifically, Default acquisition parameters include:The focal length of camera, f-number and shooting distance.
S102, reception first choice instruction, and first object is determined on present preview image according to first choice instruction With the second target, first object and the second target are the target in current field depth.
After the depth map information of present preview image and present preview image is got, terminal will be obtained needs meter Calculate the first object and the second target of distance.
In the embodiment of the present invention, terminal instruction user selects first object and the second target on present preview image.
In the embodiment of the present invention, first object and the second target need to be in the current field depths of present preview image.
S103, first object and the second target are calculated in current preview according to depth map information and the first default computation rule First position on image is poor.
After first object and the second target is got, terminal will calculate first object and the second target current pre- Looking at the first position on image differs from.
In the embodiment of the present invention, terminal is first believed according to first depth of field of depth map acquisition of information first object and the second target Breath and the second depth of view information, the between the first depth of view information and the second depth of view information is calculated further according to the first default computation rule One alternate position spike.
In the embodiment of the present invention, the first depth of view information and the second depth of view information are in the field depth of present preview image Only in field depth, terminal could obtain the first depth of view information and second depth of field letter for image, first object and the second target Breath.
In the embodiment of the present invention, what field depth can determine according to default acquisition parameters;Default acquisition parameters can To include:The focal length of camera, f-number and shooting distance.User, can be with when using terminal shoots to current scene Manually change in these three acquisition parameters at least one to determine current shooting field by user according to the actual conditions of current scene The field depth of scape, it is also possible to which suitable acquisition parameters are automatically selected by terminal.
Here, field depth △ L include:Front field depth △ L1With rear field depth △ L2, the mathematical expression of each of which Formula is respectively:
Wherein, f for camera focal length, F is f-number, L be shooting distance (i.e. imaging surface between shooting object away from From), σ is to allow disperse circular diameter.It is understood that σ is determined by the hardware configuration of terminal itself, f-number and the depth of field Scope is directly proportional, and shooting distance and field depth are directly proportional, and focal length and field depth are inversely proportional to.Therefore, f-number, increase are increased Shooting distance or reduction focal length then can reduce on the contrary field depth with extended depth of field scope.
Here, depth map information can refer to the depth of field of each pixel in current field depth in present filming scene Information.Exemplary, the depth of view information in field depth (in the range of i.e. near point to far point) is represented respectively with 00-FF, For a resolution ratio is 640 × 480 image, when the depth of view information of one of pixel is FF this pixel is shown What is preserved is the object image information near point position;Show this pixel when the depth of view information of one of pixel is 00 What point was preserved is object image information at far point position, if in present filming scene some objects be in field depth with Outward, the depth of view information of these object corresponding pixel points can be set as default value, for thing of the difference in field depth Body.
In the embodiment of the present invention, terminal receives selection operation of the user on present preview image, get need to measure away from From first object and the second target, first depth of view information and second target of the terminal according to depth map acquisition of information first object The second depth of view information, the alternate position spike letter between the first depth of view information and the second depth of view information is calculated further according to default algorithm Breath.
In the embodiment of the present invention, the first depth of view information and the second depth of view information represents first object respectively and the second target exists Positional information in depth map information.
Optionally, the first depth of view information and the second depth of view information can be the coordinate positions of a three-dimensional.
Exemplary, terminal determines that the first depth of view information is (x1, y1, z1), the second depth of view information is (x2, y2, z2), according to Default algorithm calculates the first position difference of the first depth of view information and the second depth of view information on present preview image
S104, calculated according to first position difference and the second default computation rule between first object and the second target second Alternate position spike, second place difference is position difference actual between first object and the second target.
It is necessary to calculate after the first position difference that terminal is calculated between the first depth of view information and the second depth of view information Actual position difference between first object and the second target.
In the embodiment of the present invention, after terminal calculates first position difference, terminal is according to default multiplication factor by first Alternate position spike is amplified operation, obtains the physical location difference between first object and the second target.
In the embodiment of the present invention, when terminal gets the depth map information of present preview image using dual camera, while The multiplication factor between present preview image and actual preview image is obtained, now it is known that each in depth map information Actual position difference between pixel, so terminal can determine that first position differs from corresponding physical location by multiplication factor Difference (second place is poor).
It is understood that terminal obtains the depth map information of present preview image by using dual camera, so as to true Determine the actual range between first object and the second target on present preview image, reduce terminal measure two closer location it Between measure error.
Embodiment two
The embodiment of the present invention provides a kind of method of measurement distance, is applied to terminal, and terminal is provided with dual camera, such as schemes Shown in 4, the method can include:
S201, based on user first operate, and terminal enters the first measurement pattern, and the first measurement pattern is current pre- for measurement Look on image the pattern of actual range between two target locations.
A kind of method of measurement distance provided in an embodiment of the present invention is applied to the depth of field that can obtain present preview image Under the scene of figure information.
In the embodiment of the present invention, first based on user operates, and terminal enters the first measurement pattern, in the measurement pattern Under, terminal can by obtaining the depth map information of present preview image to calculate present preview image in first object and the Actual range between two targets.
Exemplary, as shown in figure 5, user is configured at the setting interface of terminal, open the first measurement pattern.
S202, terminal receive the first shooting instruction in the first measurement pattern, and it is current to shoot instruction acquisition according to first The depth map information of preview image, depth map information be present preview image in each pixel current field depth in Depth of view information.
After into the first measurement pattern, the first shooting instruction that user sends, terminal is obtained according to the first shooting instruction Take the depth map information of present preview image.
In the embodiment of the present invention, user is sent first and is shot and referred to by the shooting push button specified on touch terminal to terminal Order, terminal is received after the first shooting instruction, indicates that dual camera carries out shooting operation to present preview image, obtains current The depth map information of preview image.
In the embodiment of the present invention, depth map information is each pixel in present preview image in current field depth Depth of view information.
In the embodiment of the present invention, current field depth can be determined by the default acquisition parameters in terminal, specifically, Default acquisition parameters include:The focal length of camera, f-number and shooting distance.
S203, terminal receive first choice instruction, and first is determined on present preview image according to first choice instruction Target and the second target, first object and the second target are the target in the current field depth of present preview image.
After the depth map information of present preview image and present preview image is got, terminal will be obtained needs meter Calculate the first object and the second target of distance.
In the embodiment of the present invention, terminal instruction user selects first object and the second target on present preview image.
In the embodiment of the present invention, first object and the second target need to be in the current field depths of present preview image.
It is exemplary, as shown in fig. 6, user can select first object and the by clicking at 2 points on screen picture Two targets, after user have selected, mobile phone screen jumps out whether pop-up box prompting determines that user presses determination key, then select First object and the second target.
Exemplary, as shown in fig. 7, user can indicate that mobile phone for measuring should by drawing a line segment on the screen image The physical length of line segment, after user have selected, mobile phone screen jumps out whether pop-up box prompting determines that user presses determination Key, then have selected needs the line segment of time of measuring length.
S204, terminal obtain the first depth of view information of first object, first depth of field according to depth map information and first object Information is depth of view information corresponding with first object in depth map information.
After first object and the second target is got, terminal will determine first object and the second target current pre- Looking at first position on image differs from, specifically, by first object and first depth of view information and second depth of field of the second target Difference between information is poor to obtain first position.
In the embodiment of the present invention, terminal is obtained after first object according to first choice instruction, and terminal obtains depth map The first depth of view information corresponding with first object in information.
In the embodiment of the present invention, the first depth of view information is the image in the field depth of present preview image, first object Only in field depth, terminal could obtain the first depth of view information.
In the embodiment of the present invention, the first depth of view information represents positional information of the first object in depth map information.
In the embodiment of the present invention, what field depth can determine according to default acquisition parameters;Default acquisition parameters can To include:The focal length of camera, f-number and shooting distance.User, can be with when using terminal shoots to current scene Manually change in these three acquisition parameters at least one to determine current shooting field by user according to the actual conditions of current scene The field depth of scape, it is also possible to which suitable acquisition parameters are automatically selected by terminal.
Here, field depth △ L include:Front field depth △ L1With rear field depth △ L2, the mathematical expression of each of which Formula is respectively:
Wherein, f for camera focal length, F is f-number, L be shooting distance (i.e. imaging surface between shooting object away from From), σ is to allow disperse circular diameter.It is understood that σ is determined by the hardware configuration of terminal itself, f-number and the depth of field Scope is directly proportional, and shooting distance and field depth are directly proportional, and focal length and field depth are inversely proportional to.Therefore, f-number, increase are increased Shooting distance or reduction focal length then can reduce on the contrary field depth with extended depth of field scope.
Here, depth map information can refer to the depth of field of each pixel in current field depth in present filming scene Information.Exemplary, the depth of view information in field depth (in the range of i.e. near point to far point) is represented respectively with 00-FF, For a resolution ratio is 640 × 480 image, when the depth of view information of one of pixel is FF this pixel is shown What is preserved is the object image information near point position;Show this pixel when the depth of view information of one of pixel is 00 What point was preserved is object image information at far point position, if in present filming scene some objects be in field depth with Outward, the depth of view information of these object corresponding pixel points can be set as default value, for thing of the difference in field depth Body.
S205, terminal obtain the second depth of view information of the second target, second depth of field according to depth map information and the second target Information is depth of view information corresponding with the second target in depth map information.
Terminal is got after the first depth of view information of first object, also to obtain the second depth of view information of the second target, Thus calculating for first position difference can be carried out according to the first depth of view information and the second depth of view information.
In the embodiment of the present invention, terminal is obtained after the second target according to first choice instruction, and terminal obtains depth map The second depth of view information corresponding with the second target in information.
In the embodiment of the present invention, the second depth of view information is the image in the field depth of present preview image, the second target Only in field depth, terminal could obtain the second depth of view information.
In the embodiment of the present invention, the second depth of view information represents positional information of second target in depth map information.
S206, terminal calculate first object according to the default computation rule of the first depth of view information, the second depth of view information and first And second the first position between target it is poor.
After terminal obtains the first depth of view information and the second depth of view information, terminal will according to the first depth of view information, Second depth of view information and the first default computation rule are differed from calculating the first position between first object and the second target.
In the embodiment of the present invention, terminal is calculated between the first depth of view information and the second depth of view information according to default algorithm Alternate position spike information.
Optionally, the first depth of view information and the second depth of view information can be the coordinate positions of a three-dimensional.
Exemplary, terminal determines that the first depth of view information is (x1, y1, z1), the second depth of view information is (x2, y2, z2), according to Default algorithm calculates the first position difference of the first depth of view information and the second depth of view information on present preview image
First position difference is amplified default multiplication factor by S207, terminal, and it is poor to calculate the second place, and second place difference is the Actual position difference between one target and the second target.
It is necessary to calculate after the first position difference that terminal is calculated between the first depth of view information and the second depth of view information Actual position difference between first object and the second target.
In the embodiment of the present invention, after terminal calculates first position difference, terminal is according to default multiplication factor by first Alternate position spike is amplified operation, obtains the physical location difference between first object and the second target.
In the embodiment of the present invention, when terminal gets the depth map information of present preview image using dual camera, while The multiplication factor between present preview image and actual preview image is obtained, now it is known that each in depth map information Actual position difference between pixel, so terminal can determine that first position differs from corresponding physical location by multiplication factor Difference (second place is poor).
It is understood that terminal obtains the depth map information of present preview image by using dual camera, so as to true Determine the actual range between first object and the second target on present preview image, reduce terminal measure two closer location it Between measure error.
Embodiment three
The embodiment of the present invention provides a kind of terminal 1, as shown in figure 8, the terminal 1 includes:
Acquisition module 10, for receiving first instruction is shot, and obtains current preview figure according to the described first shooting instruction The depth map information of picture, the depth map information be in the present preview image each pixel current field depth Interior depth of view information.
Determining module 11, for receiving first choice instruction, and instructs in the current preview according to the first choice Determination first object and the second target on image, the first object and second target are in the present preview image Target in the current field depth.
Computing module 12, for according to the depth map information and the first default computation rule calculate the first object and First position of second target on the present preview image is poor.
Computing module 12, is additionally operable to calculate the first object according to first position difference and the second default computation rule And the second place between second target is poor, the second place difference is between the first object and second target Actual position difference.
Optionally, the acquisition module 10, is additionally operable to according to the depth map information and the first object, obtains described First depth of view information of first object, first depth of view information is corresponding with the first object in the depth map information Depth of view information;According to the depth map information and second target, the second depth of view information of second target is obtained, it is described Second depth of view information is depth of view information corresponding with second target in the depth map information.
The computing module 12, specifically for according to first depth of view information, second depth of view information and described One default computation rule, the first position calculated between the first object and second target is poor.
Optionally, the computing module 12, is additionally operable to for first position difference to amplify default multiplication factor, calculates institute State the second place poor.
Optionally, the acquisition module 10, specifically for obtaining the depth map information using the dual camera.
Optionally, based on Fig. 8, as shown in figure 9, the terminal 1 also includes:Into module 13;
The entrance module 13, for the first operation based on user, into first measurement pattern.
As shown in Figure 10, in actual applications, above-mentioned acquiring unit 10, determining unit 11 and computing unit 12 can be by being located at Processor 14 in terminal 1 realizes, specially central processing unit (CPU), microprocessor (MPU), digital signal processor (DSP) Or field programmable gate array (FPGA) etc. is realized, the terminal 1 can also include memory 15, and the memory 15 can be with process Device 14 connects, wherein, memory 15 is used to store executable program code, and the program code includes computer-managed instruction, deposits Reservoir 15 may include high-speed RAM memory, it is also possible to also including nonvolatile memory, for example, at least one disk storage Device.
Those skilled in the art are it should be appreciated that embodiments of the invention can be provided as method, system or computer program Product.Therefore, the present invention can adopt hardware embodiment, software implementation or the shape with reference to the embodiment in terms of software and hardware Formula.And, the present invention can be adopted can be with storage in one or more computers for wherein including computer usable program code The form of the computer program implemented on medium (including but not limited to magnetic disc store and optical memory etc.).
The present invention is the flow process with reference to method according to embodiments of the present invention, equipment (system) and computer program Figure and/or block diagram are describing.It should be understood that can be by computer program instructions flowchart and/or each stream in block diagram The combination of journey and/or square frame and flow chart and/or the flow process in block diagram and/or square frame.These computer programs can be provided The processor of all-purpose computer, special-purpose computer, Embedded Processor or other programmable data processing devices is instructed to produce A raw machine so that produced for reality by the instruction of computer or the computing device of other programmable data processing devices The device of the function of specifying in present one flow process of flow chart or one square frame of multiple flow processs and/or block diagram or multiple square frames.
These computer program instructions may be alternatively stored in can guide computer or other programmable data processing devices with spy In determining the computer-readable memory that mode works so that the instruction being stored in the computer-readable memory is produced to be included referring to Make the manufacture of device, the command device realize in one flow process of flow chart or one square frame of multiple flow processs and/or block diagram or The function of specifying in multiple square frames.
These computer program instructions also can be loaded in computer or other programmable data processing devices so that in meter Series of operation steps is performed on calculation machine or other programmable devices to produce computer implemented process, so as in computer or The instruction performed on other programmable devices is provided for realizing in one flow process of flow chart or multiple flow processs and/or block diagram one The step of function of specifying in individual square frame or multiple square frames.
The above, only presently preferred embodiments of the present invention is not intended to limit protection scope of the present invention.

Claims (10)

1. a kind of measurement distance method, is applied to terminal, and the terminal is provided with dual camera, including:
In the first measurement pattern, receive first and shoot instruction, and present preview image is obtained according to the described first shooting instruction Depth map information, the depth map information be in the present preview image each pixel current field depth in Depth of view information, first measurement pattern is actual range between two target locations on the measurement present preview image Pattern;
Receive first choice instruction, and according to the first choice instruction on the present preview image determine first object and Second target, the first object and second target are in the described current field depth of the present preview image Target;
The first object and second target are calculated described according to the depth map information and the first default computation rule First position on present preview image is poor;
Calculated between the first object and second target according to first position difference and the second default computation rule The second place is poor, and the second place difference is position difference actual between the first object and second target.
2. method according to claim 1, it is characterised in that described to calculate according to the depth map information and first are default Rule calculates the first object and first position of second target on the present preview image is poor, including:
According to the depth map information and the first object, the first depth of view information of the first object is obtained, described first Depth of view information is depth of view information corresponding with the first object in the depth map information;
According to the depth map information and second target, the second depth of view information of second target is obtained, described second Depth of view information is depth of view information corresponding with second target in the depth map information;
According to first depth of view information, second depth of view information and the first default computation rule, described first is calculated The first position between target and second target is poor.
3. method according to claim 1, it is characterised in that described to calculate according to first position difference and second are default The second place that rule is calculated between the first object and second target is poor, including:
First position difference is amplified into default multiplication factor, the second place is calculated poor.
4. method according to claim 1, it is characterised in that the depth map information of the acquisition present preview image, bag Include:
The depth map information is obtained using the dual camera.
5. method according to claim 1, it is characterised in that described in the first measurement pattern, receives first and shoots and refer to Before order, methods described also includes:
Based on user first operates, into first measurement pattern.
6. a kind of terminal, it is characterised in that the terminal includes:
Acquisition module, for receiving first instruction is shot, and shoots the scape that instruction obtains present preview image according to described first Deep figure information, the depth map information be in the present preview image each pixel current field depth in scape Deeply convince breath;
Determining module, for receiving first choice instruction, and instructs on the present preview image according to the first choice Determine first object and the second target, the first object and second target are to work as described in the present preview image Target in the range of the front depth of field;
Computing module, for calculating the first object and described the according to the depth map information and the first default computation rule First position of two targets on the present preview image is poor;
Computing module, is additionally operable to calculate the first object and described according to first position difference and the second default computation rule The second place between second target is poor, and the second place difference is reality between the first object and second target Position difference.
7. terminal according to claim 6, it is characterised in that
The acquisition module, is additionally operable to according to the depth map information and the first object, obtains the of the first object One depth of view information, first depth of view information is depth of view information corresponding with the first object in the depth map information;Root According to the depth map information and second target, the second depth of view information of second target, second depth of field letter are obtained Cease for depth of view information corresponding with second target in the depth map information;
The computing module, specifically for being preset according to first depth of view information, second depth of view information and described first Computation rule, the first position calculated between the first object and second target is poor.
8. terminal according to claim 6, it is characterised in that
The computing module, is additionally operable to for first position difference to amplify default multiplication factor, calculates the second place poor.
9. terminal according to claim 6, it is characterised in that
The acquisition module, specifically for obtaining the depth map information using the dual camera.
10. terminal according to claim 6, it is characterised in that the terminal also includes:Into module;
The entrance module, for the first operation based on user, into first measurement pattern.
CN201611119955.4A 2016-12-08 2016-12-08 Distance measurement method and terminal Pending CN106646442A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611119955.4A CN106646442A (en) 2016-12-08 2016-12-08 Distance measurement method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611119955.4A CN106646442A (en) 2016-12-08 2016-12-08 Distance measurement method and terminal

Publications (1)

Publication Number Publication Date
CN106646442A true CN106646442A (en) 2017-05-10

Family

ID=58818796

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611119955.4A Pending CN106646442A (en) 2016-12-08 2016-12-08 Distance measurement method and terminal

Country Status (1)

Country Link
CN (1) CN106646442A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109410262A (en) * 2018-12-21 2019-03-01 宁波迪比亿贸易有限公司 Autoclave depth of field resolution system
CN109579081A (en) * 2018-12-06 2019-04-05 邱迪 Scalable metallic support drives platform
CN110148167A (en) * 2019-04-17 2019-08-20 维沃移动通信有限公司 A kind of distance measurement method and terminal device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100075252A (en) * 2008-12-24 2010-07-02 삼성전자주식회사 Distance measuring device having component mounting space
JP2013228267A (en) * 2012-04-25 2013-11-07 Panasonic Corp Display device, display method, and program
CN104506842A (en) * 2015-01-15 2015-04-08 京东方科技集团股份有限公司 Three-dimensional camera module, terminal device and distance measurement method
CN105222717A (en) * 2015-08-28 2016-01-06 宇龙计算机通信科技(深圳)有限公司 A kind of subject matter length measurement method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100075252A (en) * 2008-12-24 2010-07-02 삼성전자주식회사 Distance measuring device having component mounting space
JP2013228267A (en) * 2012-04-25 2013-11-07 Panasonic Corp Display device, display method, and program
CN104506842A (en) * 2015-01-15 2015-04-08 京东方科技集团股份有限公司 Three-dimensional camera module, terminal device and distance measurement method
CN105222717A (en) * 2015-08-28 2016-01-06 宇龙计算机通信科技(深圳)有限公司 A kind of subject matter length measurement method and device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109579081A (en) * 2018-12-06 2019-04-05 邱迪 Scalable metallic support drives platform
CN109410262A (en) * 2018-12-21 2019-03-01 宁波迪比亿贸易有限公司 Autoclave depth of field resolution system
CN110148167A (en) * 2019-04-17 2019-08-20 维沃移动通信有限公司 A kind of distance measurement method and terminal device

Similar Documents

Publication Publication Date Title
CN106502693B (en) A kind of image display method and device
CN105163042B (en) A kind of apparatus and method for blurring processing depth image
CN106909274A (en) A kind of method for displaying image and device
CN107018331A (en) A kind of imaging method and mobile terminal based on dual camera
CN105430258B (en) A kind of method and apparatus of self-timer group photo
CN104991772B (en) Remote operation bootstrap technique and device
CN106610770A (en) Picture viewing method and device
CN106331499A (en) Focusing method and shooting equipment
CN104951229B (en) Screenshot method and device
CN106534552B (en) Mobile terminal and its photographic method
CN106534693A (en) Photo processing method, photo processing device and terminal
CN106851113A (en) A kind of photographic method and mobile terminal based on dual camera
CN106506965A (en) A kind of image pickup method and terminal
CN106100968A (en) A kind of information sharing method and terminal
CN106303044B (en) A kind of mobile terminal and obtain the method to coke number
CN106646442A (en) Distance measurement method and terminal
CN106657783A (en) Image shooting device and method
CN106791149A (en) A kind of method of mobile terminal and control screen
CN106790994A (en) The triggering method and mobile terminal of control
CN106791016A (en) A kind of photographic method and terminal
CN106843684A (en) A kind of device and method, the mobile terminal of editing screen word
CN104639761B (en) The method and mobile terminal of one-handed performance
CN106686235A (en) Method and apparatus for preventing mistaken touches on terminal
CN105721757A (en) Device and method for adjusting photographing parameters
CN104731460B (en) terminal control method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20170510

RJ01 Rejection of invention patent application after publication