WO2018050014A1 - Procédé de mise au point, dispositif de prise de vues, et support de stockage - Google Patents

Procédé de mise au point, dispositif de prise de vues, et support de stockage Download PDF

Info

Publication number
WO2018050014A1
WO2018050014A1 PCT/CN2017/100852 CN2017100852W WO2018050014A1 WO 2018050014 A1 WO2018050014 A1 WO 2018050014A1 CN 2017100852 W CN2017100852 W CN 2017100852W WO 2018050014 A1 WO2018050014 A1 WO 2018050014A1
Authority
WO
WIPO (PCT)
Prior art keywords
edge information
detecting sensor
information detected
phase detecting
phase
Prior art date
Application number
PCT/CN2017/100852
Other languages
English (en)
Chinese (zh)
Inventor
姬向东
马亮
Original Assignee
努比亚技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 努比亚技术有限公司 filed Critical 努比亚技术有限公司
Publication of WO2018050014A1 publication Critical patent/WO2018050014A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils

Definitions

  • the present invention relates to the field of image processing technologies, and in particular, to a focusing method, a photographing device, and a storage medium.
  • phase-detection auto focus Phase-Detection Auto focus
  • the camera of the PDAF uses phase detection to focus, otherwise it uses the Modulation Transfer Function (MTF) to focus.
  • MTF Modulation Transfer Function
  • the phase detection pixels are distributed in pairs, and the distribution direction of the paired phase detection points affects the direction of the edge that can be detected. If the phase detection points are distributed left and right, the vertical edge can be detected. If it is distributed up and down, the horizontal edge can be detected.
  • the phase detection pixel is distributed to the left and right.
  • the phase detection sensor with the left and right phase of the phase detection pixel can only detect the vertical edge. If there is enough vertical component in the edge, the accuracy of the phase detection sensor can meet the requirement. However, for the case where there is only a horizontal edge in the image, the detection of the phase detection sensor will not guarantee accuracy.
  • embodiments of the present invention are expected to provide a focusing method, a photographing device, and a storage medium to improve the accuracy of focusing.
  • a first aspect of the embodiments of the present invention provides a photographing apparatus, including:
  • the determining module is configured to determine whether the edge information detected by the first phase detecting sensor is authentic, and determine whether the edge information detected by the second phase detecting sensor is authentic;
  • Determining a processing module configured to determine a focus processing mode according to the determination result
  • the first phase detecting sensor and the second phase detecting sensor are disposed at intersection.
  • the determining processing module is further configured to:
  • the determining processing module is further configured to:
  • the reliability of the edge information detected by the first phase detecting sensor is higher than the reliability of the edge information detected by the second phase detecting sensor, using the edge information detected by the first phase detecting sensor Phase autofocus.
  • the determining processing module is further configured to:
  • the reliability of the edge information detected by the second phase detecting sensor is higher than the reliability of the edge information detected by the first phase detecting sensor, using the edge information detected by the second phase detecting sensor Phase autofocus.
  • the determining processing module is further configured to:
  • the determining processing module is further configured to:
  • the focus processing is performed using contrast autofocus.
  • the determining processing module is further configured to:
  • the edge information detected by the first phase detecting sensor is authentic, and the edge information detected by the second phase detecting sensor is not authentic, the edge information detected by the first phase detecting sensor is used for phase autofocusing.
  • the determining processing module is further configured to:
  • the edge information detected by the first phase detecting sensor is not authentic, and the edge information detected by the second phase detecting sensor is authentic, the edge information detected by the second phase detecting sensor is used for phase autofocus processing. .
  • the determining processing module is further configured to:
  • the focus processing is performed using contrast autofocus.
  • a second aspect of the embodiments of the present invention provides a focusing method, where the method includes:
  • the first phase detecting sensor and the second phase detecting sensor are disposed at intersection.
  • the determining the focus processing manner according to the determination result includes:
  • the determining the focus processing manner according to the determination result includes:
  • the reliability of the edge information detected by the first phase detecting sensor is higher than the reliability of the edge information detected by the second phase detecting sensor, using the edge information detected by the first phase detecting sensor Phase autofocus.
  • the determining the focus processing manner according to the determination result includes:
  • the reliability of the edge information detected by the second phase detecting sensor is higher than the reliability of the edge information detected by the first phase detecting sensor, using the edge information detected by the second phase detecting sensor Phase autofocus.
  • the determining the focus processing manner according to the determination result includes:
  • the method further includes:
  • the focus processing is performed using contrast autofocus.
  • the determining the focus processing manner according to the determination result includes:
  • the edge information detected by the first phase detecting sensor is authentic, and the edge information detected by the second phase detecting sensor is not authentic, the edge information detected by the first phase detecting sensor is used for phase autofocusing.
  • the determining the focus processing manner according to the determination result includes:
  • the edge information detected by the first phase detecting sensor is not authentic, and the edge information detected by the second phase detecting sensor is authentic, the edge information detected by the second phase detecting sensor is used for phase autofocus processing. .
  • the determining the focus processing manner according to the determination result includes:
  • the focus processing is performed using contrast autofocus.
  • a third aspect of the embodiments of the present invention provides a photographing apparatus, including: a processor and a memory for storing a computer program executable on a processor, wherein the processor is configured to execute the computer program The steps of the method.
  • a fourth aspect of the embodiments of the present invention provides a computer readable storage medium having stored thereon a computer program, wherein the computer program is executed by a processor to implement the steps of the method described above.
  • the present invention provides a focusing method, a photographing device, and a storage medium.
  • the edge information detected by the first phase detecting sensor is determined when starting focusing. Whether it is credible, and judging whether the edge information detected by the second phase detecting sensor is authentic; determining the focusing processing mode according to the judgment result, so that the number of edge information in one direction in the image is small, and the edge in the other direction.
  • the accuracy of focusing is improved.
  • FIG. 1 is a schematic structural diagram of hardware of an optional mobile terminal embodying various embodiments of the present invention
  • FIG. 2 is a schematic structural diagram of a communication system that can be operated by a mobile terminal according to an embodiment of the present invention
  • FIG. 3 is a flowchart of a focusing method according to an embodiment of the present invention.
  • FIG. 4 is a flowchart of a focusing method according to an embodiment of the present invention.
  • FIG. 5 is a schematic structural diagram of a photographing apparatus according to an embodiment of the present invention.
  • the mobile terminal can be implemented in various forms.
  • the terminals described in the present invention may include, for example, mobile phones, smart phones, notebook computers, digital broadcast receivers, personal digital assistants (PDAs), tablet computers (PADs), portable multimedia players (PMPs), navigation devices, and the like.
  • Mobile terminals and fixed terminals such as digital TVs, desktop computers, and the like.
  • the terminal is a mobile terminal.
  • PDAs personal digital assistants
  • PADs tablet computers
  • PMPs portable multimedia players
  • Mobile terminals and fixed terminals such as digital TVs, desktop computers, and the like.
  • the terminal is a mobile terminal.
  • configurations in accordance with embodiments of the present invention can be applied to fixed type terminals in addition to components that are specifically for mobile purposes.
  • FIG. 1 is a schematic diagram showing the hardware structure of a mobile terminal embodying various embodiments of the present invention.
  • the mobile terminal 100 may include a wireless communication unit 110, an audio/video (A/V) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190. and many more.
  • Figure 1 shows a mobile terminal having various components, but it should be understood that not all illustrated components are required to be implemented, and more or fewer components may be implemented instead, and the components of the mobile terminal will be described in detail below. .
  • Wireless communication unit 110 typically includes one or more components that permit radio communication between mobile terminal 100 and a wireless communication system or network.
  • the wireless communication unit may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115.
  • the broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel.
  • the broadcast channel can include a satellite channel and/or a terrestrial channel.
  • the broadcast management server may be a server that generates and transmits broadcast signals and/or broadcast related information or receives it. The previously generated broadcast signal and/or broadcast related information is sent to the server of the terminal.
  • the broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like. Moreover, the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal.
  • the broadcast associated information may also be provided via a mobile communication network, and in this case, the broadcast associated information may be received by the mobile communication module 112.
  • the broadcast signal may exist in various forms, for example, it may exist in the form of Digital Multimedia Broadcasting (DMB) Electronic Program Guide (EPG), Digital Video Broadcasting Handheld (DVB-H) Electronic Service Guide (ESG), and the like.
  • the broadcast receiving module 111 can receive a signal broadcast by using various types of broadcast systems.
  • the broadcast receiving module 111 can use forward link media (MediaFLO) by using, for example, multimedia broadcast-terrestrial (DMB-T), digital multimedia broadcast-satellite (DMB-S), digital video broadcast-handheld (DVB-H)
  • MediaFLO forward link media
  • the digital broadcasting system of the @) data broadcasting system, the terrestrial digital broadcasting integrated service (ISDB-T), and the like receives digital broadcasting.
  • the broadcast receiving module 111 can be constructed as various broadcast systems suitable for providing broadcast signals as well as the above-described digital broadcast system.
  • the broadcast signal and/or broadcast associated information received via the broadcast receiving module 111 may be stored in the memory 160 (or other type of storage
  • the mobile communication module 112 transmits the radio signals to and/or receives radio signals from at least one of a base station (e.g., an access point, a Node B, etc.), an external terminal, and a server.
  • a base station e.g., an access point, a Node B, etc.
  • Such radio signals may include voice call signals, video call signals, or various types of data transmitted and/or received in accordance with text and/or multimedia messages.
  • the wireless internet module 113 supports wireless internet access of the mobile terminal.
  • the module can be internally or externally coupled to the terminal.
  • the wireless Internet access technologies involved in the module may include WLAN (Wireless LAN) (Wi-Fi), Wibro (Wireless Broadband), Wimax (Worldwide Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), etc. .
  • the short range communication module 114 is a module for supporting short range communication.
  • Some examples of short-range communication technologies include BluetoothTM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wide Band (UWB), ZigbeeTM, and the like.
  • the location information module 115 is a module for checking or acquiring location information of the mobile terminal.
  • a typical example of the location information module 115 is a GPS (Global Positioning System).
  • the GPS module 115 calculates distance information and accurate time information from three or more satellites and applies triangulation to the calculated information to accurately calculate three-dimensional current position information based on longitude, latitude, and altitude.
  • the method for calculating position and time information uses three satellites and corrects the calculated position and time information errors by using another satellite.
  • the GPS module 115 is capable of calculating speed information by continuously calculating current position information in real time.
  • the A/V input unit 120 is for receiving an audio or video signal.
  • the A/V input unit 120 may include a camera 121 and a microphone 122 that processes image data of still pictures or video obtained by the image capturing device in a video capturing mode or an image capturing mode.
  • the processed image frame can be displayed on the display unit 151.
  • the image frames processed by the camera 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 121 may be provided according to the configuration of the mobile terminal.
  • the microphone 122 can receive sound (audio data) via the microphone 122 in an operation mode of a telephone call mode, a recording mode, a voice recognition mode, and the like, and can process such sound as audio data.
  • the processed audio (voice) data can be converted to a format output that can be transmitted to the mobile communication base station via the mobile communication module 112 in the case of a telephone call mode.
  • the microphone 122 can implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated during the process of receiving and transmitting audio signals.
  • the user input unit 130 may generate key input data according to a command input by the user to control various operations of the mobile terminal.
  • the user input unit 130 allows the user to input various types of information, and may include a keyboard, a pot, a touch pad (eg, a touch sensitive component that detects changes in resistance, pressure, capacitance, etc. due to contact), a scroll wheel , rocker, etc.
  • a touch screen can be formed.
  • the sensing unit 140 detects a current state of the mobile terminal 100 (for example, the mobile terminal 100 The state of the mobile terminal 100, the user's contact with the mobile terminal 100 (ie, touch input), the orientation of the mobile terminal 100, the acceleration or deceleration movement and direction of the mobile terminal 100, and the like, and generated A command or signal for controlling the operation of the mobile terminal 100.
  • the sensing unit 140 can sense whether the slide type phone is turned on or off.
  • the sensing unit 140 can detect whether the power supply unit 190 provides power or whether the interface unit 170 is coupled to an external device.
  • Sensing unit 140 may include proximity sensor 141 which will be described below in connection with a touch screen.
  • the interface unit 170 serves as an interface through which at least one external device can connect with the mobile terminal 100.
  • the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, and an audio input/output. (I/O) port, video I/O port, headphone port, and more.
  • the identification module may be stored to verify various information used by the user using the mobile terminal 100 and may include a User Identification Module (UIM), a Customer Identification Module (SIM), a Universal Customer Identity Module (USIM), and the like.
  • the device having the identification module may take the form of a smart card, and thus the identification device may be connected to the mobile terminal 100 via a port or other connection device.
  • the interface unit 170 can be configured to receive input from an external device (eg, data information, power, etc.) and transmit the received input to one or more components within the mobile terminal 100 or can be used at the mobile terminal and external device Transfer data between.
  • the interface unit 170 may function as a path through which power is supplied from the base to the mobile terminal 100 or may be used as a transmission of various command signals allowing input from the base to the mobile terminal 100 The path to the terminal.
  • Various command signals or power input from the base can be used as signals for identifying whether the mobile terminal is accurately mounted on the base.
  • Output unit 150 is configured to provide an output signal (eg, an audio signal, a video signal, an alarm signal, a vibration signal, etc.) in a visual, audio, and/or tactile manner.
  • the output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and the like.
  • the display unit 151 can display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display unit 151 can display a user interface (UI) or a graphical user interface (GUI) related to a call or other communication (eg, text messaging, multimedia file download, etc.). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or image and related functions, and the like.
  • UI user interface
  • GUI graphical user interface
  • the display unit 151 can function as an input device and an output device.
  • the display unit 151 may include at least one of a liquid crystal display (LCD), a thin film transistor LCD (TFT-LCD), an organic light emitting diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like.
  • LCD liquid crystal display
  • TFT-LCD thin film transistor LCD
  • OLED organic light emitting diode
  • a flexible display a three-dimensional (3D) display, and the like.
  • 3D three-dimensional
  • Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as a transparent display, and a typical transparent display may be, for example, a TOLED (Transparent Organic Light Emitting Diode) display or the like.
  • TOLED Transparent Organic Light Emitting Diode
  • the mobile terminal 100 may include two or more display units (or other display devices), for example, the mobile terminal may include an external display unit (not shown) and an internal display unit (not shown) .
  • the touch screen can be used to detect touch input pressure as well as touch input position and touch input area.
  • the audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 when the mobile terminal is in a call signal receiving mode, a call mode, a recording mode, a voice recognition mode, a broadcast receiving mode, and the like.
  • the audio signal is output as sound.
  • the audio output module 152 can provide audio output (eg, call signal reception sound, message reception sound, etc.) associated with a particular function performed by the mobile terminal 100.
  • the audio output module 152 can include a speaker, a buzzer, and the like.
  • the alarm unit 153 can provide an output to notify the mobile terminal 100 of the occurrence of an event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to the audio or video output, the alert unit 153 can provide output in a different manner to notify the event The occurrence of the piece. For example, the alarm unit 153 can provide an output in the form of vibrations, and when a call, message, or some other incoming communication is received, the alarm unit 153 can provide a tactile output (eg, vibration) to notify the user of it. By providing such a tactile output, the user is able to recognize the occurrence of various events even when the user's mobile phone is in the user's pocket. The alarm unit 153 can also provide an output of the notification event occurrence via the display unit 151 or the audio output module 152.
  • the memory 160 may store a software program or the like that performs processing and control operations performed by the controller 180, or may temporarily store data (for example, a phone book, a message, a still image, a video, and the like) that has been output or is to be output. Moreover, the memory 160 can store data regarding vibrations and audio signals of various manners that are output when a touch is applied to the touch screen.
  • the memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card type memory (eg, SD or DX memory, etc.), a random access memory (RAM), a static random access memory ( SRAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), magnetic memory, magnetic disk, optical disk, and the like.
  • the mobile terminal 100 can cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
  • the controller 180 typically controls the overall operation of the mobile terminal. For example, the controller 180 performs the control and processing associated with voice calls, data communications, video calls, and the like.
  • the controller 180 may include a multimedia module 181 for reproducing (or playing back) multimedia data, which may be constructed within the controller 180 or may be configured to be separate from the controller 180.
  • the controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
  • the power supply unit 190 receives external power or internal power under the control of the controller 180 and provides appropriate power required to operate the various components and components.
  • the various embodiments described herein can be used, for example, in computer software, hardware, or any of them.
  • the combined computer readable medium is implemented.
  • the embodiments described herein may be through the use of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays ( An FPGA, a processor, a controller, a microcontroller, a microprocessor, at least one of the electronic units designed to perform the functions described herein, in some cases, such an embodiment may be at the controller 180 Implemented in the middle.
  • implementations such as procedures or functions may be implemented with separate software modules that permit the execution of at least one function or operation.
  • the software code can be implemented by a software application (or program) written in any suitable programming language, which can be stored in memory
  • the mobile terminal has been described in terms of its function.
  • a slide type mobile terminal among various types of mobile terminals such as a folding type, a bar type, a swing type, a slide type mobile terminal, and the like will be described as an example. Therefore, the present invention can be applied to any type of mobile terminal, and is not limited to a slide type mobile terminal.
  • the mobile terminal 100 as shown in FIG. 1 may be configured to operate using a communication system such as a wired and wireless communication system and a satellite-based communication system that transmits data via frames or packets.
  • a communication system such as a wired and wireless communication system and a satellite-based communication system that transmits data via frames or packets.
  • Such communication systems may use different air interfaces and/or physical layers.
  • air interfaces used by communication systems include, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS) (in particular, Long Term Evolution (LTE)). ), Global System for Mobile Communications (GSM), etc.
  • FDMA Frequency Division Multiple Access
  • TDMA Time Division Multiple Access
  • CDMA Code Division Multiple Access
  • UMTS Universal Mobile Telecommunications System
  • LTE Long Term Evolution
  • GSM Global System for Mobile Communications
  • the following description relates to a CDMA communication system, but such teachings are equally applicable to other types of systems.
  • a CDMA wireless communication system can include a plurality of mobile terminals 100, a plurality of base stations (BS) 270, a base station controller (BSC) 275, and a mobile switching center (MSC) 280.
  • the MSC 280 is configured to interface with a public switched telephone network (PSTN) 290.
  • PSTN public switched telephone network
  • the MSC 280 is also configured to interface with a BSC 275 that can be coupled to the base station 270 via a backhaul line. Return line can Constructed according to any of several known interfaces including, for example, E1/T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL or xDSL. It will be appreciated that the system as shown in FIG. 2 can include multiple BSCs 275.
  • Each BS 270 can serve one or more partitions (or regions), each of which is covered by a multi-directional antenna or an antenna directed to a particular direction radially away from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS 270 can be configured to support multiple frequency allocations, and each frequency allocation has a particular frequency spectrum (eg, 1.25 MHz, 5 MHz, etc.).
  • BS 270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology.
  • BTS Base Transceiver Subsystem
  • the term "base station” can be used to generally refer to a single BSC 275 and at least one BS 270.
  • a base station can also be referred to as a "cell station.”
  • each partition of a particular BS 270 may be referred to as a plurality of cellular stations.
  • a broadcast transmitter (BT) 295 transmits a broadcast signal to the mobile terminal 100 operating within the system.
  • a broadcast receiving module 111 as shown in FIG. 1 is provided at the mobile terminal 100 to receive a broadcast signal transmitted by the BT 295.
  • GPS Global Positioning System
  • the satellite 300 helps locate at least one of the plurality of mobile terminals 100.
  • a plurality of satellites 300 are depicted, but it is understood that useful positioning information can be obtained using any number of satellites.
  • the GPS module 115 as shown in Figure 1 is typically configured to cooperate with the satellite 300 to obtain desired positioning information. Instead of GPS tracking technology or in addition to GPS tracking technology, other techniques that can track the location of the mobile terminal can be used. Additionally, at least one GPS satellite 300 can selectively or additionally process satellite DMB transmissions.
  • BS 270 receives reverse link signals from various mobile terminals 100.
  • Mobile terminal 100 typically participates in calls, messaging, and other types of communications.
  • Each reverse link signal received by a particular base station 270 is processed within a particular BS 270.
  • the obtained data is forwarded to the relevant BSC 275.
  • BSC provides call resource allocation and includes BS270 Coordinated mobility management functions between soft handover processes.
  • the BSC 275 also routes the received data to the MSC 280, which provides additional routing services for interfacing with the PSTN 290.
  • PSTN 290 interfaces with MSC 280
  • MSC 280 interfaces with BSC 275
  • BSC 275 controls BS 270 accordingly to transmit forward link signals to mobile terminal 100.
  • phase detecting sensors are arranged in a cross, one of which is distributed up and down and the other is distributed left and right, so that the phase detecting sensor pixels are cross-distributed, and the phase can be ensured as long as there is sufficient edge information in the image. Detect the accuracy of the sensor focus.
  • FIG. 3 is a flowchart of a focusing method according to an embodiment of the present invention. As shown in FIG. 3, the method provided in this embodiment may be specifically implemented by a camera device configured with a dual camera. Specifically, the method provided in this embodiment includes:
  • Step 301 Determine whether the edge information detected by the first phase detecting sensor is authentic, and determine whether the edge information detected by the second phase detecting sensor is authentic;
  • Step 302 Determine a focus processing mode according to the determination result.
  • phase focusing technology is very mature in the field of digital cameras, but it is still in its infancy in the field of smart terminals.
  • the camera's autofocus mode consists of two types, contrast focus and phase focus.
  • contrast focus is to find the lens position, that is, the exact focus position, according to the contrast change of the screen at the focus.
  • Phase focusing is faster than the contrast focus.
  • Phase focusing its principle is to reserve some obscured pixels on the photosensitive element, specifically for phase detection, determine the offset value of the focus by the distance between the pixels and its changes to achieve accurate focus.
  • phase focusing does not require repeated movement of the lens, the focus stroke is much shorter, and the focusing process is not hesitant; on the other hand, due to the need to use the masked pixels on the CMOS for phase detection. Therefore, the phase focus requires a relatively high light intensity. It can be seen that the advantage is that the focus is only required to be completed once, the focusing speed is extremely fast, and the calculation load of the processor is reduced, and the disadvantage is that it is easy to focus on the light in a low light environment.
  • Contrast focusing takes too long.
  • the lens keeps moving from the beginning of focusing to the final focus.
  • the retracting process after “walking through the station” increases the focus stroke, which is reflected to the user that the focusing speed is slower.
  • the initial picture is a virtual focus state, and then the lens moves, people can see the coins in the screen gradually clear.
  • the camera module itself is not aware that the focus has been completed at this time, the lens will continue to move, and people will see the coins become blurred again.
  • the camera module realized that the lens "walked through the station", and then retracted to the clear focus position, so that the focus is completed.
  • the advantage is that it can focus accurately even in low light conditions.
  • the disadvantage is that there are many focusing steps, the processor has to calculate a lot of data, and the focusing time is long.
  • the first phase detecting sensor and the second phase detecting sensor are disposed, and the first phase detecting sensor and the second phase detecting sensor are disposed at intersection, one of which The top and bottom are distributed, and the other is distributed left and right.
  • the pixel points are cross-distributed, and as long as there is enough edge information in the image, the accuracy of the phase detecting sensor focusing can be ensured.
  • the edge information detected by the first phase detecting sensor when the photographing device starts focusing, it is first determined whether the edge information detected by the first phase detecting sensor is authentic, and the edge information may include the definition, the number, etc. of the edge of the image, and then determine Whether the edge information detected by the second phase detecting sensor is authentic; and determining which edge detecting sensor detects the edge information when the focus is used according to the determination result, or when the edge information detected by the two phase detecting sensors is not credible Determine the focus processing method.
  • the first phase detecting sensor is distributed up and down, and the second phase detecting sensor is distributed left and right.
  • the photographing device When it is detected that the photographing device starts focusing, first determining whether the edge information of the upper and lower distribution detected by the first phase detecting sensor is authentic, such as whether the edge of the image is clear, the number of edges of the image, etc.; Whether the edge information of the left and right distribution detected by the second phase detecting sensor is authentic, the same may be to determine the number and definition of the edge of the image.
  • the first phase detecting sensor detects that The edge information of the upper and lower distribution or the edge information of the left and right distribution detected by the second phase detecting sensor performs focusing processing.
  • the edge information of the upper and lower distribution detected by the first phase detecting sensor is authentic and the edge information of the left and right distribution detected by the second phase detecting sensor is credible
  • the edge with higher reliability is used. Information is processed in focus.
  • the existing contrast autofocus is used. Focus processing.
  • the contrast autofocus is performed in the main sensor CMOS in the body, and the focusing speed is slow, but the focus can be arbitrarily set within a relatively wide range.
  • the focusing method by setting the first phase detecting sensor and the second phase detecting sensor to cross each other, when starting focusing, determining whether the edge information detected by the first phase detecting sensor is authentic, and determining the second phase detecting Whether the edge information detected by the sensor is credible; determining the focus processing mode according to the judgment result, so that the number of edge information in one direction in the image is small, and when the number of edge information in the other direction is large, the accuracy of focusing is improved.
  • FIG. 4 is a flowchart of a focusing method according to an embodiment of the present invention. As shown in FIG. 4, when the method provided in this embodiment is applied, the specific process may include:
  • Step 401 start focusing.
  • Step 402 Determine whether the edge information detected by the first phase detecting sensor PD1 is authentic. If yes, go to step 403; if no, go to step 407.
  • Step 403 Determine whether the edge information detected by the second phase detecting sensor PD2 is authentic. If yes, go to step 404 or step 405; if no, go to step 406.
  • Step 404 Compare the reliability of the edge information detected by the first phase detecting sensor PD1 and the second phase detecting sensor PD2, and perform phase autofocusing using the edge information detected by the phase detecting sensor with higher reliability.
  • the first phase detecting sensor is compared.
  • Step 405 Compare the edge information detected by the first phase detecting sensor PD1 and the reliability of the edge information detected by the second phase detecting sensor PD2, and use a higher reliability within a preset threshold.
  • the edge information is phase autofocus, otherwise it is processed using contrast autofocus.
  • Step 406 Perform phase autofocus using the edge information detected by the first phase detecting sensor PD1.
  • the edge information detected by the first phase detecting sensor is authentic, and the edge information detected by the second phase detecting sensor is not authentic, the edge information detected by the first phase detecting sensor is used for phase autofocusing.
  • Step 407 Determine whether the edge information detected by the second phase detecting sensor PD2 is authentic. If yes, go to step 408; if no, go to step 409.
  • Step 408 Perform phase autofocus using the edge information detected by the second phase detecting sensor PD2.
  • the edge information detected by the first phase detecting sensor PD1 is not authentic, and the edge information detected by the second phase detecting sensor PD2 is authentic, the edge information detected by the second phase detecting sensor PD2 is used for phase Auto focus processing.
  • step 409 focus processing is performed using contrast autofocus.
  • the focus processing is performed using contrast autofocus.
  • the focusing method by setting the first phase detecting sensor and the second phase detecting sensor to cross each other, when starting focusing, determining whether the edge information detected by the first phase detecting sensor is authentic, and determining the second phase detecting Whether the edge information detected by the sensor is credible; determining the focus processing mode according to the judgment result, so that the number of edge information in one direction in the image is small, and when the number of edge information in the other direction is large, the accuracy of focusing is improved.
  • FIG. 5 is a schematic structural diagram of a photographing apparatus according to an embodiment of the present invention.
  • the photographing device provided in this embodiment may specifically include: a determining module 51 and a determining processing module 52.
  • the determining module 51 is configured to determine whether the edge information detected by the first phase detecting sensor is authentic, and determine whether the edge information detected by the second phase detecting sensor is authentic;
  • the determining processing module 52 is configured to determine a focus processing mode according to the determination result
  • the first phase detecting sensor and the second phase detecting sensor are disposed at intersection.
  • the determining processing module 52 is further configured to:
  • the first phase detecting sensor configured to detect if the reliability of the edge information detected by the first phase detecting sensor is higher than the reliability of the edge information detected by the second phase detecting sensor Edge information for phase autofocus; or,
  • the determining processing module 52 is further configured to:
  • the edge information detected by the first phase detecting sensor is authentic, and the edge information detected by the second phase detecting sensor is not authentic, the edge information detected by the first phase detecting sensor is used for phase autofocusing.
  • the determining processing module 52 is further configured to:
  • the edge information detected by the first phase detecting sensor is not authentic, and the edge information detected by the second phase detecting sensor is authentic, the edge information detected by the second phase detecting sensor is used for phase autofocus processing. .
  • the determining processing module 52 is further configured to:
  • the focus processing is performed using contrast autofocus.
  • the determining module 51 and the determining processing module 52 may each be a Central Processing Unit (CPU), a Micro Processor Unit (MPU), and a digital signal processor (Digital) located in the photographing device.
  • CPU Central Processing Unit
  • MPU Micro Processor Unit
  • Digital Digital
  • Signal Processor DSP
  • FPGA Field Programmable Gate Array
  • the embodiment further provides a photographing apparatus comprising: a processor and a memory for storing a computer program executable on the processor, wherein when the processor is used to run the computer program, performing the steps described below :
  • the first phase detecting sensor and the second phase detecting sensor are disposed at intersection.
  • the determining a focus processing manner according to the determination result includes:
  • the determining a focus processing manner according to the determination result includes:
  • edge information detected by the first phase detecting sensor is more reliable than the second
  • the reliability of the edge information detected by the phase detecting sensor is used to perform phase autofocus using the edge information detected by the first phase detecting sensor.
  • the determining a focus processing manner according to the determination result includes:
  • the reliability of the edge information detected by the second phase detecting sensor is higher than the reliability of the edge information detected by the first phase detecting sensor, using the edge information detected by the second phase detecting sensor Phase autofocus.
  • the determining a focus processing manner according to the determination result includes:
  • the focus processing is performed using contrast autofocus.
  • the determining a focus processing manner according to the determination result includes:
  • the edge information detected by the first phase detecting sensor is authentic, and the edge information detected by the second phase detecting sensor is not authentic, the edge information detected by the first phase detecting sensor is used for phase autofocusing.
  • the determining a focus processing manner according to the determination result includes:
  • the edge information detected by the first phase detecting sensor is not authentic, and the edge information detected by the second phase detecting sensor is authentic, the edge information detected by the second phase detecting sensor is used for phase autofocus processing. .
  • the determining a focus processing manner according to the determination result includes:
  • edge information detected by the first phase detecting sensor is not authentic, and the edge information detected by the second phase detecting sensor is not authentic, use contrast autofocus to focus deal with.
  • the processor may be an integrated circuit chip with signal processing capabilities.
  • each step of the above method may be completed by an integrated logic circuit of hardware in a processor or an instruction in a form of software.
  • the above processor may be a general purpose processor, a digital signal processor (DSP), or other programmable logic device, discrete gate or transistor logic device, discrete hardware component, or the like.
  • the processor may implement or perform the methods, steps, and logic blocks disclosed in the embodiments of the present invention.
  • a general purpose processor can be a microprocessor or any conventional processor or the like.
  • the steps of the method disclosed in the embodiment of the present invention may be directly implemented as a hardware decoding processor, or may be performed by a combination of hardware and software modules in the decoding processor.
  • the software module can be located in a storage medium, the storage medium being located in the memory, the processor reading the information in the memory, and completing the steps of the foregoing methods in combination with the hardware thereof.
  • the present embodiment also provides a computer readable storage medium, such as a memory including a computer program, which can be executed by a processor of a camera device to perform the steps described in the foregoing methods.
  • the computer readable storage medium may be a magnetic random access memory (FRAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), and an erasable memory. Erasable Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Flash Memory, Magnetic Surface Memory, Optical Disc Or a memory such as a CD-ROM (Compact Disc Read-Only Memory); or a device including one or any combination of the above memories.
  • FRAM magnetic random access memory
  • ROM Read Only Memory
  • PROM Programmable Read-Only Memory
  • EPROM Erasable Programmable Read-Only Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • Flash Memory Magnetic Surface Memory
  • the computer readable storage medium stores a computer program, wherein when the computer program is executed by the processor, the following steps are implemented:
  • the first phase detecting sensor and the second phase detecting sensor are disposed at intersection.
  • the determining a focus processing manner according to the determination result includes:
  • the determining a focus processing manner according to the determination result includes:
  • the reliability of the edge information detected by the first phase detecting sensor is higher than the reliability of the edge information detected by the second phase detecting sensor, using the edge information detected by the first phase detecting sensor Phase autofocus.
  • the determining a focus processing manner according to the determination result includes:
  • the reliability of the edge information detected by the second phase detecting sensor is higher than the reliability of the edge information detected by the first phase detecting sensor, using the edge information detected by the second phase detecting sensor Phase autofocus.
  • the determining a focus processing manner according to the determination result includes:
  • the focus processing is performed using contrast autofocus.
  • the determining a focus processing manner according to the determination result includes:
  • the first phase detecting sensor is used to detect The edge information to the phase is auto-focused.
  • the determining a focus processing manner according to the determination result includes:
  • the edge information detected by the first phase detecting sensor is not authentic, and the edge information detected by the second phase detecting sensor is authentic, the edge information detected by the second phase detecting sensor is used for phase autofocus processing. .
  • the determining a focus processing manner according to the determination result includes:
  • the focus processing is performed using contrast autofocus.
  • the foregoing embodiment method can be implemented by means of software plus a necessary general hardware platform, and of course, can also be through hardware, but in many cases, the former is better.
  • Implementation Based on such understanding, the technical solution of the present invention, which is essential or contributes to the prior art, may be embodied in the form of a software product stored in a storage medium (such as ROM/RAM, disk,
  • the optical disc includes a number of instructions for causing a terminal device (which may be a cell phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the methods described in various embodiments of the present invention.
  • the present invention is directed to a method, apparatus (system), and computer program in accordance with an embodiment of the present invention
  • the flow chart and/or block diagram of the product is described. It will be understood that each flow and/or block of the flowchart illustrations and/or FIG.
  • These computer program instructions can be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing device to produce a machine for the execution of instructions for execution by a processor of a computer or other programmable data processing device.
  • the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
  • the apparatus implements the functions specified in one or more blocks of a flow or a flow and/or block diagram of the flowchart.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
  • the instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.
  • the first phase detecting sensor and the second phase detecting sensor by setting the first phase detecting sensor and the second phase detecting sensor to cross each other, when starting focusing, determining whether the edge information detected by the first phase detecting sensor is authentic, and determining the detected by the second phase detecting sensor. Whether the edge information is credible; determining the focus processing method according to the judgment result, so that the amount of edge information in a certain direction in the image is small, When the number of edge information in the other direction is large, the accuracy of focusing is improved.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un procédé de mise au point. Le procédé consiste à : déterminer si des informations de bord détectées par un premier capteur de détection de phase sont crédibles, et déterminer si des informations de bord détectées par un second capteur de détection de phase sont crédibles ; et déterminer un mode de traitement de mise au point en fonction d'un résultat de détermination, le premier capteur de détection de phase et le second capteur de détection de phase étant agencés de manière croisée. L'invention concerne également un dispositif de prise de vues, et un support de stockage.
PCT/CN2017/100852 2016-09-13 2017-09-07 Procédé de mise au point, dispositif de prise de vues, et support de stockage WO2018050014A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610820821.9 2016-09-13
CN201610820821.9A CN106331499B (zh) 2016-09-13 2016-09-13 对焦方法及拍照设备

Publications (1)

Publication Number Publication Date
WO2018050014A1 true WO2018050014A1 (fr) 2018-03-22

Family

ID=57787820

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/100852 WO2018050014A1 (fr) 2016-09-13 2017-09-07 Procédé de mise au point, dispositif de prise de vues, et support de stockage

Country Status (2)

Country Link
CN (1) CN106331499B (fr)
WO (1) WO2018050014A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112866510A (zh) * 2019-11-12 2021-05-28 Oppo广东移动通信有限公司 对焦方法和装置、电子设备、计算机可读存储介质
CN112866553A (zh) * 2019-11-12 2021-05-28 Oppo广东移动通信有限公司 对焦方法和装置、电子设备、计算机可读存储介质
EP4057616A4 (fr) * 2019-11-28 2022-12-28 Vivo Mobile Communication Co., Ltd. Dispositif électronique et procédé de mise au point

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106331499B (zh) * 2016-09-13 2019-10-29 努比亚技术有限公司 对焦方法及拍照设备
CN108337424B (zh) * 2017-01-17 2021-04-16 中兴通讯股份有限公司 一种相位对焦方法及其装置
CN107809586B (zh) * 2017-10-31 2020-09-01 努比亚技术有限公司 移动终端对焦模式的切换方法、移动终端及存储介质
CN107888829B (zh) * 2017-11-23 2020-08-28 努比亚技术有限公司 移动终端的对焦方法、移动终端及存储介质
CN108198195B (zh) * 2017-12-28 2021-04-16 努比亚技术有限公司 一种对焦的方法、终端及计算机可读存储介质
CN109951645B (zh) * 2019-04-30 2020-10-30 努比亚技术有限公司 对焦调节方法、移动终端及计算机可读存储介质
CN112866548B (zh) * 2019-11-12 2022-06-14 Oppo广东移动通信有限公司 相位差的获取方法和装置、电子设备

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102164293A (zh) * 2010-02-16 2011-08-24 索尼公司 图像处理装置、图像处理方法、图像处理程序和成像装置
CN102741724A (zh) * 2010-01-15 2012-10-17 佳能株式会社 摄像设备
CN102844705A (zh) * 2010-02-15 2012-12-26 株式会社尼康 焦点调节装置和焦点调节程序
CN103842878A (zh) * 2011-09-28 2014-06-04 富士胶片株式会社 固态图像捕捉元件、图像捕捉设备和聚焦控制方法
US20150365584A1 (en) * 2014-06-13 2015-12-17 Vitali Samurov Reliability measurements for phase based autofocus
CN105306786A (zh) * 2014-06-30 2016-02-03 半导体元件工业有限责任公司 用于具有相位检测像素的图像传感器的图像处理方法
CN106331499A (zh) * 2016-09-13 2017-01-11 努比亚技术有限公司 对焦方法及拍照设备

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101773168B1 (ko) * 2011-07-21 2017-09-12 삼성전자주식회사 위상차이 촬상 소자에 의한 초점 조절 장치 및 방법
DE112013005142T5 (de) * 2012-10-26 2015-08-06 Fujifilm Corporation Bildaufnahmevorrichtung und zugehöriges Fokussiersteuerverfahren

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102741724A (zh) * 2010-01-15 2012-10-17 佳能株式会社 摄像设备
CN102844705A (zh) * 2010-02-15 2012-12-26 株式会社尼康 焦点调节装置和焦点调节程序
CN102164293A (zh) * 2010-02-16 2011-08-24 索尼公司 图像处理装置、图像处理方法、图像处理程序和成像装置
CN103842878A (zh) * 2011-09-28 2014-06-04 富士胶片株式会社 固态图像捕捉元件、图像捕捉设备和聚焦控制方法
US20150365584A1 (en) * 2014-06-13 2015-12-17 Vitali Samurov Reliability measurements for phase based autofocus
CN105306786A (zh) * 2014-06-30 2016-02-03 半导体元件工业有限责任公司 用于具有相位检测像素的图像传感器的图像处理方法
CN106331499A (zh) * 2016-09-13 2017-01-11 努比亚技术有限公司 对焦方法及拍照设备

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112866510A (zh) * 2019-11-12 2021-05-28 Oppo广东移动通信有限公司 对焦方法和装置、电子设备、计算机可读存储介质
CN112866553A (zh) * 2019-11-12 2021-05-28 Oppo广东移动通信有限公司 对焦方法和装置、电子设备、计算机可读存储介质
CN112866553B (zh) * 2019-11-12 2022-05-17 Oppo广东移动通信有限公司 对焦方法和装置、电子设备、计算机可读存储介质
CN112866510B (zh) * 2019-11-12 2022-06-10 Oppo广东移动通信有限公司 对焦方法和装置、电子设备、计算机可读存储介质
EP4057616A4 (fr) * 2019-11-28 2022-12-28 Vivo Mobile Communication Co., Ltd. Dispositif électronique et procédé de mise au point

Also Published As

Publication number Publication date
CN106331499B (zh) 2019-10-29
CN106331499A (zh) 2017-01-11

Similar Documents

Publication Publication Date Title
WO2018050014A1 (fr) Procédé de mise au point, dispositif de prise de vues, et support de stockage
WO2018019124A1 (fr) Procédé de traitement d'image et dispositif électronique et support d'informations
US8780258B2 (en) Mobile terminal and method for generating an out-of-focus image
WO2018019122A1 (fr) Procédé de mise au point, terminal, et support de stockage informatique
WO2017071481A1 (fr) Terminal mobile et procédé de mise en œuvre d'écran divisé
WO2017071424A1 (fr) Terminal mobile et procédé de partage de fichier
WO2016180325A1 (fr) Procédé et dispositif de traitement d'images
WO2017020836A1 (fr) Dispositif et procédé pour traiter une image de profondeur par estompage
WO2017071456A1 (fr) Procédé de traitement de terminal, terminal, et support de stockage
WO2016169483A1 (fr) Terminal mobile et procédé d'ajustement de fonction à l'aide de la région de trame virtuelle associée
CN106909274B (zh) 一种图像显示方法和装置
WO2016058458A1 (fr) Procédé de gestion de la quantité d'électricité d'une batterie, terminal mobile et support de stockage informatique
WO2016169524A1 (fr) Procédé et dispositif de réglage rapide de luminosité d'écran, terminal mobile et support d'informations
WO2018019128A1 (fr) Procédé de traitement d'image de scène de nuit et terminal mobile
WO2017020771A1 (fr) Dispositif et procédé de commande de terminal
WO2017143855A1 (fr) Dispositif doté d'une fonction de capture d'écran et procédé de capture d'écran
WO2017088631A1 (fr) Terminal mobile, procédé de réglage par augmentation/diminution et appareil à cet effet, et support d'informations
WO2016155434A1 (fr) Procédé et dispositif pour reconnaître la tenue d'un terminal mobile, support de stockage et terminal
CN106482641B (zh) 一种尺寸测量装置和方法
WO2017071532A1 (fr) Procédé et appareil de prise de selfie de groupe
CN110109528B (zh) 应用程序的管控方法、移动终端及计算机可读存储介质
WO2016034152A1 (fr) Procédé d'affichage d'interface, terminal de communication et support de stockage informatique
WO2016188495A1 (fr) Procédé, terminal, serveur, système, et support de stockage de commutation vocale
WO2018045961A1 (fr) Procédé de traitement d'image, terminal et support de stockage
CN106911881B (zh) 一种基于双摄像头的动态照片拍摄装置、方法和终端

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17850216

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 21/08/2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17850216

Country of ref document: EP

Kind code of ref document: A1