WO2016063546A1 - 近接覚センサ - Google Patents

近接覚センサ Download PDF

Info

Publication number
WO2016063546A1
WO2016063546A1 PCT/JP2015/005349 JP2015005349W WO2016063546A1 WO 2016063546 A1 WO2016063546 A1 WO 2016063546A1 JP 2015005349 W JP2015005349 W JP 2015005349W WO 2016063546 A1 WO2016063546 A1 WO 2016063546A1
Authority
WO
WIPO (PCT)
Prior art keywords
substrate
proximity sensor
signal
optical sensor
detected
Prior art date
Application number
PCT/JP2015/005349
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
学士 尾崎
圭 近藤
悠介 今井
宮田 慎司
誠一 勅使河原
和輝 飛田
絢子 田淵
佐藤 昇
Original Assignee
日本精工株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本精工株式会社 filed Critical 日本精工株式会社
Publication of WO2016063546A1 publication Critical patent/WO2016063546A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V8/00Prospecting or detecting by optical means
    • G01V8/10Detecting, e.g. by using light barriers
    • G01V8/12Detecting, e.g. by using light barriers using one transmitter and one receiver

Definitions

  • the present invention relates to a proximity sensor that detects an object to be detected relatively approaching.
  • a technique for detecting an object relatively close to a robot or the like (hereinafter referred to as “detected object” in the present specification) and avoiding a collision with the detected object has been researched and developed.
  • “relatively approaching” means that the robot moves and approaches the detected object, the detected object moves and approaches the robot, and both the robot and the detected object are It includes both cases of moving and eventually approaching.
  • detection of an object to be detected is also performed by a pigeon sensor, proximity sensors are provided around the leg of the robot and around the arm that are blind spots of the pigeon sensor.
  • the proximity sensor provided in the robot is described in, for example, Patent Document 1 and Patent Document 2.
  • Patent Document 1 describes a proximity sensor that simplifies a circuit network by using a light emitting element and a light receiving element interchangeably.
  • Patent Document 2 describes that node pairs including sensor elements are connected to a robot arm or the like in a mesh pattern, and the sensor elements are arranged along a side surface of a cylinder or the like.
  • Patent Document 2 describes, as a comparative example, covering a robot arm or the like with a sheet-like substrate provided with a photo reflector.
  • JP 2007-71564 A Japanese Patent No. 5517039
  • Patent Document 1 and Patent Document 2 both connect a plurality of sensor elements according to the shape of a robot or the like to which a proximity sensor is attached, and integrally connect the connected sensors to a robot or the like. It is installed.
  • the proximity sensor described in Patent Literature 1 and Patent Literature 2 has a so-called single-product configuration that requires changing specifications (number of sensor elements, arrangement, range, etc.) according to the member to be attached. It was. Changing the specifications for each mounted member is disadvantageous from the viewpoint of the development cost of the proximity sensor. That is, in the proximity sensor attached to the robot, it is necessary to redesign the substrate on which the sensor element is mounted for each robot having different leg and arm sizes.
  • This invention is made
  • a proximity sensor includes a plurality of optical sensors that detect light intensity, and a signal generation unit that generates a signal related to the light intensity detected by the optical sensor.
  • the signal transmission unit transmits a signal or position information by a wireless or flexible cable.
  • the first substrate is preferably a flexible substrate having flexibility.
  • the first substrate includes an adhesive layer on the back surface with respect to the surface on which the optical sensor is provided.
  • the proximity sensor of 1 aspect of this invention is provided with the positional information generation part which the 1st board
  • the signal generation unit includes an AD converter that generates a digital signal from the light intensity received by the optical sensor, and the position information generation unit includes the AD converter. It is desirable to include a microcomputer that generates position information of an object to be detected from the digital signal generated by. In the proximity sensor of one embodiment of the present invention, in the above embodiment, it is preferable that the microcomputer corrects variation in output characteristics of the optical sensor when generating position information.
  • the optical sensor includes a light receiving element, the plurality of light receiving elements of the optical sensor are connected in parallel, and the position information generation unit is connected to the light receiving element connected in parallel.
  • the first substrate includes one or more optical sensor arrays in which the optical sensors are arranged in one direction.
  • the first substrate includes a plurality of optical sensor arrays in which the optical sensors are arranged in one direction
  • the signal generation unit includes the plurality of optical sensor arrays. It is desirable to include a comparator that generates a digital signal by comparing the light intensity received by the included optical sensor with a preset value.
  • the proximity sensor according to one embodiment of the present invention preferably further includes an OR circuit that inputs a plurality of digital signals and generates one signal.
  • the first substrate is attached such that the optical sensor row is along the longitudinal direction of the attached member to which the first substrate is attached.
  • the proximity sensor of one embodiment of the present invention may include an optical filter that covers a light receiving surface on which the optical sensor receives light and selectively transmits only light in a specific wavelength range. desirable.
  • the present invention can provide a proximity sensor that can reduce development costs and can be applied to various attached members having different sizes and shapes.
  • substrate shown in FIG. 2 It is the figure which illustrated the state which affixed the 1st board
  • FIG. 1 is a schematic diagram of a proximity sensor 1 according to a first embodiment of the present invention as viewed from above.
  • the proximity sensor 1 includes a first substrate group 11 and a second substrate 12.
  • the first substrate group 11 includes n first substrates.
  • first substrates 111, 112, 113, 114, and 115 are illustrated, and the illustrated first substrates are referred to as first substrates 111, 112, 113, 114, and 115.
  • the first substrates 111, 112, 113, 114, and 115 all have the same configuration. For this reason, in the first embodiment, the description of the first substrate will be made by describing the first substrate 111 and replacing the description of the other first substrate.
  • the first substrate 111 includes an optical sensor 23 that detects light intensity, a memory control unit (hereinafter referred to as “MCU”) 25 that generates a signal related to the light intensity detected by the optical sensor 23, and have.
  • the first substrate 111 has a substrate 21 on which the optical sensor 23 and the MCU 25 are mounted.
  • the board 21 is provided with a connector 27 for outputting a signal generated by the MCU 25 to the outside.
  • the second substrate 12 obtains information (hereinafter referred to as “position information”) regarding the position of the detected object from signals generated by the MCU 25 of the plurality of first substrates 111, 112, 113, 114, 115. 31, and elements 32, 33, and 34 that operate together with the microprocessor 31.
  • the microprocessor 31 and the elements 32, 33 and 34 are mounted on the substrate 35.
  • the second substrate 12 is a substrate that controls position information obtained from the plurality of first substrates 111, 112, 113, 114, and 115.
  • the connector 27 of the first substrate 111 and the connector 37 of the second substrate 12 are connected by a flexible wiring cable 13.
  • the wiring cable 13 is provided on the first substrate 111 and the second substrate 12 by the connector 27 and the connector 37, and transmits the position information generated by the MCU 25 from the first substrate 111 to the second substrate 12.
  • FIG. 2 is a diagram for explaining the first substrate 111 shown in FIG.
  • FIG. 2A is a perspective view of the first substrate 111
  • FIG. 2B is a top view schematically showing the first substrate 111
  • FIG. 2C is a cross-sectional view taken along the line CC shown in FIG.
  • the plurality of optical sensors 23 are arranged in a line on the substrate 21.
  • a column constituted by a plurality of photosensors 23 arranged in a row is also referred to as a photosensor column in the first embodiment.
  • the optical sensor 23 of the first embodiment is a reflection type photointerrupter, and has a configuration in which a light emitting diode (hereinafter referred to as “LED”) 23 a and a phototransistor 23 b are paired.
  • the LED 23a of the first embodiment is a light emitting element that emits infrared rays
  • the phototransistor 23b is a light receiving element that receives infrared rays.
  • the MCU 25 has a function of generating a clock signal and a network interface in addition to an arithmetic function. The MCU 25 can operate only by receiving power supply.
  • the optical sensor 23 is set so that the LED 23a emits infrared light, and the infrared light emitted by the LED 23a is not received by the phototransistor 23b when there is no object to be detected.
  • the infrared rays emitted from the LED are reflected by the detected object.
  • the reflected infrared light is received by the phototransistor 23b.
  • the intensity of infrared light received by the phototransistor 23b varies depending on the distance between the phototransistor 23b and the object to be detected.
  • the wiring 22 is provided on the substrate 21.
  • the plurality of optical sensors 23, MCU 25, and connector 27 are electrically connected by wiring 22.
  • the plurality of phototransistors 23b each generate a signal corresponding to the intensity of the received infrared light by receiving the infrared light, and output the signal to the MCU 25.
  • the MCU 25 performs a preprogrammed calculation based on the signal sent from the phototransistor 23b, and obtains position information related to the position of the detected object.
  • the position information is transmitted from the connector 27 to the connector 37 of the second substrate 12 via the wiring cable 13. For this reason, in the first embodiment, the proximity sensor 1 can detect that the detected object is relatively close to the robot arm and the position of the detected object.
  • the MCU 25 transmits position information related to the position of the detected object to a higher-level control device.
  • the higher-level control device may be, for example, a drive control device that controls the drive of the robot.
  • the drive control device inputs position information related to the position of the detected object, moves the robot arm in a direction different from the approaching direction of the detected object based on the input position information, and moves the robot arm to the detected object. Avoid collisions.
  • the first embodiment is not limited to the configuration in which the MCU 25 transmits the position of the detected object to a higher-level drive control device.
  • the MCU 25 converts the output from the phototransistor 23b into a digital signal, and transmits it as it is from the connector 27 to the connector 37 of the second substrate 12 and the microprocessor 31 via the wiring cable 13. Also good.
  • the position information of the detected object is generated by the microprocessor 31.
  • the first embodiment is not limited to the one that transmits a digital signal or position information from the MCU 25 to the second substrate 12 by the wiring cable 13.
  • a digital signal or position information can be transmitted from the MCU 25 to the second substrate as a radio signal.
  • the MCU 25 and the microprocessor 31 are configured to include a communication unit for transmitting and receiving wireless signals.
  • the proximity sensor 1 When the proximity sensor 1 according to the first embodiment is attached to, for example, a side surface of a robot arm, it is preferable to use a flexible substrate (hereinafter referred to as “flexible printed circuit”) as the substrate 21. Since the flexible substrate has flexibility, even if the installation target surface (side surface of the robot arm) is a curved surface, the flexible substrate follows the installation target surface.
  • the substrate 21 is a flexible substrate, wirings 22 are printed in advance on the surface of the polyimide film that serves as a base material.
  • the optical sensor 23 and the MCU 25 are bare-chip mounted at predetermined positions on the wiring 22.
  • a protective film 29 is provided on the substrate 21 on which the optical sensor 23, the MCU 25, and the connector 27 are mounted.
  • a contact window 291 is formed in the protective film 29, and the connector 27 is disposed on the wiring 22 exposed from the contact window 291. Further, when there is a concern about the influence of refraction by the protective film 29, a window for the optical sensor 23 is formed. According to such a configuration, the thickness of the first substrate 111 can be made thinner than when resin is used for the sealing material. Further, if an adhesive layer is provided on the back surface of the substrate 21 on which the optical sensor 23 or the like is mounted, the first substrate 111 can be directly attached to the side surface 41 of the robot arm 71.
  • FIG. 3 illustrates a state where the first substrate 111 shown in FIGS. 2A, 2B, and 2C is attached to the side surface 41 of the robot arm together with the other first substrates 112 and 113.
  • FIG. 3 illustrates a state where the first substrate 111 shown in FIGS. 2A, 2B, and 2C is attached to the side surface 41 of the robot arm together with the other first substrates 112 and 113.
  • the optical sensors 23 are attached so that the rows of the optical sensors 23 extend along the longitudinal direction of the attached member (robot arm) to which the first substrates 111, 112, 113 are attached.
  • the second substrate 12 is attached to the side surface together with the first substrates 111, 112, and 113.
  • the substrate 35 of the second substrate is also a flexible substrate, and the microprocessor 31 and the elements 32, 33, and 34 are bare-mounted on the base material of the substrate 35 on which the wiring is printed. Further, the mounted microprocessor 31 and elements 32, 33, and 34 are covered with a protective film.
  • the proximity sensor can be attached to the robot arm 71 by appropriately attaching the first substrates 111, 112, and 113 to the side surface 41 of the robot arm 71.
  • the first embodiment eliminates the need to design a proximity sensor in accordance with the size and shape of the robot arm, thereby reducing the manufacturing cost of the proximity sensor and shortening the development time.
  • the substrate 21 or the substrate 35 is not limited to a flexible substrate, and may be a substrate that does not have flexibility.
  • the second substrate 12 of the first embodiment is not limited to being attached to the robot arm, and may be provided in other parts together with other control devices of the robot. In such a case, the wiring cable 13 may be routed between the first substrate 111 and the like and the second substrate 12, or the MCU 25 may transmit a signal wirelessly to the second substrate 12 through a network interface. There may be.
  • FIGS. 4A, 4 ⁇ / b> B, and 4 ⁇ / b> C are diagrams illustrating how the first substrate 111, 112, 113, 114, 115 is connected to the second substrate 12.
  • the signal output from the optical sensor 23 of the first substrate 111, 112, 113, 114, 115 is sent to the second substrate MCU25. Entered.
  • connection using a common signal line the communication between the first substrate 111 and the like and the second substrate 12 is performed by serial communication that continuously transmits data bit by bit.
  • Serial communication has the merit that the cost of the wire material of the wiring cable and the cost of the relay device can be suppressed.
  • serial communication standards include RS-422 and RS-485.
  • FIG. 4B shows an example in which n connectors 37 are provided on the second substrate 12 as many as the number of the first substrates, and each connector 37 individually inputs a signal output from the first substrate 111 or the like. .
  • a connection is referred to as “individual signal capture”.
  • communication between the first substrate 111 and the like and the second substrate 12 is performed by parallel communication in which the plurality of wiring cables 43 continuously transmit data one bit at a time.
  • the parallel communication can detect which first substrate of the plurality of first substrates included in the first substrate group 11 is closest to the detected object.
  • FIG. 4C shows an example in which connectors 27 are provided at both ends of the n first substrates including the first substrates 111, 112, 113, 114, 115, and the first substrates are connected in series. .
  • Such a connection is also called a daisy chain connection.
  • an interface capable of daisy chain connection such as IEEE 1934 for the MCU 25.
  • the first substrate of the first embodiment is not limited to the configuration in which the optical sensors 23 are arranged in a line as shown in FIGS.
  • FIG. 5A, FIG. 5B, and FIG. 5C are diagrams showing an example in which a plurality of rows of photosensors 23 are provided on one first substrate.
  • FIG. 5A shows the first substrate 111 for comparison.
  • FIG. 5B is a view showing a first substrate 611 provided with two rows of optical sensors 23.
  • FIG. 5C shows the first substrate 613 provided with three rows of optical sensors 23.
  • the first substrates 611 and 613 include the MCU 25 and the connector 27 on the flexible substrate 21.
  • FIG. 6 is a schematic cross-sectional view showing a state in which the first substrate 111 and the first substrate 613 are attached to the robot arm 71.
  • the first substrate 613 is provided on the relatively wide side surface 711 of the robot arm 71.
  • the first substrate 111 is provided on the side surface 713 narrower than the side surface 711.
  • 1st Embodiment can reduce the number of the 1st board
  • by providing the first substrate 111 having the minimum configuration with respect to the narrow side surface the approach of the object to be detected can be detected even for the narrow side surface.
  • FIG. 7A and FIG. 7B are schematic circuit diagrams for explaining the circuit configuration of the first substrate of the first embodiment.
  • the MCU 25 includes an AD converter 251 and a calculation unit 252.
  • the AD converter 251 has the same number of channels as the phototransistor 23b (four in FIG. 7A) and is output by the four phototransistors 23b.
  • Each signal is digitized to generate a digital signal.
  • the generated digital signal is a signal related to the light intensity detected by the optical sensor 23.
  • the calculation unit 252 generates position information related to the position of the detected object using the digital signal. Then, the generated position information is transmitted to a host drive control device (not shown).
  • the circuit shown in FIG. 7A can acquire the output of the phototransistor 23b as an analog signal before the proximity sensor is shipped, and calibrate the variation in characteristics of the phototransistor 23b in advance.
  • the calculation unit 252 can correct the digital signal so as to absorb the variation in characteristics of the phototransistor 23b obtained by the calibration when generating the position information.
  • Such processing can be performed by changing the program of the MCU 25. That is, the circuit shown in FIG. 7A can arbitrarily change the processing method of the signal output from the phototransistor 23b by the program of the MCU 25, and can increase the degree of freedom in designing the proximity sensor. Further, according to the circuit shown in FIG. 7A, a signal can be obtained for each phototransistor 23b.
  • the proximity sensor of the first embodiment using the circuit shown in FIG. 7A identifies the phototransistor 23b that is relatively close to the detected object, and the position of the detected object relative to the robot arm and Position information regarding the distance can be obtained. Furthermore, according to such a circuit, it becomes possible to simultaneously detect a plurality of objects to be detected.
  • FIG. 7B shows a circuit of the first substrate 52 in which the connection method between the phototransistor 23b and the MCU 25 is different from the circuit of FIG.
  • the phototransistor 23b included in the phototransistor 23b and the resistance element are connected to form a network. That is, a plurality of phototransistors 23b are connected in parallel via a resistance element on the emitter side, and the arithmetic unit 252 outputs the voltage output from the phototransistors 23b arranged at both ends of the phototransistors 23b connected in parallel.
  • the values V1 and V2 are detected.
  • the calculation unit 252 obtains the center of gravity of the detected object from the voltage value.
  • the AD converter 251 has two channels for inputting an analog signal output from the phototransistor 23b connected to both ends of the network.
  • the AD converter 251 converts the input analog signal into a digital signal and outputs the digital signal to the arithmetic unit 252.
  • the computing unit 252 identifies the position of the center of gravity of the detected object from the two digital signals.
  • the total current I flowing through the phototransistor 23b constituting the network shown in FIG. 7B is obtained by the following equation (1).
  • I a (V1 + V2) (1)
  • the primary moment Ix around the x-axis of the distribution of the total current I is obtained by the following equation (2).
  • Ix b (V1-V2) (2)
  • a and b are constants that are appropriately set according to the characteristics of the proximity sensor.
  • the coordinate x c1 indicating the center position of the distribution of the current flowing through each optical sensor 23 is obtained as follows.
  • the reason why the coordinates indicating the center of the current distribution are represented only by the x coordinate is that the first substrate 52 has a configuration in which the columns of the optical sensors 23 are arranged one-dimensionally.
  • x c1 Ix / I Expression (3)
  • the processing method of the signal output from the phototransistor 23b can be arbitrarily changed by the program of the MCU 25, and the proximity The degree of freedom in designing the sensor can be increased.
  • a simple microcomputer with a small number of channels of the AD converter 251 can be used as the MCU 25. Therefore, by using the circuit shown in FIG. 7B, the first embodiment can further reduce the manufacturing cost of the proximity sensor.
  • the circuit shown in FIG. 7B calculates the position of the object to be detected by Expressions (1) to (3), so that the object to be detected between the optical sensors 23 can be detected.
  • the proximity sensor of the first embodiment described above can be attached regardless of the shape and size of the attached member. For this reason, it is not necessary to change the layout of the proximity sensor or to change the design according to the attachment target. Therefore, the first embodiment can provide a low-cost proximity sensor with a small development time and load.
  • FIGS. 8A and 8B are schematic circuit diagrams for explaining the circuit configuration of the second embodiment and the first substrate.
  • the circuit of the second embodiment is different from the first embodiment in that a comparator 81 is provided instead of the MCU 25.
  • a comparator 81 is provided instead of the MCU 25.
  • FIG. 8A and FIG. 8B four of the plurality of photosensors 23 arranged on the first substrates 82 and 83 are illustrated for simplicity of description.
  • FIGS. 8A and 8B only the phototransistor 23b of the optical sensor 23 is shown, and the LED 23a on the light emitting side is not shown.
  • the number of comparators 81 is equal to the number of phototransistors (four in FIG. 8A), and the comparators 81 are output by four phototransistors 23b.
  • Each signal is digitized to generate a digital signal.
  • the generated digital signal is a signal related to the light intensity detected by the optical sensor 23.
  • the digital signal is transmitted to a host drive control device (not shown).
  • the drive control device uses the digital signal to generate position information related to the position of the detected object. Then, the robot arm or the like is driven based on the generated position information.
  • the proximity sensor of the first embodiment using the circuit shown in FIG. 8A identifies the phototransistor 23b that is relatively close to the detected object, and the position of the detected object relative to the robot arm and Position information regarding the distance can be obtained. Furthermore, according to such a circuit, it becomes possible to simultaneously detect a plurality of objects to be detected.
  • the second embodiment that generates a digital signal using the comparator 81 has a simpler configuration than the first embodiment that uses the MCU 25, and is advantageous in reducing the manufacturing cost.
  • the interface with the microprocessor 31 on the second substrate can be simplified. However, since the proximity sensor using the circuit shown in FIG.
  • the circuit shown in FIG. 8A can set a value to be compared with the light intensity for each comparator 81 to absorb the variation.
  • the circuit shown in FIG. 8B further includes an OR circuit 815 that inputs a plurality of digital signals output from the plurality of comparators 81.
  • the OR circuit 815 outputs a digital signal “1” when at least one of the plurality of input digital signals is “1”.
  • the circuit shown in FIG. 8B is advantageous in reducing the number of wiring cables between the first board 83 and the second board.
  • the circuit shown in FIG. 8B cannot identify the optical sensor 23 that has detected the reflected light. Therefore, the proximity sensor using the circuit shown in FIG. 8B should be used in a situation where the robot is stopped when the detected object approaches, regardless of the approach direction of the detected object. Can be considered.
  • the proximity sensors according to the first and second embodiments described above are attached to the arms and legs of a walking robot and are used to avoid an object to be detected, or the arms of an industrial robot. It is assumed that the proximity sensor is used to avoid a collision with an operator.
  • the proximity sensor of the first embodiment and the second embodiment can be attached to a cart, a shopping cart, a suitcase, etc., and used to warn relatively detected objects. it can.
  • the proximity sensor as described above is suitable for applications that require lower cost than high detection accuracy and a wide detection range of an object to be detected.
  • FIG. 9 is a diagram for explaining the first substrate 121 of the proximity sensor according to the third embodiment.
  • FIG. 9A is a perspective view of the first substrate 121
  • FIG. 9B is a top view schematically showing the first substrate 121
  • FIG. 9C is a cross-sectional view taken along the line DD shown in FIG. 9B.
  • the phototransistor 23b shown in FIG. 9B the light receiving surface of the phototransistor 23b is shown.
  • the first substrate 121 of the third embodiment includes a substrate 21, an optical sensor 23 arranged in a row on the substrate 21, an MCU 25, and a connector 27.
  • the protective film 39 is provided on the board
  • the third embodiment is different from the first embodiment in that the protective material 39 has a function of an optical filter.
  • the optical sensor 23 covers the light receiving surface shown in FIG. 9B of the phototransistor 23b that receives light, and as an optical filter that selectively transmits only light in a specific wavelength range. It has a protective material 39 that functions.
  • FIG. 10 is an example in which the relationship between the wavelength range of light and the sensitivity (S / N) of the phototransistor 23b corresponding to the wavelength is normalized with the maximum sensitivity.
  • the phototransistor 23b has a sensitivity characteristic having a width with respect to light in a specific wavelength region. For this reason, the signal detected by the phototransistor 23b is affected by the ambient light other than the light emitted by the LED 23a.
  • the sensitivity of the phototransistor 23b rises from 600 nm and has a peak at 850 nm.
  • the phototransistor 23b does not perform the light receiving function.
  • the wavelength range of light detected by the phototransistor 23b as a signal is wider, light other than that emitted by the LED 23a is received by the phototransistor 23b.
  • light other than the light emitted by the LED 23 a becomes disturbance light, which causes a decrease in the S / N ratio of the optical sensor 23.
  • FIG. 11 is a diagram showing the relationship between the transmittance of the protective material 39 and its wavelength.
  • the protective material 39 functions as an optical filter that transmits only light in a predetermined wavelength range having a peak at the center wavelength ⁇ cnm and does not transmit other wavelength ranges (hereinafter also referred to as “cut”).
  • the protective material 39 is designed so that the center wavelength ⁇ cnm is a wavelength having a transmittance peak.
  • the center wavelength ⁇ cnm is 850 nm.
  • the maximum range of the wavelength of light that passes through the protective material 39 when the center wavelength is ⁇ cnm is referred to as a “transmission region”.
  • a wavelength range obtained by removing a wavelength in the transmission region from a wavelength range in which the phototransistor 23b has sensitivity is referred to as a “cut region”. Further, a wavelength range corresponding to a transmittance of 50% is defined as a full width at half maximum (FWHM) f.
  • FWHM full width at half maximum
  • the protective material 39 include a filter glass in which a transition metal that absorbs light with wavelength selectivity is mixed in a glass substrate, and a multilayer filter that forms an optical thin film on the surface of the glass substrate.
  • the transition metal used in the filter glass include Fe, Ti, Cr, Cu, Co, Ni, and Mn.
  • the filter glass is more advantageous than the multilayer filter in that the optical characteristics do not depend on the incident angle.
  • FIG. 12 is a diagram for explaining a multilayer filter.
  • the multilayer filter of the third embodiment is formed by providing a dielectric thin film 93 on the surface S2 of the glass substrate 91.
  • the reflected light r2 of the incident light r1 reflected at the interface between the air and the surface S1 of the dielectric thin film 93 and the interface between the dielectric thin film 93 and the glass substrate 91 interferes with each other, and the multilayer film The light transmission of the filter is changed.
  • the dielectric thin film is not a single layer but a multi-layered film (dielectric multilayer film), and filters having various properties can be created depending on the design.
  • the multilayer filter can make the rise and fall of the sensitivity characteristic curve steeper than the glass filter.
  • the third embodiment can provide a proximity sensor that can detect an object to be detected with high accuracy.
  • the present invention described above is suitable for a system that detects an obstacle and prevents the two from colliding when an autonomously operable device such as a robot and the obstacle are relatively close to each other.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • Geophysics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Optical Transform (AREA)
  • Switches Operated By Changes In Physical Conditions (AREA)
  • Manipulator (AREA)
  • Force Measurement Appropriate To Specific Purposes (AREA)
PCT/JP2015/005349 2014-10-24 2015-10-23 近接覚センサ WO2016063546A1 (ja)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-217080 2014-10-24
JP2014217080 2014-10-24

Publications (1)

Publication Number Publication Date
WO2016063546A1 true WO2016063546A1 (ja) 2016-04-28

Family

ID=55760604

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/005349 WO2016063546A1 (ja) 2014-10-24 2015-10-23 近接覚センサ

Country Status (2)

Country Link
JP (1) JP6358227B2 (enrdf_load_stackoverflow)
WO (1) WO2016063546A1 (enrdf_load_stackoverflow)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107063163A (zh) * 2017-06-02 2017-08-18 秀卓自动化设备(湖北)有限公司 一种汽车天窗玻璃曲率检测系统
JP2018146456A (ja) * 2017-03-07 2018-09-20 旭光電機株式会社 物体検知装置
JP2018179501A (ja) * 2017-04-03 2018-11-15 日本精工株式会社 近接覚センサ
JP2018206903A (ja) * 2017-06-02 2018-12-27 日本精工株式会社 近接覚センサ及び近接覚センサの製造方法
WO2020036217A1 (ja) * 2018-08-17 2020-02-20 旭光電機株式会社 物体検知装置

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021038991A (ja) * 2019-09-03 2021-03-11 株式会社フューチャースタンダード センサモジュール

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6166371A (en) * 1999-04-30 2000-12-26 Beckman Coulter, Inc. Diffuse reflective light curtain system
JP2011209077A (ja) * 2010-03-29 2011-10-20 Kyokko Denki Kk 物体検出装置
JP2013083615A (ja) * 2011-10-12 2013-05-09 Kyokko Denki Kk 検知範囲を自由に設定できる物体検出装置
JP2013245987A (ja) * 2012-05-24 2013-12-09 Nsk Ltd 近接覚センサ

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6009873B2 (ja) * 2012-09-13 2016-10-19 アズビル株式会社 光電センサ

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6166371A (en) * 1999-04-30 2000-12-26 Beckman Coulter, Inc. Diffuse reflective light curtain system
JP2011209077A (ja) * 2010-03-29 2011-10-20 Kyokko Denki Kk 物体検出装置
JP2013083615A (ja) * 2011-10-12 2013-05-09 Kyokko Denki Kk 検知範囲を自由に設定できる物体検出装置
JP2013245987A (ja) * 2012-05-24 2013-12-09 Nsk Ltd 近接覚センサ

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018146456A (ja) * 2017-03-07 2018-09-20 旭光電機株式会社 物体検知装置
JP2018179501A (ja) * 2017-04-03 2018-11-15 日本精工株式会社 近接覚センサ
CN107063163A (zh) * 2017-06-02 2017-08-18 秀卓自动化设备(湖北)有限公司 一种汽车天窗玻璃曲率检测系统
JP2018206903A (ja) * 2017-06-02 2018-12-27 日本精工株式会社 近接覚センサ及び近接覚センサの製造方法
WO2020036217A1 (ja) * 2018-08-17 2020-02-20 旭光電機株式会社 物体検知装置
JPWO2020036217A1 (ja) * 2018-08-17 2021-08-26 旭光電機株式会社 物体検知装置
JP7373177B2 (ja) 2018-08-17 2023-11-02 旭光電機株式会社 物体検知装置

Also Published As

Publication number Publication date
JP6358227B2 (ja) 2018-07-18
JP2016085219A (ja) 2016-05-19

Similar Documents

Publication Publication Date Title
JP6358227B2 (ja) 近接覚センサ
TWI460642B (zh) 輸入裝置與觸碰事件處理方法
JP5886158B2 (ja) 多方向近接センサー
EP2511737B1 (en) Modular light curtain and plug-in module therefor
EP2930473A1 (en) Optical encoder and apparatus provided therewith
JP5517039B2 (ja) リング型センサ
US12078561B2 (en) Optical sensor and optical sensor module
WO2018230243A1 (ja) ポジションセンサ
JP2016085219A5 (enrdf_load_stackoverflow)
JP5547531B2 (ja) 物体検出装置
JP6035467B2 (ja) 反射型エンコーダ
CN101673147B (zh) 多维度光学控制装置及多维度光学控制方法
US20110069321A1 (en) Encoder
WO2022230410A1 (ja) センサ装置
KR20190036652A (ko) Pir 센서 어레이, 조도센서, 및 디밍 컨트롤러를 포함하는 조명제어 시스템
JP2013083673A (ja) 検出ユニット、エンコーダ
JP2014134540A5 (enrdf_load_stackoverflow)
JP6582542B2 (ja) 近接覚センサを備えたワーク搬送用ハンド
Sharma et al. Low cost sensors for general applications
JP6349978B2 (ja) 物体検出装置及び物体検出方法
JP6020738B2 (ja) 傾き検出装置
CN110174707A (zh) 一种用于移动机器人的红外传感器及其使用方法
JP2017207303A (ja) 近接覚センサ
CN108226955B (zh) 红外阵列接近传感器
Sutaj Vision and electro-optic sensors for robots

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15852278

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15852278

Country of ref document: EP

Kind code of ref document: A1