JP2014089777A - Optical pickup device - Google Patents

Optical pickup device Download PDF

Info

Publication number
JP2014089777A
JP2014089777A JP2011038890A JP2011038890A JP2014089777A JP 2014089777 A JP2014089777 A JP 2014089777A JP 2011038890 A JP2011038890 A JP 2011038890A JP 2011038890 A JP2011038890 A JP 2011038890A JP 2014089777 A JP2014089777 A JP 2014089777A
Authority
JP
Japan
Prior art keywords
light
regions
direction
spectroscopic element
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
JP2011038890A
Other languages
Japanese (ja)
Inventor
Kenji Nagatomi
謙司 永冨
Original Assignee
Sanyo Electric Co Ltd
三洋電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sanyo Electric Co Ltd, 三洋電機株式会社 filed Critical Sanyo Electric Co Ltd
Priority to JP2011038890A priority Critical patent/JP2014089777A/en
Publication of JP2014089777A publication Critical patent/JP2014089777A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B7/00Recording or reproducing by optical means, e.g. recording using a thermal beam of optical radiation by modifying optical properties or the physical structure, reproducing using an optical beam at lower power by sensing optical properties; Record carriers therefor
    • G11B7/12Heads, e.g. forming of the optical beam spot or modulation of the optical beam
    • G11B7/135Means for guiding the beam from the source to the record carrier or from the record carrier to the detector
    • G11B7/1353Diffractive elements, e.g. holograms or gratings
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B7/00Recording or reproducing by optical means, e.g. recording using a thermal beam of optical radiation by modifying optical properties or the physical structure, reproducing using an optical beam at lower power by sensing optical properties; Record carriers therefor
    • G11B7/12Heads, e.g. forming of the optical beam spot or modulation of the optical beam
    • G11B7/135Means for guiding the beam from the source to the record carrier or from the record carrier to the detector
    • G11B7/1395Beam splitters or combiners
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B7/00Recording or reproducing by optical means, e.g. recording using a thermal beam of optical radiation by modifying optical properties or the physical structure, reproducing using an optical beam at lower power by sensing optical properties; Record carriers therefor
    • G11B2007/0003Recording, reproducing or erasing systems characterised by the structure or type of the carrier
    • G11B2007/0006Recording, reproducing or erasing systems characterised by the structure or type of the carrier adapted for scanning different types of carrier, e.g. CD & DVD
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B7/00Recording or reproducing by optical means, e.g. recording using a thermal beam of optical radiation by modifying optical properties or the physical structure, reproducing using an optical beam at lower power by sensing optical properties; Record carriers therefor
    • G11B2007/0003Recording, reproducing or erasing systems characterised by the structure or type of the carrier
    • G11B2007/0009Recording, reproducing or erasing systems characterised by the structure or type of the carrier for carriers having data stored in three dimensions, e.g. volume storage
    • G11B2007/0013Recording, reproducing or erasing systems characterised by the structure or type of the carrier for carriers having data stored in three dimensions, e.g. volume storage for carriers having multiple discrete layers
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B7/00Recording or reproducing by optical means, e.g. recording using a thermal beam of optical radiation by modifying optical properties or the physical structure, reproducing using an optical beam at lower power by sensing optical properties; Record carriers therefor
    • G11B7/12Heads, e.g. forming of the optical beam spot or modulation of the optical beam
    • G11B7/135Means for guiding the beam from the source to the record carrier or from the record carrier to the detector
    • G11B7/1372Lenses
    • G11B7/1374Objective lenses
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B7/00Recording or reproducing by optical means, e.g. recording using a thermal beam of optical radiation by modifying optical properties or the physical structure, reproducing using an optical beam at lower power by sensing optical properties; Record carriers therefor
    • G11B7/12Heads, e.g. forming of the optical beam spot or modulation of the optical beam
    • G11B7/135Means for guiding the beam from the source to the record carrier or from the record carrier to the detector
    • G11B7/1372Lenses
    • G11B7/1376Collimator lenses

Abstract

Provided is an optical pickup device capable of smoothly suppressing the influence of stray light and suppressing deterioration of a detection signal due to a sensor position shift.
A spectroscopic element H3 has diffraction regions H3a to H3i. The laser beams incident on the diffraction regions H3a to H3h are diffracted in the directions Va to Vh, respectively. The laser light incident on the diffraction region H3i is diffracted so as not to be irradiated on the sensor unit. The diffraction regions H3a to H3h provide a lens effect in the direction of the arrow with respect to the incident laser light. Such a lens effect is set to increase in the left-right direction as the distance from the center increases in the diffraction areas H3b, H3c, H3f, and H3g, and increases in the vertical direction as the distance from the center increases in the diffraction areas H3a, H3d, H3e, and H3h. Is set. Thereby, even if the positional deviation of the sensor layout occurs, the irradiation area is difficult to protrude from the sensor unit.
[Selection] Figure 13

Description

  The present invention relates to an optical pickup device, and is particularly suitable for use in irradiating a recording medium on which a plurality of recording layers are laminated with laser light.

  In recent years, with the increase in capacity of optical discs, the number of recording layers has been increasing. By including a plurality of recording layers in one disc, the data capacity of the disc can be remarkably increased. In the past, when recording layers were stacked, two single-sided layers were common, but recently, in order to further increase the capacity, a disc having three or more recording layers on one side has been put to practical use. ing. Here, when the number of recording layers is increased, the capacity of the disk can be increased. However, on the other hand, the interval between the recording layers is narrowed, and signal deterioration due to interlayer crosstalk increases.

  When the recording layer is multilayered, the reflected light from the recording layer (target recording layer) to be recorded / reproduced becomes weak. For this reason, when unnecessary reflected light (stray light) is incident on the photodetector from the recording layers above and below the target recording layer, the detection signal may be deteriorated, which may adversely affect the focus servo and tracking servo. Therefore, when a large number of recording layers are arranged in this way, it is necessary to properly remove stray light and stabilize the signal from the photodetector.

  Patent Document 1 below discloses a new configuration of an optical pickup device that can appropriately remove stray light when a large number of recording layers are arranged. According to this configuration, a rectangular region (signal light region) where only signal light exists can be formed on the light receiving surface of the photodetector. The reflected light from the recording medium is irradiated near the apex angle of the signal light region. By arranging the sensor of the photodetector near the apex angle of the signal light region, the influence of stray light on the detection signal can be suppressed.

JP 2009-2111770 A

  In the optical pickup device having the above-described configuration, the amount of reflected light irradiated near the pair of apex angles facing each other in the signal light region is set larger than the amount of reflected light irradiated near the other pair of apex angles. desirable. As a result, the tracking error signal can be optimized. Such adjustment of the amount of light can be realized by changing the area of the reflected light irradiated in the vicinity of each apex angle.

  However, if the area of the reflected light is changed in this way, the shape of the reflected light changes accordingly, so that the shape of each reflected light hardly matches the shape of the sensor. For this reason, if a positional deviation occurs in the sensor, the detection signal may be degraded according to the amount of the positional deviation.

  The present invention has been made in view of the above points, and provides an optical pickup device capable of smoothly suppressing the influence of stray light and suppressing deterioration of a detection signal due to a positional deviation of the sensor. Objective.

An optical pickup device according to a main aspect of the present invention is configured to receive a laser light source, an objective lens that converges laser light emitted from the laser light source on a recording medium, and the laser light reflected by the recording medium At the same time, the laser beam is converged in a first direction to generate a first focal line, and the laser beam is converged in a second direction perpendicular to the first direction to generate a second focal line. The astigmatism element for generating the laser beam and the laser beam reflected by the recording medium are incident, and the traveling directions of the light beams incident on the first to fourth regions are made different from each other. And a photodetector that includes a sensor unit and receives each of the light beams dispersed by the sensor unit and outputs a detection signal. Here, when the intersection of the first and second straight lines that are parallel to and cross each other in the first direction and the second direction are aligned with the center of the spectroscopic element, the first and second The first and second regions are arranged in a direction in which a set of vertical angles formed by a straight line are arranged, and the third and fourth regions are arranged in a direction in which the other set of vertical angles are arranged, The astigmatism element is arranged so that the direction in which the second regions are arranged is parallel to the direction of the track image of the recording medium projected onto the spectroscopic element. The first and second regions and the third and fourth regions have different areas, and each region spreads away from the center of the spectroscopic element. The spectroscopic element further imparts an optical action to the luminous flux so that when the luminous flux passing through each of the regions is irradiated onto the sensor unit, the shape of the luminous flux approaches a fan shape having an apex angle of 90 degrees. .

  ADVANTAGE OF THE INVENTION According to this invention, while suppressing the influence by a stray light smoothly, the optical pick-up apparatus which can suppress deterioration of the detection signal by the position shift of a sensor can be provided.

  The effects and significance of the present invention will become more apparent from the following description of embodiments. However, the following embodiment is merely an example for carrying out the present invention, and the present invention is not limited by the following embodiment.

It is a figure explaining the technical principle (convergence state of a light ray) which concerns on embodiment. It is a figure explaining the technical principle (distribution state of a light beam) which concerns on embodiment. It is a figure explaining the technical principle (distribution state of signal light and stray light) concerning an embodiment. It is a figure explaining the technical principle (separation method of a light beam) which concerns on embodiment. It is a figure which shows the arrangement | positioning method of the sensor part which concerns on embodiment. It is a figure which shows the preferable application range of the technical principle which concerns on embodiment. It is an illustration figure of the spectroscopic element based on the technical principle which concerns on embodiment. It is an illustration figure of the spectroscopic element based on the technical principle which concerns on embodiment. It is a figure which shows the irradiation area | region on the sensor part based on the technical principle which concerns on embodiment, and an illustration figure of the arithmetic circuit for producing | generating a push pull signal. It is a figure which shows the irradiation area | region on the sensor part based on the technical principle which concerns on embodiment, and a figure explaining interference with a signal light and a stray light. It is a figure which shows the simulation result of the irradiation area | region based on the technical principle which concerns on embodiment. It is a figure which shows the state which the position shift of the sensor layout based on the technical principle which concerns on embodiment produced. It is an illustration figure of the spectroscopic element based on the technical principle which concerns on embodiment, the figure explaining a lens effect, and the figure which shows the irradiation area | region on a sensor part. It is a figure which shows the simulation result of the irradiation area | region based on the technical principle which concerns on embodiment. It is a figure which shows the state which the position shift of the sensor layout based on the technical principle which concerns on embodiment produced. It is a figure which shows the optical system of the optical pick-up apparatus which concerns on an Example. It is a figure which shows the sensor layout of the photodetector which concerns on an Example. It is a figure which shows the example of a change of the spectroscopic element which concerns on an Example. It is a figure which shows the example of a change of the spectroscopic element which concerns on an Example.

  Embodiments of the present invention will be described below with reference to the drawings.

<Technical principle>
First, the technical principle applied to this embodiment will be described with reference to FIGS.

  FIG. 1 is a diagram illustrating a light beam convergence state. FIG. 6A shows a laser beam (signal light) reflected by the target recording layer, a laser beam reflected by a layer deeper than the target recording layer (stray light 1), and a layer shallower than the target recording layer. It is a figure which shows the convergence state of a laser beam (stray light 2). FIG. 4B is a diagram showing the configuration of the anamorphic lens used in this principle.

  Referring to FIG. 4B, the anamorphic lens imparts a converging action in the curved surface direction and the planar direction to the laser light incident in parallel to the lens optical axis. Here, the curved surface direction and the planar direction are orthogonal to each other. Further, the curved surface direction has a smaller radius of curvature than the planar direction, and has a large effect of converging the laser light incident on the anamorphic lens.

  Here, in order to simply explain the astigmatism action in the anamorphic lens, for the sake of convenience, they are expressed as “curved surface direction” and “planar direction”, but in reality, the action of connecting the focal lines to different positions. However, the shape of the anamorphic lens in the “planar direction” in FIG. 1B is not limited to a plane. When laser light is incident on the anamorphic lens in a convergent state, the shape of the anamorphic lens in the “plane direction” can be a straight line (curvature radius = ∞).

  Referring to FIG. 5A, the signal light converged by the anamorphic lens forms focal lines at different positions by convergence in the curved surface direction and the planar direction. The focal line position (S1) due to the convergence in the curved surface direction is closer to the anamorphic lens than the focal line position (S2) due to the convergence in the planar direction, and the convergence position (S0) of the signal light is the focal position in the curved surface direction and the planar direction. This is an intermediate position between the line positions (S1) and (S2).

  Similarly, for the stray light 1 converged by the anamorphic lens, the focal line position (M11) due to convergence in the curved surface direction is closer to the anamorphic lens than the focal line position (M12) due to convergence in the planar direction. The anamorphic lens is designed such that the focal line position (M12) due to the convergence of the stray light 1 in the plane direction is closer to the anamorphic lens than the focal line position (S1) due to the convergence of the signal light in the curved surface direction.

  Similarly, for the stray light 2 converged by the anamorphic lens, the focal line position (M21) due to convergence in the curved surface direction is closer to the anamorphic lens than the focal line position (M22) due to convergence in the plane direction. The anamorphic lens is designed so that the focal line position (M21) due to the convergence of the stray light 2 in the curved surface direction is farther from the anamorphic lens than the focal line position (S2) due to the convergence of the signal light in the planar direction.

  Further, at the convergence position (S0) between the focal line position (S1) and the focal line position (S2), the beam of signal light becomes a minimum circle of confusion.

  Considering the above, the relationship between the irradiation area of the signal light and the stray lights 1 and 2 on the surface S0 will be examined.

  Here, as shown in FIG. 2A, the anamorphic lens is divided into four regions A to D. In this case, the signal light incident on the areas A to D is distributed as shown in FIG. 2B on the surface S0. Further, the stray light 1 incident on the regions A to D is distributed on the surface S0 as shown in FIG. The stray light 2 incident on the areas A to D is distributed on the surface S0 as shown in FIG.

  Here, when the signal light and the stray lights 1 and 2 on the surface S0 are extracted for each light flux region, the distribution of each light becomes as shown in FIGS. In this case, the stray light 1 and the stray light 2 in the same light flux region do not overlap with the signal light in each light flux region. For this reason, when the light beams (signal light, stray light 1 and 2) in each light beam region are dispersed in different directions and then only the signal light is received by the sensor unit, the corresponding sensor unit receives the signal light. Only incident light can be input, and the incidence of stray light can be suppressed. Thereby, degradation of the detection signal due to stray light can be avoided.

  Thus, only the signal light can be extracted by dispersing the light passing through the regions A to D and separating them on the surface S0. The present embodiment is based on this principle.

  FIG. 4 shows a surface when the traveling directions of the light beams (signal light, stray light 1 and 2) passing through the four regions A to D shown in FIG. 2A are changed in different directions by the same angle. It is a figure which shows the distribution state of the signal light and stray light 1 and 2 on S0. 4A is a view of the anamorphic lens viewed from the optical axis direction of the anamorphic lens (the traveling direction of the laser light when the anamorphic lens is incident), and FIG. 4B is a distribution state of the signal light and stray light 1 and 2 on the surface S0. FIG.

  In FIG. 6A, the traveling directions of the light beams (signal light, stray light 1 and 2) that have passed through the regions A to D are the directions Da, Db, Dc, Dd changes by the same angular amount α (not shown). The directions Da, Db, Dc, and Dd have an inclination of 45 degrees with respect to the plane direction and the curved surface direction, respectively.

  In this case, by adjusting the angle amount α in the directions Da, Db, Dc, and Dd, the signal light and the stray lights 1 and 2 in each light flux region are distributed on the surface S0 as shown in FIG. Can do. As a result, as shown in the figure, it is possible to set a signal light region where only signal light exists on the surface S0. By arranging a plurality of sensor portions of the photodetector in this signal light region, only the signal light in each region can be received by the corresponding sensor portion.

  FIG. 5 is a diagram for explaining a method of arranging the sensor units. FIG. 4A is a diagram showing a light flux region of reflected light (signal light) from the disk, and FIG. 4B shows the arrangement position of the anamorphic lens and the surface S0 in the configuration of FIG. It is a figure which shows the distribution state of the signal beam | light on a photodetector when each arrange | positions the anamorphic lens and the photodetector (4-part dividing sensor) based on the conventional astigmatism method. FIGS. 5C and 5D are diagrams showing a signal light distribution state and a sensor layout based on the above-described principle on the surface S0.

The direction of the signal light diffraction image (track image) by the track groove has an inclination of 45 degrees with respect to the plane direction and the curved surface direction. If the direction of the track image is the left-right direction in FIG. 6A, the direction of the track image in the signal light is the vertical direction in FIGS. In FIG. 9A, for convenience of explanation, the light beam is divided into eight light beam regions a to h. FIGS. 5B and 5D correspond to the light beam regions a to h, respectively. Illumination areas a to h on the sensor unit are shown. Further, the track image is indicated by a solid line, and the beam shape at the time of off-focus is indicated by a dotted line.

  It is known that the overlapping state of the 0th-order diffraction image and the first-order diffraction image of the signal light by the track groove is obtained by wavelength / (track pitch × objective lens NA), and FIGS. , (D), the condition that the first-order diffraction image fits in the four light flux regions a, d, e, and h is wavelength / (track pitch × objective lens NA)> √2.

In the conventional astigmatism method, the sensor portions P1 to P4 (four-divided sensors) of the photodetector are set as shown in FIG. In this case, when the detection signal components based on the light intensity of the light flux areas a to h are represented by A to H, the focus error signal FE and the push-pull signal PP are
FE = (A + B + E + F) − (C + D + G + H) (1)
PP = (A + B + G + H)-(C + D + E + F) (2)
It is obtained by the operation of

  On the other hand, in the distribution state of FIG. 4B, the signal light is distributed in the state of FIG. 5C in the signal light region as described above. In this case, the signal light passing through the light beam areas a to h shown in FIG. That is, the signal light passing through the light flux areas a to h in FIG. 9A is guided to the irradiation areas a to h shown in FIG. 4D on the surface S0 on which the sensor unit of the photodetector is placed.

  Therefore, if the sensor parts P11 to P18 are arranged at the positions of the irradiation areas a to h shown in FIG. 4D, as shown in FIG. 4D, the same arithmetic processing as in FIG. Thus, a focus error signal and a push-pull signal can be generated. That is, also in this case, when the detection signals from the sensor units that receive the light beams in the light beam regions a to h are represented by A to H, the focus error signal FE and the push-pull signal PP are expressed as in the case of FIG. , And can be obtained by the calculations of the above formulas (1) and (2).

  As described above, according to the present principle, the focus error signal and the push-pull signal (tracking error signal) in which the influence of stray light is suppressed are generated by the same arithmetic processing based on the conventional astigmatism method. be able to.

  The effect of the above principle is that, as shown in FIG. 6, the position of the focal line in the plane direction of the stray light 1 is closer to the astigmatism element than the surface S0 (the surface where the spot of signal light is the minimum circle of confusion). And the focal line position of the stray light 2 in the curved surface direction can be achieved when the position is farther from the astigmatism element than the surface S0. That is, if this relationship is satisfied, the distribution of the signal light and the stray lights 1 and 2 is in the state shown in FIG. 4, and the signal light and the stray lights 1 and 2 can be prevented from overlapping on the plane S0. In other words, as long as this relationship is satisfied, the focal line position in the plane direction of the stray light 1 is closer to the plane S0 than the focal line position in the curved surface direction of the signal light, or the focal line in the plane direction of the signal light is Even if the focal line position in the curved surface direction of the stray light 2 rather than the position approaches the surface S0, the effect based on the above principle can be achieved.

<Spectroscopic element H0>
FIG. 7A is a diagram showing a configuration of the spectroscopic element H0 for changing the traveling direction of the light beam incident on the regions A to D as shown in FIG. 4A. FIG. 7A is a plan view when the spectroscopic element H0 is viewed from the anamorphic lens side shown in FIGS. 1A and 1B. FIG. 7A shows the planar direction and curved surface direction of the anamorphic lens in FIG. 1B and the direction of the track image of the laser light incident on the spectroscopic element H0.

As shown in the figure, the spectroscopic element H0 is formed of a square-shaped transparent plate, and a diffraction pattern (diffraction hologram) is formed on the light incident surface. The light incident surface of the spectroscopic element H0 is divided into four diffraction regions H0a to H0d, and the boundary line of the spectroscopic element H0 is 45 degrees with respect to the direction of the track image. The spectroscopic element H0 is arranged so that the laser beams passing through the light flux areas A to D in FIG. 4A are incident on the diffraction areas H0a to H0d, respectively. The diffraction regions H0a to H0d diffract the incident laser light in directions Da to Dd in FIG.

  FIGS. 7B and 7C are enlarged views of the left portion and the upper portion of the sensor layout in the signal light irradiation region on the light receiving surface when the spectroscopic element H0 is used. In the figure, the shapes of the sensor portions P11, P12, P13, and P14 are slightly changed for convenience in comparison with the case of FIG.

  When the spectroscopic element H0 is used, as shown in the figure, the shape of the light beam applied to each sensor unit is a fan shape having an apex angle of 90 degrees. For this reason, even if the sensor parts P11, P12, P14, and P16 are displaced to the positions indicated by the broken lines, the apex angle portions of the irradiation areas a and h are positioned without substantial gaps in the apex angle parts of the sensor parts P11 and P12. The apex angle portions of b and c are positioned without substantial gaps at the apex angle portions of the sensor portions P14 and P16. That is, even if the position of the sensor layout is displaced, the irradiation area does not protrude from the sensor unit as shown in the figure. For this reason, a decrease in the accuracy of the detection signal can be suppressed.

<Spectroscopic element H1>
The spectroscopic element H0 can be changed or improved as follows. Details of the configuration, operation, and effect of such a spectroscopic element are described in Japanese Patent Application No. 2010-222422 filed earlier by the present applicant.

  FIG. 8A shows the configuration of the spectroscopic element H1. FIG. 4A is a plan view of the spectroscopic element H1 when viewed from the anamorphic lens side shown in FIGS. FIG. 8A shows the planar direction and curved surface direction of the anamorphic lens in FIG. 1B and the direction of the track image of the laser light incident on the spectroscopic element H1. FIG. 8B is a diagram showing light flux regions a1 to h1 obtained by dividing the laser light incident on the spectroscopic element H1 into eight regions so as to correspond to the boundary lines of the diffraction regions of the spectroscopic element H1.

  Referring to FIG. 8A, the light incident surface of the spectroscopic element H1 is divided into eight diffraction regions H1a to H1h as shown in the drawing. The diffraction regions H1a, H1d, H1e, and H1h have the same area, and the diffraction regions H1b, H1c, H1f, and H1g have the same area. The diffraction regions H1b, H1c, H1f, and H1g each have a larger area than the diffraction regions H1aH1d, H1e, and H1h.

  The diffraction regions H1a to H1h diffract the incident laser light in directions Va to Vh, respectively, by a diffraction action. In the directions Va and Vh, components in the downward direction and the upward direction are slightly added from the direction Da in FIG. Similarly, in the directions Vf and Vg, components in the left direction and the right direction are slightly added from the direction Db in FIG. Further, in the directions Vb and Vc, components in the right direction and the left direction are slightly added from the direction Dc in FIG. Further, in the directions Vd and Ve, the components in the downward direction and the upward direction are slightly added from the direction Dd in FIG.

  The boundary lines between the diffraction regions H1a, H1d, H1e, and H1h of the spectroscopic element H1 and the diffraction regions H1b, H1c, H1f, and H1g have linear portions p1, p2, and p3 extending in the vertical direction. The boundary line is a straight line having an angle of 45 degrees with respect to the vertical and horizontal directions.

The spectroscopic element H1 is arranged so that the optical axis of the laser beam passes through the center, and the light beam regions a1 to h1 shown in FIG. 8B are incident on the diffraction regions H1a to H1h, respectively. At this time, the boundary line of the spectroscopic element H1 is set so that the track image included in the light beam incident on the spectroscopic element H1 is sufficiently applied to the linear portion p2 of the spectroscopic element H1. Thereby, the irradiation areas of the light flux areas a1 and d1 on the diffraction areas H1a and H1d are smaller than the irradiation areas of the light flux areas b1 and c1 on the diffraction areas H1b and H1c. Similarly, the irradiation regions of the light beam regions e1 and h1 on the diffraction regions H1e and H1h are smaller than the irradiation regions of the light beam regions f1 and g1 on the diffraction regions H1f and H1g.

  FIG. 9A shows the irradiation of signal light when the laser light passing through the light flux regions a1 to h1 in FIG. 8B is irradiated to the sensor portions P11 to P18 by the spectroscopic element H1 in FIG. It is a schematic diagram which shows an area | region. In addition, the irradiation area | region of the signal light which passes light beam area | region a1-h1 is shown as irradiation area | region a1-h1.

  As shown in FIG. 9A, the signal light passing through the light flux regions a1 to h1 is applied to the sensor portions P11, P16, P14, P17, P18, P13, P15, and P12, respectively. At this time, the stray lights 1 and 2 passing through the light flux areas a1 to h1 are irradiated outside the signal light area in substantially the same manner as in FIG.

  Further, as shown in FIG. 9A, two irradiation areas (for example, irradiation areas a1 and h1) in the apex portion of the signal light area are separated from each other by a certain distance. On the other hand, there is a predetermined gap between two sensor portions (for example, P11 and P12) arranged at each apex angle portion. The gap between the two irradiation areas at the apex angle portion is larger than the gap between the two corresponding sensor parts. As described above, the gap between the irradiation regions is set by slightly including components in the vertical direction or the horizontal direction in the directions Va to Vh.

  Even when the sensor parts P11 to P18 are displaced in the vertical and horizontal directions within the surface S0 (see FIG. 1A) due to such a gap, the irradiation areas a1 to h1 are positioned in the sensor parts P11 to P18. It becomes easy to be done. Therefore, it is possible to suppress a decrease in accuracy of detection signals of the sensor units P11 to P18 due to such positional deviation.

  When the spectroscopic element H1 is used, the push-pull signal is generated by the arithmetic circuit in FIG. In the figure, the direction of the track image is shown.

  In the arithmetic circuit, a signal obtained by adding the signals output from the sensor units P11 and P12 by the adding circuit 11 and a signal obtained by adding the signals output from the sensor units P17 and P18 by the adding circuit 12 are subtracted by the subtracting circuit 13. , Signal PP1 is generated. A signal obtained by adding the signals output from the sensor units P13 and P14 by the adder circuit 14 and a signal obtained by adding the signals output from the sensor units P15 and P16 by the adder circuit 15 are subtracted by the subtractor circuit 16, so that the signal PP2 is obtained. Generated. Further, a signal obtained by multiplying the signal PP2 by k by the multiplication circuit 17 is subtracted from the signal PP1 to generate a push-pull signal PP.

  In this arithmetic circuit, by adjusting the multiplier k of the multiplier circuit 17, it is possible to remove the DC component caused by the lens shift superimposed on the push-pull signal PP. At this time, when the spectroscopic element H1 is used, since the areas of the irradiation regions b1, c1, f1, and g1 are larger than the areas of the irradiation regions a1, d1, e1, and h1, the magnitude of the signal PP2 becomes the magnitude of the signal PP1. approach. For this reason, the multiplier k can be reduced. Since the multiplier k can be reduced in this way, it is possible to suppress the noise component from being amplified by the multiplication circuit 17 and obtain a good push-pull signal PP.

When such a spectroscopic element H1 is used, since the upper and lower track images are respectively applied to the straight line portion p2, the upper and lower track images are formed in the diffraction regions H1a and H1h and the diffraction regions H1d even if a lens shift occurs. , H1e evenly. For this reason, even if a lens shift occurs, the amplitude of the signal PP1 based on the detrack does not change, and the amplitude of the push-pull signal PP does not change according to the lens shift.

  The effect of using the spectroscopic element H1 is described in detail in the above Japanese Patent Application No. 2010-222422.

<Spectroscopic element H2>
The spectroscopic element H0 can be further changed and improved as follows.

  FIG. 8C is a diagram showing the configuration of the spectroscopic element H2. FIG. 5A is a plan view of the spectroscopic element H2 when viewed from the anamorphic lens side shown in FIGS. FIG. 8D is a diagram showing light beam regions a2 to i2 obtained by dividing the laser light incident on the spectroscopic element H2 into nine regions so as to correspond to the boundary lines of the diffraction regions of the spectroscopic element H2.

  Referring to FIG. 8C, a square diffraction region H2i is formed at the center of the spectroscopic element H2. The diffraction region H2i is set so that the laser beam incident on this region does not irradiate the sensor parts P11 to P18 but irradiates a place away from the sensor parts P11 to P18. The area of the diffraction region H2i is set so that interference between signal light and stray light, which will be described later, is effectively suppressed.

  The diffraction regions H2a, H2d, H2e, and H2h have the same area, and the diffraction regions H2b, H2c, H2f, and H2g have the same area. The diffraction regions H2b, H2c, H2f, and H2g each have a larger area than the diffraction regions H2a, H2d, H2e, and H2h. The diffraction action of the diffraction regions H2a to H2h is the same as that of the diffraction regions H1a to H1h of the spectroscopic element H1.

  The boundary line between the diffraction regions H2a, H2d, H2e, and H2h of the spectroscopic element H2 and the diffraction regions H2b, H2c, H2f, and H2g has a straight line portion p4. It is a straight line having an angle of 45 degrees.

  Further, the boundary line of the spectroscopic element H2 is set so that the track image included in the light beam incident on the spectroscopic element H2 sufficiently covers the straight line portion p4 of the spectroscopic element H2. Thereby, the irradiation areas of the light flux areas a2 and d2 on the diffraction areas H2a and H2d are smaller than the irradiation areas of the light flux areas b2 and c2 on the diffraction areas H2b and H2c. Similarly, the irradiation areas of the light flux areas e2 and h2 on the diffraction areas H2e and H2h are smaller than the irradiation areas of the light flux areas f2 and g2 on the diffraction areas H2f and H2g. The structure of the other spectroscopic element H2 is the same as that of the spectroscopic element H1.

  FIG. 10A shows the irradiation of signal light when the laser light passing through the light flux regions a2 to h2 of FIG. 8D is irradiated to the sensor portions P11 to P18 by the spectroscopic element H2 of FIG. It is a schematic diagram which shows an area | region. Also in this case, two irradiation regions (for example, irradiation regions a2 and h2) in the apex portion of the signal light region are separated from each other by a certain distance. On the other hand, there is a predetermined gap between two sensor portions (for example, P11 and P12) arranged for each apex angle. The gap between the two irradiation areas at the apex angle portion is larger than the gap between the two corresponding sensor parts. Thereby, like the spectroscopic element H1, even if position shift arises in the sensor parts P11-P18, the fall of the precision of a detection signal can be suppressed.

Further, even when the spectroscopic element H2 is used, the offset (DC component) of the push-pull signal PP is effectively suppressed by using the arithmetic circuit of FIG. 9B, as in the case where the spectroscopic element H1 is used. A good push-pull signal PP can be obtained.

  Further, when the spectroscopic element H2 is used, the following effects can be obtained.

  FIG. 10B is an enlarged schematic diagram showing an irradiation region in the vicinity of the sensor portions P11 and P12.

  As shown, stray light 1 passing through the light flux areas a2 and h2 is irradiated near the lower left of the sensor part P11 as shown by a broken line, and light flux areas a2 and h2 are shown near the upper left of the sensor part P12 as shown by a broken line. The stray light 2 passing through is irradiated.

  Here, when the boundary line of the central portion of the spectroscopic element H2 is formed in an X shape as shown in FIG. 18C, a part of the laser light passing through the light beam region i2 is shown in the hatched portion of FIG. Irradiated. That is, the stray light 1 is also irradiated to the diagonally shaded portions on the upper side of the irradiation areas a2 and h2 of the stray light 1, the stray light 2 is also irradiated to the diagonally shaded parts on the lower side of the irradiation areas a2 and h2 of the stray light 2 and the signal Signal light is also irradiated to the shaded portion of the triangle on the left side of the light irradiation areas a2 and h2. In this case, since the signal light and the stray light are adjacent to each other, interference is likely to occur, and the detection signals of the sensor units P11 and P12 may be deteriorated. However, according to the spectroscopic element H2 shown in FIG. 8C, since the stray light in the hatched portion is removed by the diffraction region H2i, it becomes difficult for the signal light and the stray light to interfere with each other, and deterioration of the detection signal can be suppressed. .

  FIG. 10C is an enlarged schematic diagram showing an irradiation region in the vicinity of the sensor portions P14 and P16.

  As illustrated, stray light 1 passing through the light flux regions b2 and c2 is irradiated near the upper right of the sensor unit P16 as shown by a broken line, and light flux regions b2 and c2 are shown near the upper left of the sensor unit P14 as shown by a broken line. Stray light 2 passing through the signal light is irradiated, and the signal light is also irradiated to the hatched portion of the triangle above the signal light irradiation areas b2 and c2.

  Also in this case, similarly to the case of FIG. 10B, the stray light in the hatched portion is removed by the diffraction region H2i, so that the signal light and the stray light are difficult to interfere with each other, and the deterioration of the detection signal can be suppressed. Similarly, the signal light and stray light are less likely to interfere with each other in the irradiation regions in the vicinity of the sensor units P13 and 15 and the sensor units P17 and P18, and deterioration of the detection signal can be suppressed.

  FIG. 11 is a diagram showing a simulation result of the irradiation region of the signal light on the sensor layout when the spectroscopic element H2 is used. FIGS. 4A to 4D are enlarged views of the left side portion, the upper side portion, the right side portion, and the lower side portion of the sensor layout, respectively. In addition, the shape of sensor part P11-P18 is the same shape as sensor part B1-B8 of the Example mentioned later for convenience.

  As shown in FIGS. 11A to 11D, the signal light irradiation regions a <b> 2 to h <b> 2 are positioned on the sensor unit. In addition, the two irradiation areas in each figure are positioned with a gap between the sensor parts, and the interval between the two irradiation areas is larger than the gap between the sensor parts. As a result, as described above, even when the sensor units P11 to P18 are displaced in the vertical and horizontal directions within the surface S0 (see FIG. 1A), a decrease in accuracy of the detection signal is suppressed.

  FIG. 12 is a diagram illustrating a state in which the position of the sensor layout has shifted from the state illustrated in FIG. FIGS. 4A to 4D show cases where the position of the sensor layout is shifted by a predetermined amount in the right direction, the downward direction, the left direction, and the upward direction, respectively. Note that the position of the sensor layout in the case where no positional deviation has occurred is indicated by a broken line.

  As shown in FIG. 12A, when the position of the sensor layout is shifted by a predetermined amount in the right direction, the portions indicated by dotted lines of the signal light irradiation areas a2 and h2 are the lower left of the sensor part P11 and the sensor part P12, respectively. It protrudes from the upper left. Similarly, as shown in FIGS. 12B to 12D, when the position of the sensor layout is shifted by a predetermined amount, the portion indicated by the dotted line of the signal light irradiation region protrudes from the sensor portion. Thus, if the position of the sensor layout is shifted by a predetermined amount, the irradiation area of the signal light may protrude from the sensor unit, and the accuracy of the detection signal may be reduced.

<Spectroscopic element H3>
In order to solve the problem described with reference to FIG. 12, the following configuration can be used. This configuration is one embodiment of the present invention.

  FIG. 13A shows the configuration of the spectroscopic element H3. FIG. 4A is a plan view when the spectroscopic element H3 is viewed from the anamorphic lens side shown in FIGS. FIG. 13B is a diagram showing light beam regions a3 to i3 obtained by dividing the laser light incident on the spectroscopic element H3 into nine regions so as to correspond to the boundary lines of the diffraction regions of the spectroscopic element H3.

  The shape and area of the diffraction regions H3a to H3i of the spectroscopic element H3 are the same as the diffraction regions H2a to H2i of the spectroscopic element H2. The region composed of the diffraction regions H3b and H3c and the region composed of the diffraction regions H3f and H3g correspond to the “first region” and the “second region” in claim 1, respectively, and are composed of the diffraction regions H3a and H3h. The region and the region composed of the diffraction regions H3d and H3e correspond to the “third region” and the “fourth region” in claim 1, respectively. The diffraction region H3i corresponds to a “fifth region” in claim 3.

  The spectroscopic element H3 is different from the spectroscopic element H2 only in that a lens effect is given to the diffraction regions H3a to H3h. Hereinafter, for convenience, only the lens effect imparted to the diffraction regions H3a to H3h will be described.

  FIG. 13C conceptually shows the lens effect of the diffraction regions H3a to H3h of the spectroscopic element H3. The diffraction regions H3a and H3d are in the upward direction, the diffraction regions H3e and H3h are in the downward direction, the diffraction regions H3b and H3g are in the left direction, and the diffraction regions H3c and H3f are in the right direction. Give. The diffraction regions H3b, H3c, H3f, and H3g are set so that the lens effect increases as the distance from the center of the spectroscopic element H3 increases in the left-right direction. The diffraction regions H3a, H3d, H3e, and H3h are set so that the lens effect decreases as the distance from the center of the spectroscopic element H3 increases in the vertical direction. Such a lens effect is realized by giving a square term to the phase function representing the diffraction action of the diffraction regions H3a to H3h.

  FIG. 13D shows the irradiation of the signal light when the laser light passing through the light beam regions a3 to h3 of FIG. 13B is irradiated to the sensor units P11 to P18 by the spectroscopic element H3 of FIG. It is a schematic diagram which shows an area | region. Also in this case, similarly to the spectroscopic element H2 shown in FIG. 10A, the signal light passing through the light flux regions a3 to h3 is irradiated to each sensor unit, and the stray lights 1 and 2 passing through the light flux regions a3 to h3 are shown in FIG. As in (b), the light is irradiated outside the signal light region.

  Similarly to the spectroscopic element H2, the irradiation areas a3, d3, e3, and h3 have a smaller area than the irradiation areas b3, c3, f3, and g3. By adjusting the area and providing the straight line portion p4 at the boundary line of the diffraction region of the spectroscopic element H3, the offset (DC component) of the push-pull signal PP due to lens shift can be effectively suppressed. Further, since stray light incident on the diffraction region H3i is removed, similarly to the spectroscopic element H2, the signal light and stray light are less likely to interfere (see FIGS. 10B and 10C), and the deterioration of the detection signal is suppressed. Can be done.

  Furthermore, in the spectroscopic element H3, the following effects can be produced by the lens effect shown in FIG.

  FIG. 14 is a diagram showing a simulation result of the irradiation region of the signal light on the sensor layout when the spectroscopic element H3 is used. FIGS. 4A to 4D are enlarged views of the left side portion, the upper side portion, the right side portion, and the lower side portion of the sensor layout, respectively.

  As shown in FIGS. 14A to 14D, the signal light irradiation regions a3 to h3 are positioned on the sensor unit. In addition, the two irradiation areas in each figure are positioned with a gap between the sensor portions interposed therebetween.

  The two irradiation areas in each figure are closer to one end as shown by the arrows in the figure than in FIGS. 11A to 11D due to the lens effect. That is, due to the lens effect, the irradiation regions a3 and h3 are brought closer to each other as the apex angle of the signal light region is approached from the center of the sensor layout, and the irradiation regions d3 and e3 are made closer to the top of the signal light region from the center of the sensor layout. As you get closer to the corner, you get closer to each other. Further, the irradiation regions b3 and c3 are brought closer to each other as they approach the center of the sensor layout from the apex angle of the signal light region, and the irradiation regions f3 and g3 become closer to the center of the sensor layout from the apex angle of the signal light region. Get close to each other.

  Due to the lens effect, the shape of the region surrounding both of the irradiation regions a3 and h3, the shape of the region surrounding both of the irradiation regions d3 and e3, the shape of the region surrounding both of the irradiation regions b3 and c3, and the irradiation region f3 , G3, the shape of the region surrounding both of them is closer to a fan shape with an apex angle of 90 degrees compared to the case where the spectroscopic element H3 is used.

  FIG. 15 is a diagram illustrating a state in which the position of the sensor layout has shifted from the state illustrated in FIG. FIGS. 4A to 4D show cases where the position of the sensor layout is shifted by a predetermined amount in the right direction, the downward direction, the left direction, and the upward direction, respectively. Note that the positional deviation amounts in FIGS. 12A to 12D are the same as the positional deviation amounts in FIGS.

  As shown in FIG. 15A, even if the position of the sensor layout is shifted to the same extent as in FIG. 12A, the irradiation areas a3 and h3 do not protrude from the sensor portions P11 and P12. Further, as shown in FIG. 15C, even if the position of the sensor layout is shifted to the same extent as in FIG. 12C, the irradiation areas d3 and e3 do not protrude from the sensor portions P17 and P18. Further, as shown in FIG. 15B, even if the position of the sensor layout is shifted to the same extent as in FIG. 12B, as shown in the dotted line region, the irradiation areas b3 and c3 are compared with those in FIG. The amount that protrudes is reduced. Further, as shown in FIG. 15 (d), even if the position of the sensor layout is shifted to the same extent as in FIG. 12 (d), as shown in the dotted line region, the irradiation areas f3 and g3 are compared with those in FIG. The amount that protrudes is reduced. Thereby, even when the position of the sensor layout is shifted, a decrease in the accuracy of the detection signal due to the irradiation area of the signal light protruding from the sensor layout can be suppressed as compared with the spectroscopic element H2.

  As described above, when the spectroscopic element H3 is used, the shape of the region including the two signal lights irradiated to the positions of the respective vertex angles of the signal light region by the lens effect shown in FIG. Since it approaches a fan shape of 90 degrees, even if a positional deviation occurs in the sensor parts P11 to P18 as shown in FIG. 15, each irradiation area does not easily protrude from the sensor parts P11 to P18 due to the positional deviation, and the accuracy of the detection signal Is suppressed.

Note that when the spectroscopic element H3 is used, adjacent irradiation regions approach each other due to the lens effect. For this reason, even if the sensor parts P11 and P12 and the sensor parts P17 and P18 are slightly displaced in the vertical direction from the states of FIGS. 15A and 15C, the detection signals of the sensor parts P11, P12, P17 and P18 are not detected. Change. Similarly, even if the sensor units P14 and P16 and the sensor units P13 and P15 are slightly displaced in the left-right direction from the states of FIGS. 15B and 15D, the detection signals of the sensor units P13 to P16 change. Therefore, it is possible to adjust the position of the photodetector smoothly and appropriately by referring to the detection signals of the sensor units P11 to P18.

  In the following examples, specific configuration examples of an optical pickup device using the spectral element H3 are shown.

<Example>
In this embodiment, the present invention is applied to a compatible optical pickup device that can handle BD, DVD, and CD. The above principle is applied only to the optical system for BD, and the focus adjustment technique by the conventional astigmatism method and the tracking adjustment technique by the three beam method (in-line method) are applied to the optical system for CD and the optical system for DVD. Has been applied.

  FIGS. 16A and 16B are diagrams illustrating an optical system of the optical pickup device according to the present embodiment. FIG. 16A is a plan view of the optical system in which the configuration on the disk side of the rising mirrors 114 and 115 is omitted, and FIG. 16B is a perspective view of the optical system after the rising mirrors 114 and 115 from the side. FIG.

  As illustrated, the optical pickup device includes a semiconductor laser 101, a half-wave plate 102, a diverging lens 103, a two-wavelength laser 104, a diffraction grating 105, a diverging lens 106, a composite prism 107, Front monitor 108, collimator lens 109, drive mechanism 110, reflection mirrors 111 and 112, quarter-wave plate 113, rising mirrors 114 and 115, two-wavelength objective lens 116, and BD objective lens 117 , A spectroscopic element H3, an anamorphic lens 118, and a photodetector 119.

  The semiconductor laser 101 emits BD laser light (hereinafter referred to as “BD light”) having a wavelength of about 405 nm. The half-wave plate 102 adjusts the polarization direction of the BD light. The diverging lens 103 adjusts the focal length of the BD light so as to shorten the distance between the semiconductor laser 101 and the composite prism 107.

  The two-wavelength laser 104 has two laser elements that respectively emit a laser beam for CD having a wavelength of about 785 nm (hereinafter referred to as “CD light”) and a laser beam for DVD having a wavelength of about 660 nm (hereinafter referred to as “DVD light”). Are accommodated in the same CAN.

  FIG. 16C is a diagram showing an arrangement pattern of laser elements (laser light sources) in the two-wavelength laser 104. FIG. 2C shows the two-wavelength laser 104 as viewed from the beam emission side. In FIG. 5C, CE and DE indicate the light emission points of CD light and DVD light, respectively. The gap between the emission points of CD light and DVD light is G.

  Note that the gap G between the light emission point CE of the CD light and the light emission point DE of the DVD light is set so that the DVD light is appropriately applied to the four-divided sensor for DVD light, as will be described later. Thus, by accommodating two light sources in the same CAN, the optical system can be simplified as compared with the configuration of a plurality of CAN.

Returning to FIG. 16A, the diffraction grating 105 divides CD light and DVD light into a main beam and two sub beams, respectively. The diffraction grating 105 is a two-step step type diffraction grating. The diffraction grating 105 is integrated with a half-wave plate. The polarization direction of the CD light and the DVD light is adjusted by the integrated half-wave plate. Diverging lens 1
06 adjusts the focal lengths of the CD light and the DVD light so as to shorten the distance between the two-wavelength laser 104 and the composite prism 107.

The composite prism 107 has a dichroic surface 107a and a PBS (Polarizing Beam Splitter) surface 107b inside. The dichroic surface 107a reflects BD light and transmits CD light and DVD light. The semiconductor laser 101, the two-wavelength laser 104, and the composite prism 107 are arranged so that the optical axis of the BD light reflected by the dichroic surface 107a and the optical axis of the CD light transmitted through the dichroic surface 107a are aligned with each other. The optical axis of the DVD light transmitted through the dichroic surface 107a is shifted from the optical axis of the BD light and the CD light by a gap G shown in FIG.

  Each of the BD light, the CD light, and the DVD light is reflected by the PBS surface 107b, and most of the light passes through the PBS surface 107b. In this way, the half-wave plate 102 and the diffraction grating 105 (integrated half-wave plate) are arranged so that a part of the BD light, CD light, and DVD light is reflected by the PBS surface 107b. .

  When the diffraction grating 105 is arranged in this way, the main beam and two sub beams of the CD light, and the main beam and two sub beams of the DVD light are along the tracks of the CD and DVD, respectively. The main beam and the two sub beams of the CD light reflected by the CD are applied to a four-divided sensor for CD on a photodetector 119 described later. The DVD main beam and the two sub-beams reflected by the DVD are applied to a DVD quadrant sensor on a photodetector 120 described later.

  The front monitor 108 is irradiated with the BD light, CD light, and DVD light reflected by the PBS surface 107b. The front monitor 108 outputs a signal corresponding to the amount of received light. A signal from the front monitor 108 is used for output power control of the semiconductor laser 101 and the two-wavelength laser 104.

  The collimator lens 109 converts BD light, CD light, and DVD light incident from the side of the composite prism 107 into parallel light. The drive mechanism 110 moves the collimating lens 109 in the optical axis direction according to the control signal when correcting the aberration. The driving mechanism 110 includes a holder 110a that holds the collimating lens 109, and a gear 110b that sends the holder 110a in the optical axis direction of the collimating lens 109. The gear 110b is connected to a driving shaft of the motor 110c.

  The BD light, CD light, and DVD light that have been converted into parallel light by the collimator lens 109 are reflected by the two reflecting mirrors 111 and 112 and enter the quarter-wave plate 113. The quarter-wave plate 113 converts BD light, CD light, and DVD light incident from the reflection mirror 112 side into circularly polarized light, and reflects BD light, CD light, and DVD light incident from the rising mirror 114 side. The light is converted into linearly polarized light orthogonal to the polarization direction when entering from the mirror 112 side. Thereby, the reflected light from the disk is reflected by the PBS surface 107b.

  The rising mirror 114 is a dichroic mirror that transmits BD light and reflects CD light and DVD light in a direction toward the two-wavelength objective lens 116. The rising mirror 115 reflects BD light in a direction toward the BD objective lens 117.

The two-wavelength objective lens 116 is configured to properly converge the CD light and the DVD light with respect to the CD and the DVD, respectively. The BD objective lens 117 is configured to properly converge the BD light onto the BD. The two-wavelength objective lens 116 and the BD objective lens 117 are held by the holder 131 and are moved by the objective lens actuator 132.
Driven in focus and tracking directions.

  The spectroscopic element H3 is the spectroscopic element shown in FIG. Of the BD light, CD light, and DVD light incident on the spectroscopic element H3, the BD light is divided into eight light fluxes, and the traveling direction of each light flux is changed by the diffractive action of the spectroscopic element H3. Most of the CD light and the DVD light pass through the spectroscopic element H3 without being diffracted by the spectroscopic element H3.

  The spectroscopic element H3 is formed of a square-shaped transparent plate, and a step-type diffraction pattern (diffraction hologram) is formed on the light incident surface. The number of steps and the step height of the diffraction pattern are set so that the + 1st order diffraction efficiency with respect to the wavelength of the BD light is increased and the 0th order diffraction efficiency with respect to the wavelengths of the CD light and the DVD light is increased. The diffraction angle is adjusted by the pitch of the diffraction pattern.

  The diffraction regions H3a to H3i of the spectroscopic element H3 are, for example, 8-step diffraction patterns. In this case, the step per step is set to 7.35 μm. Thereby, the diffraction efficiency of the 0th-order diffracted light of the CD light and the DVD light can be set to 99% and 92%, respectively, while the diffraction efficiency of the + 1st-order diffracted light of the BD light is 81%. In this case, the 0th-order diffraction efficiency of BD light is 7%. The CD light and the DVD light are irradiated to a quadrant sensor, which will be described later, on the photodetector 119 without being substantially diffracted by the diffraction regions H3a to H3i.

  Note that the number of steps of the diffraction pattern arranged in the diffraction regions H3a to H3i can be set to other steps. In addition, the diffraction regions H3a to H3i can be configured using, for example, the technique described in JP-A-2006-73042. If this technique is used, the diffraction efficiency with respect to BD light, CD light, and DVD light can be adjusted further finely.

  The anamorphic lens 118 introduces astigmatism into the BD light, CD light, and DVD light incident from the spectroscopic element H3 side. The anamorphic lens 118 corresponds to the anamorphic lens shown in FIGS. The BD light, CD light, and DVD light transmitted through the anamorphic lens 118 enter the photodetector 119. The photodetector 119 has a sensor layout for receiving each light.

  FIG. 17 is a diagram illustrating a sensor layout of the photodetector 119.

  The photodetector 119 includes BD sensor units B1 to B8 that receive the BD light separated by the spectroscopic element H3, and the CD sensor that receives the CD light that is transmitted by the spectroscopic element H3 without being separated by the spectroscopic element H3. There are four-divided sensors C01 to C03, and four-divided sensors D01 to D03 for DVD that receive DVD light transmitted through the spectroscopic element H3 without being separated by the spectroscopic element H3. The signal light of the BD light separated by the spectroscopic element H3 is applied to the apex portion of the signal light region.

  As shown in the figure, sensor portions B1, B2, sensor portions B3, B5, and sensors are respectively provided near the four apex angles of the signal light region so that the signal light of the BD light passing through the light flux regions a to h can be received. Parts B4 and B6 and sensor parts B7 and B8 are arranged. The sensor parts B1 to B8 are arranged so as to sufficiently include the irradiation area of the BD light irradiated on the inner side of the four apex portions of the signal light area. Accordingly, even when the sensor units B1 to B8 are misaligned due to aging or the like, the sensor units B1 to B8 can sufficiently receive the signal light separated by the spectroscopic element H3. The irradiation area of the signal light of the BD light on the sensor parts B1 to B8 is substantially the same as the irradiation area on the sensor parts P11 to P18 shown in FIG.

Since the optical axes of the BD light and the CD light are aligned by the dichroic surface 107a as described above, the main beam (0th order diffracted light) of the CD light is on the light receiving surface of the photodetector 119.
The center of the signal light region of BD light is irradiated. The quadrant sensor C01 is arranged at the center position of the main beam of CD light. The quadrant sensors C02 and C03 are arranged on the light receiving surface of the photodetector 119 in the direction of the track image with respect to the main beam so as to receive the sub beam of CD light.

  Since the optical axis of the DVD light is deviated from the optical axis of the CD light as described above, the main beam of the DVD light and the two sub beams are separated from the main beam of the CD light on the light receiving surface of the photodetector 119. Irradiated to a position deviated from one sub-beam. The four-divided sensors D01 to D03 are respectively arranged at the irradiation positions of the main beam and two sub beams of DVD light. The distance between the main beam of CD light and the main beam of DVD light is determined by the gap G between the light emission points of CD light and DVD light shown in FIG.

  As described above, according to the present embodiment, the signal light irradiation region of the BD light is distributed inside the four apex portions of the signal light region as shown in FIG. The irradiation region 2 is distributed outside the signal light region in substantially the same manner as in the state shown in FIG. Therefore, only the signal light of the BD light can be received by the sensor parts B1 to B8 shown in FIG. Thereby, deterioration of the detection signal due to stray light can be suppressed.

  Further, according to the present embodiment, as shown in FIG. 13 (d) or FIGS. 14 (a) to (d), the outer portions of the two irradiation regions distributed respectively in the four apex portions of the signal light region, or The inner portions are separated from each other with a gap between the corresponding two sensor portions. Thereby, even if position shift arises in sensor part B1-B8, the detection signal of sensor part B1-B8 becomes difficult to deteriorate. In addition, the outer part or the inner part of the two irradiation regions respectively distributed in the four apex angle portions of the signal light region are close to each other with a gap between the corresponding two sensor portions interposed therebetween. Thereby, by referring to the detection signals of the sensor parts B1 to B8, the position of the sensor parts B1 to B8 in the surface S0 can be adjusted, and the sensor parts B1 to B8 can be properly installed. Become.

  Further, according to the present embodiment, as shown in FIG. 13D, the diffraction regions H3b, H3c, H3f, and H3g are set to have larger areas than the diffraction regions H3a, H3d, H3e, and H3h, and the diffraction regions The boundary line between H3a and H3b, the boundary line between the diffraction regions H3c and H3d, the boundary line between the diffraction regions H3e and H3f, and the boundary line between the diffraction regions H3g and H3h include a straight line portion p4. Thereby, the offset (DC component) of the push-pull signal PP due to the lens shift can be effectively suppressed. Further, since the laser light incident on the diffraction region H3i is not irradiated on the sensor portions B1 to B8, the signal light and the stray light are less likely to interfere with each other, and the detection signal can be prevented from deteriorating.

  In addition, according to the embodiment of the present invention, even when the position of the sensor layout is shifted as shown in FIGS. 15A to 15D, the irradiation area is difficult to protrude from the sensor unit. The decrease can be suppressed.

  In this example, the lens effect imparted to the diffraction regions H3a to H3h was set as shown in FIG. Such a lens effect is set so that the shape of the irradiation region on the sensor parts B1 to B8 can be efficiently accommodated on the sensor parts B1 to B8. That is, the lens effect may be set as appropriate so that the shape of the irradiation areas a3 to h3 approaches a sector shape having an apex angle of 90 degrees in accordance with the apex angles of the sensor units B1 to B8.

  As mentioned above, although the Example of this invention was described, this invention is not restrict | limited to the said Example at all, Moreover, a various change is possible for the Example of this invention besides the above.

For example, in the above embodiment, the BD light is dispersed using the spectroscopic element H3 having the diffraction pattern formed on the light incident surface. Instead, the BD light is separated using a spectroscopic element composed of a prism having a plurality of surfaces. The light may be dispersed. On the incident surface of such a prism, eight curved surfaces corresponding to the diffraction regions H3a to H3h of the spectroscopic element H3 and one surface corresponding to the diffraction region H3i are formed. The light incident on the curved surfaces corresponding to the diffraction regions H3a to H3h is refracted in the directions Va to Vh in FIG. 13A and has the lens effect shown in FIG. Light incident on the surface corresponding to the diffraction region H3i does not enter the sensor portions B1 to B8. Thereby, as in the case of using the spectroscopic element H3, the signal light of the BD light is irradiated on the light receiving surface as shown in FIG.

  In the case of using the spectroscopic element composed of the prism, the optical system for receiving BD light and the optical system for receiving CD light and DVD light can be configured separately. That is, the BD light is guided to the BD objective lens 117 shown in FIG. 16B by the optical system for BD, and the two-wavelength objective lens 116 is for CD / DVD different from the optical system for BD. CD light and DVD light are guided by the optical system. The optical system for BD has a laser light source that emits BD light and one photodetector that receives the BD light reflected by the BD, and the optical system for CD / DVD emits CD light and DVD light. A laser light source and a photodetector different from the photodetector for BD light that receives CD light and DVD light reflected by the CD and DVD are included. The photodetector for CD / DVD has two sensor groups that individually receive CD light and DVD light. The BD optical system includes an anamorphic lens that introduces astigmatism into the BD light reflected by the BD, as in the above embodiment. The spectroscopic element comprising the prism is disposed, for example, in the front stage of the anamorphic lens.

  In the above-described embodiment, the spectroscopic element H3 is disposed at the front stage of the anamorphic lens 118. However, the spectroscopic element H3 may be disposed at the rear stage of the anamorphic lens 118, or may be disposed on the incident surface or the exit surface of the anamorphic lens 118. A diffraction pattern that imparts the same diffractive action as that of H3 to the laser light may be integrally arranged.

  In the above embodiment, the diffractive region H3i is formed at the center of the spectroscopic element H3. However, instead of this, a light blocking region for blocking incident laser light may be formed. In this case, the optical system for receiving BD light and the optical system for receiving CD light and DVD light can be configured separately.

  Moreover, it can replace with the spectroscopic element H3, and can also use the spectroscopic element which added the lens effect of FIG.13 (c) to the spectroscopic element H1 of Fig.8 (a).

  Further, in place of the spectroscopic element H3 of the above embodiment, spectroscopic elements H4 to H9 shown in FIGS. 18A to 18C and FIGS. 19A to 19D may be used. The plane direction and curved surface direction of the anamorphic lens in each figure and the direction of the track image of the BD light incident on each spectroscopic element are the same as those shown in FIG. Moreover, the lens effect similar to the corresponding diffraction area | region of FIG.13 (c) is provided to each diffraction area | region of the spectroscopic elements H4-H9.

  Referring to FIG. 18A, the spectroscopic element H4 is a spectroscopic element in which a diffraction region H3i formed at the center of the spectroscopic element H3 is expanded in the lateral direction. As shown in the figure, the boundary lines near the center and the outer edge of the spectroscopic element H4 form an angle of 45 degrees with the vertical and horizontal directions as in the spectroscopic element H3. Also in this case, the same effect as the spectral element H3 is achieved.

Referring to FIG. 18B, the spectroscopic element H5 is a spectroscopic element in which a diffraction region H4i formed at the center of the spectroscopic element H4 is modified. As shown in the figure, the diffraction region H5i has a shape in which a region having a predetermined width is added to a square region in a direction of 45 degrees from the upper right, upper left, lower left, and lower right. Also in this case, the same effect as the spectral element H3 is achieved. The diffraction region H
The region extending in the 45 degree direction of 5i makes it difficult for signal light and stray light to be superimposed on the surface S0, and it becomes easy to distinguish the target recording layer from among the plurality of recording layers, which becomes the target. The focal position of the laser beam can be quickly adjusted with respect to the recording layer.

  Referring to FIG. 18C, in the spectroscopic element H6, in the spectroscopic element H3, a boundary line passing through the center is formed in an X shape instead of the diffraction area H3i, and the diffraction areas H3a to H3h are expanded to the center. It is a spectroscopic element. In this case, unlike the spectroscopic element H3, the spectroscopic element H6 is not formed with a diffraction region near the center, so that it is difficult to suppress interference between signal light and stray light. However, even if the sensor layout is misaligned, the signal light irradiation areas on the sensor units B1 to B8 are easily positioned on the sensor units B1 to B8.

  Referring to FIG. 19A, the spectroscopic element H7 is a spectroscopic element in which the diffraction area of the spectroscopic element H6 is four. The diffraction regions H7a to H7d of the spectroscopic element H7 diffract the incident laser light in directions Da to Dd in FIG. The lens effects of the diffraction regions H7a to H7d are set in the same manner as the spectroscopic element H3 as shown in FIG. However, in the spectroscopic element H7, since there is no boundary line of the diffraction region in the broken line portion in the drawing, the diffraction regions H7a to H7d are smoothly changed so that the lens effect of two regions adjacent to each other in the broken line portion changes smoothly in the broken line region. Composed.

  In this case, since the signal light passing through the diffraction regions H7a to H7d becomes four irradiation regions on the sensor units B1 to B8, when the position of the sensor layout is shifted in the vertical direction and the horizontal direction, the sensor unit B1, The accuracy of the detection signals of B2, B7, B8 and the sensor units B3 to B6 is likely to be lowered. However, when the position of the sensor layout is shifted in the vertical direction and the horizontal direction, a decrease in accuracy of detection signals of the sensor units B3 to B6 and the sensor units B1, B2, B7, and B8 can be suppressed.

  Referring to FIG. 19C, the spectroscopic element H8 is a spectroscopic element in which the boundary line of the spectroscopic element H7 is further simplified. The boundary lines of the diffraction regions H8a to H8d of the spectroscopic element H8 form an angle larger than 45 degrees with respect to the horizontal straight line. Referring to FIG. 19D, the spectroscopic element H9 is a spectroscopic element in which the boundary line of the spectroscopic element H8 is curved. In the spectroscopic elements H8 and H9, the lens effect is set similarly to the spectroscopic element H7, and the same effect as the spectroscopic element H7 can be obtained.

  In addition, the embodiment of the present invention can be variously modified as appropriate within the scope of the technical idea shown in the claims.

101 ... Semiconductor laser (laser light source)
117 ... BD objective lens (objective lens)
118 ... Anamo lens (astigmatism element)
119 ... Photodetector B1 to B8 ... Sensor unit H3 to H9 ... Spectroscopic element H3b, H3c, H3f, H3g ... Diffraction region (first and second regions)
H3a, H3d, H3e, H3h ... Diffraction region (third and fourth regions)
H3i: Diffraction region (fifth region)
H4b, H4c, H4f, H4g ... Diffraction region (first and second regions)
H4a, H4d, H4e, H4h ... Diffraction region (third and fourth regions)
H4i: Diffraction region (fifth region)
H5b, H5c, H5f, H5g ... Diffraction region (first and second regions)
H5a, H5d, H5e, H5h ... Diffraction region (third and fourth regions)
H5i: Diffraction region (fifth region)
H6b, H6c, H6f, H6g ... Diffraction region (first and second regions)
H6a, H6d, H6e, H6h ... Diffraction region (third and fourth regions)
H7b, H7c ... Diffraction region (first and second regions)
H7a, H7d: Diffraction region (third and fourth regions)
H8b, H8c ... Diffraction region (first and second regions)
H8a, H8d ... Diffraction region (third and fourth regions)
H9b, H9c ... Diffraction region (first and second regions)
H9a, H9d: Diffraction region (third and fourth regions)

Claims (6)

  1. A laser light source;
    An objective lens for converging the laser light emitted from the laser light source onto a recording medium;
    The laser light reflected by the recording medium is incident, the laser light is converged in a first direction to generate a first focal line, and a second perpendicular to the first direction is generated. An astigmatism element that converges the laser beam in a direction to generate a second focal line;
    A spectroscopic element that makes the laser light reflected by the recording medium incident thereon, makes the traveling directions of the light beams incident on the first to fourth regions different from each other, and makes the four light beams discrete from each other;
    A photodetector that includes a sensor unit and that receives each of the light fluxes separated by the sensor unit and outputs a detection signal;
    When the intersecting point of the first and second straight lines that are parallel to and cross each other in the first direction and the second direction are aligned with the center of the spectroscopic element, the first and second straight lines are formed. The first and second regions are arranged in the direction in which the set of vertical angles are arranged, and the third and fourth regions are arranged in the direction in which the other set of vertical angles are arranged,
    The astigmatism element is arranged so that the direction in which the first and second regions are arranged is parallel to the direction of the track image of the recording medium projected onto the spectroscopic element,
    The first and second regions and the third and fourth regions have different areas, and each region spreads away from the center of the spectroscopic element,
    The spectroscopic element further imparts an optical action to the luminous flux so that when the luminous flux passing through each of the regions is irradiated onto the sensor unit, the shape of the luminous flux approaches a fan shape having an apex angle of 90 degrees. ,
    An optical pickup device characterized by that.
  2. The optical pickup device according to claim 1,
    The first and second regions have a larger area than the third and fourth regions,
    The optical action imparted to the light beam passing through the first and second regions includes a converging action in two directions parallel to the direction in which the first and second regions are arranged, and the first and second regions. Is set so as to increase as the distance from the center of the spectroscopic element increases, and the optical action imparted to the light beam passing through the third and fourth regions is determined by the third and fourth. Including the convergence action in two directions parallel to the direction in which the regions are arranged, the convergence action in the two directions by the third and fourth regions is set so as to increase as it approaches the center of the spectroscopic element.
    An optical pickup device characterized by that.
  3. The optical pickup device according to claim 1 or 2,
    A fifth region is further arranged in the central portion of the spectroscopic element,
    The light flux passing through the fifth region is not irradiated on the sensor unit.
    An optical pickup device characterized by that.
  4. In the optical pick-up device according to any one of claims 1 to 3,
    The first and second regions are each divided into two in a direction perpendicular to the track image to form four divided regions,
    The third and fourth regions are each divided into two in a direction parallel to the track image to form four divided regions,
    In the spectroscopic element, a light beam portion passing through the two divided regions of the first region is separated on the photodetector, and a light beam portion passing through the two divided regions of the second region is the photodetector. A light beam portion that is spaced above and passes through the two divided regions of the third region is spaced apart on the photodetector, and a light beam portion that passes through the two divided regions of the fourth region is the photodetector. To be spaced apart above
    Changing the traveling direction of the light beam portion passing through each of the divided regions;
    An optical pickup device characterized by that.
  5. The optical pickup device according to any one of claims 1 to 4,
    A boundary between the first and second regions and the third and fourth regions includes a straight line portion parallel to an arrangement direction of the third and fourth regions;
    An optical pickup device characterized by that.
  6. The optical pickup device according to any one of claims 1 to 5,
    In the spectroscopic element, the four light beams travel so that the light beams that have passed through the first to fourth regions are respectively guided to four vertex angles in different rectangular shapes on the light receiving surface of the photodetector. Changing the direction in a direction of 45 degrees with respect to the first and second directions and by a predetermined angle;
    An optical pickup device characterized by that.
JP2011038890A 2011-02-24 2011-02-24 Optical pickup device Withdrawn JP2014089777A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2011038890A JP2014089777A (en) 2011-02-24 2011-02-24 Optical pickup device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011038890A JP2014089777A (en) 2011-02-24 2011-02-24 Optical pickup device
PCT/JP2011/075385 WO2012114583A1 (en) 2011-02-24 2011-11-04 Light pick-up device

Publications (1)

Publication Number Publication Date
JP2014089777A true JP2014089777A (en) 2014-05-15

Family

ID=46720387

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2011038890A Withdrawn JP2014089777A (en) 2011-02-24 2011-02-24 Optical pickup device

Country Status (2)

Country Link
JP (1) JP2014089777A (en)
WO (1) WO2012114583A1 (en)

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5173953B2 (en) * 2008-08-01 2013-04-03 三洋電機株式会社 Optical pickup device and optical disk device
JP5173922B2 (en) * 2009-04-20 2013-04-03 三洋電機株式会社 Optical pickup device and optical disk device

Also Published As

Publication number Publication date
WO2012114583A1 (en) 2012-08-30

Similar Documents

Publication Publication Date Title
JP4389154B2 (en) Optical pickup and disk drive device
JP4759617B2 (en) Pickup device
US7940630B2 (en) Optical pick up apparatus with a single beam system and having a diffraction grating
US7539090B2 (en) Optical pick-up head, optical information apparatus, and optical information reproducing method
JP4357518B2 (en) Optical head and optical disc apparatus including the same
US8064317B2 (en) Optical pickup apparatus and focal-point adjusting method
US20090278029A1 (en) Pickup device
KR100751430B1 (en) Splitting element, light emitter, and optical pickup apparatus
US8121013B2 (en) Optical pickup apparatus and optical disc apparatus
US7483360B2 (en) Optical pickup device using polarizing hologram element and hologram laser therefor
JP2005327387A (en) Optical pickup apparatus
JP5002465B2 (en) Optical head, optical disc apparatus, computer, optical disc player, and optical disc recorder
JP5347038B2 (en) Optical head device, optical information device, and information processing device
JP5043581B2 (en) Optical head device and optical information device
US7936658B2 (en) Photodetector, diffraction grating, optical pickup and optical disc apparatus
JP4954871B2 (en) Optical head device and optical information processing device
JP5506849B2 (en) Optical head device and optical disk device
US8270280B2 (en) Optical pickup device
JP5002445B2 (en) Optical pickup device and optical disk device
JP2008204517A (en) Optical head and optical information recording and reproducing device
US20100157777A1 (en) Optical head and optical disc device
US7916612B2 (en) Optical pickup apparatus and focal-point adjusting method
WO2007105767A1 (en) Optical head device
JP3974079B2 (en) Optical pickup
JP2009501404A (en) Method for reading information from multilayer optical recording medium and optical reading device

Legal Events

Date Code Title Description
A300 Withdrawal of application because of no request for examination

Free format text: JAPANESE INTERMEDIATE CODE: A300

Effective date: 20140513