WO2024084606A1 - Illumination control system - Google Patents

Illumination control system Download PDF

Info

Publication number
WO2024084606A1
WO2024084606A1 PCT/JP2022/038855 JP2022038855W WO2024084606A1 WO 2024084606 A1 WO2024084606 A1 WO 2024084606A1 JP 2022038855 W JP2022038855 W JP 2022038855W WO 2024084606 A1 WO2024084606 A1 WO 2024084606A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
light
unit
image light
area
Prior art date
Application number
PCT/JP2022/038855
Other languages
French (fr)
Japanese (ja)
Inventor
旭洋 山田
孝介 八木
冰 鄭
遥 寺島
直紀 古畑
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2022/038855 priority Critical patent/WO2024084606A1/en
Priority to JP2023511916A priority patent/JP7475539B1/en
Priority to JP2024063099A priority patent/JP2024074970A/en
Publication of WO2024084606A1 publication Critical patent/WO2024084606A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/60Intended control result
    • G05D1/617Safety or protection, e.g. defining protection zones around obstacles or avoiding hazards

Definitions

  • This disclosure relates to a lighting control system.
  • JP 2005-313291 A (paragraph 0011, FIG. 1)
  • the purpose of this disclosure is to solve the above problems and provide a lighting control system that improves visibility for users.
  • a lighting control system includes: A mobile robot equipped with a lighting device that illuminates an illuminated surface in an oblique direction, an illumination device including a light source, an image light forming unit that converts light emitted from the light source into image light, and an optical element that irradiates the image light emitted from the image light forming unit onto the illuminated surface; a memory unit in which optical data is stored that associates characteristics of the irradiated surface, a specific position, and the light intensity and wavelength of the image light reflected by the irradiated surface detected at the specific position; An identification unit that identifies the characteristics of the illuminated surface illuminated by the lighting device; An estimation unit that estimates a position of a user's eyes; and a control unit that controls the light intensity or wavelength of the image light emitted from the lighting device based on the characteristic of the illuminated surface and the result estimated by the estimation unit.
  • a lighting control system includes: An illumination device that illuminates a first image light indicating a symbol, a character, or a pattern onto an illuminated surface; A control unit is provided, a first image formed by the first image light and background light in a state where the first image light is irradiated; a second image formed by background light in a state where the first image light is not irradiated; and The control unit performs image processing so that a separation area separating the first image area and the second image area is formed between a first image area corresponding to the first image light and a second image area formed around the first image area.
  • This disclosure provides a lighting control system that improves visibility for users.
  • FIG. 1 is a block diagram illustrating a schematic configuration of a lighting control system according to a first embodiment.
  • FIG. 1 is a diagram illustrating an example of the operation of a lighting control system.
  • FIG. 2 illustrates an example of a hardware configuration of a main control unit.
  • FIG. 13 is a diagram illustrating another example of the hardware configuration of the main control unit.
  • FIG. 1 is a cross-sectional view illustrating a schematic configuration of a lighting device.
  • FIG. 2 is a front view illustrating a schematic structure of an image light forming unit.
  • 4 is a diagram showing the incident angle of incident light from a lighting device and the exit angle of exiting light.
  • FIG. 10 is a flowchart illustrating an example of a method for generating mask data.
  • FIG. 1 is a diagram showing one step of a masking process.
  • 11A to 11C are diagrams illustrating other steps of the mask treatment.
  • 11A to 11C are diagrams illustrating other steps of the mask treatment.
  • 11A to 11C are diagrams illustrating other steps of the mask treatment.
  • FIG. 13 is a diagram showing an image obtained by mask processing. 13 is a flowchart showing an example of a process for controlling image light in the third embodiment.
  • FIG. 11 is a diagram illustrating an example of a movable range map.
  • FIG. 2 is a diagram showing an example of a floor surface map.
  • FIG. 13 is a diagram illustrating an example of an illumination map.
  • FIG. 13 is a diagram illustrating an example of an external light map.
  • XYZ coordinates are used in the figures shown below.
  • the +Y direction indicates the height direction
  • the Z axis direction indicates the direction in which, for example, a mobile robot or a user moves.
  • Clockwise rotation around the X axis is indicated by +RX
  • clockwise rotation around the Y axis is indicated by +RY
  • clockwise rotation around the Z axis is indicated by +RZ.
  • FIG. 1 is a block diagram illustrating a schematic configuration of a lighting control system 100 according to the first embodiment.
  • FIG. 2 is a diagram showing an example of the operation of the lighting control system 100.
  • the lighting control system 100 includes a memory unit 10a, an identification unit 10b, an estimation unit 10c, a control unit 10d, and a lighting device 10e.
  • the lighting control system 100 is mounted on, for example, a mobile robot 11.
  • the lighting control system 100 may further include an imaging device 41a.
  • the storage unit 10a is, for example, a volatile memory or a non-volatile memory, and may be configured with both a volatile memory and a non-volatile memory.
  • a lighting device 10e and an imaging device 41a are mounted on the mobile robot 11, and data captured by the imaging device 41a is stored in the memory unit 10a. This creates a database in the memory unit 10a.
  • the storage unit 10a prestores the optical characteristics of the image light incident from various irradiated surfaces 51, which are acquired by the imaging device 41b.
  • the imaging device 41b acquires data such as the amount of light (specifically, light intensity) and color (specifically, wavelength) of the image light reflected from the irradiated surface 51 and incident on the imaging device 41b.
  • Data acquired by an imaging device 41b (also called a second imaging device) different from the imaging device 41a mounted on the mobile robot 11 may be stored in the storage unit 10a.
  • the imaging device 41b is provided, for example, at a position corresponding to the eye height of the user 21 who is positioned opposite the mobile robot 11.
  • the identification unit 10b identifies characteristics of the irradiated surface 51 illuminated by the lighting device 10e. For example, the identification unit 10b identifies the characteristics of the irradiated surface 51 by using an image of the irradiated surface 51 acquired by the imaging device 41a mounted on the mobile robot 11.
  • the identification unit 10b may identify the characteristics of the irradiated surface 51 using position information indicating the position of the irradiated surface 51, may identify the characteristics of the irradiated surface 51 using information input to the identification unit 10b from outside the lighting control system 100, or may identify the characteristics of the irradiated surface 51 using data that has been calculated based on data on multiple irradiated surfaces 51.
  • the estimation unit 10c estimates the position of the eyes of the user 21.
  • the position of the eyes of the user 21 is, for example, the distance from the floor to the eyes of the user 21.
  • the position of the eyes of the user 21 may be estimated using an image acquired by an imaging device 41a mounted on the mobile robot 11. In this case, the image acquired by the imaging device 41a is sent to the estimation unit 10c, and the estimation unit 10c estimates the position of the eyes of the user 21.
  • the eye position of user 21 may be estimated using an image captured by another imaging device installed indoors where user 21 is present.
  • the estimation unit 10c may estimate the light intensity and wavelength of the image light reflected by the irradiated surface 51 at the position of the user's 21's eyes.
  • the estimation unit 10c may estimate the light intensity and wavelength at the position of the user's 21's eyes based on the emission angle of the image light emitted from the lighting device 10e, the height of the lighting device 10e, the light intensity of the image light, the wavelength of the image light, and the characteristics of the irradiated surface 51.
  • a distance measuring device mounted on the mobile robot 11 may be used to measure the distance from the mobile robot 11 to the user 21, and the position of the user's 21 eyes and the reflection angle of the image light reflected by the irradiated surface 51 may be estimated, thereby estimating the light intensity and wavelength at the position of the user's 21 eyes. According to this method, the light intensity and wavelength at the position of the user's 21 eyes can be estimated with high accuracy.
  • Control unit 10d controls the light intensity or wavelength of the image light 61 emitted from the illumination device 10e based on the characteristics of the illuminated surface 51 and the result estimated by the estimation unit 10c.
  • the control unit 10d may control the light intensity or wavelength of the image light 61 emitted from the lighting device 10e based on the data of the image light 61 and the data of the background light (external light).
  • the control unit 10d may stop the irradiation of the image light from the lighting device 10e. For example, when it detects that another user is approaching the mobile robot 11 from a direction unrelated to the direction of the mobile robot 11 and the user 21, for example the Z-axis direction in FIG. 2, such as the ⁇ X-axis direction, the control unit 10d may turn off the irradiation of the image light to the irradiated surface 51. This makes it possible to prevent a user other than the user 21, who is the target of the image light, from recognizing the image light. To detect the other user, an indoor imaging device may be used, or a human sensor may be installed on the mobile robot 11.
  • the lighting device 10e may be kept on.
  • FIG. 3 is a diagram showing an example of a hardware configuration of the main control unit 42.
  • FIG. 4 is a diagram showing another example of the hardware configuration of the main control unit 42.
  • FIG. 3 is a diagram showing an example of a hardware configuration of the main control unit 42.
  • FIG. 4 is a diagram showing another example of the hardware configuration of the main control unit 42.
  • the identification unit 10b, the estimation unit 10c, and the control unit 10d are configured, for example, by the main control unit 42.
  • the main control unit 42 is configured, for example, by at least one processor 42a and at least one memory 42b.
  • the storage unit 10a shown in FIG. 1 may be the memory 42b.
  • the processor 42a is, for example, a CPU (Central Processing Unit) that executes a program stored in the memory 42b.
  • the functions of the identification unit 10b, the estimation unit 10c, and the control unit 10d are realized by software, firmware, or a combination of software and firmware.
  • the software and firmware can be stored as programs in the memory 42b. With this configuration, the program for realizing the functions of the main control unit 42 is executed by the computer.
  • Memory 42b is a computer-readable recording medium, for example, a volatile memory such as RAM (Random Access Memory) and ROM (Read Only Memory), a non-volatile memory, or a combination of volatile and non-volatile memory.
  • a volatile memory such as RAM (Random Access Memory) and ROM (Read Only Memory)
  • ROM Read Only Memory
  • non-volatile memory or a combination of volatile and non-volatile memory.
  • the main control unit 42 may have multiple processors 42a and multiple memories 42b. In this case, the functions of the identification unit 10b, the estimation unit 10c, and the control unit 10d are realized by these multiple processors 42a and multiple memories 42b.
  • the main control unit 42 may be configured with a processing circuit 42c as dedicated hardware such as a single circuit or a composite circuit.
  • the processing circuit 42c is, for example, a system LSI.
  • the functions of the identification unit 10b, the estimation unit 10c, and the control unit 10d are realized by the processing circuit 42c.
  • FIG. 5 is a cross-sectional view that illustrates a schematic configuration of the lighting device 10e.
  • the illumination device 10e includes an illumination optical system 31b.
  • the illumination device 10e may be, for example, a projection display device capable of displaying moving images, such as a projector.
  • the illumination optical system 31b enables the illumination device 10e to be compact and energy-efficient.
  • the illumination optical system 31 b includes a light source 1 , an image light forming section 3 , and a projection optical section 4 .
  • the light source 1 is a solid-state light source that emits light L1.
  • the light source 1 is, for example, an LED (light-emitting diode).
  • the light L1 emitted from the light-emitting surface of the light source 1 is incident on the image light forming unit 3 as diffused light.
  • the light L1 emitted from the light-emitting surface of the light source 1 may be directional light whose divergence angle in the ⁇ RX directions is less than ⁇ 60 degrees (1/2 beam angle: 120 degrees).
  • the light source 1 may be, for example, a light source that emits a mixture of excited fluorescent light and laser light that has passed through a phosphor applied to a flat surface by irradiating the phosphor with an excitation laser light having a central wavelength of 448 nm.
  • Light source 1 may be a light source that emits only fluorescent light.
  • the light source 1 may be composed of only a laser diode. In this case, the size of the light-emitting surface of the light source 1 is small. Therefore, a rectangular rod lens or light pipe that increases the size of the light-emitting surface may be disposed between the light source 1 and the image light forming unit 3.
  • a rod lens or light pipe may be disposed between the light source 1 and the image light forming unit 3.
  • a rod lens or light pipe may be disposed between the light source 1 and the image light forming section 3 in order to form the image light shape and to uniformize the light intensity.
  • the size of the exit surface of the rod lens or light pipe is larger than the image light forming areas 3a2, 3b2, and 3c2 shown in FIG. 6.
  • FIG. 6 is a front view showing a schematic structure of the image light forming unit 3.
  • the image light forming section 3 is disposed between the light source 1 and the projection optical section 4.
  • the image light forming section 3 changes the light L1 emitted from the light source 1 into image light and emits it toward the projection optical section 4.
  • the image light forming section 3 is rotatably supported.
  • the image light forming section 3 rotates, for example, by a rotational driving force transmitted from a rotation drive section (for example, a motor).
  • the image light forming section 3 rotates, for example, in the +RZ direction.
  • the image light forming section 3 may also rotate in the -RZ direction.
  • the structure of the image light forming unit 3 when viewed in the +Z axis direction is shown.
  • the rotation center axis Ra of the image light forming unit 3 is not positioned coaxially with the optical axis C1.
  • the image light forming unit 3 has multiple segments (segment 3a, segment 3b, and segment 3c in FIG. 6) divided in the +RZ direction. In the example shown in FIG. 6, the image light forming unit 3 is divided into three segments.
  • the three segments 3a, 3b, and 3c each include an image light forming area 3a2, 3b2, and 3c2.
  • the image light forming unit 3 can form image light that forms one type of image out of multiple types of images (three types in the example shown in FIG. 6) as it rotates.
  • Image light forming unit 3 has light-shielding regions 3a1, 3b1, and 3c1, and light L1 that reaches light-shielding regions 3a1, 3b1, and 3c1 does not travel in the +Z direction.
  • the light L1 passes through the corresponding image light forming area 3a2, 3b2, or 3c2, travels in the +Z direction, and reaches the projection optical unit 4.
  • the projection optical unit 4 In order to make the lighting device 10e compact, it is preferable to configure the projection optical unit 4 with a small number of optical elements (projection lenses), and considering the imaging performance of the projection optical unit 4 on the illuminated surface 51, the image formed by the image light 61 is preferably a simple pattern such as letters or symbols, but the image formed by the image light 61 is not limited to letters or symbols. Note that Figure 6 shows a configuration in which the projection optical unit 4 has one optical element.
  • the shape of the image light forming areas 3a2, 3b2, and 3c2 may be a shape that corrects distortion (e.g., trapezoidal distortion) that occurs during oblique projection.
  • distortion e.g., trapezoidal distortion
  • Each of the central angles ⁇ 1, ⁇ 2, and ⁇ 3 shown in FIG. 6 is, for example, 120 degrees.
  • the central angles can be changed according to the number of segments. As long as it is possible to detect the timing at which the image light forming areas 3a2, 3b2, and 3c2 pass the optical axis C1, the central angles ⁇ 1, ⁇ 2, and ⁇ 3 may be different angles.
  • the rotation angle steps may be determined in advance.
  • the projection optical unit 4 is, for example, an optical element such as a projection lens.
  • the projection optical unit 4 irradiates the image light emitted from the image light forming unit 3 in an oblique direction onto the irradiated surface 51.
  • the projection optical unit 4 forms, for example, a projection pattern of the image light 61 formed by the image light forming unit 3.
  • the projection optical unit 4 has a function of forming an image of the image light 61 formed by the image light forming unit 3 on the irradiated surface 51.
  • the projection optical unit 4 may project a blurred projection pattern onto the illuminated surface 51.
  • the projection optical unit 4 may be configured with a reflecting mirror or a combination of a reflecting mirror and a lens.
  • the optical data which associates the characteristics of the irradiated surface 51, a specific position, and the light intensity and wavelength of the image light reflected by the irradiated surface 51 detected at the specific position, is stored in the storage unit 10a. In this way, a database is constructed in the storage unit 10a.
  • the emission conditions of the illumination device 10e onto the illuminated surface 51 e.g., the incident angle of the image light 61, the irradiation distance, the installation height, etc.
  • the light intensity and wavelength acquired by the imaging device 41a and the imaging device 41b may be stored in the memory unit 10a as optical data.
  • the height of the imaging device 41b in the Y-axis direction may be changed and height data acquired by the imaging device 41b may be stored in the memory unit 10a. This makes it possible to take into account the height of the user 21 and the distance between the mobile robot 11 and the user 21. In this way, the light intensity and wavelength acquired by the pair of imaging devices 41a, 41b may be stored in the memory unit 10a.
  • Optical data may be created by appropriately changing conditions such as the emission conditions of the lighting device 10e onto the irradiated surface 51 (incident angle, irradiation distance, installation height, etc.) and the placement conditions of the imaging devices 41a and 41b.
  • FIG. 7 is a diagram showing the incident angle ⁇ i of the incident light Li (i.e., the image light 61) from the illumination device 10e and the exit angle ⁇ o of the exit light Lo.
  • the incident angle ⁇ i of the incident light Li i.e., the image light 61
  • the exit angle ⁇ o of the exit light Lo in order to create a database taking into consideration the emission conditions (incident angle, irradiation distance, installation height, etc.) of the lighting device 10e mounted on the mobile robot 11 and the eye position of the user 21, for example, the light intensity and reflectance for wavelength for each emission angle ⁇ o (-80 degrees to 80 degrees) of the emitted light Lo relative to the incident light Li for each incident angle ⁇ i (-80 degrees to -5 degrees) may be acquired as optical data.
  • the incident light Li does not necessarily have to be light from the lighting device 10e.
  • the imaging device 41a mounted on the mobile robot 11 it is desirable to acquire data on the emission angle ⁇ o (-80 degrees to 0 degrees), for example.
  • the characteristics of the irradiated surface 51 are, for example, reflectance, scattering degree, glossiness, or color.
  • the material of the irradiated surface 51 may be tile or cloth. If the material of the irradiated surface 51 is known, the characteristics of the irradiated surface 51 can be easily identified by storing multiple known materials in the memory unit 10a. In other words, calculation processing based on the data acquired by the imaging device 41a is not required, and the characteristics of the irradiated surface 51 can be easily identified.
  • the light intensity of each wavelength of the reflected light is the light intensity of each wavelength obtained by multiplying the light intensity of each wavelength of the light source 1 by the reflectance of each wavelength of the irradiated surface 51.
  • the light obtained by multiplying the spectral distribution (light intensity of each wavelength) of the light source 1 by the reflectance of each wavelength of the irradiated surface 51 enters the eye of the user 21. Therefore, by normalizing the light intensity of each wavelength using the data of the reference surface, it is possible to calculate the relative reflectance of each wavelength of the irradiated surface 51. Here, if the reference surface has a reflectance of 100%, it is possible to calculate the absolute reflectance of each wavelength of the irradiated surface 51.
  • control unit 10d controls the light intensity or wavelength of the image light 61 emitted from the lighting device 10e based on the characteristics of the irradiated surface 51 and the result estimated by the estimation unit 10c. This makes it possible to provide the lighting control system 100 that improves visibility for the user.
  • Embodiment 2 In the second embodiment, the data stored in the storage unit 10a and the control unit 10d are different from those in the first embodiment.
  • the other configurations in the second embodiment are the same as those in the first embodiment.
  • the memory unit 10a may store information regarding the shape of the image light formed by symbols, characters, or simple patterns (including data such as mask data for each pattern).
  • FIG. 8 is a diagram showing an example of the operation of lighting control system 100 in the second embodiment.
  • FIG. 9 is a flowchart showing an example of a method for generating mask data.
  • mask data of the image light 61 irradiated from the lighting device 10e is generated when the lighting device 10e is started up.
  • step S21 while the lighting device 10e is not emitting light, the imaging device 41a captures an image of the background light (specifically, external light) on the illuminated surface 51.
  • the background light specifically, external light
  • step S22 the lighting device 10e irradiates the irradiated surface 51 with image light 61, and the imaging device 41a captures the image (an image combining background light and image light 61) projected onto the irradiated surface 51 by the lighting device 10e. That is, the imaging device 41a captures the image formed on the irradiated surface 51 by the image light 61.
  • the image formed on the irradiated surface 51 by the image light 61 is, for example, an image showing a symbol, character, or pattern.
  • step S23 the control unit 10 acquires the images acquired by the imaging device 41a (an image of the background light and an image combining the background light and the image light 61), and performs image processing such as difference calculation, binarization, and boundary extraction (e.g., mask processing).
  • image processing such as difference calculation, binarization, and boundary extraction (e.g., mask processing).
  • step S24 the mask data (image area, background area, boundary area) for each pattern is stored in the memory unit 10a.
  • the patterns are, for example, the image light formation areas 3a2, 3b2, and 3c2 shown in FIG. 6.
  • 10 to 15 are diagrams showing the steps of the masking process.
  • the illumination device 10e irradiates the illuminated surface 51 with image light 61, and the imaging device 41a captures an image 61A (FIG. 10).
  • the imaging device 41a captures an image 61B formed by background light (specifically, external light) on the illuminated surface 51 (FIG. 11). As shown in FIG. 12, a difference image 61C is calculated by taking the difference between the image 61A and the image 61B.
  • the area between the image area 61D (also referred to as the first image area) and the background area 61E (also referred to as the second image area) is extracted as shown in Fig. 14, and divided into three areas, the image area 61D, the background area 61E, and the separation area 61F (also referred to as the boundary area), as shown in Fig. 15.
  • the mask processing is completed.
  • binarization is not essential, and weighting may be applied to the boundary region in stages.
  • the masking method is not limited as long as the image region 61D, the background region 61E, and the separation region 61F are divided into three regions.
  • the area is divided into three regions in order to perform control with high accuracy, if the accuracy is acceptable, it is also possible to divide only the image area and the background area, i.e., to divide into two areas. This has the advantage of making control easier.
  • control unit 10d performs mask processing using images acquired in advance (e.g., an image 61A including a pattern and an image 61B formed by background light), and controls the light intensity of the light emitted from the illumination device 10e using data obtained by the mask processing. Specifically, the control unit 10d compares the light intensity of a first image region formed by an image region 61D with the light intensity of a background region 61E that forms a second image region different from the first image region around the first image region, and controls the light intensity of the image light 61 irradiated from the illumination device 10e so that the light intensity of the image region 61D is greater than the light intensity of the background region 61E.
  • images acquired in advance e.g., an image 61A including a pattern and an image 61B formed by background light
  • the control unit 10d compares the light intensity of a first image region formed by an image region 61D with the light intensity of a background region 61E that forms a second image region different from the first image region around the
  • FIG. 16 is a flowchart showing an example of a process in the control unit 10d according to the second embodiment. The process shown in FIG. 16 is performed, for example, while the mobile robot 11 is moving. In step S25, the control unit 10d acquires the image formed on the illuminated surface 51.
  • step S26 the control unit 10d separates the acquired image into three regions: an image region 61D, a background region 61E, and a separation region 61F, using mask data corresponding to the pattern of the image light 61.
  • the separation region 61F is provided to avoid the effects of the imaging performance of the image region 61D and image blurring caused by vibrations of the lighting device 10e. This makes it possible to clearly separate the image region 61D and the background region 61E.
  • step S27 the control unit 10d calculates the light intensity of the image area 61D and the light intensity of the background area 61E using the data of the image area 61D, the data of the background area 61E, and the data acquired by the imaging device 41a.
  • step S28 the control unit 10d controls the illumination device 10e so that the light intensity of the image region 61D forming the first image region is greater than the light intensity of the background region 61E forming the second image region.
  • the control unit 10d controls the light intensity of the image light 61 irradiated from the illumination device 10e so that the relationship between the light intensity of the image region 61D and the light intensity of the background region 61E approaches a preset target value.
  • the control unit 10d controls the light intensity of the image light 61 so that the light intensity of the image region 61D is more than twice the light intensity of the background region 61E.
  • step S25 to step S28 may be repeated.
  • the area of the background region 61E is 0.5 times or more the area of the image region 61D irradiated on the irradiated surface 51. This makes it possible to narrow the calculation area and increase the calculation speed. As a result, the contrast can be increased and the image light 61 can be emphasized.
  • control unit 10d compares the light intensity of the first image area formed by the image area 61D with the light intensity of the background area 61E that forms a second image area different from the first image area around the first image area, and controls the light intensity of the image light 61 irradiated from the illumination device 10e so that the light intensity of the image area 61D is greater than the light intensity of the background area 61E.
  • the image area 61D, background area 61E, and separation area 61F are separated into three areas and excluding separation area 61F from the calculation.
  • the image light 61 can be emphasized, and the visibility of the first image for the user 21 can be improved.
  • the control unit 10d preferably controls the lighting device 10e so that the light intensity of the image area 61D forming the first image area is greater than the light intensity of the background area 61E forming the second image area. This can further increase the visibility of the first image area for the user 21.
  • control unit 10d controls the light intensity of the image light 61 so that the light intensity of the image region 61D is at least twice the light intensity of the background region 61E. This can further increase the visibility of the first image region for the user 21.
  • masking is performed using images acquired in advance (e.g., image 61A including a pattern and image 61B formed by background light), but image processing may also be performed in real time.
  • an imaging device 41a mounted on the mobile robot 11 or an imaging device installed on a ceiling or wall surface of a room where the mobile robot 11 moves may acquire a first image area when image light is irradiated, and acquire a second image area formed around the first image area when image light is not irradiated, and the control unit 10d may perform image processing so that a separation area is formed between the first image area and the second image area, separating the first image area from the second image area. This makes it possible to calculate the contrast ratio with high accuracy.
  • This separation function is not limited to the configuration shown in Fig. 1. It may be applied to a system that does not include a storage unit, a characteristic unit, and an estimation unit, and may be applied to a system that does not use oblique projection. There are no particular limitations on the system to which it is applied, and it may be used to calculate a contrast ratio with high accuracy. In that case, the imaging device is not limited to the above, and it is sufficient if it can input a captured image of the target area to the system.
  • Embodiment 3 In the third embodiment, the data stored in the storage unit 10a is different from that in the first embodiment. Other configurations in the third embodiment are the same as those in the first embodiment.
  • an indoor information database is constructed regarding information about the indoor area in which the mobile robot 11 moves.
  • the indoor information database is composed of, for example, a movable range map 81, a floor surface map 82, a lighting map 83, and an external light map 84.
  • the control unit 10d uses the indoor information database to appropriately control the image light (specifically, the light intensity and wavelength of the reflected light from the irradiated surface 51 at the position of the user's 21 eyes).
  • FIG. 16 is a flowchart showing an example of a process for controlling image light in the third embodiment.
  • FIG. 17 is a diagram showing an example of the movable range map 81.
  • the movable range map 81 is data showing the movable range of the mobile robot 11.
  • the movable range map 81 is data showing the movable range of the mobile robot 11, taking into consideration indoor conditions such as the indoor floor surface, obstacles, etc.
  • the movable range map 81 is classified into a movable area 81a and an unmovable area 81b, and is composed of data showing these areas.
  • FIG. 18 is a diagram showing an example of the floor surface map 82.
  • the floor surface map 82 is data indicating the optical characteristics of the floor surface corresponding to the movable area 81 a or the type of material of the floor surface (i.e., the irradiated surface 51) corresponding to the movable area 81 a. Therefore, the floor surface map 82 is associated with the movable range map 81.
  • data "C1” indicates a carpet (first carpet)
  • data "C2” indicates another carpet (a second carpet different from the first carpet)
  • data "PT” indicates a P tile.
  • the optical characteristics of these data “C1”, “C2”, and “PT” are stored in advance in the memory unit 10a.
  • the optical characteristics of these data "C1", “C2”, and “PT” are identified by the identification unit 10b, for example, based on the position information or movement path information of the mobile robot 11.
  • the movement path information of the mobile robot 11 is, for example, information indicating a movement path set to travel through a specific location during a specific time period, and is stored in advance in the memory unit 10a. Note that the movement path information of the mobile robot 11 may also be information indicating a movement path set arbitrarily before operation.
  • FIG. 19 is a diagram showing an example of the illumination map 83.
  • the illumination map 83 is data indicating the positions (e.g., the position of the ceiling) of the lighting fixtures corresponding to the movable area 81a. Therefore, the illumination map 83 is associated with the movable range map 81.
  • the illumination map 83 indicating the type of lighting fixture for each movable area 81a is stored in the storage unit 10a.
  • data "FL” indicates a fluorescent lamp (daylight white)
  • data "DL” indicates a downlight (warm white).
  • Data such as the type of lighting fixture and light source color, such as the data "FL” and "DL”, are pre-stored in the memory unit 10a.
  • the light intensity at the eye position of the user 21 may be estimated using the illuminance on the floor surface, the positions of the lighting fixtures, and the characteristics of the irradiated surface 51 that are stored in advance in the memory unit 10a. This eliminates the need to estimate the effect of background light using the imaging device 41a mounted on the mobile robot 11 or the imaging device 41b installed indoors.
  • the color distribution of reflected light on the floor surface corresponding to the movable range map 81 may be stored in the storage unit 10a from multiple viewpoints.
  • the color distribution of reflected light may be stored in the storage unit 10a taking into account the eye position and line of sight of the user 21.
  • the chromaticity or color information of the floor surface may be stored in advance in the memory unit 10a, and the color at the eye position of the user 21 may be estimated using the positions of the lighting fixtures in the illumination map 83, the light source color, and the characteristics of the irradiated surface 51.
  • FIG. 20 is a diagram showing an example of the external light map 84.
  • the external light map 84 is data showing the intensity distribution of reflected light in the movable area 81a where external light enters. Therefore, the external light map 84 is associated with the movable range map 81.
  • the intensity distribution of reflected light corresponds to, for example, an estimated value of the solar altitude in the area where external light enters according to conditions such as weather information and season.
  • the intensity distribution of reflected light may be stored in the storage unit 10a, for example, taking into consideration the position of the eyes of the user 21 and the line of sight of the user 21.
  • the light intensity at the eye position of the user 21 may be estimated using the illuminance on the floor surface, the incident angle of external light, and the characteristics of the irradiated surface 51 that are pre-stored in the memory unit 10a.
  • the color distribution of reflected light on the floor surface corresponding to the movable range map 81 may be stored in the storage unit 10a from multiple viewpoints.
  • the color distribution of reflected light may be stored in the storage unit 10a taking into account the eye position and line of sight of the user 21.
  • the chromaticity or color information of the floor surface may be stored in advance in the memory unit 10a, and the color at the eye position of the user 21 may be estimated using the positions of the lighting fixtures in the illumination map 83, the light source color, and the characteristics of the irradiated surface 51.
  • the light intensity of the area indicated by data "N1” and the light intensity of the area indicated by data “N2" when external light enters the house from the upper side of FIG. 20 are stored in advance in the storage unit 10a.
  • the open/closed state of the blinds may also be stored in advance in the storage unit 10a. This makes it possible to estimate the effect of external light using the open/closed information of the blinds.
  • the movable range map 81, the floor surface map 82, the illumination map 83, and the external light map 84 are stored in advance in the storage unit 10a. Therefore, by using these data, it is possible to identify the characteristics of the irradiated surface 51 and estimate the light intensity and wavelength at the position of the user 21's eyes based on information such as the position and movement path of the mobile robot 11.
  • the imaging device 41a without using the imaging device 41a, for example, by estimating the position of the user 21's eyes using an image acquired by another imaging device installed indoors where the user 21 is present, it is possible to appropriately control the light intensity and wavelength of the image light (i.e., the light reflected on the irradiated surface 51 at the position of the user 21's eyes). As a result, it is possible to improve visibility for the user 21.
  • the characteristics of the background light and the irradiated surface 51 acquired by the imaging device 41a may be stored in the memory unit 10a.
  • the data stored in the memory unit 10a may be data generated using artificial intelligence.
  • 1 light source 3 image light forming section, 4 projection optical section, 10a memory section, 10b identification section, 10c estimation section, 10d control section, 10e lighting device, 11 mobile robot, 21 user, 31b lighting optical system, 41a, 41b imaging device, 42 main control section, 42a processor, 42b memory, 51 irradiated surface, 100 lighting control system.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

An illumination control system (100) comprises an illumination device (10e) that emits image light (61) diagonally with respect to an illuminated surface (51), a storage unit (10a) that stores optical data associating the light intensity and wavelength of the image light (61) reflected by the illuminated surface (51), an identification unit (10b) that identifies characteristics of the illuminated surface (51) illuminated by the illumination device (10e), an estimation unit (10c) that estimates the position of the eyes of a user (21), and a control unit (10d) that controls the light intensity or wavelength of the image light (61) emitted from the illumination device (10e) on the basis of the characteristics of the illuminated surface (51) and the result estimated by the estimation unit (10c).

Description

照明制御システムLighting Control System
 本開示は、照明制御システムに関する。 This disclosure relates to a lighting control system.
 近年、移動型ロボットの普及に伴い、移動型ロボットと人(ユーザ)とが混在する空間において、ユーザへの注意喚起などの、移動型ロボットの利用が重要となっている。例えば、移動型ロボットに搭載された照明装置を利用して、ユーザへ注意喚起を行う際に、照明装置によって照射される光のコントラストが、外光の影響により低下するという課題がある。そのため、ロボットに搭載された映像投影手段により、被照射面に投影される画像の、明るさ又はコントラストを調整する方法が提案されている(例えば、特許文献1参照)。 In recent years, with the spread of mobile robots, it has become important to use mobile robots to alert users in spaces where mobile robots and people (users) coexist. For example, when alerting users using a lighting device mounted on a mobile robot, there is an issue that the contrast of the light emitted by the lighting device decreases due to the influence of external light. For this reason, a method has been proposed in which the brightness or contrast of an image projected onto an illuminated surface by a video projection means mounted on the robot is adjusted (see, for example, Patent Document 1).
特開2005-313291号公報(段落0011、図1)JP 2005-313291 A (paragraph 0011, FIG. 1)
 しかしながら、従来の技術では、移動型ロボットに搭載された映像投影手段から被照射面に斜めに照射された光をユーザが観察する際におけるユーザの目線の位置を考慮しておらず、ユーザにとっての視認性を高める照射が困難であるという課題がある。 However, conventional technology does not take into account the position of the user's line of sight when the user observes the light that is projected obliquely onto the illuminated surface from the image projection means mounted on the mobile robot, making it difficult to illuminate the image in a way that enhances visibility for the user.
 本開示の目的は、上記の課題を解決するものであり、ユーザにとっての視認性を高める照明制御システムを提供することを目的とする。 The purpose of this disclosure is to solve the above problems and provide a lighting control system that improves visibility for users.
 本開示の一態様に係る照明制御システムは、
 被照射面に対して斜め方向に照射する照明装置を備えた移動型ロボットにおいて、
 光源、前記光源から出射された光を画像光に変更する画像光形成部、及び前記画像光形成部から出た前記画像光を前記被照射面に対して照射する光学素子を有する照明装置と、
 前記被照射面の特性、特定の位置、並びに前記特定の位置において検出された、前記被照射面で反射した前記画像光の光強度及び波長が関連付けられた光学データが格納された記憶部と、
 前記照明装置によって照射される前記被照射面の前記特性を特定する特定部と、
 ユーザの目の位置を推定する推定部と、
 前記被照射面の前記特性及び前記推定部によって推定された結果に基づいて、前記照明装置から出る前記画像光の光強度又は波長を制御する制御部と
 を備える。
 本開示の他の態様に係る照明制御システムは、
 記号、文字、又はパターンを示す第1の画像光を被照射面に照射する照明装置と、
制御部を備え、
前記第1の画像光が照射された状態で前記第1の画像光と背景光によって形成された第1の画像と、
 前記第1の画像光が照射されていない状態で、背景光によって形成された第2の画像と
 を取得し、
 前記制御部は、前記第1の画像光に対応する第1の画像領域と前記第1の画像領域の周囲に形成される第2の画像領域との間に前記第1の画像領域と前記第2の画像領域とを分離する分離領域が形成されるように画像処理を行う。
A lighting control system according to an embodiment of the present disclosure includes:
A mobile robot equipped with a lighting device that illuminates an illuminated surface in an oblique direction,
an illumination device including a light source, an image light forming unit that converts light emitted from the light source into image light, and an optical element that irradiates the image light emitted from the image light forming unit onto the illuminated surface;
a memory unit in which optical data is stored that associates characteristics of the irradiated surface, a specific position, and the light intensity and wavelength of the image light reflected by the irradiated surface detected at the specific position;
An identification unit that identifies the characteristics of the illuminated surface illuminated by the lighting device;
An estimation unit that estimates a position of a user's eyes;
and a control unit that controls the light intensity or wavelength of the image light emitted from the lighting device based on the characteristic of the illuminated surface and the result estimated by the estimation unit.
A lighting control system according to another aspect of the present disclosure includes:
An illumination device that illuminates a first image light indicating a symbol, a character, or a pattern onto an illuminated surface;
A control unit is provided,
a first image formed by the first image light and background light in a state where the first image light is irradiated;
a second image formed by background light in a state where the first image light is not irradiated; and
The control unit performs image processing so that a separation area separating the first image area and the second image area is formed between a first image area corresponding to the first image light and a second image area formed around the first image area.
 本開示によれば、ユーザにとっての視認性を高める照明制御システムを提供することができる。 This disclosure provides a lighting control system that improves visibility for users.
実施の形態1に係る照明制御システムの構成を概略的に示すブロック図である。1 is a block diagram illustrating a schematic configuration of a lighting control system according to a first embodiment. 照明制御システムの動作の一例を示す図である。FIG. 1 is a diagram illustrating an example of the operation of a lighting control system. 主制御部のハードウェア構成の一例を示す図である。FIG. 2 illustrates an example of a hardware configuration of a main control unit. 主制御部のハードウェア構成の他の例を示す図である。FIG. 13 is a diagram illustrating another example of the hardware configuration of the main control unit. 照明装置の構成を概略的に示す断面図である。FIG. 1 is a cross-sectional view illustrating a schematic configuration of a lighting device. 画像光形成部の構造を概略的に示す正面図である。FIG. 2 is a front view illustrating a schematic structure of an image light forming unit. 照明装置からの入射光の入射角度と出射光の出射角度とを示す図である。4 is a diagram showing the incident angle of incident light from a lighting device and the exit angle of exiting light. FIG. 実施の形態3における照明制御システムの動作の一例を示す図である。A figure showing an example of the operation of the lighting control system in embodiment 3. マスクデータを作成する方法の一例を示すフローチャートである。10 is a flowchart illustrating an example of a method for generating mask data. マスク処理の一工程を示す図である。FIG. 1 is a diagram showing one step of a masking process. マスク処理の他の工程を示す図である。11A to 11C are diagrams illustrating other steps of the mask treatment. マスク処理の他の工程を示す図である。11A to 11C are diagrams illustrating other steps of the mask treatment. マスク処理の他の工程を示す図である。11A to 11C are diagrams illustrating other steps of the mask treatment. マスク処理の他の工程を示す図である。11A to 11C are diagrams illustrating other steps of the mask treatment. マスク処理によって得られる画像を示す図である。FIG. 13 is a diagram showing an image obtained by mask processing. 実施の形態3における画像光を制御する工程の一例を示すフローチャートである。13 is a flowchart showing an example of a process for controlling image light in the third embodiment. 可動範囲マップの一例を示す図である。FIG. 11 is a diagram illustrating an example of a movable range map. 床面マップの一例を示す図である。FIG. 2 is a diagram showing an example of a floor surface map. 照明マップの一例を示す図である。FIG. 13 is a diagram illustrating an example of an illumination map. 外光マップの一例を示す図である。FIG. 13 is a diagram illustrating an example of an external light map.
<座標の設定>
 本実施の形態では、説明を容易にするために、以下に示す図においてXYZ座標を用いる。+Y方向は高さ方向を示し、Z軸方向は、例えば、移動型ロボット又はユーザが進行する方向を示す。X軸中心の右回りの回転は+RXで示され、Y軸中心の右回りの回転は+RYで示され、Z軸中心の右回りの回転は+RZで示される。
<Setting coordinates>
In this embodiment, for ease of explanation, XYZ coordinates are used in the figures shown below. The +Y direction indicates the height direction, and the Z axis direction indicates the direction in which, for example, a mobile robot or a user moves. Clockwise rotation around the X axis is indicated by +RX, clockwise rotation around the Y axis is indicated by +RY, and clockwise rotation around the Z axis is indicated by +RZ.
実施の形態1.
 図1は、実施の形態1に係る照明制御システム100の構成を概略的に示すブロック図である。
 図2は、照明制御システム100の動作の一例を示す図である。
 照明制御システム100は、記憶部10aと、特定部10bと、推定部10cと、制御部10dと、照明装置10eとを有する。照明制御システム100は、例えば、移動型ロボット11に搭載される。照明制御システム100は、撮像装置41aをさらに有してもよい。
Embodiment 1.
FIG. 1 is a block diagram illustrating a schematic configuration of a lighting control system 100 according to the first embodiment.
FIG. 2 is a diagram showing an example of the operation of the lighting control system 100.
The lighting control system 100 includes a memory unit 10a, an identification unit 10b, an estimation unit 10c, a control unit 10d, and a lighting device 10e. The lighting control system 100 is mounted on, for example, a mobile robot 11. The lighting control system 100 may further include an imaging device 41a.
<記憶部10a>
 記憶部10aは、例えば、揮発性メモリ又は不揮発性メモリである。記憶部10aは、揮発性メモリ及び不揮発性メモリの両方によって構成されていてもよい。
<Storage unit 10a>
The storage unit 10a is, for example, a volatile memory or a non-volatile memory, and may be configured with both a volatile memory and a non-volatile memory.
 図2に示される例では、照明装置10e及び撮像装置41a(第1の撮像装置とも称する)が移動型ロボット11に搭載されており、撮像装置41aによって撮影されたデータが記憶部10aに格納される。これにより、記憶部10aにデータベースが構築される。 In the example shown in FIG. 2, a lighting device 10e and an imaging device 41a (also referred to as a first imaging device) are mounted on the mobile robot 11, and data captured by the imaging device 41a is stored in the memory unit 10a. This creates a database in the memory unit 10a.
 記憶部10aには、撮像装置41bによって取得された、様々な被照射面51から入射した画像光の光学特性が予め格納されている。撮像装置41bは、例えば、被照射面51で反射して撮像装置41bに入射する画像光の光量(具体的には、光強度)及び色(具体的には、波長)などのデータを取得する。 The storage unit 10a prestores the optical characteristics of the image light incident from various irradiated surfaces 51, which are acquired by the imaging device 41b. The imaging device 41b acquires data such as the amount of light (specifically, light intensity) and color (specifically, wavelength) of the image light reflected from the irradiated surface 51 and incident on the imaging device 41b.
 移動型ロボット11に搭載された撮像装置41aとは異なる撮像装置41b(第2の撮像装置とも称する)によって取得されたデータが、記憶部10aに格納されてもよい。撮像装置41bは、例えば、移動型ロボット11と対向する位置に存在するユーザ21の目の高さに相当する位置に設けられる。 Data acquired by an imaging device 41b (also called a second imaging device) different from the imaging device 41a mounted on the mobile robot 11 may be stored in the storage unit 10a. The imaging device 41b is provided, for example, at a position corresponding to the eye height of the user 21 who is positioned opposite the mobile robot 11.
<特定部10b>
 特定部10bは、照明装置10eによって照射される被照射面51の特性を特定する。例えば、特定部10bは、移動型ロボット11に搭載された撮像装置41aによって取得された被照射面51の画像を用いて被照射面51の特性を特定する。
<Specifying unit 10b>
The identification unit 10b identifies characteristics of the irradiated surface 51 illuminated by the lighting device 10e. For example, the identification unit 10b identifies the characteristics of the irradiated surface 51 by using an image of the irradiated surface 51 acquired by the imaging device 41a mounted on the mobile robot 11.
 特定部10bは、被照射面51の位置を示す位置情報を用いて被照射面51の特性を特定してもよく、照明制御システム100の外部から特定部10bに入力された情報を用いて被照射面51の特性を特定してもよく、複数の被照射面51のデータを基に演算処理されたデータを用いて被照射面51の特性を特定してもよい。 The identification unit 10b may identify the characteristics of the irradiated surface 51 using position information indicating the position of the irradiated surface 51, may identify the characteristics of the irradiated surface 51 using information input to the identification unit 10b from outside the lighting control system 100, or may identify the characteristics of the irradiated surface 51 using data that has been calculated based on data on multiple irradiated surfaces 51.
<推定部10c>
 推定部10cは、ユーザ21の目の位置を推定する。ユーザ21の目の位置は、例えば、床からユーザ21の目までの距離である。ユーザ21の目の位置は、移動型ロボット11に搭載された撮像装置41aによって取得された画像を用いて推定されてもよい。この場合、撮像装置41aによって取得された画像が推定部10cに送られ、推定部10cがユーザ21の目の位置を推定する。
<Estimation unit 10c>
The estimation unit 10c estimates the position of the eyes of the user 21. The position of the eyes of the user 21 is, for example, the distance from the floor to the eyes of the user 21. The position of the eyes of the user 21 may be estimated using an image acquired by an imaging device 41a mounted on the mobile robot 11. In this case, the image acquired by the imaging device 41a is sent to the estimation unit 10c, and the estimation unit 10c estimates the position of the eyes of the user 21.
 ユーザ21の目の位置は、ユーザ21が存在する屋内に設置された別の撮像装置によって取得された画像を用いて推定されてもよい。 The eye position of user 21 may be estimated using an image captured by another imaging device installed indoors where user 21 is present.
 推定部10cは、被照射面51での反射した画像光の、ユーザ21の目の位置における光強度及び波長を推定してもよい。推定部10cは、照明装置10eから出る画像光の出射角度、照明装置10eの高さ、画像光の光強度、画像光の波長、及び被照射面51の特性に基づいて、ユーザ21の目の位置における光強度及び波長を推定してもよい。 The estimation unit 10c may estimate the light intensity and wavelength of the image light reflected by the irradiated surface 51 at the position of the user's 21's eyes. The estimation unit 10c may estimate the light intensity and wavelength at the position of the user's 21's eyes based on the emission angle of the image light emitted from the lighting device 10e, the height of the lighting device 10e, the light intensity of the image light, the wavelength of the image light, and the characteristics of the irradiated surface 51.
 さらに、移動型ロボット11に搭載された測距装置を用いて移動型ロボット11からユーザ21までの距離を計測し、ユーザ21の目の位置さと被照射面51で反射した画像光の反射角度を推定し、ユーザ21の目の位置における光強度及び波長を推定してもよい。この方法によれば、ユーザ21の目の位置における光強度及び波長を高精度で推定することができる。 Furthermore, a distance measuring device mounted on the mobile robot 11 may be used to measure the distance from the mobile robot 11 to the user 21, and the position of the user's 21 eyes and the reflection angle of the image light reflected by the irradiated surface 51 may be estimated, thereby estimating the light intensity and wavelength at the position of the user's 21 eyes. According to this method, the light intensity and wavelength at the position of the user's 21 eyes can be estimated with high accuracy.
<制御部10d>
 制御部10dは、被照射面51の特性及び推定部10cによって推定された結果に基づいて、照明装置10eから出る画像光61の光強度又は波長を制御する。
<Control unit 10d>
The control unit 10d controls the light intensity or wavelength of the image light 61 emitted from the illumination device 10e based on the characteristics of the illuminated surface 51 and the result estimated by the estimation unit 10c.
 制御部10dは、画像光61のデータ及び背景光(外光)のデータに基づいて照明装置10eから出る画像光61の光強度又は波長を制御してもよい。 The control unit 10d may control the light intensity or wavelength of the image light 61 emitted from the lighting device 10e based on the data of the image light 61 and the data of the background light (external light).
 制御部10dは、ユーザ21(第1のユーザとも称する)とは異なるユーザ(第2のユーザとも称する)を検知した場合、照明装置10eからの照射を停止してもよい。例えば、移動型ロボット11とユーザ21の方向、図2ではZ軸方向とは無関係な方向、例えば±X軸方向から別のユーザが移動型ロボット11方向に近づくことを検知した際に、被照射面51への画像光の照射を消灯してもよい。これにより、画像光の表示対象であるユーザ21とは異なるユーザが画像光を認識することを避けることが可能となる。別のユーザの検知には屋内の撮像装置を用いてもよいし、移動型ロボット11に人感センサを設置してもよい。 When the control unit 10d detects a user (also referred to as a second user) other than the user 21 (also referred to as a first user), the control unit 10d may stop the irradiation of the image light from the lighting device 10e. For example, when it detects that another user is approaching the mobile robot 11 from a direction unrelated to the direction of the mobile robot 11 and the user 21, for example the Z-axis direction in FIG. 2, such as the ±X-axis direction, the control unit 10d may turn off the irradiation of the image light to the irradiated surface 51. This makes it possible to prevent a user other than the user 21, who is the target of the image light, from recognizing the image light. To detect the other user, an indoor imaging device may be used, or a human sensor may be installed on the mobile robot 11.
 特定部10bで被照射面51の光学特性を特定することで、例えば、±X方向から別のユーザが移動型ロボット11方向に近づくことを検知した場合でも、±X方向への画像光61の反射光を別のユーザ(第2のユーザ)が認識できないと推定部10cによって推定できる場合は、照明装置10eの点灯を維持してもよい。 By using the identification unit 10b to identify the optical characteristics of the irradiated surface 51, even if it is detected that another user is approaching the mobile robot 11 from the ±X direction, if the estimation unit 10c can estimate that the reflected light of the image light 61 in the ±X direction is not recognizable by the other user (second user), the lighting device 10e may be kept on.
 図3は、主制御部42のハードウェア構成の一例を示す図である。
 図4は、主制御部42のハードウェア構成の他の例を示す図である。
FIG. 3 is a diagram showing an example of a hardware configuration of the main control unit 42. As shown in FIG.
FIG. 4 is a diagram showing another example of the hardware configuration of the main control unit 42. In FIG.
 特定部10b、推定部10c、及び制御部10dは、例えば、主制御部42で構成される。この場合、主制御部42は、例えば、少なくとも1つのプロセッサ42a及び少なくとも1つのメモリ42bで構成される。図1に示される記憶部10aは、メモリ42bでもよい。プロセッサ42aは、例えば、メモリ42bに格納されるプログラムを実行するCPU(Central Processing Unit)である。この場合、特定部10b、推定部10c、及び制御部10dの機能は、ソフトウェア、ファームウェア、又はソフトウェアとファームウェアとの組み合わせにより実現される。ソフトウェア及びファームウェアはプログラムとしてメモリ42bに格納することができる。この構成により、主制御部42の機能を実現するためのプログラムは、コンピュータによって実行される。 The identification unit 10b, the estimation unit 10c, and the control unit 10d are configured, for example, by the main control unit 42. In this case, the main control unit 42 is configured, for example, by at least one processor 42a and at least one memory 42b. The storage unit 10a shown in FIG. 1 may be the memory 42b. The processor 42a is, for example, a CPU (Central Processing Unit) that executes a program stored in the memory 42b. In this case, the functions of the identification unit 10b, the estimation unit 10c, and the control unit 10d are realized by software, firmware, or a combination of software and firmware. The software and firmware can be stored as programs in the memory 42b. With this configuration, the program for realizing the functions of the main control unit 42 is executed by the computer.
 メモリ42bは、コンピュータで読み取り可能な記録媒体であり、例えば、RAM(Random Access Memory)及びROM(Read Only Memory)などの揮発性メモリ、不揮発性メモリ、又は揮発性メモリと不揮発性メモリとの組み合わせである。 Memory 42b is a computer-readable recording medium, for example, a volatile memory such as RAM (Random Access Memory) and ROM (Read Only Memory), a non-volatile memory, or a combination of volatile and non-volatile memory.
 主制御部42は、複数のプロセッサ42a及び複数のメモリ42bを有してもよい。この場合、特定部10b、推定部10c、及び制御部10dの機能は、これらの複数のプロセッサ42a及び複数のメモリ42bによって実現される。 The main control unit 42 may have multiple processors 42a and multiple memories 42b. In this case, the functions of the identification unit 10b, the estimation unit 10c, and the control unit 10d are realized by these multiple processors 42a and multiple memories 42b.
 主制御部42は、単一回路及び複合回路などの専用のハードウェアとしての処理回路42cで構成されてもよい。処理回路42cは、例えば、システムLSIである。この場合、特定部10b、推定部10c、及び制御部10dの機能は、処理回路42cで実現される。 The main control unit 42 may be configured with a processing circuit 42c as dedicated hardware such as a single circuit or a composite circuit. The processing circuit 42c is, for example, a system LSI. In this case, the functions of the identification unit 10b, the estimation unit 10c, and the control unit 10d are realized by the processing circuit 42c.
<照明装置10e>
 図5は、照明装置10eの構成を概略的に示す断面図である。
 照明装置10eは、照明光学系31bを有する。照明装置10eは、例えば、プロジェクターのように動画を表示できる投射型表示装置でもよい。照明光学系31bは、小型かつ省エネを実現可能とする。
<Lighting device 10e>
FIG. 5 is a cross-sectional view that illustrates a schematic configuration of the lighting device 10e.
The illumination device 10e includes an illumination optical system 31b. The illumination device 10e may be, for example, a projection display device capable of displaying moving images, such as a projector. The illumination optical system 31b enables the illumination device 10e to be compact and energy-efficient.
<照明光学系31b>
 照明光学系31bは、光源1と、画像光形成部3と、投射光学部4とを有する。
<Illumination optical system 31b>
The illumination optical system 31 b includes a light source 1 , an image light forming section 3 , and a projection optical section 4 .
<光源1>
 光源1は、光L1を発する固体光源である。光源1は、例えば、LED(発光ダイオード)である。光源1の発光面から出た光L1は、拡散光として画像光形成部3に入射する。なお、光源1の発光面から出た光L1は、±RX方向の発散角度が±60度(1/2ビーム角:120度)より小さい指向性のある光でも良い。
<Light source 1>
The light source 1 is a solid-state light source that emits light L1. The light source 1 is, for example, an LED (light-emitting diode). The light L1 emitted from the light-emitting surface of the light source 1 is incident on the image light forming unit 3 as diffused light. Note that the light L1 emitted from the light-emitting surface of the light source 1 may be directional light whose divergence angle in the ±RX directions is less than ±60 degrees (1/2 beam angle: 120 degrees).
 光源1は、例えば、平面に塗布された蛍光体に、中心波長が448nmである励起用のレーザ光を照射することによって当該蛍光体を透過したレーザ光及び励起された蛍光光を混合した光を発する光源であってもよい。 The light source 1 may be, for example, a light source that emits a mixture of excited fluorescent light and laser light that has passed through a phosphor applied to a flat surface by irradiating the phosphor with an excitation laser light having a central wavelength of 448 nm.
 光源1は、蛍光光のみを発する光源であってもよい。 Light source 1 may be a light source that emits only fluorescent light.
 光源1は、レーザダイオードのみによって構成されていてもよい。この場合、光源1の発光面のサイズは小さい。そのため、発光面のサイズを大きくする四角柱状のロッドレンズ又はライトパイプが、光源1と画像光形成部3との間に配置されていてもよい。 The light source 1 may be composed of only a laser diode. In this case, the size of the light-emitting surface of the light source 1 is small. Therefore, a rectangular rod lens or light pipe that increases the size of the light-emitting surface may be disposed between the light source 1 and the image light forming unit 3.
 蛍光体に入射するレーザ光の領域が狭い場合も同様に、ロッドレンズ又はライトパイプが光源1と画像光形成部3との間に配置されていてもよい。 Similarly, if the area of the laser light incident on the phosphor is narrow, a rod lens or light pipe may be disposed between the light source 1 and the image light forming unit 3.
 LED光源の場合でも、画像光形状の形成および光強度の均一化のため、ロッドレンズ又はライトパイプが光源1と画像光形成部3との間に配置されていてもよい。その場合、図6に示す画像光形成領域3a2、3b2、3c2よりロッドレンズ又はライトパイプの出射面のサイズが大きいことが好ましい。 Even in the case of an LED light source, a rod lens or light pipe may be disposed between the light source 1 and the image light forming section 3 in order to form the image light shape and to uniformize the light intensity. In that case, it is preferable that the size of the exit surface of the rod lens or light pipe is larger than the image light forming areas 3a2, 3b2, and 3c2 shown in FIG. 6.
<画像光形成部3>
 図6は、画像光形成部3の構造を概略的に示す正面図である。
 画像光形成部3は、光源1と投射光学部4との間に配置されている。画像光形成部3は、光源1から出た光L1を画像光に変更して投射光学部4に向けて出射する。画像光形成部3は、回転可能に支持されている。画像光形成部3は、例えば、回転駆動部(例えば、モータ)から伝達された回転駆動力によって回転する。画像光形成部3は、例えば、+RZ方向に回転する。画像光形成部3は、-RZ方向に回転してもよい。
<Image light forming unit 3>
FIG. 6 is a front view showing a schematic structure of the image light forming unit 3. As shown in FIG.
The image light forming section 3 is disposed between the light source 1 and the projection optical section 4. The image light forming section 3 changes the light L1 emitted from the light source 1 into image light and emits it toward the projection optical section 4. The image light forming section 3 is rotatably supported. The image light forming section 3 rotates, for example, by a rotational driving force transmitted from a rotation drive section (for example, a motor). The image light forming section 3 rotates, for example, in the +RZ direction. The image light forming section 3 may also rotate in the -RZ direction.
 図6に示される例では、+Z軸方向に見たときの画像光形成部3の構造が示されている。画像光形成部3の回転中心軸Raは、光軸C1と同軸上には位置していない。 In the example shown in FIG. 6, the structure of the image light forming unit 3 when viewed in the +Z axis direction is shown. The rotation center axis Ra of the image light forming unit 3 is not positioned coaxially with the optical axis C1.
 図6に示されるように、画像光形成部3は、+RZ方向に分割された複数のセグメント(図6では、セグメント3a、セグメント3b、及びセグメント3c)を有する。図6に示される例では、画像光形成部3は、3個のセグメントに分割されている。 As shown in FIG. 6, the image light forming unit 3 has multiple segments (segment 3a, segment 3b, and segment 3c in FIG. 6) divided in the +RZ direction. In the example shown in FIG. 6, the image light forming unit 3 is divided into three segments.
 3個のセグメント3a、3b、3cは、それぞれ画像光形成領域3a2、3b2、3c2を含む。画像光形成部3は、回転動作に伴って、複数の種類(図6に示される例では、3種類)の画像のうちの1種類の画像を形成する画像光を形成することができる。 The three segments 3a, 3b, and 3c each include an image light forming area 3a2, 3b2, and 3c2. The image light forming unit 3 can form image light that forms one type of image out of multiple types of images (three types in the example shown in FIG. 6) as it rotates.
 具体的には、光軸C1と重なるセグメント3a、3b、3cを通過した光L1が、画像光61に変換される。画像光形成部3は遮光領域3a1、3b1、3c1を有し、遮光領域3a1、3b1、3c1に到達した光L1は+Z方向に進行しない。 Specifically, light L1 that passes through segments 3a, 3b, and 3c that overlap with optical axis C1 is converted into image light 61. Image light forming unit 3 has light-shielding regions 3a1, 3b1, and 3c1, and light L1 that reaches light-shielding regions 3a1, 3b1, and 3c1 does not travel in the +Z direction.
 光軸C1上に画像光形成領域3a2、3b2、又は3c2のいずれかが対向しているとき、光L1が対応する画像光形成領域3a2、3b2、又は3c2を通過し、+Z方向に進行し、投射光学部4に到達する。 When any of the image light forming areas 3a2, 3b2, or 3c2 faces the optical axis C1, the light L1 passes through the corresponding image light forming area 3a2, 3b2, or 3c2, travels in the +Z direction, and reaches the projection optical unit 4.
 照明装置10eを小型にするためには、投射光学部4を少ない枚数の光学素子(投射レンズ)で構成することが好ましく、投射光学部4の被照射面51上の結像性能を考慮すると、画像光61によって形成される画像は文字又は記号などの簡易なパターンが好ましいが、画像光61によって形成される画像は文字又は記号に限定されない。なお、図6では、投射光学部4の光学素子が1枚の構成を示している。 In order to make the lighting device 10e compact, it is preferable to configure the projection optical unit 4 with a small number of optical elements (projection lenses), and considering the imaging performance of the projection optical unit 4 on the illuminated surface 51, the image formed by the image light 61 is preferably a simple pattern such as letters or symbols, but the image formed by the image light 61 is not limited to letters or symbols. Note that Figure 6 shows a configuration in which the projection optical unit 4 has one optical element.
 画像光形成領域3a2、3b2、3c2の形状は、斜め投射の際に生じるひずみ(例えば、台形ひずみ)を補正する形状でもよい。 The shape of the image light forming areas 3a2, 3b2, and 3c2 may be a shape that corrects distortion (e.g., trapezoidal distortion) that occurs during oblique projection.
 図6に示される中心角度β1、β2、β3の各々は、例えば、120度である。セグメント数に応じて中心角度は変更できる。光軸C1を画像光形成領域3a2、3b2,3c2が通過するタイミングを検知できさえすれば、中心角度β1、β2、β3は互いに異なる角度であってもよい。回転角度のステップを予め決定しておいてもよい。 Each of the central angles β1, β2, and β3 shown in FIG. 6 is, for example, 120 degrees. The central angles can be changed according to the number of segments. As long as it is possible to detect the timing at which the image light forming areas 3a2, 3b2, and 3c2 pass the optical axis C1, the central angles β1, β2, and β3 may be different angles. The rotation angle steps may be determined in advance.
<投射光学部4>
 投射光学部4は、例えば、投射レンズなどの光学素子である。投射光学部4は、画像光形成部3から出た画像光を被照射面51に対して斜め方向に照射する。投射光学部4は、例えば、画像光形成部3によって形成された画像光61の投射パターンを形成する。投射光学部4は、画像光形成部3によって形成された画像光61を被照射面51に結像する機能を有する。
<Projection optical unit 4>
The projection optical unit 4 is, for example, an optical element such as a projection lens. The projection optical unit 4 irradiates the image light emitted from the image light forming unit 3 in an oblique direction onto the irradiated surface 51. The projection optical unit 4 forms, for example, a projection pattern of the image light 61 formed by the image light forming unit 3. The projection optical unit 4 has a function of forming an image of the image light 61 formed by the image light forming unit 3 on the irradiated surface 51.
 投射光学部4は、ぼかしが生じた投射パターンを被照射面51に照射してもよい。投射光学部4は、反射ミラー又は反射ミラーとレンズとの組み合わせによって構成されていてもよい。 The projection optical unit 4 may project a blurred projection pattern onto the illuminated surface 51. The projection optical unit 4 may be configured with a reflecting mirror or a combination of a reflecting mirror and a lens.
<データベース>
 被照射面51の特性、特定の位置、並びにその特定の位置において検出された、被照射面51で反射した画像光の光強度及び波長が関連付けられた光学データは、記憶部10aに格納される。これにより、記憶部10aにデータベースが構築される。
<Database>
The optical data, which associates the characteristics of the irradiated surface 51, a specific position, and the light intensity and wavelength of the image light reflected by the irradiated surface 51 detected at the specific position, is stored in the storage unit 10a. In this way, a database is constructed in the storage unit 10a.
 図2において、照明装置10eの被照射面51への出射条件(例えば、画像光61の入射角度、照射距離、設置高さ等)が固定される場合、撮像装置41a及び撮像装置41bによって取得される光強度及び波長が光学データとして記憶部10aに格納されてもよい。 In FIG. 2, when the emission conditions of the illumination device 10e onto the illuminated surface 51 (e.g., the incident angle of the image light 61, the irradiation distance, the installation height, etc.) are fixed, the light intensity and wavelength acquired by the imaging device 41a and the imaging device 41b may be stored in the memory unit 10a as optical data.
 撮像装置41bのY軸方向における高さを変化させて撮像装置41bによって取得された高さ方向のデータが記憶部10aに格納されてもよい。これにより、ユーザ21の身長、移動型ロボット11とユーザ21との距離を考慮することが可能となる。このように、一対の撮像装置41a、41bによって取得される光強度及び波長が記憶部10aに格納されてもよい。 The height of the imaging device 41b in the Y-axis direction may be changed and height data acquired by the imaging device 41b may be stored in the memory unit 10a. This makes it possible to take into account the height of the user 21 and the distance between the mobile robot 11 and the user 21. In this way, the light intensity and wavelength acquired by the pair of imaging devices 41a, 41b may be stored in the memory unit 10a.
 照明装置10eの被照射面51への出射条件(入射角度、照射距離、設置高さ等)、撮像装置41a及び撮像装置41bの配置条件などの条件を適宜変更して光学データを作成してもよい。 Optical data may be created by appropriately changing conditions such as the emission conditions of the lighting device 10e onto the irradiated surface 51 (incident angle, irradiation distance, installation height, etc.) and the placement conditions of the imaging devices 41a and 41b.
 図7は、照明装置10eからの入射光Li(すなわち、画像光61)の入射角度αiと出射光Loの出射角度αoとを示す図である。
 図7に示されるように、移動型ロボット11に搭載される照明装置10eの出射条件(入射角度、照射距離、設置高さ等)及びユーザ21の目の位置を考慮してデータベースを作成するために、例えば、入射角度αi毎(-80度から-5度)の入射光Liに対する出射光Loの出射角度αo毎(-80度から80度)の光強度及び波長に対する反射率を光学データとして取得してもよい。
FIG. 7 is a diagram showing the incident angle αi of the incident light Li (i.e., the image light 61) from the illumination device 10e and the exit angle αo of the exit light Lo.
As shown in FIG. 7, in order to create a database taking into consideration the emission conditions (incident angle, irradiation distance, installation height, etc.) of the lighting device 10e mounted on the mobile robot 11 and the eye position of the user 21, for example, the light intensity and reflectance for wavelength for each emission angle αo (-80 degrees to 80 degrees) of the emitted light Lo relative to the incident light Li for each incident angle αi (-80 degrees to -5 degrees) may be acquired as optical data.
 被照射面51の特性を取得する場合、入射光Liは必ずしも照明装置10eからの光である必要はない。移動型ロボット11に搭載された撮像装置41aによって取得されたデータに基づいて被照射面51の特性を特定する場合、例えば、出射角度αo(-80度から0度)のデータを取得することが望ましい。 When acquiring the characteristics of the irradiated surface 51, the incident light Li does not necessarily have to be light from the lighting device 10e. When identifying the characteristics of the irradiated surface 51 based on data acquired by the imaging device 41a mounted on the mobile robot 11, it is desirable to acquire data on the emission angle αo (-80 degrees to 0 degrees), for example.
 被照射面51の特性は、例えば、反射率、散乱度、光沢度、又は色である。被照射面51の材質は、タイル又は布でもよい。被照射面51の材質が既知である場合、複数の既知の材質を記憶部10aに格納することにより、被照射面51の特性を容易に特定することができる。すなわち、撮像装置41aによって取得されたデータに基づく演算処理が不要となり、被照射面51の特性を容易に特定することができる。 The characteristics of the irradiated surface 51 are, for example, reflectance, scattering degree, glossiness, or color. The material of the irradiated surface 51 may be tile or cloth. If the material of the irradiated surface 51 is known, the characteristics of the irradiated surface 51 can be easily identified by storing multiple known materials in the memory unit 10a. In other words, calculation processing based on the data acquired by the imaging device 41a is not required, and the characteristics of the irradiated surface 51 can be easily identified.
 基準面(例えば、完全散乱白色面)のデータを取得することにより、入射光Liの光源色(白色は、3000K、5000K等、光源に依存する)の影響を除外することが可能となる。 By acquiring data on a reference surface (e.g., a completely scattering white surface), it is possible to eliminate the influence of the light source color of the incident light Li (white depends on the light source, e.g., 3000K, 5000K, etc.).
 光源1の各波長の光強度と被照射面51の各波長の反射率とを乗じて得られる光強度が反射光の各波長の光強度である。光源1の分光分布(各波長の光強度)と被照射面51の各波長の反射率を乗じて得られる光がユーザ21の目に入る。したがって、基準面のデータを用いて各波長の光強度を正規化することにより、被照射面51の各波長の相対反射率を算出することが可能となる。ここで、基準面が100%反射率である場合、被照射面51の各波長の絶対反射率を算出することが可能となる。つまり、照明装置10eの基準面のデータを記憶部10aに格納することで、異なる光源色で作成したデータベースを照明装置10eのデータベースに変換することが可能となる。言い換えると、照明装置10eの基準面のデータを異なる光源色の基準面のデータで正規化することで、各波長の反射率の補正係数を算出することが可能となる。 The light intensity of each wavelength of the reflected light is the light intensity of each wavelength obtained by multiplying the light intensity of each wavelength of the light source 1 by the reflectance of each wavelength of the irradiated surface 51. The light obtained by multiplying the spectral distribution (light intensity of each wavelength) of the light source 1 by the reflectance of each wavelength of the irradiated surface 51 enters the eye of the user 21. Therefore, by normalizing the light intensity of each wavelength using the data of the reference surface, it is possible to calculate the relative reflectance of each wavelength of the irradiated surface 51. Here, if the reference surface has a reflectance of 100%, it is possible to calculate the absolute reflectance of each wavelength of the irradiated surface 51. In other words, by storing the data of the reference surface of the lighting device 10e in the storage unit 10a, it is possible to convert a database created with a different light source color into a database of the lighting device 10e. In other words, by normalizing the data of the reference surface of the lighting device 10e with the data of the reference surface of a different light source color, it is possible to calculate the correction coefficient of the reflectance of each wavelength.
<実施の形態1の利点>
 実施の形態1によれば、制御部10dは、被照射面51の特性及び推定部10cによって推定された結果に基づいて、照明装置10eから出る画像光61の光強度又は波長を制御する。これにより、ユーザにとっての視認性を高める照明制御システム100を提供することができる。
Advantages of the First Embodiment
According to the first embodiment, the control unit 10d controls the light intensity or wavelength of the image light 61 emitted from the lighting device 10e based on the characteristics of the irradiated surface 51 and the result estimated by the estimation unit 10c. This makes it possible to provide the lighting control system 100 that improves visibility for the user.
実施の形態2.
 実施の形態2では、記憶部10aに格納されるデータ及び制御部10dが実施の形態1と異なる。実施の形態2におけるその他の構成は、実施の形態1と同じである。
Embodiment 2.
In the second embodiment, the data stored in the storage unit 10a and the control unit 10d are different from those in the first embodiment. The other configurations in the second embodiment are the same as those in the first embodiment.
<データベース>
 記憶部10aには、実施の形態1で説明したデータ以外に、記号、文字、又は簡易なパターンで形成される、画像光の形状に関する情報(パターンごとのマスクデータなどのデータ含む)が格納されてもよい。
<Database>
In addition to the data described in embodiment 1, the memory unit 10a may store information regarding the shape of the image light formed by symbols, characters, or simple patterns (including data such as mask data for each pattern).
<マスクデータの作成>
 図8は、実施の形態2における照明制御システム100の動作の一例を示す図である。
 図9は、マスクデータを作成する方法の一例を示すフローチャートである。
 実施の形態2では、例えば、照明装置10eの起動時に照明装置10eから照射される画像光61のマスクデータを作成する。
<Mask data creation>
FIG. 8 is a diagram showing an example of the operation of lighting control system 100 in the second embodiment.
FIG. 9 is a flowchart showing an example of a method for generating mask data.
In the second embodiment, for example, mask data of the image light 61 irradiated from the lighting device 10e is generated when the lighting device 10e is started up.
 ステップS21では、照明装置10eが光を照射していない状態で、撮像装置41aは、被照射面51上の背景光(具体的には、外光)の画像を取得する。 In step S21, while the lighting device 10e is not emitting light, the imaging device 41a captures an image of the background light (specifically, external light) on the illuminated surface 51.
 ステップS22では、照明装置10eは、画像光61を被照射面51に照射し、撮像装置41aは、照明装置10eによって被照射面51に投影された画像(背景光と画像光61を合わせた画像)を取得する。すなわち、撮像装置41aは、画像光61によって被照射面51に形成された画像を取得する。この場合、画像光61によって被照射面51に形成される画像は、例えば、記号、文字、又はパターンを示す画像である。 In step S22, the lighting device 10e irradiates the irradiated surface 51 with image light 61, and the imaging device 41a captures the image (an image combining background light and image light 61) projected onto the irradiated surface 51 by the lighting device 10e. That is, the imaging device 41a captures the image formed on the irradiated surface 51 by the image light 61. In this case, the image formed on the irradiated surface 51 by the image light 61 is, for example, an image showing a symbol, character, or pattern.
 ステップS23では、制御部10は、撮像装置41aによって取得された画像(背景光の画像および背景光と画像光61を合わせた画像)を取得し、差分計算、二値化、及び境界部抽出等の画像処理(例えば、マスク処理)を行う。 In step S23, the control unit 10 acquires the images acquired by the imaging device 41a (an image of the background light and an image combining the background light and the image light 61), and performs image processing such as difference calculation, binarization, and boundary extraction (e.g., mask processing).
 ステップS24では、パターンごとにマスクデータ(画像領域、背景領域、境界領域)を記憶部10aに格納する。パターンは、例えば、図6に示される画像光形成領域3a2、3b2、3c2である。 In step S24, the mask data (image area, background area, boundary area) for each pattern is stored in the memory unit 10a. The patterns are, for example, the image light formation areas 3a2, 3b2, and 3c2 shown in FIG. 6.
 マスク処理の一例を説明する。
 図10から図15は、マスク処理の工程を示す図である。
 差分計算が行われる際、照明装置10eは画像光61を被照射面51に照射し、撮像装置41aが画像61Aを取得する(図10)。
An example of the mask process will be described.
10 to 15 are diagrams showing the steps of the masking process.
When the difference calculation is performed, the illumination device 10e irradiates the illuminated surface 51 with image light 61, and the imaging device 41a captures an image 61A (FIG. 10).
 次に、照明装置10eが画像光61を照射しない状態で、撮像装置41aが被照射面51の背景光(具体的には、外光)によって形成された画像61Bを取得する(図11)。図12に示されるように、画像61Aと画像61Bとの差分をとることで差分画像61Cを算出する。 Next, while the illumination device 10e is not emitting image light 61, the imaging device 41a captures an image 61B formed by background light (specifically, external light) on the illuminated surface 51 (FIG. 11). As shown in FIG. 12, a difference image 61C is calculated by taking the difference between the image 61A and the image 61B.
 図13に示されるように差分画像61Cを二値化した後、図14に示されるように画像領域61D(第1の画像領域とも称する)と背景領域61E(第2の画像領域とも称する)との間の領域を抽出し、図15に示されるように画像領域61D、背景領域61E、及び分離領域61F(境界領域とも称する)の3つの領域に分割する。以上の処理により、マスク処理が完了する。
 なお、マスク化を行う際は、二値化は必須ではなく、境界領域に対して段階的に重みづけを行ってもよい。つまり、画像領域61D、背景領域61E、分離領域61Fの3つの領域に分割すればマスク化方法は限られない。
 なお、高精度に制御を行うため、3つの領域に分割しているが、精度が許容される場合は、画像領域と背景領域のみを分割、つまり2つの領域の分割でもよい。これにより、制御が容易となる利点がある。
After binarizing the difference image 61C as shown in Fig. 13, the area between the image area 61D (also referred to as the first image area) and the background area 61E (also referred to as the second image area) is extracted as shown in Fig. 14, and divided into three areas, the image area 61D, the background area 61E, and the separation area 61F (also referred to as the boundary area), as shown in Fig. 15. With the above processing, the mask processing is completed.
In addition, when masking, binarization is not essential, and weighting may be applied to the boundary region in stages. In other words, the masking method is not limited as long as the image region 61D, the background region 61E, and the separation region 61F are divided into three regions.
Although the area is divided into three regions in order to perform control with high accuracy, if the accuracy is acceptable, it is also possible to divide only the image area and the background area, i.e., to divide into two areas. This has the advantage of making control easier.
<制御部10d>
 実施の形態2では、制御部10dは、予め取得した画像(例えば、パターンを含む画像61Aおよび背景光によって形成された画像61B)を用いてマスク処理を行い、マスク処理によって得られたデータを使用して、照明装置10eから出る光の光強度を制御する。具体的には、制御部10dは、画像領域61Dによって形成される第1の画像領域の光強度と第1の画像領域の周囲に第1の画像領域とは異なる第2の画像領域を形成する背景領域61Eの光強度を比較し、画像領域61Dの光強度が背景領域61Eの光強度より大きくなるように照明装置10eから照射される画像光61の光強度を制御する。
<Control unit 10d>
In the second embodiment, the control unit 10d performs mask processing using images acquired in advance (e.g., an image 61A including a pattern and an image 61B formed by background light), and controls the light intensity of the light emitted from the illumination device 10e using data obtained by the mask processing. Specifically, the control unit 10d compares the light intensity of a first image region formed by an image region 61D with the light intensity of a background region 61E that forms a second image region different from the first image region around the first image region, and controls the light intensity of the image light 61 irradiated from the illumination device 10e so that the light intensity of the image region 61D is greater than the light intensity of the background region 61E.
 図16は、実施の形態2における制御部10dにおける処理の一例を示すフローチャートである。
 図16に示される処理は、例えば、移動型ロボット11が走行しながら行われる。
 ステップS25では、制御部10dは、被照射面51に形成された画像を取得する。
FIG. 16 is a flowchart showing an example of a process in the control unit 10d according to the second embodiment.
The process shown in FIG. 16 is performed, for example, while the mobile robot 11 is moving.
In step S25, the control unit 10d acquires the image formed on the illuminated surface 51.
 ステップS26では、制御部10dは、画像光61のパターンに対応するマスクデータを用いて、取得した画像を画像領域61D、背景領域61E、及び分離領域61Fの3つの領域に分離する。画像領域61Dの結像性能、照明装置10eの振動による画像ブレ等の影響を避けるため、分離領域61Fを設ける。これにより、画像領域61Dと背景領域61Eとを明確に分離することが可能となる。 In step S26, the control unit 10d separates the acquired image into three regions: an image region 61D, a background region 61E, and a separation region 61F, using mask data corresponding to the pattern of the image light 61. The separation region 61F is provided to avoid the effects of the imaging performance of the image region 61D and image blurring caused by vibrations of the lighting device 10e. This makes it possible to clearly separate the image region 61D and the background region 61E.
 ステップS27では、制御部10dは、画像領域61Dのデータ、背景領域61Eのデータ、及び撮像装置41aによって取得されたデータを用いて、画像領域61Dの光強度及び背景領域61Eの光強度を算出する。 In step S27, the control unit 10d calculates the light intensity of the image area 61D and the light intensity of the background area 61E using the data of the image area 61D, the data of the background area 61E, and the data acquired by the imaging device 41a.
 ステップS28では、制御部10dは、第1の画像領域を形成する画像領域61Dの光強度が第2の画像領域を形成する背景領域61Eの光強度よりも大きくなるように照明装置10eを制御する。例えば、制御部10dは、画像領域61Dの光強度と背景領域61Eの光強度との関係が予め設定された目標値に近づくように照明装置10eから照射される画像光61の光強度を制御する。目標値を、画像領域61Dの光強度が背景領域61Eの光強度の2倍以上の値に設定した場合、制御部10dは、画像領域61Dの光強度が背景領域61Eの光強度の2倍以上となるように画像光61の光強度を制御する。 In step S28, the control unit 10d controls the illumination device 10e so that the light intensity of the image region 61D forming the first image region is greater than the light intensity of the background region 61E forming the second image region. For example, the control unit 10d controls the light intensity of the image light 61 irradiated from the illumination device 10e so that the relationship between the light intensity of the image region 61D and the light intensity of the background region 61E approaches a preset target value. When the target value is set to a value at which the light intensity of the image region 61D is more than twice the light intensity of the background region 61E, the control unit 10d controls the light intensity of the image light 61 so that the light intensity of the image region 61D is more than twice the light intensity of the background region 61E.
 ステップS25からステップS28までの処理は、繰り返し行われてもよい。 The processes from step S25 to step S28 may be repeated.
 被照射面51に照射される画像領域61Dの面積に対して背景領域61Eの面積は、0.5倍以上であることが好ましい。これにより、計算領域を絞ることが可能となり計算速度を速くすることができる。その結果、コントラストを上げることができ、画像光61を強調することができる。 It is preferable that the area of the background region 61E is 0.5 times or more the area of the image region 61D irradiated on the irradiated surface 51. This makes it possible to narrow the calculation area and increase the calculation speed. As a result, the contrast can be increased and the image light 61 can be emphasized.
 以上に説明したように、実施の形態2によれば、制御部10dは、画像領域61Dによって形成される第1の画像領域の光強度と第1の画像領域の周囲に第1の画像領域とは異なる第2の画像領域を形成する背景領域61Eの光強度を比較し、画像領域61Dの光強度が背景領域61Eの光強度より大きくなるように照明装置10eから照射される画像光61の光強度を制御する。 As described above, according to the second embodiment, the control unit 10d compares the light intensity of the first image area formed by the image area 61D with the light intensity of the background area 61E that forms a second image area different from the first image area around the first image area, and controls the light intensity of the image light 61 irradiated from the illumination device 10e so that the light intensity of the image area 61D is greater than the light intensity of the background area 61E.
 例えば、画像領域61D、背景領域61E、及び分離領域61Fの3つの領域に分離し、分離領域61Fを計算から除外することにより、精度高くコントラスト比率を算出することが可能となる。その結果、画像光61を強調することができ、ユーザ21にとっての第1の画像の視認性を高めることができる。また、画像領域61Dの結像性能、照明装置10eの振動による画像ブレ等の影響により、算出したコントラスト比率の精度が低下することを抑制する効果が得られる。 For example, by separating the image area 61D, background area 61E, and separation area 61F into three areas and excluding separation area 61F from the calculation, it is possible to calculate the contrast ratio with high accuracy. As a result, the image light 61 can be emphasized, and the visibility of the first image for the user 21 can be improved. In addition, it is possible to suppress a decrease in the accuracy of the calculated contrast ratio due to the influence of the imaging performance of the image area 61D, image blur due to vibration of the lighting device 10e, etc.
 制御部10dは、第1の画像領域を形成する画像領域61Dの光強度が第2の画像領域を形成する背景領域61Eの光強度よりも大きくなるように照明装置10eを制御することが好ましい。これにより、ユーザ21にとっての第1の画像領域の視認性をより高めることができる。 The control unit 10d preferably controls the lighting device 10e so that the light intensity of the image area 61D forming the first image area is greater than the light intensity of the background area 61E forming the second image area. This can further increase the visibility of the first image area for the user 21.
 例えば、制御部10dは、画像領域61Dの光強度が背景領域61Eの光強度の2倍以上となるように画像光61の光強度を制御することが好ましい。これにより、ユーザ21にとっての第1の画像領域の視認性をより高めることができる。 For example, it is preferable that the control unit 10d controls the light intensity of the image light 61 so that the light intensity of the image region 61D is at least twice the light intensity of the background region 61E. This can further increase the visibility of the first image region for the user 21.
なお、本実施の形態では、制御速度を速めるため、予め取得した画像(例えば、パターンを含む画像61Aおよび背景光によって形成された画像61B)を用いてマスク処理を行っているが、リアルタイムで画像処理を行ってもよい。
例えば、移動型ロボット11に搭載した撮像装置41aあるいは、移動型ロボット11が移動する屋内の天井又は壁面に設置された撮像装置により、画像光が照射された状態で第1の画像領域を取得し、画像光が照射されていない状態で、第1の画像領域の周囲に形成された第2の画像領域とを取得し、制御部10dは、第1の画像領域と第2の画像領域との間に第1の画像領域と第2の画像領域とを分離する分離領域が形成されるように画像処理を行ってもよい。
 これにより、精度高くコントラスト比率を算出することが可能となる。
 なお、この分離機能は、図1に示す構成に限らず。記憶部、特性部、推定部を備えないシステムに適用しても良く、斜め投射でない場合でも適用しても良い。適用先のシステムは特に限定されず、精度高くコントラスト比率を算出する用途であればよい。その場合は、撮像装置は上記に限定されず、システムに対象となる領域の撮像画像を入力できれば良い。
In this embodiment, in order to increase the control speed, masking is performed using images acquired in advance (e.g., image 61A including a pattern and image 61B formed by background light), but image processing may also be performed in real time.
For example, an imaging device 41a mounted on the mobile robot 11 or an imaging device installed on a ceiling or wall surface of a room where the mobile robot 11 moves may acquire a first image area when image light is irradiated, and acquire a second image area formed around the first image area when image light is not irradiated, and the control unit 10d may perform image processing so that a separation area is formed between the first image area and the second image area, separating the first image area from the second image area.
This makes it possible to calculate the contrast ratio with high accuracy.
This separation function is not limited to the configuration shown in Fig. 1. It may be applied to a system that does not include a storage unit, a characteristic unit, and an estimation unit, and may be applied to a system that does not use oblique projection. There are no particular limitations on the system to which it is applied, and it may be used to calculate a contrast ratio with high accuracy. In that case, the imaging device is not limited to the above, and it is sufficient if it can input a captured image of the target area to the system.
実施の形態3.
 実施の形態3では、記憶部10aに格納されるデータが実施の形態1と異なる。実施の形態3におけるその他の構成は、実施の形態1と同じである。
Embodiment 3.
In the third embodiment, the data stored in the storage unit 10a is different from that in the first embodiment. Other configurations in the third embodiment are the same as those in the first embodiment.
 記憶部10aには、移動型ロボット11が移動する屋内の情報に関する屋内情報データベースが構築されている。屋内情報データベースは、例えば、可動範囲マップ81、床面マップ82、照明マップ83、及び外光マップ84から構成される。制御部10dは、屋内情報データベースを用いて画像光(具体的には、ユーザ21の目の位置における、被照射面51での反射光の光強度及び波長)を適切に制御する。 In the memory unit 10a, an indoor information database is constructed regarding information about the indoor area in which the mobile robot 11 moves. The indoor information database is composed of, for example, a movable range map 81, a floor surface map 82, a lighting map 83, and an external light map 84. The control unit 10d uses the indoor information database to appropriately control the image light (specifically, the light intensity and wavelength of the reflected light from the irradiated surface 51 at the position of the user's 21 eyes).
 図16は、実施の形態3における画像光を制御する工程の一例を示すフローチャートである。
<可動範囲マップ81>
 図17は、可動範囲マップ81の一例を示す図である。
 可動範囲マップ81は、移動型ロボット11が移動可能な範囲を示すデータである。具体的には、可動範囲マップ81は、屋内の床面、障害物などの屋内の状況を考慮して移動型ロボット11が移動可能な範囲を示すデータである。可動範囲マップ81は、移動可能領域81aと移動不可能領域81bとに分類されており、これらの領域を示すデータによって構成されている。
FIG. 16 is a flowchart showing an example of a process for controlling image light in the third embodiment.
<Movement range map 81>
FIG. 17 is a diagram showing an example of the movable range map 81. As shown in FIG.
The movable range map 81 is data showing the movable range of the mobile robot 11. Specifically, the movable range map 81 is data showing the movable range of the mobile robot 11, taking into consideration indoor conditions such as the indoor floor surface, obstacles, etc. The movable range map 81 is classified into a movable area 81a and an unmovable area 81b, and is composed of data showing these areas.
<床面マップ82>
 図18は、床面マップ82の一例を示す図である。
 床面マップ82は、移動可能領域81aに対応する床面の光学特性又は移動可能領域81aに対応する床面(すなわち、被照射面51)の材料の種類を示すデータである。したがって、床面マップ82は、可動範囲マップ81と関連付けられている。
<Floor Map 82>
FIG. 18 is a diagram showing an example of the floor surface map 82. As shown in FIG.
The floor surface map 82 is data indicating the optical characteristics of the floor surface corresponding to the movable area 81 a or the type of material of the floor surface (i.e., the irradiated surface 51) corresponding to the movable area 81 a. Therefore, the floor surface map 82 is associated with the movable range map 81.
 図18に示される例では、データ「C1」はカーペット(第1のカーペット)を示し、データ「C2」は他のカーペット(第1のカーペットとは異なる第2のカーペット)を示し、データ「PT」はPタイルを示す。これらのデータ「C1」、「C2」、「PT」の光学特性は、記憶部10aに予め格納されている。これらのデータ「C1」、「C2」、「PT」の光学特性は、例えば、移動型ロボット11の位置情報又は移動経路情報に基づいて、特定部10bによって特定される。移動型ロボット11の移動経路情報は、例えば、決められた時間帯に特定の場所を走行するように定められた移動経路を示す情報であり、記憶部10aに予め格納されている。なお、移動型ロボット11の移動経路情報は、動作前の任意に設定された移動経路を示す情報でもよい。 In the example shown in FIG. 18, data "C1" indicates a carpet (first carpet), data "C2" indicates another carpet (a second carpet different from the first carpet), and data "PT" indicates a P tile. The optical characteristics of these data "C1", "C2", and "PT" are stored in advance in the memory unit 10a. The optical characteristics of these data "C1", "C2", and "PT" are identified by the identification unit 10b, for example, based on the position information or movement path information of the mobile robot 11. The movement path information of the mobile robot 11 is, for example, information indicating a movement path set to travel through a specific location during a specific time period, and is stored in advance in the memory unit 10a. Note that the movement path information of the mobile robot 11 may also be information indicating a movement path set arbitrarily before operation.
<照明マップ83>
 図19は、照明マップ83の一例を示す図である。
 照明マップ83は、移動可能領域81aに対応する照明器具の位置(例えば、天井の位置)を示すデータである。したがって、照明マップ83は、可動範囲マップ81と関連付けられている。例えば、移動可能領域81aごとに照明器具の種類を示す照明マップ83が記憶部10aに格納されている。
<Lighting Map 83>
FIG. 19 is a diagram showing an example of the illumination map 83. As shown in FIG.
The illumination map 83 is data indicating the positions (e.g., the position of the ceiling) of the lighting fixtures corresponding to the movable area 81a. Therefore, the illumination map 83 is associated with the movable range map 81. For example, the illumination map 83 indicating the type of lighting fixture for each movable area 81a is stored in the storage unit 10a.
 図19に示される例では、データ「FL」は蛍光灯(昼白色)を示し、データ「DL」はダウンライト(電球色)を示す。これらのデータ「FL」、「DL」のように、照明器具の種類、光源色などのデータが記憶部10aに予め格納されている。 In the example shown in FIG. 19, data "FL" indicates a fluorescent lamp (daylight white), and data "DL" indicates a downlight (warm white). Data such as the type of lighting fixture and light source color, such as the data "FL" and "DL", are pre-stored in the memory unit 10a.
 記憶部10aに予め格納された床面上の照度と照明器具の位置及び被照射面51の特性とを用いてユーザ21の目の位置における光強度を推定してもよい。これにより、移動型ロボット11に搭載された撮像装置41a又は屋内に設置された撮像装置41bを用いて背景光の影響を推定する必要がなくなる。 The light intensity at the eye position of the user 21 may be estimated using the illuminance on the floor surface, the positions of the lighting fixtures, and the characteristics of the irradiated surface 51 that are stored in advance in the memory unit 10a. This eliminates the need to estimate the effect of background light using the imaging device 41a mounted on the mobile robot 11 or the imaging device 41b installed indoors.
 色の場合、光強度と同様に、可動範囲マップ81に対応する床面での反射光の色分布を複数の視点で記憶部10aに格納してもよい。例えば、反射光の色分布を、ユーザ21の目の位置、ユーザ21の目線を考慮して記憶部10aに格納してもよい。 In the case of color, similar to light intensity, the color distribution of reflected light on the floor surface corresponding to the movable range map 81 may be stored in the storage unit 10a from multiple viewpoints. For example, the color distribution of reflected light may be stored in the storage unit 10a taking into account the eye position and line of sight of the user 21.
 床面(すなわち、被照射面51)の色度又は色情報を記憶部10aに予め格納し、照明マップ83の照明器具の位置、光源色、被照射面51の特性を用いてユーザ21の目の位置における色を推定してもよい。 The chromaticity or color information of the floor surface (i.e., the irradiated surface 51) may be stored in advance in the memory unit 10a, and the color at the eye position of the user 21 may be estimated using the positions of the lighting fixtures in the illumination map 83, the light source color, and the characteristics of the irradiated surface 51.
<外光マップ84>
 図20は、外光マップ84の一例を示す図である。
 外光マップ84は、外光が入る移動可能領域81aにおける反射光の強度分布を示すデータである。したがって、外光マップ84は、可動範囲マップ81と関連付けられている。反射光の強度分布は、例えば、外光が入る領域における、天候情報、季節などの条件に応じた太陽高度の推定値に対応する。反射光の強度分布は、例えば、ユーザ21の目の位置、ユーザ21の目線を考慮して記憶部10aに格納してもよい。
<External Light Map 84>
FIG. 20 is a diagram showing an example of the external light map 84. As shown in FIG.
The external light map 84 is data showing the intensity distribution of reflected light in the movable area 81a where external light enters. Therefore, the external light map 84 is associated with the movable range map 81. The intensity distribution of reflected light corresponds to, for example, an estimated value of the solar altitude in the area where external light enters according to conditions such as weather information and season. The intensity distribution of reflected light may be stored in the storage unit 10a, for example, taking into consideration the position of the eyes of the user 21 and the line of sight of the user 21.
 記憶部10aに予め格納された床面上の照度と外光の入射角度及び被照射面51の特性とを用いてユーザ21の目の位置における光強度を推定してもよい。 The light intensity at the eye position of the user 21 may be estimated using the illuminance on the floor surface, the incident angle of external light, and the characteristics of the irradiated surface 51 that are pre-stored in the memory unit 10a.
 色の場合、光強度と同様に、可動範囲マップ81に対応する床面での反射光の色分布を複数の視点で記憶部10aに格納してもよい。例えば、反射光の色分布を、ユーザ21の目の位置、ユーザ21の目線を考慮して記憶部10aに格納してもよい。 In the case of color, similar to light intensity, the color distribution of reflected light on the floor surface corresponding to the movable range map 81 may be stored in the storage unit 10a from multiple viewpoints. For example, the color distribution of reflected light may be stored in the storage unit 10a taking into account the eye position and line of sight of the user 21.
 床面(すなわち、被照射面51)の色度又は色情報を記憶部10aに予め格納し、照明マップ83の照明器具の位置、光源色、被照射面51の特性を用いてユーザ21の目の位置における色を推定してもよい。 The chromaticity or color information of the floor surface (i.e., the irradiated surface 51) may be stored in advance in the memory unit 10a, and the color at the eye position of the user 21 may be estimated using the positions of the lighting fixtures in the illumination map 83, the light source color, and the characteristics of the irradiated surface 51.
 図20に示される例では、図20の上側(例えば、北側)から外光が屋内に入る場合の、データ「N1」で示される領域の光強度及びデータ「N2」で示される領域の光強度が記憶部10aに予め格納されている。ブラインドの開閉状態を記憶部10aに予め格納してもよい。これにより、ブラインドの開閉情報を用いて、外光の影響を推定することが可能となる。 In the example shown in FIG. 20, the light intensity of the area indicated by data "N1" and the light intensity of the area indicated by data "N2" when external light enters the house from the upper side of FIG. 20 (e.g., the north side) are stored in advance in the storage unit 10a. The open/closed state of the blinds may also be stored in advance in the storage unit 10a. This makes it possible to estimate the effect of external light using the open/closed information of the blinds.
 以上に説明したように、実施の形態3によれば、可動範囲マップ81、床面マップ82、照明マップ83、及び外光マップ84が記憶部10aに予め格納されている。したがって、これらのデータを用いることにより、移動型ロボット11の位置、移動経路などの情報に基づいて、被照射面51の特性の特定、ユーザ21の目の位置における光強度及び波長の推定を行うことができる。すなわち、撮像装置41aを用いることなく、例えば、ユーザ21の目の位置を、ユーザ21が存在する屋内に設置された別の撮像装置によって取得された画像を用いて推定することにより、画像光(すなわち、ユーザ21の目の位置における、被照射面51での反射光)の光強度及び波長を適切に制御することができる。その結果、ユーザ21にとっての視認性を高めることができる。 As described above, according to the third embodiment, the movable range map 81, the floor surface map 82, the illumination map 83, and the external light map 84 are stored in advance in the storage unit 10a. Therefore, by using these data, it is possible to identify the characteristics of the irradiated surface 51 and estimate the light intensity and wavelength at the position of the user 21's eyes based on information such as the position and movement path of the mobile robot 11. In other words, without using the imaging device 41a, for example, by estimating the position of the user 21's eyes using an image acquired by another imaging device installed indoors where the user 21 is present, it is possible to appropriately control the light intensity and wavelength of the image light (i.e., the light reflected on the irradiated surface 51 at the position of the user 21's eyes). As a result, it is possible to improve visibility for the user 21.
 移動型ロボット11が可動範囲マップ81の移動可能領域81aを走行している際に、撮像装置41aによって取得される背景光と被照射面51の特性を記憶部10aに格納してもよい。 When the mobile robot 11 is moving through the movable area 81a of the movable range map 81, the characteristics of the background light and the irradiated surface 51 acquired by the imaging device 41a may be stored in the memory unit 10a.
 記憶部10aに格納されるデータは、人工知能を用いて生成されたデータでもよい。 The data stored in the memory unit 10a may be data generated using artificial intelligence.
 以上に説明した各実施の形態における特徴は、互いに組み合わせることができる。 The features of each of the embodiments described above can be combined with each other.
 1 光源、 3 画像光形成部、 4 投射光学部、 10a 記憶部、 10b 特定部、 10c 推定部、 10d 制御部、 10e 照明装置、 11 移動型ロボット、 21 ユーザ、 31b 照明光学系、 41a,41b 撮像装置、 42 主制御部、 42a プロセッサ、 42b メモリ、 51 被照射面、 100 照明制御システム。 1 light source, 3 image light forming section, 4 projection optical section, 10a memory section, 10b identification section, 10c estimation section, 10d control section, 10e lighting device, 11 mobile robot, 21 user, 31b lighting optical system, 41a, 41b imaging device, 42 main control section, 42a processor, 42b memory, 51 irradiated surface, 100 lighting control system.

Claims (8)

  1.  被照射面に対して斜め方向に照射する照明装置を備えた移動型ロボットにおいて、
     光源、前記光源から出射された光を画像光に変更する画像光形成部、及び前記画像光形成部から出た前記画像光を前記被照射面に対して照射する光学素子を有する照明装置と、
     前記被照射面の特性、特定の位置、並びに前記特定の位置において検出された、前記被照射面で反射した前記画像光の光強度及び波長が関連付けられた光学データが格納された記憶部と、
     前記照明装置によって照射される前記被照射面の前記特性を特定する特定部と、
     ユーザの目の位置を推定する推定部と、
     前記被照射面の前記特性及び前記推定部によって推定された結果に基づいて、前記照明装置から出る前記画像光の光強度又は波長を制御する制御部と
     を備えた照明制御システム。
    A mobile robot equipped with a lighting device that illuminates an illuminated surface in an oblique direction,
    an illumination device including a light source, an image light forming unit that converts light emitted from the light source into image light, and an optical element that irradiates the image light emitted from the image light forming unit onto the illuminated surface;
    a memory unit in which optical data is stored that associates characteristics of the irradiated surface, a specific position, and the light intensity and wavelength of the image light reflected by the irradiated surface detected at the specific position;
    An identification unit that identifies the characteristics of the illuminated surface illuminated by the lighting device;
    An estimation unit that estimates a position of a user's eyes;
    and a control unit that controls the light intensity or wavelength of the image light emitted from the lighting device based on the characteristics of the illuminated surface and the result estimated by the estimation unit.
  2.  撮像装置をさらに備え、
     前記推定部は、前記撮像装置によって取得された画像を用いて前記ユーザの目の位置を推定する請求項1に記載の照明制御システム。
    Further comprising an imaging device;
    The lighting control system according to claim 1 , wherein the estimation unit estimates the position of the user's eyes by using an image captured by the imaging device.
  3.  前記画像光は、記号、文字、又はパターンを示す第1の画像光を前記被照射面に照射し、
     前記制御部は、
     前記画像光が照射された状態で前記第1の画像光と背景光によって形成された第1の画像と、
     前記画像光が照射されていない状態で、前記背景光によって形成された第2の画像と
     を取得し、
     前記制御部は、前記第1の画像光に対応する第1の画像領域と前記第1の画像領域の周囲に形成される第2の画像領域との間に前記第1の画像領域と前記第2の画像領域とを分離する分離領域が形成されるように画像処理を行う
     請求項1又は2に記載の照明制御システム。
    The image light is a first image light indicating a symbol, a character, or a pattern, which is irradiated onto the illumination surface;
    The control unit is
    a first image formed by the first image light and background light in a state where the image light is irradiated;
    a second image formed by the background light in a state where the image light is not irradiated;
    The lighting control system according to claim 1 or 2, wherein the control unit performs image processing so that a separation area is formed between a first image area corresponding to the first image light and a second image area formed around the first image area, the separation area separating the first image area and the second image area.
  4.  前記制御部は、前記第1の画像を形成する前記画像光の光強度が前記第2の画像を形成する前記背景光の光強度よりも大きくなるように前記照明装置を制御する請求項3に記載の照明制御システム。 The lighting control system according to claim 3, wherein the control unit controls the lighting device so that the light intensity of the image light forming the first image is greater than the light intensity of the background light forming the second image.
  5.  前記制御部は、前記第1の画像を形成する前記画像光の光強度が前記第2の画像を形成する前記背景光の光強度の2倍以上となるように前記照明装置を制御する請求項3又は4に記載の照明制御システム。 The lighting control system according to claim 3 or 4, wherein the control unit controls the lighting device so that the light intensity of the image light forming the first image is at least twice the light intensity of the background light forming the second image.
  6.  前記制御部は、前記ユーザとは異なるユーザを検知した場合、前記照明装置からの照射を停止する請求項1から5のいずれか1項に記載の照明制御システム。 The lighting control system according to any one of claims 1 to 5, wherein the control unit stops illumination from the lighting device when it detects a user other than the user.
  7.  前記記憶部には、屋内の情報に関する屋内情報データベースが構築されており、
     前記制御部は、前記屋内情報データベースを用いて前記画像光を制御する
     請求項1から6のいずれか1項に記載の照明制御システム。
    An indoor information database relating to indoor information is constructed in the storage unit,
    The lighting control system according to claim 1 , wherein the control unit controls the image light by using the indoor information database.
  8. 記号、文字、又はパターンを示す第1の画像光を被照射面に照射する照明装置と、
    制御部を備え、
    前記第1の画像光が照射された状態で前記第1の画像光と背景光によって形成された第1の画像と、
     前記第1の画像光が照射されていない状態で、前記背景光によって形成された第2の画像と
     を取得し、
     前記制御部は、前記第1の画像光に対応する第1の画像領域と前記第1の画像領域の周囲に形成される第2の画像領域との間に前記第1の画像領域と前記第2の画像領域とを分離する分離領域が形成されるように画像処理を行う照明制御システム。
    An illumination device that irradiates a first image light indicating a symbol, a character, or a pattern onto an illumination target surface;
    A control unit is provided,
    a first image formed by the first image light and background light in a state where the first image light is irradiated;
    a second image formed by the background light in a state where the first image light is not irradiated; and
    A lighting control system in which the control unit performs image processing so that a separation area separating the first image area and the second image area is formed between a first image area corresponding to the first image light and a second image area formed around the first image area.
PCT/JP2022/038855 2022-10-19 2022-10-19 Illumination control system WO2024084606A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/JP2022/038855 WO2024084606A1 (en) 2022-10-19 2022-10-19 Illumination control system
JP2023511916A JP7475539B1 (en) 2022-10-19 2022-10-19 Lighting Control System
JP2024063099A JP2024074970A (en) 2022-10-19 2024-04-10 Lighting Control System

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/038855 WO2024084606A1 (en) 2022-10-19 2022-10-19 Illumination control system

Publications (1)

Publication Number Publication Date
WO2024084606A1 true WO2024084606A1 (en) 2024-04-25

Family

ID=90737074

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/038855 WO2024084606A1 (en) 2022-10-19 2022-10-19 Illumination control system

Country Status (2)

Country Link
JP (2) JP7475539B1 (en)
WO (1) WO2024084606A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005015466A1 (en) * 2003-08-07 2005-02-17 Matsushita Electric Industrial Co., Ltd. Life assisting system and its control program
JP2005313291A (en) * 2004-04-30 2005-11-10 Mitsubishi Heavy Ind Ltd Image display method linked with robot action, and device thereof
JP2011204145A (en) * 2010-03-26 2011-10-13 Sony Corp Moving device, moving method and program
JP2015149522A (en) * 2014-02-04 2015-08-20 シャープ株式会社 Projection control apparatus
JP2017170982A (en) * 2016-03-22 2017-09-28 日本電気株式会社 System for controlling unmanned flight device, method for controlling unmanned flight device and image projecting device
JP2018000383A (en) * 2016-06-29 2018-01-11 パナソニックIpマネジメント株式会社 Walking support robot and walking support method
JP2021157203A (en) * 2018-06-19 2021-10-07 ソニーグループ株式会社 Mobile control device, mobile control method, and program

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5465523B2 (en) 2009-01-29 2014-04-09 三洋電機株式会社 Stereoscopic image display system
JP7112903B2 (en) 2018-07-19 2022-08-04 シャープ株式会社 vacuum cleaner
CN114022543A (en) 2021-10-28 2022-02-08 北京乐驾科技有限公司 AR/VR glasses, system and method for measuring height, electronic device and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005015466A1 (en) * 2003-08-07 2005-02-17 Matsushita Electric Industrial Co., Ltd. Life assisting system and its control program
JP2005313291A (en) * 2004-04-30 2005-11-10 Mitsubishi Heavy Ind Ltd Image display method linked with robot action, and device thereof
JP2011204145A (en) * 2010-03-26 2011-10-13 Sony Corp Moving device, moving method and program
JP2015149522A (en) * 2014-02-04 2015-08-20 シャープ株式会社 Projection control apparatus
JP2017170982A (en) * 2016-03-22 2017-09-28 日本電気株式会社 System for controlling unmanned flight device, method for controlling unmanned flight device and image projecting device
JP2018000383A (en) * 2016-06-29 2018-01-11 パナソニックIpマネジメント株式会社 Walking support robot and walking support method
JP2021157203A (en) * 2018-06-19 2021-10-07 ソニーグループ株式会社 Mobile control device, mobile control method, and program

Also Published As

Publication number Publication date
JP2024074970A (en) 2024-05-31
JP7475539B1 (en) 2024-04-26

Similar Documents

Publication Publication Date Title
US10473284B2 (en) Apparatus for spatially and spectrally adaptable dichromatic white light source using spatial light modulator
US9400416B2 (en) Illumination light source device including a reflecting-transmitting element, projection device including the illumination light source device and method to control the projection device
TWI425299B (en) Light emitting device, light source device and the projector using the light source device
JP5045212B2 (en) Face image capturing device
JP6139017B2 (en) Method for determining characteristics of light source and mobile device
JP6981174B2 (en) Vehicle headlight device
JP2013152474A (en) Laser projector with alerting light
KR960016514A (en) Projection type image display device
US11827140B2 (en) Vehicle detecting device, vehicle lamp system, vehicle detecting method, light distribution controlling device, and light distribution controlling method
JP2010032944A (en) Projection display device
JP2005136952A (en) Infrared night vision system in color
US10118534B2 (en) Irradiation apparatus
TW202021340A (en) Infrared pre-flash for camera
TW201518845A (en) Projector
JP4751443B2 (en) Imaging apparatus and imaging method
US11066006B2 (en) Vehicle lamp
JP7252755B2 (en) Active sensors, object identification systems, vehicles, vehicle lighting
WO2024084606A1 (en) Illumination control system
JP5319999B2 (en) Lighting device
JP6495021B2 (en) Vehicle lighting
JP2006340755A (en) Ophthalmic examination apparatus
JP2005505372A5 (en)
JP7038344B2 (en) Lighting system
JP2014053844A (en) Projector, projection control method and projection control program
WO2023229027A1 (en) Vehicle lamp system