EP1884798B1 - Procédé pour mesurer la distance à un objet - Google Patents

Procédé pour mesurer la distance à un objet Download PDF

Info

Publication number
EP1884798B1
EP1884798B1 EP07015216A EP07015216A EP1884798B1 EP 1884798 B1 EP1884798 B1 EP 1884798B1 EP 07015216 A EP07015216 A EP 07015216A EP 07015216 A EP07015216 A EP 07015216A EP 1884798 B1 EP1884798 B1 EP 1884798B1
Authority
EP
European Patent Office
Prior art keywords
light receiving
section
information
data
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
EP07015216A
Other languages
German (de)
English (en)
Other versions
EP1884798A3 (fr
EP1884798A2 (fr
Inventor
Nobuo Iizuka
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Casio Computer Co Ltd
Original Assignee
Casio Computer Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2006211589A external-priority patent/JP4637066B2/ja
Priority claimed from JP2006261988A external-priority patent/JP4210955B2/ja
Priority claimed from JP2006262078A external-priority patent/JP4178419B2/ja
Application filed by Casio Computer Co Ltd filed Critical Casio Computer Co Ltd
Publication of EP1884798A2 publication Critical patent/EP1884798A2/fr
Publication of EP1884798A3 publication Critical patent/EP1884798A3/fr
Application granted granted Critical
Publication of EP1884798B1 publication Critical patent/EP1884798B1/fr
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves

Definitions

  • the present invention relates to a method for measuring a distance to an object.
  • a distance measurement apparatus has been known as disclosed in Japanese Laid-Open Publication No. S63-266382 that measures a distance to an arbitrary target (measurement object) based on a difference between a time at which laser light is irradiated to the target and a time at which the reflected light is received.
  • a direction measurement apparatus can be structured that measures a direction of the target seen from a measurement point.
  • a position measurement apparatus is also known as disclosed in Japanese Laid-Open Publication No. 2005-77291 that measures, based on electric waves from a plurality of GPS satellites on a geosynchronous orbit, a position on earth (latitude, longitude, and altitude).
  • the distance measurement apparatus uses laser light that is hazardous to a human body (because laser light may damage optic nerves) and thus has a problem in safety.
  • the position measurement apparatus using GPS satellites cannot be used in a place to which an electric wave cannot reach (e.g., indoor).
  • Patent document US 2002/0188388 relates to a distance measuring system to be used in cars moving through traffic.
  • One of the two cars has modulating means controlling two lights, so as to transmit information associated with the travelling state of the car, such as vehicle speed, acceleration, etc (cf. paragraph [0036] and [0037]).
  • a second car includes imaging means so as to acquire light emitted by the first car and retrieve the information transmitted through the modulation of the light.
  • the second car can compute its own distance from the first car by use of triangulation, since the first car has two modulated lights placed a predetermined distance (cf. paragraph [0054]).
  • Patent document JP 2001245253 relates to an imaging device capable of storing a plurality of images taken in a time series and extract information from the time series luminance change of the images.
  • a distance to an object is measured without using laser light.
  • a distance to an object can be measured without causing a safety problem.
  • a distance to an object is measured without using an electric wave from a GPS satellite.
  • a distance to an object can be measured without being influenced by ambient environment.
  • an imaging apparatus for example an electronic camera according to first embodiment of the present invention will be described with regards to the structure.
  • an imaging apparatus 10 includes a main control section 14 including a CPU11, a ROM 12, and a RAM 13 as well as various peripheral circuits (not shown) or the like, and at least the respective sections as described below that are appropriately arranged to surround the main control section 14 and that are required for the operation of the imaging apparatus 10.
  • the main control section 14 typically consists of a one-chip microprocessor.
  • An imaging section 15 is composed of an optical system 16 and an imaging device 17.
  • the optical system 16 includes a photographing lens, an aperture mechanism, a focusing mechanism, and a zoom mechanism for example.
  • the imaging device 17 consists of a two-dimensional image sensor such as a CCD sensor or a CMOS sensor.
  • An operation of an imaging processor 21 is controlled by an imaging controller 19.
  • the imaging controller 19 operates in accordance with photographing operation instructions from the main control section 14 and an automatic focus controller 18 including a step motor 181.
  • a photographing operation instruction from the main control section 14 may be, for example, a frame image reading operation instruction for reading a frame image having a predetermined frame rate (e.g., a frame rate of few dozens to few hundreds per second) for checking a photographing picture composition (for a so-called through image) or for video photographing, a photographing operation instruction for photographing a high resolution still image, and a previous operation instruction for setting an aperture value or a zoom multiplication factor required for these operations.
  • a photographing operation instruction from the automatic focus controller 18 is an operation instruction for the focusing of an optical system 16.
  • the image processor 21 In response to a photographing operation instruction from the main control section 14, the image processor 21 periodically reads a frame image for checking a photographing structure or video with the above frame rate or reads a high resolution frame image of a still image. These frame images are converted by the image processor 21 to digital signals and are subjected to a predetermined image processing (e.g., gamma compensation processing) and are subsequently inputted to the main control section 14 via a FIFO buffer 22.
  • a predetermined image processing e.g., gamma compensation processing
  • the operation section 23 includes, for example, various controllers required for an input interface of the operation imaging apparatus 10 (e.g., a power source switch, a switch for switching between image photographing mode and image reproduction mode, a shutter button for performing still image and video photographing, a menu button for displaying various setting menus, a selection button for selecting a menu item or for selecting an image to be reproduced in an image reproduction mode).
  • various controllers required for an input interface of the operation imaging apparatus 10 e.g., a power source switch, a switch for switching between image photographing mode and image reproduction mode, a shutter button for performing still image and video photographing, a menu button for displaying various setting menus, a selection button for selecting a menu item or for selecting an image to be reproduced in an image reproduction mode.
  • a display driver 24 converts various pieces of display data outputted from the main control section 14 (e.g., through image display data, menu screen display data, image reproduction screen display data) to have a predetermined display format and outputs converted data to a display section 25 constituted by a flat display device (e.g., liquid crystal display).
  • This display section 25 includes a touch panel 26.
  • the touch detector 27 detects a position coordinate where a contact between the touch panel 26 and a finger or a pen for example is detected, and outputs the detection result to the main control section 14.
  • An image memorization section 28 is constituted by a nonvolatile high-capacity memorization apparatus (e.g., flash memory, hard disk, or optical disk) .
  • the term "nonvolatile” herein means that contents memorized in the apparatus are not lost even when the power source is turned OFF.
  • the image memorization section 28 is mainly used to accumulate and store images photographed by this imaging apparatus 10.
  • the respective accumulated and stored images are a compressed file based on a JPEG format for example or an uncompressed raw data file (so-called RAN file) for example.
  • a region in which these images are stored may be positioned just below a route in a file system or may be positioned at a folder in a single layer or a plurality of layers appropriately prepared just below the route.
  • This image memorization section 28 may be provided as the fixed one or also may be a general-purpose memory device that is detachable from the imaging apparatus 10 tobe attached to a personal computer (not shown).
  • An external interface section 29 is a data input/output section corresponding to a general-purpose protocol (e.g., USB, IEEE1394) for example.
  • a photographed image can be optionally transferred to a personal computer (not shown) for example (e.g., an image accumulated and stored in the image memorization section 28 can be transferred to the personal computer) or can be read from a personal computer (e.g., the image can be read to the image memorization section 28 from the personal computer).
  • a power source section 30 includes a rechargeable secondary battery or a disposable primary battery and supplies a power source voltage required for the operations of the respective sections of the imaging apparatus 10 (e.g., main control section 14).
  • An orientation sensor 31 and an elevation angle sensor 32 both detect a photographing direction of this imaging apparatus 10 (direction of a light axis of an optical system 16).
  • the orientation sensor 31 detects the orientation assuming that a magnetic north is 0 degree.
  • the elevation angle sensor 32 detects an elevation angle (or a depression angle) based on an assumption that a horizontal direction is 0 degree.
  • the imaging apparatus 10 uses the imaging device 17 to receive an image including this light emitting object 33 in a time-series manner (or to photograph the image continuously) and demodulates data included in a luminance modulation region of the light emitting object 33 as a measurement target included in the image. Then, the imaging apparatus 10 calculates a distance D to the light emitting object 33 based on the received image and the demodulated data.
  • FIG. 2 shows the structure of the light emitting object 33.
  • the light emitting object 33 includes a luminous source 34 for emitting light in a visible light region; a data memory 35 for storing data to be transmitted; a luminous control section 36 for modulating data stored in this data memory 35 and controlling the luminance degree of the light source 34 according to modulation information; and a luminance window 37 having a predetermined shape and a predetermined size.
  • the data memory 35 includes a guide data memory 351 for retaining arbitrary information and a self size data memory 352 for retaining shape data of the luminance window 37 ("circular shape” in this case) and size data of the luminance window 37 (diameter “R” of the circular shape) and position data of the light emitting object 33 (latitude, longitude, and altitude).
  • the luminous control section 36 includes a pattern data memory 361, a timing generator 362; and a control section 363.
  • the pattern data memory 361 retains two types preamble data of preamble data for detection and acquisition (for measurement) and preamble data for detection and acquisition (for data body); and two different types of luminance change pattern (hereinafter referred to as the first pattern sequence SA and the second pattern sequence SB).
  • the timinggenerator 362 generates a stable clock signal having a predetermined cycle. This clock signal is synchronized with a clock signal of the imaging device 17 of the imaging apparatus 10.
  • the control section 363 repeats an operation as described below in synchronization with a clock signal from the timing generator 362.
  • the control section 363 sequentially reads bit data stored in the pattern data memory 361, the guide data memory 351, and the self size data memory 352 to determine the bit value (whether the bit value is data "1" or data "0"); reads the first pattern sequence SA from the pattern data memory 361 when the bit value is data "1”; reads the second pattern sequence SB from the pattern data memory 361 when the bit value is data "0”; and outputs the first pattern sequence SA or the second pattern sequence SB to the light source 34.
  • the control section 363 repeats this operation in an amount of bit count of data to be transmitted.
  • the light source 34 emits light at a timing corresponding to "1" in the first pattern sequence SA and the second pattern sequence SB and blacks out (or reduces the luminance) at a timing corresponding to "0". By such a blinking operation, the light source 34 outputs, via the luminance window 37, light P for which the luminance changes in a time-series manner.
  • FIG. 1 shows that the light emitting object 33 is carried by a person as the photographic subj ect 20, the photographic subject 20 also may be a fixed structure (e.g., billboard, guide plate) so that the light emitting object 33 can be provided to each fixed structure.
  • data memorized in the data memories 35 of the respective light emitting obj ects 33 also may be downloaded from a server provided in a building or at the exterior of the building via a network (e.g., LAN).
  • a network e.g., LAN
  • the light source 34 is designed to perform a blinking operation at pattern sequences "1" and "0"
  • the light source 34 also may designed, when a pattern sequence includes data of multiple values, to emit light at a plurality of levels in addition to "lighting" and "black out”.
  • the data memory 35 stores therein, as data to be transmitted, at least shape data showing the shape of this light emitting object 33 and size data for the light emitting object 33. This will be described with reference to FIG. 2 for example.
  • the data memory 35 stores therein the shape data showing that the luminance window 37 has a "circular shape” and the size data showing that the luminance window 37 has a diameter "R", respectively. These pieces of data are modulated by the luminous control section 36.
  • the luminous control section 36 desirably modulates the shape data and the size data to be transmitted by, for example, assuming the above shape data ad size data as binary digital data consisting of a logic 0 and a logic 1 to allocate a luminance change pattern (the first pattern sequence SA) having the corresponding time series to the data "0" and to allocate a luminance change pattern (the second pattern sequence SB) having different time series from that of the above data "0" to the data "1".
  • These two luminance change patterns are desirably changed with an identical cycle and are changed at a cycle different from cycles existing in the natural world such as a cycle standardized by a commercial power source or disturbance light.
  • the imaging apparatus 10 has the structure as described above (see FIG. 1 ) and can appropriately perform an "imaging function” to photograph a still image or video to accumulate and store the image file in the image memorization section 28 and optionally a "reproduction function” to read an arbitrary image file accumulated and stored in the image memorization section 28 to cause the image to be reproduced and displayed on the display section 25.
  • This imaging apparatus 10 also can perform a "distance measurement function" according to the present invention.
  • the imaging apparatus 10 acquires in a time-series manner an image including light from the light emitting object 33 at the position of the photographic subject 20; demodulates, as a measurement target included in the image, data included in a luminance modulation region of the light emitting object 33; and measures, based on the acquired image and the demodulated data, a distance D to the light emitting object 33.
  • This "distance measurement function" is mainly provided by the function of the main control section 14.
  • the main control section 14 controls the respective parts of the imaging apparatus 10.
  • the main control section 14 particularly controls: a control of an acquisition cycle by the imaging device 17; the reading of data memorized in the FIFO buffer 22; and the measurement of the distance D to the light emitting object 33 based on the size of the image of the light emitting object 33 formed in the light receiving face 17a of the imaging device 17a, and shape data and size data demodulated by a signal demodulation section 14c (whichwill be described later) ; and various processings using the measurement result.
  • FIGS. 3A and 3B are a conceptual diagram illustrating a part of a memorization space of the RAM 13 of the main control section 14 and functional blocks realized by the main control section 14.
  • FIG. 3A illustrates a part of the memorization space of the RAM 13.
  • FIG. 3B shows some functional blocks realized by the main control section 14.
  • the memorization space of the RAM 13 includes the respective regions such as an imaging length data storage section 13a, an image formation distortion correction data storage section 13b, a distance calculation data table storage section 13c, and a detection data list storage section 13e.
  • the main control section 14 includes the respective functions such as a pattern data memory 14a, a signal region detection section 14b, a signal demodulation section 14c, and a work memory 14d.
  • a photographing lens included in the optical system 16 is composed of one convex lens for example, and is provided so that an image including the light emitting object 33 is formed on the imaging device 17 at a later stage.
  • the image has a light axis A (see FIG. 2 ) at the center and has the angle of view ⁇ .
  • the imaging device 17 is composed by an image sensor (e.g., CCD, CMOS) in which a plurality of imaging elements are arranged in a regular manner.
  • the imaging device 17 converts the status of luminosity of the light emitting object 33 acquired in a two-dimensional manner to an electric signal by assuming that the status of luminosity is the ratio of an area of the light receiving part to the area of the light receiving face 17a (i.e., image of the imaged light emitting obj ect 33).
  • the imaging device 17 outputs the electric signal with a predetermined frame rate (e.g., 30FPS) based on the control by the main control section 14.
  • the imaging device 17 may be any device so long as the device can acquire the status of luminosity of the light emitting object 33 in the two-dimensional manner.
  • the imaging device 17 may be a device in which a plurality of light receiving elements such as photo diodes are arranged.
  • the imaging device 17 are composed by image sensors such as CCD or CMOS, the reference numeral 17a is desirably called as an imaging face. However, the principle of the imaging device 17 will be described based on an assumption that the imaging device 17 is a light receiving face.
  • the ROM 12 memorizes various control programs executed by the CPU 11.
  • the RAM 13 is used as an execution area of these control programs and includes a storage section of various pieces of data shown in FIG. 3 (e.g., an imaging length data storage section 13a for memorizing the imaging length d between the light receiving face 17a of the imaging device 17 and an imaging lens included in the optical system 16).
  • the optical system 16 includes one convex photographing lens provided in a fixed manner.
  • an imaging apparatus including an optical zoom desirably may have a different imaging length depending on the displacement of the lens.
  • the imaging length d is desirably an imaging length obtained by the adjustment of the position of a lens or a focusing length.
  • the signal demodulation section 14c is controlled by the main control section 14 to sequentially acquire, with a cycle of 30FPS, the status of luminosity of the light emitting object 33 that is outputted in a time-series manner when the light emitting object 33 is continuously imaged by the imaging device 17 to demodulate, based on these periodically obtained statuses of luminosity of the light emitting obj ect 33, the data subjected to luminance modulation to data stored in the data memory 35.
  • the signal demodulation section 14c demodulates the data in an opposite method to obtain the shape data and size data.
  • the pattern data memory 14a retains, as in the pattern data memory 361 of the luminous control section 36, two types of pieces of preamble data of preamble data for detection and acquisition (for measurement) and preamble data for detection and acquisition (for data body) as well as two different types of luminance change patterns (the first pattern sequence SA and the second pattern sequence SB).
  • the signal region detection section 14b has a function to identify, when a pixel for which the luminance changes in a time-series manner is detected from an image signal of a plurality of frames retained in the FIFO buffer 22, a pixel region consisting of a pixel group for which the luminance changes at an identical timing with the timing of this luminance change.
  • the signal demodulation section 14c When a pixel having for which the luminance changes in a time-series manner is detected from an image signal of a plurality of frames retained by the FIFO buffer 22, the signal demodulation section 14c outputs bit data of "1" and "0" from the frame data corresponding to the bit length of the data format 38 sequentially and subsequently buffered to the FIFO buffer 22 depending on the luminance change of the detected pixel; determines whether these pieces of bit data correspond to any of the first pattern sequence SA and the second pattern sequence SB; outputs, when these pieces of bit data correspond to any of the first pattern sequence SA and the second pattern sequence SB, bits corresponding to this pattern; and demodulates the outputted bits to obtain size data, position data, and guide data.
  • the work memory 14d retains an image in the imaging face of the above identified pixel region (light receiving face 17a).
  • the main control section 14 includes the RAM 13 for temporarily memorizing the respective pieces of data obtained by a processing (which will be described later) .
  • the main control section 14 acquires data demodulated by the signal demodulation section 14c to execute, when the data is set to a measurement mode, the measurement of a distance to the light emitting object 33 and the measurement of a current position of the imaging apparatus 10 to output the result of the measurements to the display section 25.
  • the main control section 14 outputs, to the display section 25, guide data that is acquired by light reception (imaging) and that is memorized in the data record of the detection data list storage section 13e.
  • the imaging length data storage section 13a stores therein the imaging length d as described in the above description for the principle.
  • the image formation distortion correction data storage section 13b stores therein data for correcting distortion of an image formed by the imaging device 17 due to a characteristic of the photographing lens of the optical system 16.
  • the distance calculation data table storage section 13c stores therein formulae (1) and (2) as described later for the principle.
  • a current position data storage section 13d retains the self position information calculated by the main control section 14.
  • Thedetection data list storage section 13e retains the distance D to the light emitting obj ect 33 obtained by a calculation processing by the main control section 14, the self position (e.g., coordinate altitude), and guide data acquired by light reception.
  • the detection data list storage section 13e is a section that stores, when the light emitting object 33 is detected from an imaging face (light receiving face 17a) outputted from the imaging device 17 in a time-series manner, a distance to the light emitting object 33, the position (e.g., coordinate, altitude), and guide data.
  • the detection data list storage section 13e in this embodiment stores such data as data record. The reason is to allow the detection data list storage section 13e to store, when a plurality of light emitting objects are detected from the imaging face (light receiving face 17a), distances, positions, and pieces of guide data separately with regards to the respective light emitting objects for example.
  • the diameter R is the size data for the diameter of the light emitting object 33 obtained by receiving data subjected to luminance modulation as described above.
  • the reference mark "d" represents an imaging length.
  • the maximum angle at which light can received by an imaging lens included in the optical system 16 is represented by " ⁇ ".
  • FIG. 4 is a schematic diagram illustrating the concept of an image formed on the light receiving face 17a of the imaging device 17. As shown in FIG. 4 , the horizontal length of the light receiving face 17a is defined as "H”, the vertical length is defined as “V”, an image of the light emitting object 33 is defined as "33a”, and the diameter of the image 33a is defined as "r”.
  • an angle ⁇ can be calculated based on the diameter "r" of the image 33a and the imaging length d.
  • FIG. 5 is a conceptual diagram illustrating the calculation of the distance D.
  • a scaling relation is established between a triangle formed by the angle ⁇ , the distance D, and the R/2 and a triangle formed by the angle ⁇ , the distance d, and r/2.
  • FIG. 6 shows an example of a format of data that is outputted from the control section 363 to the light source 34 and that is transmitted as the light P.
  • the data format 38 consists of a preamble data section for detection and acquisition section (for measurement) 38a, a size data section 38b, a position data section 38c, a preamble data for detection and acquisition section (for guide data) 38d, and a guide data section 38e. Data is outputted in a cyclic manner based on the data format 38 as a section.
  • Data stored in the preamble data for detection and acquisition section (for measurement) 38a is data that is detected when the imaging apparatus 10 sets the measurement mode when the imaging apparatus 10 receives the above data format 38.
  • this data section is received, a distance to the light emitting object 33 or the position thereof is calculated by a predetermined processing operation.
  • the imaging apparatus 10 measures the distance to the light emitting object 33.
  • such data is stored in the position data section 38c such as data showing a position of the light emitting object 33 (e.g., latitude, longitude, altitude). Based on this data, the imaging apparatus 10 measures the direction of the light emitting object 33 seen from the imaging apparatus 10 and the self position.
  • the position data section 38c such as data showing a position of the light emitting object 33 (e.g., latitude, longitude, altitude).
  • Data stored in the preamble data for detection and acquisition section (for guide data) 38d is data that is detected when the imaging apparatus 10 sets the guide mode when the imaging apparatus 10 receives the above data format 38.
  • a processing operation is executed in which the data set in the guide data section 38e is demodulated, reproduced, and outputted.
  • Data stored in the guide data section 38e is data stored in the guide data memory 351. Based on this data, the imaging apparatus 10 performs optional processing such as a route guide, a sightseeing guide, and auxiliary information related to an imaging operation.
  • FIG. 7 is a flowchart illustrating the flow of a processing for calculating a distance to an object. This flowchart is mainly composed of a signal region detection processing block S1, a signal demodulation processing block S2, and a measurement processing block S3.
  • the signal region detection processing block S1 firstly uses an image formed in the imaging face (light receiving face 17a) of the imaging device 17 as frame data to allow the main control section 14 to sequentially buff er the number of frames corresponding to the bit count of the buffer preamble pattern in the FIFO buffer 22 (Step S11). Then, the main control section 14 determines whether the plurality of pieces of buffered frame data include a pixel for which the luminance is changed.
  • the main control section 14 determines whether the plurality of pieces of buffered frame data include a pixel for which the maximum luminance exceeds a predetermined peak and a periodic change is caused or not to determine, based on the pixel, whether data exists for which the luminance is modulated in a time-series manner or not (Step S12) .
  • Step S14 When the main control section 14 cannot detect a pixel for which the luminance is changed, the main control section 14 performs a processing of Step S14 (which will be described later) to return to Step S11.
  • the main control section 14 detects a pixel for which the luminance is changed on the other hand, the main control section 14 reads preamble pattern data (for measurement) and preamble pattern data (for guide data) from the pattern data memory 14a to verify these preamble patterns with the above detected time-series luminance change of the pixel (Step S13).
  • the main control section 14 determines that data cannot be obtained from this detected pixel to discard the frame data buffered in the FIFO buffer 22 (Step S14) to return to the processing of Step S11 again.
  • the main control section 14 drives the orientation sensor 31 to acquire the imaging direction (Step S15) and drives the elevation angle sensor 32 to acquire the imaging elevation angle (horizontal angle) ⁇ (Step S16).
  • Step S13 shows that any of the pieces of preamble pattern corresponds to the above detected time-series luminance change
  • the main control section 14 determines that this detected pixel is a pixel transmitting the data to control the signal region detection unit 14b to identify a pixel region consisting of a pixel group for which the luminance is changed with an identical timing (Step S17). Then, the imaging direction acquired in the Step S15 and the imaging elevation angle elevation of image capturing acquired in the Step S16 are once retained in the RAM 13 in the main control section 14. With regards to the identified image region, the image is stored in the work memory 14d of the signal region detection unit 14b (Step S18).
  • the main control section 14 firstly acquires frame data corresponding to a bit count of the data format 38 in a sequential manner from the pixel region identified in Step S17 of the signal region detection processing block S1 (Step S21) to store the frame data in the FIFO buffer 22.
  • the main control section 14 also allows the signal demodulation section 14c to perform a processing for converting the region having the changed luminance to bit data of "1" and "0", the verification of bit data obtained through this processing with the first pattern sequence SA and the second pattern sequence SB, a bit output processing, and a processing for demodulating this outputted bit to size data , position data, and guide data (Step S22). Then, among these pieces of demodulated data, size data and position data are once memorized in the RAM 13 of the main control section 14 and guide data is stored in the data record of the detection data list storage section 13e (Step S23).
  • FIG. 8 is a flowchart showing the flow of the processing of the measurement processing block S3.
  • the measurement processing block S3 firstly allows the main control section 14 to read the image of the pixel region stored in the work memory 14d to identify, with regards to this pixel region, the shape of the light emitting object 33 to set a measurement axis (Step S31). Next, the main control section 14 performs a weighting of the respective pixels on this measurement axis to determine an area of the "image" of the light emitting object included in the pixel region (Step 932).
  • FIGS. 9A and 9B are diagrams for specifically explaining the method for determining the measurement axis, the weighting, and the area of the "image".
  • FIG. 9A it is assumed that the frame data sequentially obtained from the imaging face (light receiving face 17a) has the maximum luminance exceeding a predetermined peak and that the image 33a of the light emitting object 33 is set as a candidate of a pixel region having a periodic change.
  • FIGS. 9A and 9B show the image 33a having an elliptical shape because the light emitting object 33 is seen in an oblique direction of 45 degrees from the upper side or the lower side. When the light emitting object 33 is seen from the front side, the light emitting object 33 is seen as the image 33a having a circular shape.
  • the main control section 14 determines, with regards to the image 33a having a height H (9 dot) and a width W (5 dot), the longest column including many pixels having the maximum luminance peak among one row or one column of dots for which the maximum luminance exceeds a predetermined peak as a measurement axis. Then, the main control section 14 subjects pixels surrounding this measurement axis to a weighting corresponding to the luminance.
  • H (9 dot) and W (5 dot the longest column including many pixels having the maximum luminance peak among one row or one column of dots for which the maximum luminance exceeds a predetermined peak as a measurement axis.
  • the main control section 14 sets the maximum weight value "1" to a pixel range W1 having the highest luminance, a weight value "0.6” is set to the peripheral range W2, and a weight value "0.3” is set to the outermost edge range W3 to set the height H (9 dot) as a measurement axis for example. Based on these weighting calculations, the main control section 14 obtains the pixel range W1 of "20", the pixel range W2 of 7.2, and the pixel range W3 of 2.4 to obtain an area of "image" 33a of 29.6.
  • a method for determining a measurement axis and a weighting and an area calculation method are not limited to the above ones. Any other methods also can be used so long as they can determine an area accurately.
  • the main control section 14 reads, with regards to the shape of this image, the distortion correction data based on the characteristic of the imaging lens included in the optical system 16 from the image formation distortion correction data storage section 13b to use the data to correct the distortion (Step S33).
  • the main control section 14 reads the imaging length d from the imaging length data storage section 13a to calculate a value of " ⁇ " based on the measurement axis and the imaging length d (Step S34).
  • Step S35 calculates the value of " ⁇ "
  • the main control section 14 uses this ⁇ to read the formula (1) from the distance calculation data table storage section 13c to calculate the distance D in FIG. 5 based on the size data memorized in the RAM 13 of the main control section 14 (Step S35). Since the shape of "image” 33a is set as “elliptical shape” obtained when the circular shape is seen in about 45° direction, the size data obtained by Step S24 as the "circular shape having a diameter R" is corrected, based on the above set shape, to an "elliptical shape having a diagonal ⁇ 2L".
  • the main control section 14 uses, based on this distance D and the photographing elevation angle ⁇ calculated by Step S31, the following formula (3) to calculate the distance D' (horizontal direction distance) to a position just blow the light emitting obj ect 33 (Step S36).
  • D ⁇ Dcos ⁇ ⁇
  • the main control section 14 calculates, based on the above calculated distance D', the photographing direction of the imaging apparatus 10 acquired by the orientation sensor 31, and the position data acquired by Step S24, the position of the imaging apparatus 10 (Step S37) to cause these pieces of distance data D' and position data to be memorized in the data record of the detection data list storage section 13e (Step S38).
  • FIGS. 10A and 10B are a conceptual diagram illustrating an example of the use of registered data.
  • FIG. 10A shows an example of a display when the imaging apparatus 10 is in a measurement mode.
  • FIG. 10B shows an example of a display when the imaging apparatus 10 is in a guide mode.
  • the screen of the display section 25 shows information for a distance to the light emitting object 33 (e.g., "distance to target: 3m") and information for the current position of the imaging apparatus 10 (e.g., "your position: latitude 35° 4625.75 north and longitude 139° 1843.69 east") displayed while being surrounded by a speech bubble at a screen corner.
  • the screen of display section 25 displays predetermined guide information (e.g., "front entrance 150m ahead”) sent from the light emitting object 33 that is surrounded by a speech bubble at a screen corner.
  • predetermined guide information e.g., "front entrance 150m ahead
  • the combination of the imaging apparatus 10 and the light emitting object 33 of this embodiment can be used to provide a measured distance, a current position, or a route guide for example by the transmission of information by luminance-modulated light from the light emitting object 33.
  • the distance measurement technique based on the principle as described above can be provided so that imaging operation-related processings and an information display mode are controlled in application examples listed in below
  • FIG. 11 is a conceptual diagram illustrating an application example of a system consisting of an advertisement exhibit and an imaging apparatus.
  • FIG. 11 shows two persons AandB. These persons A and B carry the above-described imaging apparatuses 10 and read the advertisement exhibit by holding the imaging apparatus 10 to be opposed to the advertisement exhibit 39 provided at a predetermined place.
  • D1 a distance from the advertisement exhibit 39 to the person A
  • D2 a distance from the advertisement exhibit 39 to the person B
  • D1 ⁇ D2 is established.
  • the person A is at a position at which the person A can read information printed on the advertisement exhibit 39 while the person B is at a position at which the person B cannot read the above printed information.
  • the person A also may be interpreted as typically representing people who can at a position having a distance at which the information printed on the advertisement exhibit 39 can be read and the person B may be interpreted as typically representing people who can be at a position having a distance at which the information printed on the advertisement exhibit 39 cannot be read.
  • the advertisement exhibit 39 shows predetermined advertisement information.
  • this advertisement information shows a printed image of an article (personal computer) and the description thereof and also shows a message of "coupons are distributed!.
  • This advertisement exhibit 39 is irradiated by illuminating light 40a of visible light from a lamp 40.
  • This irradiation range corresponds to the light source for visible the light communication of the light emitting object 33 as described above.
  • this irradiation range will be referred to as light emitting object 40b for convenience and for consistency with the above description.
  • the main function of the lamp 40 is to use the illuminating light 40a to brightly illuminate the advertisement exhibit 39 so that more people can notice the advertisement exhibit 39.
  • the second function of the lamp 40 is to change the luminance of the illuminating light 40a with a short time interval that cannot be recognized by human eyes so that desired data can be transmitted through visible light communication based on the luminance change pattern of the light emitting object 40b (reflected light from the illuminating light 40a).
  • the display section 25 of the imaging apparatus 10 held by the person A existing close to the advertisement exhibit 39 displays, as shown by the reference numeral 41, an imaged image 42 showing a large image of the advertisement exhibit and an information frame 43.
  • This information frame 43 displays therein a message of "attached information (discount coupon) was received and stored".
  • a discount coupon is shown in this example, this is a mere example.
  • Any sales promotion information may be displayed such as information for a link on the Internet (e.g., URL information) for providing a special service.
  • the display section 25 of the imaging apparatus 10 held by the person B existing away from the advertisement exhibit 39 displays, as shown by the reference numeral 44 in FIG. 11 , an imaged image 45 showing a small image of the advertisement exhibit corresponding to the distance from the advertisement exhibit 39 to the person B and an information frame 46.
  • This information frame 46 displays therein, as auxiliary information regarding an imaging of this advertisement exhibit 39, a message of "Attached information (discount coupon) is distributed. However, your position is too far away and thus the information cannot be acquired. Please move closer to a position within 30m from the advertisement".
  • the imaging apparatus 10 is allowed to photograph the imaged image to acquire the coupon.
  • this person can be guided to a position closer to the advertisement exhibit 39.
  • the advertisement exhibit 39 the present invention is not limited to this. Any other exhibition media used for advertisement or the like can be used such as a street poster or a display. An exhibition medium is not always required to provide visible information and also may provide only information by visible light communication.
  • the illuminating light 40a from the lamp 40 is reflected from the advertisement exhibit 39 in this example, other visible light communication styles also may be used.
  • a backlight-type display panel, a large backlight display, or a self-luminous display such as LED also maybe used.
  • any visible light communication may be used so long as the communication can finally provide a modulation signal to the imaging apparatus 10.
  • FIG. 12 illustrates information sent from the light source (lamp 40).
  • information sent from the light source (lamp 40) includes a light source size information storage section 47, a distance storage section 48, and a distributed information storage section 49.
  • the light source size information represents a floodlighting size of the illuminating light 40a irradiated from the light source (lamp 40) to the advertisement exhibit 39 and corresponds to the size data 38b of FIG. 6 in the above description for the principle.
  • This light source size information is given as an initial value when the lamp 40 is placed.
  • Distance information stored in the above distance storage section 48 is determined for every piece of distributed information.
  • the above distance information is information including the maximum distance at which apersonholdingthe imaging apparatus 10 canreadinformationprinted on the advertisement exhibit 39. This distance information is updated when the advertisement exhibit 39 is exchanged with a new one or when the advertisement exhibit 39 is repainted.
  • the preamble for detection and acquisition included in the data format is a format required for a preamble data body.
  • various signal detection and acquisition methods may be used as described above such as a method for dispersing bit 0/1 to the first pattern sequence SA and the second pattern sequence SB to modulate a light source.
  • a data format of a data protocol layer used in the present invention has no relation with the essential of the present invention.
  • a protocol component for detecting or complementing a signal (e.g., preamble) will not be shown or described.
  • FIG. 13 shows a flow of a processing in an application example of a system consisting of the advertisement exhibit 39 and the imaging apparatus 10.
  • the imaging apparatus 10 detects and receives information sent from the light 40a of the lamp 40 (see FIG. 13 )
  • the imaging apparatus 10 firstly extracts distance information (information stored in the distance storage section 48) from the information (Step S41) to calculate a distance to the information transmission position (the illuminating light 40a on theadvertisementexhibit39) (StepS42). Thisdistancecalculation is performed by the method as descried above.
  • the imaging apparatus 10 determines whether the above calculated distance is shorter than a distance represented by the distance information or not (Step S43).
  • information for permitting acquisition of the coupon is displayed (and stored and used) for example as shown by the reference numeral 41 of FIG. 11 (StepS44).
  • information as shown by the reference numeral 44 of FIG. 11 is displayed that does not permit acquisition of the coupon and information asking the person to move closer to the advertisement exhibit 39 is displayed for example (Step S45) .
  • distance information in information sent from the light source (lamp40) can be referred in one processing in the entire imaging process to determine the display or operation of the imaging apparatus 10.
  • information is transmitted through irradiation light (indirect light) for which the luminance is modulated through the exhibition medium (advertisement exhibit 39) so that the information includes conditions regarding acquisition of the information printed on the advertisement exhibit 39 (distance information) .
  • the information can be provided in a very fine manner by which coupon information for example is distributed only to a person who can read the above printed information.
  • the application example of this advertisement exhibit can show "types of information that can be acquired” and "a distance required for the person to move closer to obtain the coupon". Thus, the person can be guided to obtain the information to allow the person to read detailed advertisement. Thus, an enhanced advertising effect can be expected.
  • an opposite rule also may be used in which the person cannot acquire information when a distance between the advertisement exhibit and the person is too long and the person is guided to move closer to the advertisement exhibit.
  • This opposite rule can be used when a specific photographic subject such as a new product is desirably prevented from being photographed at a short distance at which the details can be seen or when a copyright or portrait rights should be protected.
  • the above distance information also may include an upper limit distance and a lower limit distance.
  • FIGS. 14A and 14B illustrate information including an upper limit distance and a lower limit distance.
  • FIG. 14A shows the structure thereof.
  • FIG. 14B shows an example of stored information.
  • the structures of FIGS. 14A and 14B are different from that of FIG. 12 in that the distance storage section 48 includes a lower limit distance storage section 48a and an upper limit distance storage section 48b.
  • the lower limit distance storage section 48a is set to "0" (hereinafter section: “m") and the upper limit distance storage section 48b is set to "20” for example, this means a range within 20m.
  • the lower limit distance storage section 48a is set to "10" and the upper limit distance storage section 48b is set to "0” on the other hand, this means a range of 10m or more.
  • the lower limit distance storage section 48a set to "5" and the upper limit distance storage section 48b set to "30” mean a distribution distance range from 5m to 30m.
  • the upper limit and the lower limit both set to "0" mean no limitation in distance
  • distance information and distance conditions also may be set to control various operations related to a photographing operation such as an execution, storage, editing, and transmission (e.g., permission or prohibition of an execution of a photographing operation).
  • the photographic subject side can control the photographing operation by specifying, in order to protect the copyright or portrait rights, a condition of "prohibition of photographing of the photographic at a short distance to the subject” or an opposite condition of "permission of photographing of a person as a photographic subject at a distance at which the appearance of the person is most attractive".
  • FIG. 15 shows an example of a sending format corresponding to the permission and prohibition of a photographing operation for example.
  • the sending format has the same structure as those of FIG. 12 and FIGS. 14A and 14B (the light source size information storage section 47, distance storage section 48, and distributed information storage section 49) but is different from FIG. 12 and FIGS. 14A and 14B in including a photographing operation limiting information storage section 50.
  • This photographing operation limiting information storage section 50 is set to show, based on a request by the information provider, any of the following permission category classifications of: (A) photographing and storage are both permitted, (B) photographing is permitted but the storage must be performed with the minimum resolution, (C) photographing is permitted but the storage must be performed by an image including watermark information, (D) photographing is permitted but the storage must be performed by an image including a warning text character, or (E) only display on a monitor is permitted for example.
  • photographing-related limitation can be instructed finely only by selecting any of the above permission category classifications of (A) photographing and storage are both permitted, (B) photographing is permitted but the storage must be performed with the minimum resolution, (C) photographing is permitted but the storage must be performed by an image including watermark information, (D) photographing is permitted but the storage must be performed by an image including a warning text character, or (E) only display on a monitor is permitted to set the selected permission category classification to the photographing operation control information storage section 50.
  • this embodiment has provided limitation on the display and storage of an image, other limitations also may be additionally used such as limitation on the second use (redistribution) of a photographed image by attaching the image to an e-mail.
  • photographing-related limitation is desirably provided not only by the above distance conditions but also by information for an angle of view of an optical system (e.g., information for telescopic or wide-angle) because the angle of view information can be always obtained at a stage of the distance measurement or information for image definition.
  • information for an angle of view of an optical system e.g., information for telescopic or wide-angle
  • FIG. 16 is a conceptual diagram illustrating when measurement data for a distance to the light emitting object 33 is used for the focusing control of the optical system 16.
  • the automatic focus controller 18 is controlled via the imaging controller 19 and the main control section 14 based on the distance D' between the imaging apparatus 10 and the light emitting object 33 calculated based on the principle as described above.
  • the imaging device 17 when the light emitting object 33 is provided at a position of the photographic subject 20 as a target in FIG. 1 and when the image 33a caused by light emitted from the light emitting obj ect 33 is imaged by the imaging device 17, distance information subjected to luminance modulation sent from the light emitting object 33 can be received through light reception and an area value of the light emitting object 33 for which the light is received can be obtained to obtain a distance between the imaging apparatus 10 and the light emitting object 33. Depending on this distance, the optical system 16 of the imaging apparatus 10 can perform a focusing operation. Thus, a troublesome procedure as required by a conventional focusing operation is eliminated in which an original picture composition is returned while the focus is being locked to subsequently perform a photographing operation.
  • a photographing lens desirably has the minimum aperture during the measurement.
  • FIG. 17 is a flowchart illustrating imaging-related processings for an image imaged by using a distance between the imaging apparatus 10 and the light emitting object 33.
  • this distance D' is used to determine whether the imaging apparatus 10 is far from or close to the light emitting object 33 (Step S51) .
  • the imaging apparatus 10 is far from the light emitting object 33
  • the imaged image is subjected to a sharpness processing (Step S52) .
  • Step S53 the imaged image is subjected to a soft-focus processing.
  • the term “sharpness processing” herein means a processing for making an image contour clearer and the term “soft-focus processing” on the contrary is a processing for reducing the sharpness of the contour.
  • an imaged image is subjected to the sharpness processing or the soft-focus processing based on the above distance D' .
  • the image effect as described above corresponding to general photographing techniques can be obtained without using a special lens or filter for example. This can eliminate a labor hour and allows a beginner to easily use the effective photographing technique to photograph an image having a good appearance.
  • FIG. 18 is a flowchart illustrating imaging-related processings for controlling, depending ona distance from the imaging apparatus 10 to the light emitting object 33, photographing conditions (an optical system in particular).
  • Step S61 when the above distance D' is acquired (Step S61), whether the distance between the position of the imaging apparatus 10 and the position of the light emitting object 33 can be classified as being within a predetermined range or not is determined based on this distance D' (Step S62), When the distance therebetween is determined as being in a telephoto-range, then a zoom lens is moved toward the telescopic side (Step S63), When the distance therebetween is determined as being in a middle-range, the zoom lens is moved to a middle distance (middle angle of view) (Step S64).
  • the zoom lens is moved to the wide angle side (Step S65).
  • a macro lens having the shortest photographing distance from the zoom lens is newly used (Step S66).
  • FIG. 19 illustrates a billboard 60 corresponding to the light emitting object 33 in the above description of the principle.
  • this billboard 60 is placed at a roof of a building or at a wall surface for example.
  • the billboard 60 is a large light emitting display having a one side of a few meters in which a great number of LED are arranged in a matrix manner for example.
  • FIG. 20 is a circuit diagram illustrating an imaging apparatus 100 corresponding to the imaging apparatus 10 (see FIG. 1 ) in the above description for the principle.
  • the imaging apparatus 100 further includes an information memorization section 61, a backlight 62, and a backlight driving controller 63.
  • An external interface section 29 includes a wireless communication section to send and receive contents memorized in the image memorization section 28 and the information memorization section 61.
  • a CPU 11 further includes a function as the luminous control section 36 in FIG. 2 .
  • the information memorizationsection 61 detects the operation section 23 by a user to store address book data or mail data inputted, prepared, or edited.
  • the information memorization section 61 also stores guide data memory 351 in the data memory 35 in FIG. 2 , information corresponding to contents memorized in the self size data memory 352 (the shape or area of the display section 25 in particular), and various pieces of information acquired from an information source (which will be described later) .
  • the backlight 62 is a light source consisting of a plurality of LED for irradiating the display section 25 from the back face.
  • the backlight driving controller 63 has a function to adjust the luminance of the backlight 62 based on a control signal from the main control section 14 (CPU11) and adjusts the luminance of the backlight 62 based on an operation by a user.
  • the backlight driving controller 63 receives modulated information read from the information memorization section 40 to change, based on this information, the luminance of the backlight 62 in a time-series manner.
  • the display section 25 as a liquid crystal display (transmissive liquid crystal display requiring illumination from the back face in a narrow sense)
  • the display section 25 also may include a light-emitting function such as organic EL material.
  • a display driver 24 has the function and operation of the backlight driving controller 63.
  • the billboard 60 has luminance modulated by arbitrary information and thus can be used as an information source.
  • the imaging apparatus 100A can be used not only as an apparatus for receiving information but also an apparatus that receives information through visible light communication to send the information.
  • Such an information sending apparatus also may be, in addition to the billboard 60, a traffic signal machine, a ceiling light, an interior light, or a street light for example.
  • FIG. 21 shows a positional relation among the billboard 60, the imaging apparatus 100A, and the imaging apparatus 100B.
  • two information-sending apparatuses (billboard 60 and imaging apparatus 100A) exist at a distance C (e.g., 1m) and at a distance D (e.g., 100m).
  • the distance C has a person 65 holding the imaging apparatus 100A and the distance D has the billboard 60 placed at the roof of a building 66.
  • FIG. 22 shows an example of a display by the display section 25 of the imaging apparatus 100B when the present invention is not used.
  • the display section 25 displays the person 65 positioned at the distance C, the imaging apparatus 100A held by the person 65, the building 66 positioned at the distance D, and the image on the billboard 60 provided at the roof of the building 66.
  • the display section 25 displays information sent from the billboard 60 and the imaging apparatus 100A so that the information from the imaging apparatus 100A is displayed as "my message! in a speech bubble 67 and the information from the billboard 60 is displayed as "AAA station building" in a speech bubble 68 for example.
  • the two pieces of information displayed on the display section 25 have an identical display size.
  • a problem is caused in which an increased number of speech bubbles suppresses a user from visually recognizing the speech bubbles.
  • FIG. 23 shows an example of a display by the display section 25 of the imaging apparatus 100B when the present invention is used in which a display mode is controlled depending on a distance between the imaging apparatus 100B and an information source.
  • the display section 25 displays an image of the imaging apparatus 100A held by the person positioned at the distance C and an image of the billboard 60 provided at the roof of the building 66 positioned at the distance D and also displays pieces of information sent from the imaging apparatus 100A and the billboard 60 surrounded by the speech bubbles 69 and 70.
  • FIG. 23 is different from FIG. 22 in the following point.
  • this application example can provide the above respective pieces of information with a visual perspective.
  • This application example also can display information having a shorter distance with characters having a larger size so that the information can be conspicuous and can display information having a longer distance with characters having a smaller size so that only the existence thereof can be noticed. This is particularly advantageous because a plurality of pieces of information can be displayed and read in an easier manner.
  • this example has provided different information display modes by changing the character size, the invention is not limited to this.
  • Other display modes also may be used by controlling, for example, the color of a character, a character font, or an existence or nonexistence of a border attribute or by controlling the size of a speech bubble of each piece of information, a background color, a color or thickness of a frame border, or the transparency level of a speech bubble. Any display mode may be used so long as information closer to a user can be displayed in a more conspicuous manner because information closer to a user may be useful information.
  • FIG. 24 shows the flow of a processing for obtaining the above improvement example ( FIG. 23 ).
  • Step S71 whether the imaging apparatus 100B has detected information from an information source or not is firstly determined.
  • the imaging apparatus 100B refers to a received data list (the detection data list storage section 13e of FIG. 3 ) (step S72) to extract data having a shorter distance (Step S73).
  • the imaging apparatus 100B determines whether the number of the extracted pieces of data exceeds a predetermined number "n" or not (Step S74) .
  • the imagingapparatus 100B determines, as described in the above description for the principle, a bubble size and a size of characters to be displayed in accordance with a distance between the imaging apparatus 100B and the respective information sources (Step S75) to display the respective speech bubbles 69 and 70 in the display section 25 (Step S76) .
  • the imaging apparatus 100B displays a predetermined marking in the detection data region (Step S77) .
  • Step S78 the imaging apparatus 100B determines whether reception of all pieces of data is completed or not.
  • the processings after Step S72 are repeated.
  • the step of determining whether the imaging apparatus 100B has detected information from an information source or not is returned.
  • the information from the imaging apparatus 100A positioned at the distance C can be displayed with a larger speech bubble and characters having a larger size while the information from the billboard 60 positioned at the distance D away from the distance C can be displayed with a smaller speech bubble and characters having a smaller size.
  • the respective pieces of information can be provided with a visual perspective.
  • information closer to a user can be displayed with characters having a larger size so that the information is conspicuous and information farther away from a user can be displayed with characters having a smaller size so that the information is conspicuous so that only the existence thereof can be noticed by the user.
  • This is particularly advantageous because a plurality of pieces of information can be displayed and read in an easier manner.
  • the imaging apparatus 100B displays a detection data region having a predetermined marking on the display section 25.
  • the imaging apparatus 100B can monitor the maximum number of pieces of displayable data (n) to prevent the display section 25 from being filled with all received pieces of information (or images added with texts) in speech bubbles.
  • Sizes of characters based on the distances thereof may be basically determined so that characters for information having a longer distance are displayed with a smaller size by displaying information within 5m from a user for example with characters of 12 points and displaying information at a position about 100m from a user with 6 points.
  • a character size may be determined by linearly complementing the result of calculating a distance from a detected region to user.
  • characters also maybe displayed with different colors depending on the distance from a user as described above by displaying information closer to a user with characters having a darker color and by displaying information farther away from a user with characters having a lighter color.
  • information also may be displayed with different chroma saturations or luminances of the color of a speech bubble depending on the distance from a user by displaying information farther away from a user with a speech bubble having a lighter color and by displaying information closer to a user with a speech bubble having a more conspicuous color.
  • information also may be displayed with different transparencies of a speech bubble depending on the distance from a user by displaying information farther away from a user with a more transparent speech bubble so that the speech bubble is less noticed by the user.
  • display modes by characters of different sizes, colors, or shapes information corresponding to a detection distance can be displayed in accordance with the result of measuring a distance of the detected region.
  • the imaging apparatus 100 can use information acquired from another information source (e.g., billboard 60, imaging apparatus 100) as information to be sent therefrom.
  • another information source e.g., billboard 60, imaging apparatus 100
  • information can be transmitted in a wide range without requiring increased traffic or communication cost.
  • the present invention is not limited to this example.
  • the present invention can be also applied to general electronic cameras (e.g., a mobile telephone equipped with a camera, a mobile imaging apparatus equipped with a Camera).

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Studio Devices (AREA)
  • Measurement Of Optical Distance (AREA)

Claims (14)

  1. Procédé, comprenant les étapes de :
    détermination (S12) si des luminances reçues par au moins un élément de réception de lumière parmi plusieurs éléments de réception de lumière d'un appareil sont égales ou supérieures à une valeur prédéterminée ;
    détermination (S13) si les luminances varient ou non en séquence dans le temps quand des luminances reçues par au moins un élément de réception de lumière sont égales ou supérieures à une valeur prédéterminée ;
    caractérisé par
    l'acquisition d'informations (S2) se rapportant à une taille d'une source optique (33) à partir de la variation en séquence dans le temps lorsque les luminances varient en séquence dans le temps, dans lesquelles les informations sont contenues dans les luminances qui varient en séquence dans le temps, fournies par la source de lumière (33) ;
    détection (S3) d'une taille d'une zone de réception de lumière en se basant sur un rapport d'un élément de réception de lumière, ayant reçu une luminance ayant une valeur prédéterminée ou plus, sur les éléments de réception de lumière ; et
    calcul (S35, S36) d'une distance entre l'appareil et la source de lumière (33) en se basant sur la taille de la zone de réception de lumière et les informations acquises.
  2. Procédé selon les revendications 1 ou 2, comprenant en outre les étapes de : détermination (S32, S33) d'une forme de la zone de réception de lumière, dans laquelle la distance jusqu'à la source de lumière est calculée en se basant sur la taille de la zone de réception de lumière, les informations acquises, et la forme de la zone de réception de lumière.
  3. Procédé selon l'une quelconque des revendications 1 à 3, comprenant en outre les étapes de : commande (S44, S45) de traitements concernant une opération d'imagerie en se basant sur la distance calculée.
  4. Procédé selon la revendication 3, dans lequel les traitements concernant une opération d'imagerie comportent des traitements de reproduction ou de stockage (S44) des informations acquises en se basant sur la distance calculée.
  5. Procédé selon les revendications 3 ou 4, dans lequel les traitements concernant une opération d'imagerie comportent un traitement (S44, S45) pour afficher des informations auxiliaires concernant l'imagerie en se basant sur la distance calculée.
  6. Procédé selon l'une quelconque des revendications 3 à 5, dans lequel les traitements concernant une opération d'imagerie comportent un traitement pour commander (S63-S66) une section de mise au point en se basant sur la distance calculée.
  7. Procédé selon l'une quelconque des revendications 3 à 6, dans lequel les traitements concernant une opération d'imagerie comportent un traitement (S52, S53) pour traiter une image ayant subi une opération d'imagerie en se basant sur la distance calculée.
  8. Procédé selon l'une quelconque des revendications 3 à 7, dans lequel les traitements concernant une opération d'imagerie comportent un traitement pour commander (S63-S66) un mécanisme de système optique en se basant sur la distance calculée.
  9. Procédé selon l'une quelconque des revendications 1 à 8, comprenant en outre les étapes de : commande d'un mode d'affichage des informations acquises sur une section d'affichage.
  10. Procédé selon la revendication 9, dans lequel l'étape de commande du mode d'affichage comporte une étape d'affichage des informations acquises avec des images ayant subi une opération d'imagerie par plusieurs éléments de réception de lumière.
  11. Procédé selon les revendications 9 ou 10, dans lequel l'étape de commande du mode d'affichage comporte une étape d'affichage des informations acquises en association avec l'image de la source de lumière comprise dans les images ayant subi une opération d'imagerie par plusieurs éléments de réception de lumière.
  12. Appareil, comprenant :
    une section de réception de lumière (17) comportant plusieurs éléments de réception de lumière ;
    une première section de détermination (14) conçue pour déterminer si des luminances reçues par au moins un élément de réception de lumière parmi plusieurs éléments de réception de lumière sont égales ou supérieures à une valeur prédéterminée ;
    une seconde section de détermination (14) conçue pour déterminer si les luminances varient ou non en séquence dans le temps lorsque des luminances reçues par au moins un élément de réception de lumière sont égales ou supérieures à une valeur prédéterminée ;
    caractérisé par
    une section d'acquisition (14) conçue pour acquérir des informations se rapportant à une taille d'une source de lumière (33) à partir de la variation en séquence dans le temps lorsque les luminances varient en séquence dans le temps, dans lesquelles les informations sont contenues dans les luminances qui varient en séquence dans le temps, fournies par la source de lumière (33) ; une section de détection (14) pour détecter une taille d'une zone de réception de lumière en se basant sur un rapport d'un élément de réception de lumière ayant reçu une luminance avec une valeur prédéterminée ou plus sur les plusieurs éléments de réception de lumière ;
    une section de calcul (14) conçue pour calculer une distance jusqu'à la source de lumière (33) en se basant sur la taille de la zone de réception de lumière et les informations acquises.
  13. Produit formant programme informatique conçu pour effectuer les étapes de procédé selon l'une quelconque des revendications 1 à 11.
  14. Système comprenant :
    une source de lumière (33) comportant :
    une section de modulation (36) ; et
    une section de luminance (34) pour émettre une lumière avec une luminance modulée par cette section de modulation, et
    un appareil de réception de lumière (10) incluant :
    une section de réception de lumière (17) dans laquelle plusieurs éléments de réception de lumière sont agencés régulièrement pour recevoir une lumière en provenance de la source de lumière ;
    une section de démodulation (14) conçue pour démoduler une variation de luminance de la lumière reçue par cette section de réception de lumière ;
    caractérisé en ce que
    la section de modulation est configurée pour moduler des informations se rapportant à une taille de la source de lumière (33) pour une variation de luminance en séquence dans le temps ;
    la section de démodulation est configurée pour démoduler la variation de luminance en séquence dans le temps pour les informations se rapportant à la taille de la source de lumière, et
    l'appareil de réception de lumière comportant en outre :
    une section de détection (14) conçue pour détecter une taille d'une zone de réception de lumière en se basant sur un rapport d'un élément de réception de lumière ayant reçu la lumière sur les plusieurs éléments de réception de lumière ; et
    une section de mesure (14) conçue pour mesurer, en se basant sur la taille de la région détectée par la section de détection (14) et les informations concernant la taille démodulées par la section de démodulation (14), une distance entre la source de lumière et une position de l'appareil de réception de lumière (10).
EP07015216A 2006-08-03 2007-08-02 Procédé pour mesurer la distance à un objet Expired - Fee Related EP1884798B1 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2006211589A JP4637066B2 (ja) 2006-08-03 2006-08-03 情報取得装置、情報取得方法、情報取得プログラム、及び、距離情報取得システム
JP2006261988A JP4210955B2 (ja) 2006-09-27 2006-09-27 撮像装置、撮像制御方法、及び、撮像制御プログラム
JP2006262078A JP4178419B2 (ja) 2006-09-27 2006-09-27 情報表示装置、表示制御方法、及び、表示制御プログラム

Publications (3)

Publication Number Publication Date
EP1884798A2 EP1884798A2 (fr) 2008-02-06
EP1884798A3 EP1884798A3 (fr) 2010-11-10
EP1884798B1 true EP1884798B1 (fr) 2012-01-18

Family

ID=38658167

Family Applications (1)

Application Number Title Priority Date Filing Date
EP07015216A Expired - Fee Related EP1884798B1 (fr) 2006-08-03 2007-08-02 Procédé pour mesurer la distance à un objet

Country Status (2)

Country Link
US (1) US7724353B2 (fr)
EP (1) EP1884798B1 (fr)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201015202A (en) * 2008-10-03 2010-04-16 Altek Corp Image pickup device capable of providing GPS coordinates of subject to be shot and method for detecting GPS coordinates thereof
DE102008061035C5 (de) * 2008-12-08 2013-09-05 Sick Ag Verfahren und optischer Sensor zur Erfassung von Objekten
US8374201B2 (en) * 2009-09-16 2013-02-12 Samsung Electronics Co., Ltd. Preamble design for supporting multiple topologies with visible light communication
US8803915B2 (en) * 2009-12-18 2014-08-12 Panasonic Intellectual Property Corporation Of America Information display device, integrated circuit for display control, display control program, and display control method
CN102484678B (zh) * 2010-08-20 2017-03-01 松下电器(美国)知识产权公司 接收显示装置、信息发送装置、光无线通信系统、接收显示用集成电路、信息发送用集成电路、接收显示程序、信息发送程序及光无线通信方法
US9127806B2 (en) 2012-01-26 2015-09-08 Hit Technologies Inc. Providing a rail mounting system for a mobile device case
JP5983184B2 (ja) * 2012-08-24 2016-08-31 ブラザー工業株式会社 画像処理システム、画像処理方法、画像処理装置、および画像処理プログラム
US11143749B2 (en) * 2014-05-23 2021-10-12 Signify Holding B.V. Object detection system and method
USD785614S1 (en) 2015-06-01 2017-05-02 Hit Technologies Inc. Shield for a mobile device case
CN107764233B (zh) 2016-08-15 2020-09-04 杭州海康威视数字技术股份有限公司 一种测量方法及装置
US10783265B2 (en) * 2016-09-28 2020-09-22 Samsung Electronics Co., Ltd. Data access device and apparatus comprising same
JP6897389B2 (ja) * 2017-07-25 2021-06-30 富士通株式会社 判別用コンピュータプログラム、判別装置及び判別方法ならびに通信システム
US11151737B1 (en) 2018-12-20 2021-10-19 X Development Llc Automatic field of view detection
JP7136141B2 (ja) * 2020-02-07 2022-09-13 カシオ計算機株式会社 情報管理装置、情報管理方法及びプログラム

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3617128A (en) * 1968-10-28 1971-11-02 Eastman Kodak Co Automatic rangefinder means
JPH06105291B2 (ja) 1987-04-24 1994-12-21 松下電器産業株式会社 レ−ザ測距装置
AU703825B2 (en) * 1995-08-07 1999-04-01 Komatsu Limited Distance measuring apparatus and shape measuring apparatus
JPH0996528A (ja) 1995-09-29 1997-04-08 Toyota Motor Corp 車間距離検出装置及び方法
JPH1090591A (ja) 1996-09-18 1998-04-10 Olympus Optical Co Ltd 焦点検出装置
JP3986220B2 (ja) 1999-10-19 2007-10-03 オリンパス株式会社 情報表示部材及びそれを用いた位置検出方法
US6577249B1 (en) 1999-10-19 2003-06-10 Olympus Optical Co., Ltd. Information display member, position detecting method using the same, apparatus and method of presenting related information, and information presenting apparatus and information presenting method
JP3713171B2 (ja) 1999-11-10 2005-11-02 富士通株式会社 車両走行制御システムおよび車両制御装置
JP3666642B2 (ja) 2000-01-14 2005-06-29 富士電機システムズ株式会社 移動体の位置検出方法
JP3750132B2 (ja) * 2000-03-01 2006-03-01 カシオ計算機株式会社 撮像装置
JP2005077291A (ja) 2003-09-02 2005-03-24 Nippon Gps Solutions Corp 三次元測位システム
US7064810B2 (en) * 2003-09-15 2006-06-20 Deere & Company Optical range finder with directed attention
US20050206874A1 (en) * 2004-03-19 2005-09-22 Dougherty Robert P Apparatus and method for determining the range of remote point light sources

Also Published As

Publication number Publication date
US7724353B2 (en) 2010-05-25
EP1884798A3 (fr) 2010-11-10
US20080030711A1 (en) 2008-02-07
EP1884798A2 (fr) 2008-02-06

Similar Documents

Publication Publication Date Title
EP1884798B1 (fr) Procédé pour mesurer la distance à un objet
KR102019412B1 (ko) 카메라의 공간적 특성들을 결정하는 방법 및 시스템
CN101267501B (zh) 影像信息处理装置
US9843810B2 (en) Method of using laser scanned point clouds to create selective compression masks
US20090262071A1 (en) Information Output Apparatus
JP5404956B2 (ja) 情報配信装置、情報配信システム、情報配信方法及び携帯通信機
CN104205175A (zh) 信息处理装置,信息处理系统及信息处理方法
JP4178419B2 (ja) 情報表示装置、表示制御方法、及び、表示制御プログラム
CN106251799B (zh) 驱动显示装置的方法
JP2012068481A (ja) 拡張現実表現システムおよび方法
JP7043681B2 (ja) 光通信装置を用いた自律移動可能な機器の案内方法
CN109842790B (zh) 影像信息显示方法与显示器
US20120154443A1 (en) Reception display apparatus, information transmission apparatus, optical wireless communication system, reception display integrated circuit, information transmission integrated circuit, reception display program, information transmission program, and optical wireless communication method
JP2006251596A (ja) 視覚障害者支援装置
KR101690311B1 (ko) 증강현실객체 배치, 공유 및 전시 시스템 및 배치, 공유 및 전시 방법
JPH08129216A (ja) 撮影情報を記録可能なカメラ
Rafian et al. Remote sighted assistants for indoor location sensing of visually impaired pedestrians
JP4637066B2 (ja) 情報取得装置、情報取得方法、情報取得プログラム、及び、距離情報取得システム
JP2004257872A (ja) 位置情報取得システム、位置情報取得装置、位置情報取得方法、及びプログラム
JP4210955B2 (ja) 撮像装置、撮像制御方法、及び、撮像制御プログラム
JP2018105799A (ja) 測定装置、方法およびプログラム
JP2009239621A (ja) 画像上方位表示方法及び装置並びに写真
TW201926018A (zh) 影像資訊顯示方法、影像資訊顯示系統與顯示器
US20040183904A1 (en) Enhanced, downlink-capable, fire-data gathering and monitoring
JP2003244488A (ja) 画像情報付加装置及びシステム

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20070802

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC MT NL PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA HR MK YU

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC MT NL PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA HR MK RS

AKX Designation fees paid

Designated state(s): DE FR GB

RIC1 Information provided on ipc code assigned before grant

Ipc: G01S 1/70 20060101ALI20110610BHEP

Ipc: G01S 11/12 20060101AFI20110610BHEP

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): DE FR GB

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602007020102

Country of ref document: DE

Effective date: 20120315

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20121019

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602007020102

Country of ref document: DE

Effective date: 20121019

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 10

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 11

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20170628

Year of fee payment: 11

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IT

Payment date: 20170828

Year of fee payment: 11

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20180802

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180831

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180802

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20190723

Year of fee payment: 13

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602007020102

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210302