WO2008001503A1 - Communication party identifying apparatus, communication party identifying method, and communication party identifying program - Google Patents

Communication party identifying apparatus, communication party identifying method, and communication party identifying program Download PDF

Info

Publication number
WO2008001503A1
WO2008001503A1 PCT/JP2007/000712 JP2007000712W WO2008001503A1 WO 2008001503 A1 WO2008001503 A1 WO 2008001503A1 JP 2007000712 W JP2007000712 W JP 2007000712W WO 2008001503 A1 WO2008001503 A1 WO 2008001503A1
Authority
WO
WIPO (PCT)
Prior art keywords
communication
communication partner
appearance
feature
relative position
Prior art date
Application number
PCT/JP2007/000712
Other languages
French (fr)
Japanese (ja)
Inventor
Shohei Nomoto
Original Assignee
Nec Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nec Corporation filed Critical Nec Corporation
Publication of WO2008001503A1 publication Critical patent/WO2008001503A1/en
Priority to AT08014843T priority Critical patent/ATE461867T1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/255Detecting or recognising potential candidate objects based on visual cues, e.g. shapes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a communication partner identification device, a communication partner identification method, and a communication partner identification program, and in particular, a communication partner identification device and a communication partner identification for identifying a communication partner using shape information of the communication partner.
  • ITS Intelligent Transport System
  • vehicle control information and position information acquired by GPS are transmitted to a base station via wireless communication, and the base station receives position information received by wireless communication. Based on this, the positional relationship with respect to a plurality of vehicles is specified, and the control information of the preceding vehicle is transmitted to a specific succeeding vehicle via wireless communication. The specific succeeding vehicle receives the preceding vehicle received by wireless communication. The vehicle is automatically controlled based on the control information. As a result, when the preceding vehicle suddenly brakes, the brakes of the following vehicle are automatically activated, thereby avoiding a crushing accident.
  • GPS Global Positioning System
  • an image of the surroundings of the vehicle taken by the camera and the position information acquired by the GPS are transmitted to the surrounding vehicles via wireless communication.
  • the vehicle that has received the image and the position information via wireless communication compares the position information of the host vehicle acquired by GPS with the position information received by wireless communication, thereby determining the position of the transmission source and the host vehicle.
  • the relationship is identified and the image received via wireless communication is Display on the display according to the relationship.
  • the camera unit monitors the number plate of the following vehicle from the preceding vehicle, and transmits an ID signal, a synchronization signal, and information to be transmitted from the preceding vehicle to the following vehicle.
  • the information from the preceding vehicle is the information on the own vehicle, technology has been developed to communicate between the vehicles and avoid accidents.
  • Patent Document 5 the current orientation and size of the subject from the calculated orientation and size of the subject, the vehicle speed, and the steering angle in the image before the comparison target image is taken.
  • a technique for predicting the length and selecting a template to be used for comparison using the prediction information is disclosed.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2 00 2 _ 2 2 2 4 9 1
  • Patent Document 2 Japanese Patent Laid-Open No. 2 0 0 6 _ 0 3 1 3 2 8
  • Patent Document 3 Japanese Patent Laid-Open No. 5-2700
  • Patent Document 4 Japanese Patent Laid-Open No. 9-9 8 1 2 5
  • Patent Document 5 Japanese Unexamined Patent Publication No. 2 0 0 5 _ 3 1 8 5 4 6
  • Patent Document 4 it is possible to establish a communication transmission path between the preceding vehicle and the succeeding vehicle and perform mutual communication to avoid an accident or the like.
  • the mutual communication is limited to the vehicles located in the front and rear, and it becomes impossible to establish communication with the vehicle traveling diagonally behind the host vehicle. Therefore, the technique of Patent Document 4 has a problem that it is impossible to avoid danger in advance when a vehicle traveling forward and backward changes lanes.
  • Patent Document 5 the criterion for selecting a template is limited to information about only the own vehicle, and there remains a problem that it is difficult to cope with the traffic state of the vehicle that changes from moment to moment. .
  • An object of the present invention is to provide a communication partner identification device, a communication partner identification method, and a communication partner identification program that solve the above-mentioned problems in consideration of the protection of personal information.
  • a communication partner identifying device is arranged in each of its own and other devices, and identifies another device that is a communication partner from any other device located around the own device.
  • a communication partner identification device is arranged in each of its own and other devices, and identifies another device that is a communication partner from any other device located around the own device.
  • An external feature extraction unit that analyzes an image obtained by capturing an external shape of another device located around the device, and extracts features of the external shape of the other device from the image;
  • Appearance feature comparison means for comparing the appearance shape information extracted by the appearance feature extraction means of the other device and the features of the appearance shape possessed by the own device;
  • an identification unit that identifies the other device of the communication partner based on the comparison result transmitted from the appearance feature comparison means of the other device.
  • the communication means of the own device is the appearance of the other device extracted by the appearance feature extracting means of the own device.
  • the feature data of the shape may be transmitted to the other device, and the comparison result by the appearance feature comparison unit of the other device may be received.
  • Imaging region detection means for analyzing position data of another device in an image obtained by photographing the periphery of the device, and a relative position for calculating the relative position of the other device with respect to the device based on the position data of the other device
  • the personality having the position calculating means may be used.
  • the feature data of the appearance shape corresponding to the relative position is selected, and the feature data of the selected appearance shape is transmitted to the appearance feature extraction unit of another device. It is good also as a structure which has a means
  • a configuration may be adopted in which a communication partner is selected based on the information on the relative position, and information on the selected communication partner is transmitted to the communication unit of the own apparatus.
  • the communication partner specifying method of the present invention is a communication partner specifying method for specifying another device as a communication partner from any other device located around the own device,
  • the other device of the communication partner is identified based on the external feature comparison step that compares the external shape information with the external shape characteristics of the device itself, and the comparison results sent from the external feature comparison means of other devices. And an identification step to execute.
  • the feature data of the external shape of the other device extracted by the own device may be transmitted to the other device, and the external feature comparison result in the other device may be received.
  • the relative position of the other device to the own device is determined.
  • a relative position calculating step for calculating, and a configuration for executing are also possible.
  • the characteristic of the external shape corresponding to the relative position It is also possible to select a ⁇ data and compare the features of the appearance shape based on the feature data of the selected appearance shape.
  • a communication partner may be selected based on the information on the relative position, and the information may be transmitted to the selected communication partner.
  • the communication partner specifying program according to the present invention includes a computer constituting a communication partner specifying device that specifies another device serving as a communication partner from any other device located around the own device.
  • the computer may transmit the feature data of the appearance shape of the other device extracted by the own device to the other device, and may receive the appearance feature comparison result in the other device, and It is.
  • the computer Based on the function of analyzing the position data of the other apparatus in the image obtained by photographing the periphery of the own apparatus and the position data of the other apparatus, the computer calculates the relative position of the other apparatus with respect to the own apparatus. It is good also as a structure which performs the function to perform and.
  • the computer selects feature data of the outer shape corresponding to the relative position, and compares the features of the outer shape based on the selected feature data of the outer shape. It is also good as a configuration for executing the function to be performed.
  • the computer may be configured to execute a function of selecting a communication partner based on the relative position information and transmitting the information to the selected communication partner.
  • the communication partner identification device of the present invention is arranged in each of its own and other devices, A communication partner specifying device for specifying another device as a communication partner from any other device located in the enclosure,
  • the communication partner specifying method of the present invention is a communication partner specifying method for specifying another device as a communication partner from any other device located around the own device,
  • the communication partner specifying program according to the present invention includes a computer constituting a communication partner specifying device that specifies another device serving as a communication partner from any other device located around the own device.
  • the communication device of the communication partner can be accurately identified by using the image information of the external shape of the communication device.
  • communication is performed by using an image of the appearance of the communication device.
  • the other party's communication device can be specified with high accuracy.
  • the positional relationship with the communication device of the communication partner can be accurately identified. Also, it is possible to communicate only with a specific communication device by broadcasting the appearance image of the communication device that communicated.
  • FIG. 1 is a block diagram for explaining an operation of specifying a communication partner using the communication partner specifying apparatus according to Embodiment 1 of the present invention.
  • FIG. 2 is a block diagram showing a configuration of a communication partner identifying apparatus according to Embodiment 1 of the present invention.
  • FIG. 3 is a diagram for explaining a positional relationship among a plurality of communication partner identifying apparatuses.
  • FIG. 4 is a flowchart explaining a communication partner specifying method according to Embodiment 1 of the present invention.
  • FIG. 5 is a flowchart explaining a communication partner specifying method according to Embodiment 1 of the present invention.
  • FIG. 6 is a block diagram for explaining an operation of identifying a communication partner using the communication partner identifying apparatus according to Embodiment 2 of the present invention.
  • FIG. 7 is a block diagram showing a configuration of a communication partner identifying apparatus according to Embodiment 2 of the present invention.
  • FIG. 8 is a diagram for explaining a positional relationship among a plurality of communication partner specifying devices.
  • FIG. 9 is a diagram illustrating a process of calculating a relative position.
  • FIG. 10 is a flowchart for explaining a communication partner specifying method according to Embodiment 2 of the present invention.
  • FIG. 11 is a flowchart for explaining a communication partner specifying method according to Embodiment 2 of the present invention.
  • FIG. 12 is a block diagram for explaining an operation of specifying a communication partner using the communication partner specifying apparatus according to Embodiment 3 of the present invention.
  • FIG. 13 is a block diagram showing a configuration of a communication partner identifying apparatus according to Embodiment 3 of the present invention.
  • FIG. 14 is a flowchart for explaining a communication partner specifying method according to Embodiment 3 of the present invention.
  • FIG. 15 is a block diagram illustrating an operation of specifying a communication partner using the communication partner specifying apparatus according to Embodiment 4 of the present invention.
  • FIG. 16 is a block diagram showing a configuration of a communication partner identifying apparatus according to Embodiment 4 of the present invention.
  • FIG. 17 is a flowchart for explaining a communication partner specifying method according to Embodiment 4 of the present invention.
  • FIG. 18 is a block diagram showing a configuration of a communication partner identifying apparatus according to Embodiment 5 of the present invention.
  • FIG. 19 is a diagram for explaining a positional relationship among a plurality of communication partner specifying devices.
  • FIG. 20 is a flowchart for explaining a communication partner specifying method according to Embodiment 5 of the present invention.
  • FIG. 21 is a flowchart for explaining a communication partner specifying method according to Embodiment 5 of the present invention.
  • the first embodiment is arranged in each of its own and other devices (automobiles), and from any other device (other automobile) located around the own device (own automobile) to another device (others) Communication partner identification devices A to C for identifying a vehicle).
  • the communication partner identification devices A to C are communication means 1 1 for exchanging information between themselves and other devices (automobiles), and other devices (other automobiles) located around the own device (own automobile). ) Analyze the image taken of the appearance shape of the other device from the image (other Appearance feature extraction means 1 2 that extracts features of the appearance shape of automobiles, appearance shape information extracted by appearance feature extraction means 1 2 of other devices (other automobiles), and own device (own automobile) Appearance feature comparison means 1 3 for comparing features of external shapes to be performed and control means for identifying another device of the communication partner based on the comparison result transmitted from the appearance feature comparison means 1 3 of another device (another automobile) 1 0 and
  • the communication partner specifying apparatus includes a control unit 10, a communication unit 11, an appearance feature extraction unit 12, an appearance feature comparison unit 13, Means 1 to 4 are provided.
  • the control means 10, the communication means 1 1, the appearance feature extraction means 1 2, the appearance feature comparison means 1 3, and the storage means 1 4 are connected by a system bus.
  • the communication partner identification device shown in Fig. 2 is installed in its own device and in the other device as the communication partner.
  • the block indicated by a dotted line indicates that the communication partner is in a suspended state
  • the block indicated by a solid line indicates that it is in an operating state.
  • the X on communication path R in Fig. 1 indicates that no personal information is sent.
  • the control means 1 0 0 included in the communication partner specifying device A shown in FIG. 1, the control means 2 0 0 included in the communication partner specifying device B, and the control means 3 0 0 included in the communication partner specifying device C are: Corresponds to the control means 10 of FIG.
  • the communication means 1 0 1 included in the communication partner specifying device A shown in FIG. 1, the communication means 2 0 1 included in the communication partner specifying device B, and the communication means 3 0 1 included in the communication partner specifying device C are shown in FIG.
  • appearance feature extraction means 2 0 2 included in communication partner identification device B and appearance feature extraction means 3 included in communication partner identification device C 3 0 2 corresponds to appearance feature extraction means 1 2 in FIG.
  • Appearance feature comparison means 1 0 3 included in communication partner identification device A shown in FIG. 1 corresponds to the appearance feature comparison means 2 0 3 included in communication partner identification device B and appearance feature comparison means 3 included in communication partner identification device C 3 0 3 corresponds to the appearance feature comparison means 13 in FIG.
  • Storage means 1 0 4 included in communication partner identification device A shown in FIG. 1 storage means 2 0 4 included in communication partner identification device B, and communication partner identification device
  • the storage means 30 04 included in the device C corresponds to the storage means 14 in FIG.
  • the control means 10 controls the operation of the communication means 1 1, appearance feature extraction means 12, appearance feature comparison means 13, and storage means 14. Further, the control means 10 has an interface with an external device (not shown), and receives an image of the surroundings of the device and various sensor information via the interface, and specifies a communication partner. Send data such as results. Further, the control means 10 identifies the other device of the communication partner based on the comparison result transmitted from the external feature comparison means of the other device.
  • a camera is used as an external device that captures an image around the device itself.
  • the camera external device
  • the camera external device
  • the camera is attached to the rear and images other vehicles that are traveling right behind and obliquely behind the vehicle.
  • the camera is attached to a posture that goes around the vehicle and images the surrounding situation. Since a general technique is applied to a camera that captures the periphery of its own device, detailed description of its configuration and imaging method is omitted.
  • the communication means 11 1 uses the communication standards such as IEEE 802.11b and UWB to extract the extracted feature data of the outer shape and the comparison result of the features of the outer shape. Send and receive.
  • communication standards such as IEEE 802.11b and UWB
  • the appearance feature extraction unit 12 extracts the feature of the external shape of the other device that is reflected in the image around the own device received by the control unit 10 from the external device. Images taken around the device include unnecessary backgrounds as well as other devices. Therefore, the appearance feature extraction unit 12 cuts out the target other device from the image, and extracts the feature of the appearance shape of the other device from the cut out image. Since a general technique is applied to a configuration and method for extracting features of another target device from an image, detailed description of the configuration and extraction method is omitted.
  • the external feature extraction unit 12 sends the extracted external shape feature data of the other device to the communication means 11.
  • the outer shape of the vehicle, the vehicle type, and the license plate are used as the characteristics of the vehicle outer shape.
  • the outer shape of the vehicle include a vehicle silhouette, Use the tail lamp mounting position, front grille, rear grille shape, vehicle width and height, front and rear tire mounting width, and window frame shape.
  • the vehicle type of the vehicle is a truck, a motorcycle, or a sedan by pattern matching with respect to the outer shape of the vehicle.
  • the appearance color is extracted as a feature of the vehicle.
  • the appearance feature comparison unit 13 receives the feature of the external shape of the other device extracted by the communication unit 11 1 and received from the communication unit 11 1 and the self stored in the storage unit 14 The characteristics of the external shape of the device are compared, and the comparison result is transmitted to the communication partner via communication means 11.
  • the appearance feature comparison means 13 includes the appearance shape feature of the other device extracted by the communication partner appearance feature extraction means 12 and the appearance shape feature of the own apparatus stored in the storage means 14. -If it matches, the communication means 11 sends the comparison result that the object captured in the image captured by the communication partner is the device itself to the communication partner.
  • Appearance feature comparison means 1 3 is used when the appearance feature of the other device extracted by the appearance feature extraction means 1 2 of the communication partner does not match the appearance shape feature of the own device stored in the storage means 14
  • the communication means 11 sends the comparison result that the object reflected in the image taken by the communication partner is not the device itself to the communication partner.
  • the features of the external shape of the other device extracted by the external feature extraction unit 12 of the communication partner received by the communication unit 11 1 by the external feature comparison unit 13 and stored in the storage unit 14 Shows an example of a method to compare the appearance shape characteristics of the device
  • the own device receives the appearance characteristics of the communication device (vehicle) extracted from the image by other devices that travel around, and compares the received appearance features with the appearance features of the own device held by the own device. Determine if the received appearance features match your own appearance features. This operation is performed by starting the comparison from the rough features (color, size, shape, etc.) and calculating the degree of coincidence in each feature. If the calculated degree of coincidence is less than the specified value, it is determined that the sent image and itself are inconsistent. If the degree of coincidence is greater than the specified value, compare other features. Storage means 1 If all the features stored in 4 are not judged to be inconsistent, it is judged that the sent image matches itself.
  • the color when using color as a feature of the appearance shape of the vehicle, the color can be digitized, and the color difference can be calculated and evaluated.
  • the outer shape When using the outer shape as a feature, perform a predetermined analysis and compare the parameters of the features with the outer shape. For this parameter comparison, for example, a region where a certain change in the pixel value (density value) of the image is extracted as a vehicle outline and partial line, and the area value of the part surrounded by the line is calculated.
  • the outer shape portion can be specified.
  • the features of the appearance shape By quantifying the features of the appearance shape in this way, it is possible to compare the features of the appearance shape extracted from the image acquired by the other device with the features of the appearance shape of the corresponding own device.
  • the value obtained as a result of this comparison is taken as the degree of coincidence. For example, assuming that 100% corresponds to 100% and the specified value is 60%, it is possible to determine that the features of both external shapes are inconsistent if the degree of match is 6 0 ⁇ 1 ⁇ 2 or less. .
  • the communication partner If the features of the external shape of the other device extracted by the external feature extraction means 12 of the communication partner and the features of the external shape of the own device stored in the storage means 14 match, the communication partner The communication means 11 sends the comparison result that the object reflected in the image taken by is the own device to the communication partner.
  • the appearance feature comparison unit 13 is a communication partner appearance feature extraction unit.
  • the feature of the external shape of the other device extracted in 12 and the feature of the external shape of the own device stored in the storage unit 14 the feature of the external shape is compared with the own device. For this reason, external appearance feature data including personal information is not sent to the device that captures the image. Therefore, the personal information is not leaked to the communication partner because it is included in the feature of the external shape, and the present invention is excellent in terms of protecting personal information.
  • the external shape feature of the other device extracted by the external feature extraction unit 12 of the communication partner is compared with the external shape feature of the own device stored in the storage unit 14
  • the result is sent to the communication partner.
  • the comparison result is information on whether or not the object captured in the image taken by the communication partner is the device itself, that is, information of “YES” or “NO”, Does not contain personal information. Therefore, even if the comparison result of the feature of the external shape is exchanged, the personal information is not leaked to the other party of communication, and the present invention is excellent in terms of protecting the personal information.
  • the storage means 14 provides a working storage area required when the control means 10, the communication means 11, the appearance feature means 12, and the appearance feature comparison means 13 execute operations. . Further, the storage means 14 stores data on the appearance shape of the own device and identification ID data for identifying the other device. The information stored in the storage means 14 is the minimum necessary data of the external appearance feature and the data of the identification ID, and the storage capacity is extremely small.
  • the communication partner identifying apparatus according to the first embodiment shown in FIG. 1 is constructed as hardware, but is not limited thereto.
  • the communication partner identifying apparatus according to Embodiment 1 may be constructed as software. In this case, the communication partner specifying program is incorporated in the computer constituting the communication partner specifying device shown in FIG. 1, and the computer program is read out and executed by the computer CPU, so that the control means 10 is controlled by the computer. And the appearance feature extraction means 1 2 and the appearance feature comparison means 1 3.
  • the communication partner identification method uses a vehicle (automobile) traveling on a road as its own device and other devices, and each vehicle is equipped with the communication partner identification device shown in FIG. The case will be described.
  • the host vehicle (own device) is inclined rearward.
  • the communication partner specifying device mounted on the own device is referred to as self communication device A
  • the communication partner specifying device mounted on the other device is referred to as other communication devices B and C.
  • FIG. 3 it is assumed that a communication path R is formed between the own communication device A and the other communication device B, and communication is performed between the own communication device A and the other communication device B. .
  • control means 10 0 of the own communication device A acquires an external device (camera etc.) force ⁇ an image obtained by photographing the periphery of the own vehicle, and the obtained image data is an appearance feature extraction means 1 0. 2 (Step S1 in Fig. 4).
  • the appearance feature extraction means 10 0 2 of the own communication device A is the feature of the appearance shape of the other communication device (B or C) reflected in the image acquired from the external device transmitted from the control means 100. And the extracted feature data of the external shape is transmitted to the communication means 101 (step S2 in FIG. 4). [0061]
  • the communication unit 101 transmits the feature data of the outer shape transmitted from the appearance feature extraction unit 102 to the other communication device B of the communication partner (step S3 in Fig. 4).
  • the communication means 2 0 1 of the other communication device B in communication with the own communication device A (connected by the communication path R) receives the feature data of the external shape extracted by the own communication device A, and The received external appearance feature data is transmitted to the external feature comparison means 20 3 of the other communication device B (step S 4 in FIG. 5).
  • the appearance feature comparison means 20 03 of the other communication device B includes the appearance shape feature data transmitted from the own communication device A and the appearance shape of the other communication device B stored in the storage means 20 04 in advance.
  • the feature data is compared (step S 5 in FIG. 5).
  • step S 5 of FIG. 5 the appearance feature comparison means 2 0 3 stores the appearance feature of the other device extracted by the appearance feature extraction means 1 0 2 of the communication partner and the storage means 2 0 4. If the stored external communication device (own device) B matches the feature of the external shape, it is compared with the other communication device (self device) B that is the object captured in the image captured by the communication partner. The result and the identification ID for identifying the other communication device B stored in the storage means 204 are transmitted to the communication partner by the communication means 2011 (step S6 in FIG. 5).
  • step S 5 of FIG. 5 the appearance feature comparison means 2 0 3 stores the appearance feature of the other device extracted by the appearance feature extraction means 1 0 2 of the communication partner and the storage means 2 0 4. If the external appearance feature of the other communication device (own device) B does not match, the comparison result is that the object reflected in the image captured by the communication partner is not the other communication device (self device) B.
  • the communication means 11 sends the identification ID for identifying the other communication device B stored in the storage means 2 0 4 to the communication partner (step S 6 in FIG. 5).
  • the communication means 1 0 1 of the own communication apparatus A connected to the communication means 2 0 1 of the other communication apparatus B via the communication path R is the communication means 2 0 1 of the other communication apparatus B of the communication partner. Therefore, the comparison result and the identification ID for identifying the other communication device B are received, and the received comparison result and the identification ID are transmitted to the control means 100 (step S 7 in FIG. 4).
  • the control means 100 of own communication device A is characterized by the appearance shape extracted by appearance feature extraction means 10 2 of own communication device A based on the comparison result transmitted from communication means 1001. , It is detected whether it matches or does not match the feature of the external shape of the other communication device B (step S 8 in FIG. 4).
  • step S8 of Fig. 4 if the control means 100 of own communication device A is a comparison result that the comparison results transmitted from other communication device B match, that is, the own communication device
  • the image captured by the external device is displayed.
  • the fact that the imprinted vehicle is the other communication device B of the communication partner is stored in the storage means 104 in association with the identification ID of the other communication device B (step S 9 in FIG. 4).
  • step S8 of Fig. 4 the control means 1 0 0 of the own communication device A has a comparison result that the comparison result transmitted from the other communication device B is inconsistent, that is, the own communication device A If it is determined that the feature of the external shape transmitted from is inconsistent with the feature of the external shape of the other communication device B imprinted in the image (step S 8 in FIG. 4; NO), the process is the step of FIG. Returning to S 1, the image around the own communication device A is acquired again from the external device, and the processing is continued (steps S 1 to S 3 in FIG. 4).
  • the communication device of the communication partner can be accurately identified.
  • the communication partner is specified based on the feature of the appearance shape that appears on the external appearance of the self-other device without focusing on the characteristic portion that appears in the specific part of the self-other device.
  • the communication partner can be identified without being affected by the relative position of the device.
  • the process of capturing an image of a specific object and the process of extracting features from the captured image are performed by one device, and the features are compared by the other device.
  • the storage means only stores the data of the features of the external appearance of the device itself and the data of the identification ID, and the information stored in the storage means becomes the minimum necessary, so that the storage capacity can be kept extremely low. it can. Therefore, the present invention can be applied without changing the specifications of the existing Chikichi navigation system.
  • the external feature comparison unit of the other communication device includes the external shape feature extracted by the external feature extraction unit of the communication partner's own communication device, and the external shape stored in the storage unit of the other communication device.
  • the comparison of the features of the external shape is performed by the own device. For this reason, feature data of the external shape including personal information is not transmitted to the imaging device. Therefore, it is possible to prevent personal information from being unnecessarily diffused in the features of the external shape, and to strengthen the protection of personal information.
  • the comparison result compared by the appearance feature comparison means of the other communication device of the communication partner is transmitted to the own communication device.
  • the comparison result is information on whether or not the object captured in the image taken by the communication partner is the device itself, that is, information of “YES” or “NO”, and this information includes personal information. Is not included. Therefore, it is possible to enhance the protection of personal information even if the comparison results of the appearance shape characteristics are exchanged.
  • an expiration date is set for the calculated relative position data, and the relative position data is managed.
  • the communication partner specifying apparatus includes an imaging unit 15, an imaging region detection unit 16, and a relative position calculation unit 1 7 in the circuit configuration shown in FIG.
  • the circuit configuration is as described above.
  • Control means 1 communication means 1 1 and Appearance feature extraction means 1 2, appearance feature comparison means 1 3, storage means 1 4, imaging means 1 5, imaging area detection means 1 6, and relative position calculation means 1 7 Connected by.
  • the communication partner identification device shown in FIG. 7 is installed in the own device and the other device as the communication partner.
  • the block indicated by the dotted line indicates that the communication partner is in a suspended state
  • the block indicated by a solid line indicates that it is in an operating state.
  • the control means 1 0 0 included in the communication partner specifying device A shown in FIG. 6, the control means 2 0 0 included in the communication partner specifying device B, and the control means 3 0 0 included in the communication partner specifying device C are shown in FIG. Corresponds to the control means 10 of the above.
  • the communication means 1 0 1 included in the communication partner specifying device A shown in FIG. 6, the communication means 2 0 1 included in the communication partner specifying device B, and the communication means 3 0 1 included in the communication partner specifying device C are shown in FIG. Corresponds to communication means 1 1.
  • appearance feature extraction means 2 0 2 included in communication partner identification device B and appearance feature extraction means 3 included in communication partner identification device C 3 0 2 corresponds to appearance feature extraction means 1 2 in FIG.
  • Appearance feature comparison means 1 0 3 included in communication partner identification device A shown in FIG. 6 appearance feature comparison means 2 0 3 included in communication partner identification device B and appearance feature comparison means 3 included in communication partner identification device C 3 0 3 corresponds to the appearance feature comparison means 13 in FIG.
  • the storage means 10 04 included in the communication partner specifying device A shown in FIG. 6, the storage means 20 04 included in the communication partner specifying device B, and the storage means 30 04 included in the communication partner specifying device C are shown in FIG. This corresponds to the storage means 1 of 4.
  • the imaging means 10 05 included in the communication partner identification device A shown in FIG. 6, the imaging means 20 05 included in the communication partner identification device B, and the imaging means 30 0 included in the communication partner identification device C are shown in FIG. Corresponds to shooting means 1-5.
  • Imaging area detection means 1 0 6 included in communication partner identification apparatus A shown in FIG. 6, imaging area detection means 2 0 6 included in communication partner identification apparatus B and imaging area detection means 3 0 included in communication partner identification apparatus C 6 corresponds to the imaging region detection means 16 in FIG.
  • the calculating means 2 07 and the relative position calculating means 3 0 7 included in the communication partner specifying device C correspond to the relative position calculating means 17 in FIG.
  • the photographing means 15 obtains image data obtained by photographing the entire surroundings of the own device with the entire periphery of the own device as the photographing range.
  • the imaging means 15 adds imaging parameters to the acquired image data, and transmits these data to the imaging area detection means 16.
  • the shooting parameters include information about the camera that has captured the image to be transmitted, and the shooting parameters include information such as the position information of the camera attached to the device and the direction when the camera shot. Since general camera technology is applied to the imaging means 15, details of the configuration, imaging method, imaging parameters, etc. are omitted.
  • the imaging means 15 has a configuration in which a plurality of cameras are installed around the own device and the image of the entire surroundings of the own device is acquired to acquire image data, or a single camera is installed on the own device. Any configuration may be employed in which image data is acquired by photographing the situation around the entire device while moving to the surroundings. In short, the configuration of the imaging means 15 may be any configuration as long as it can acquire image data by capturing the situation around the entire device.
  • the photographing means 15 is installed as one component of the communication partner specifying device, but the present invention is not limited to this. Captures image data such as image position information and image position information corresponding to the image data from the external device via the interface, and the image data of the entire surroundings of the device. It may be.
  • the imaging region detection means 16 analyzes the image based on the imaging parameters and images transmitted from the imaging means 15 to detect an area where another device is captured in the image, and The position of the other device in the image of the area is transmitted to the relative position calculation means 17, and the section image cut out from the area is transmitted to the appearance feature extraction means 12.
  • the position of the other device is determined by using the coordinates of the center of gravity of the detected other device, the coordinate group of the point sequence that constitutes the detected silhouette of the other device, or the coordinates of the rectangular area that includes the silhouette. Identify. It is not limited to this. Since a general technique is applied to specify the position of the other device, a detailed description thereof will be omitted.
  • Relative position calculation means 17 uses the position of the other apparatus in the image transmitted from imaging area detection means 16 and the imaging parameter transmitted from imaging means 15 to The relative position of is calculated.
  • the relative position data calculated by the relative position calculation means 17 is stored in the storage means 14.
  • the image received by the relative position calculating means 1 7 from the imaging means 15 is displayed on the coordinate of the horizontal axis X g and the vertical axis Y g at the position (xg, _ yg), for example, another device such as a vehicle. (Fig. 9 (a)).
  • Relative position calculation means 17 uses the screen coordinate system shown in Fig. 9 (a) to perform the projection transformation shown in Fig. 9 (b), as shown in Fig. 9 (c) (X p, Y p Projective transformation on the real world camera coordinate system.
  • Relative position calculation means 17 derives the relationship between yp and yg based on the following equation when performing projective transformation to the real-world camera coordinate system shown in Fig. 9 (c).
  • the relative position calculation means 17 converts the image coordinate system shown in Fig. 9 (a) into a real world camera coordinate system by projective transformation shown in Fig. 9 (b). Then, the relationship between the own device (for example, the own vehicle) and the other device (for example, another vehicle) is plotted on the real world coordinate system with the camera (imaging means 15) as the center.
  • V, Y v is the real-world vehicle coordinate system.
  • the relative position calculation means 17 calculates the coordinate position (X V, y v) of the other vehicle of the communication partner based on the following equation.
  • the relative position calculation method described above is merely an example, and the present invention is not limited to this. Since a general technique is applied to the relative position calculation method, detailed description of the relative position calculation methods other than those described above is omitted.
  • the appearance feature extraction means 12 receives the slice image cut out from the area where the other apparatus is transmitted, which is transmitted from the imaging area detection means 16 and the appearance shape of the other apparatus from the received slice image Extract features. Note that the data relating to the feature of the external shape of the other device extracted by the appearance feature extraction means 12 is stored in the storage means 14.
  • the storage means 14 relates to the relative position data of the other apparatus calculated by the relative position detection means 17 of the own apparatus with respect to the same other apparatus detected by the imaging region detection means 16 of the own apparatus.
  • the feature data of the appearance shape of the other device extracted by the appearance feature extraction means 12 of the own device is stored as a pair, and the data of this pair is transmitted to the communication means 11.
  • the storage unit 14 stores the relative position regarding the same other device transmitted from the communication unit 11 and the feature of the external shape of the other device as a pair.
  • the storage means 14 includes an identification ID for identifying the communication partner, the relative position of the other device of the communication partner transmitted from the control means 10, and the relative position data of the other device set by the control means 10. Is stored as a pair.
  • Storage means 1 4 Stores in advance the features of the external shape of its own device.
  • the storage means 14 includes a control means 10, a communication means 1 1, an appearance feature extraction means 1 2, and an appearance feature comparison means.
  • storage means 14, imaging means 15, imaging area detection means 16, and relative position calculation means 17 provide a working storage area necessary for executing operations.
  • the communication unit 1 1 identifies the pair of the relative position and the external shape feature stored in the storage unit 1 4 with respect to other devices located around the own device, and the communication unit 1 1 of the own device. Send the identification ID to the communication partner.
  • the communication means 11 includes a pair of features of the relative position and appearance shape with respect to another device provided around the other device from another device equipped with the same communication partner identifying device, and a communication device of the other device.
  • Identification ID for identifying 1 is received, and these data are transmitted to storage means 14.
  • the communication means 1 1 has a communication partner identifying apparatus shown in Fig. 7, and a pair of relative position data and external appearance feature data for other apparatuses located in the vicinity, and communication means 1 of the other apparatus. An identification ID for identifying 1 is received. The information received by the communication unit 11 is stored in the storage unit 14.
  • the communication of the communication means 1 1 includes ad hoc communication in which direct communication is performed between the own device and another device, or a part or all of a communication path between the own device and the other device is used as road-to-vehicle communication. Communication between vehicles is also included. Since a general technique is applied to the communication of the communication means 11, the detailed description of the configuration and the communication method is omitted.
  • the identification ID is an identifier assigned to the own device and the other device.
  • the identifier is assigned to each power navigation device. Any device can be used as long as it can identify the device and the other device, such as an identifier assigned to the device and the other device by the communication protocol between them.
  • the communication means 11 transmits an identification ID for identifying its own device to the communication partner.
  • the communication unit 11 receives the identification ID for identifying the communication unit 11 of the other device of the communication partner, and transmits the identification ID to the control unit 10.
  • the communication means 11 transmits the comparison result of the appearance characteristic comparison means 13 of its own device to the communication partner.
  • the communication unit 11 receives the comparison result of the appearance feature comparison unit 13 of the other device of the communication partner, and transmits the comparison result to the control unit 10.
  • Appearance feature comparison means 13 is a storage means 1 of its own device that is transmitted from another device having the same communication partner specifying device as the feature of the appearance shape of its own device stored in storage means 14 A comparison with the features of the external shape stored in 4 is performed.
  • the appearance feature comparison unit 13 transmits the comparison result to the communication unit 11 and the control unit 10 of its own device.
  • the control means 10 is an imaging means 15, an imaging area detection means 16, a communication means 11, an appearance feature extraction means 12, an appearance feature comparison means 13, a storage means 14, a relative position calculation means 1 Control each operation of 7.
  • the control means 10 has an interface with an external device, and receives an image around the device and various sensor information, and outputs a communication partner identification result.
  • control means 10 collates the identification ID transmitted from the communication means 11 for identifying the communication partner and the identification ID stored in the storage means 14. If there is no matching identification ID or the expiration date of the relative position stored as a pair with the matching identification ID is invalid, the control means 1 0 acquires the relative position data for the communication partner's own device, Store the data of position and identification ID in memory unit 14.
  • the control means 1 0 refers to the expiration date of the relative position data of the communication partner stored in the storage means 14 at regular intervals. If the expiration date is invalid, the control means 1 0 The position data is acquired, and the data of the relative position and the identification ID are stored in the storage means 14.
  • the communication partner specifying apparatus is constructed as hardware, but is not limited to this.
  • the communication partner specifying apparatus shown in FIG. By incorporating the identification program, and reading and executing the communication partner identification program by the CPU of the computer, the imaging area detection means 16, the control means 10, the appearance feature extraction means 12, and Appearance feature comparison means 1 3 and relative position calculation means 1 7 It may be constructed to perform a function.
  • the communication partner identification device shown in Fig. 7 is mounted on the vehicle as the own / other device, and the relative position between the communication partner identification devices mounted on the own / other device is identified. An example of the method will be described with reference to FIG. 8, FIG. 10, and FIG.
  • the communication partner is specified even when another device exists behind, diagonally rearward, or diagonally forward of the own device.
  • the communication partner identification device of Fig. 7 installed in the vehicle as its own device is the own communication device A
  • the communication partner identification device of Fig. 7 installed in the vehicle traveling in the rear is the other communication device of the communication partner.
  • D and E exist.
  • the own communication device A performs the processing shown in FIG. 10 and the other communication device B performs the processing shown in FIG. Calculate the position.
  • the local communication device A performs the process shown in FIG. 11 and the other communication device B performs the process shown in FIG. 10 so that the communication partner is identified and the relative position with the communication partner is calculated. It may be.
  • the communication partner is specified between the own communication device A and the other communication device C, or between the other communication device B and the other communication device C, and the relative position with respect to the communication partner is calculated. It's good.
  • step S in FIG. 10 when the control means 1 0 0 of the own communication apparatus A acquires an identification ID for identifying communication with another communication apparatus B via the communication means 1 0 1 (step S in FIG. 10). 1 0; YES), the process proceeds to step S 11 in FIG. 10 and if the identification ID is not obtained (step S 10 in FIG. 10; NO), the process proceeds to step S 13 in FIG. 10. Proceed to
  • step S 11 of FIG. 10 the control means 1 0 0 of the own communication device A obtains the identification ID that matches the identification ID acquired from the other communication device B from the storage information of the storage means 1 0 4. Search for. [0109]
  • step S11 of FIG. 10 the control means 100 of the own communication device A does not have the identification ID acquired from the other communication device B in the storage information of the storage means 104, or If it is determined that the validity period of the matching identification ID has passed and is invalid (S 1 2 in FIG. 10; YES), the process proceeds to step S 15 in FIG.
  • control means 100 of the own communication device A is valid if the identification ID acquired from the other communication device B is within the validity period of the identification ID that is present in or coincides with the storage information of the storage device 104. If so, the process returns to step S 1 0 in FIG.
  • step S 13 of FIG. 10 the control unit 100 of the own communication device A confirms the expiration date of the relative position data stored in the storage unit 104. If the expiration date has passed (YES in S1 3), the control means 100 advances the process to step S15 in FIG. 10, and if the expiration date has not passed, the process proceeds to the step in FIG. Proceed to step S 1 0.
  • step S15 of Fig. 10 the imaging means 1 05 captures the image of the surroundings of the own communication device A under the control of the control means 100, and acquires the image data.
  • the image data is transmitted to the imaging region detection means 106.
  • the shooting area detection means 106 detects the area in which the other communication device B appears in the image data transmitted from the shooting means 105 (step S16 in FIG. 10), and vehicle information Assuming that the number is n (step S 17 in FIG. 10), the following processing (steps S 18 to S 22 in FIG. 10) is performed for each other communication device.
  • the relative position calculating means 107 calculates the relative position of the other communication device B with respect to the own communication device A.
  • the relative position calculation means 107 is the position in the image of the other communication device B detected by the imaging area detection means 106, and the camera that has taken the image provided by the imaging means 105 Based on the mounting position information attached to communication device A and the shooting parameters indicating the direction the camera is facing, the relative position of other communication device B with respect to own communication device A is calculated (step S in FIG. 10). 1 9).
  • the camera parameters are stored in advance in the storage means 104 and stored.
  • the means 104 may provide the camera parameter to the relative position calculation means 107.
  • the appearance feature extraction means 10 2 extracts the feature of the appearance shape of the other communication device B whose relative position has been calculated (step S 2 0 in FIG. 10).
  • the external shape of the vehicle, vehicle type, color, license plate, etc. are used as features of the external shape of the vehicle.
  • vehicle seat, rear tail lamp mounting position, front grille, rear grille shape, vehicle width and height, front and rear tire mounting width, window frame shape, and the like are used as the outer shape of the vehicle.
  • the vehicle type of the vehicle is estimated to be a truck, a minivan, or a sedan by pattern matching with respect to the outer shape of the vehicle.
  • the control means 1 0 0 is the relative position of the other communication device B calculated by the relative position calculation means 1 0 7 and the external shape feature of the other communication device B calculated by the external feature extraction means 1 0 2 Are stored in the storage means 104 as a pair (step S 2 1 in FIG. 10).
  • control means 3 0 2 increments (step S 2 2 in FIG. 10) and returns the processing to step S 1 8 in FIG. 10, and exists around the next self-communication device A.
  • the other communication devices C, D, E are caused to execute the above-described processing.
  • step S 19 to S 2 2 in FIG. 10 the above processing (steps S 19 to S 2 2 in FIG. 10) Is completed (step S 1 8 in FIG. 10; NO), the control means 100 of the own communication device A sends all other communication devices B, C, D,
  • the communication ID of the communication device A is determined by a pair of the relative position of E (No. 1 to n) and the feature of the appearance of the vehicle (other device) and the identification ID for identifying the communication means 1 0 1 of the communication device A.
  • the data is transmitted to the other communication device B through the device 1001 (step S2 3 in FIG. 10).
  • control means 2 0 0 of the other communication device B is characterized by the external shape related to the vehicle around the communication device A via the communication device 1 0 1 from the communication device 1 0 And relative position pair data and identification ID data for identifying communication means 1 0 1 of own communication device A (step S 2 4 in FIG. 11).
  • the position pair data and the number of transmitted vehicle information (n) data are stored in the storage means 204 (steps S 25 and S 26 in FIG. 11).
  • control means 20 0 of the other communication device B performs the following processing for each received vehicle (steps S 27 to S 30 in FIG. 11).
  • step S 2 7 in FIG. 11 the process proceeds to S 28 and i is incremented (step S 2 8 in FIG. 11).
  • the external feature comparison means 2 0 3 of the other communication device B is stored in the storage means 2 0 4 in advance and the characteristics of the external shape of the vehicle relating to the i-th vehicle stored in the storage means 2 0 4. The characteristics of the external shape of the vehicle of the other communication device B are compared (step S 29 in FIG. 11).
  • Appearance feature comparison means When it is determined that there is a feature of the appearance of the vehicle that has passed the action (step S3 0 in Fig. 11; YES), processing is performed in step S3 1 in Fig. 11 Proceed to
  • step S 3 0 in FIG. 11 determines that the features of the vehicle exterior shape do not match (step S 3 0 in FIG. 11; NO)
  • the process returns to step S 2 7 in FIG.
  • the above process is performed for all received vehicles. If the appearance feature comparison means 20 3 determines that the features of the appearance of the vehicles do not match for all the received vehicles, the process proceeds to step S 33 in FIG.
  • step S 3 1 of FIG. 11 the control device 2 00 of the other communication device B stores the data of the relative position paired with the matched feature data of the appearance of the vehicle. Read from 2 0 4 Since the relative position is information on the position of the other communication device B as viewed from the own communication device A, the control means 20 0 performs the reverse calculation of the relative position and the own communication device A as viewed from the other communication device B. The relative position of is calculated.
  • control means 20 0 of the other communication device B uses the identification ID for identifying the communication device A as the communication partner, the calculated relative position, and the calculated relative position. Save the expiration date as a pair in the storage means 2 0 4 (Fig. 1 Step 1 of S 3 2).
  • the relative position of the communication partner changes with time, and the error between the relative position of the communication partner estimated at an arbitrary time and the actual relative position of the communication partner becomes large.
  • 2 0 0 is set to an appropriate number (for example, 10) corresponding to the expiration date of the relative position information. If the relative position is not updated, the number is set for each appropriate time (for example, 1 OO ms). Decrement (1 0) and decrement until the numerical value is reduced.
  • the control means 2 0 0 sequentially updates the relative position where the expiration date becomes 0 after decrementing using this communication partner identification method.
  • step S3 3 of Fig. 1 the control means 2 0 0 compares the result of comparing the external appearance characteristics of the vehicle acquired in step S29 of Fig. 1 1 with the identification ID of the other communication device ⁇ . Is transmitted to the own communication device ⁇ ⁇ via the communication means 2 0 1.
  • the communication means 1 0 1 of the own communication device A receives the comparison result of the external shape feature of the vehicle from the other communication device B of the communication partner and the identification ID of the other communication device B (of FIG. 10). In step S 3 4), these data are transmitted to the control means 100.
  • control means 1 0 0 performs processing when it obtains a comparison result that the feature of the appearance of the vehicle matches from the other communication device B of the communication partner (step S 3 5 in FIG. 10; YES). Proceeding to step S 3 6, if they do not match (step S 3 5; NO in FIG. 10), the process returns to step S 10.
  • the control means 100 of the own communication device A reads out the matching vehicle exterior shape feature and the pair relative position data from the storage means 104. Then, the control means 100 stores the data of the read relative position, the expiration date for the relative position information, and the identification ID of the other communication device B of the communication partner as a pair in the storage means 104. (Step S 36 in FIG. 10), the process returns to Step S 10 in FIG. 10.
  • the own communication device A which is one of the communication partners
  • the data of the external shape of the other communication device B, which is the other communication partner, extracted from the side image and the relative position with the other communication device B are transmitted to the other communication device B, and the other communication device of the communication partner
  • the own communication device A can identify the other communication device B of the communication partner, the features of the external shape, and its relative position. it can.
  • the other communication device B of the communication partner can identify the communication device A of the communication partner and the relative position by performing reverse calculation of the received information on the relative position.
  • the expiration date is set for the calculated relative position data and the relative position data is managed, so that the identification can always be performed based on the latest relative position data.
  • Embodiment 3 an example in which the communication partner specifying apparatus according to Embodiment 2 of the present invention is changed will be described as Embodiment 3.
  • the communication partner specifying device of Embodiment 3 has the circuit configuration shown in FIG. 7 as a basic configuration, and the appearance feature selection means 1 8 is added to the circuit configuration shown in FIG. It is constructed as a circuit configuration.
  • Control unit 10 communication unit 1 1, appearance feature extraction unit 1 2, appearance feature comparison unit 1 3, storage unit 1 4, imaging unit 1 5, imaging region detection unit 1 6
  • the relative position calculation means 17 and the appearance feature selection means 18 are connected by a system bus.
  • the communication partner identification device shown in Fig. 13 is installed in its own device and in the other device as the communication partner.
  • the block indicated by the dotted line indicates that the communication partner is in a suspended state
  • the block indicated by a solid line indicates that it is in an operating state.
  • the communication means 1 0 1 included in the fixed device A, the communication means 2 0 1 included in the communication partner identification device B, and the communication means 3 0 1 included in the communication partner identification device C are the communication means 1 in FIG. Corresponds to 1.
  • Appearance feature extraction means 2 0 2 included in communication partner identification device B and appearance feature extraction means included in communication partner identification device C 3 0 2 corresponds to the appearance feature extraction means 12 in FIG.
  • appearance feature comparison means included in communication partner identification device B 2 0 3 and comparison of appearance features included in communication partner identification device C Means 3 0 3
  • the image capturing means 10 05 included in the communication partner specifying apparatus A shown in FIG. 12, the image capturing means 20 05 included in the communication partner specifying apparatus B, and the image capturing means 30 05 included in the communication partner specifying apparatus C are shown in FIG. Corresponds to 3 shooting means 1 5.
  • Imaging area detection means 1 0 6 included in communication partner identification device A shown in FIG. 12 imaging area detection means 2 0 6 included in communication partner identification apparatus B and imaging area detection included in communication partner identification apparatus C
  • Means 30 06 corresponds to the imaging region detection means 16 in FIG. Figure
  • Relative position calculation means 20 07 included in the communication partner identification device B and Relative position calculation means 3 included in the communication partner identification device C 3 0 7 corresponds to the relative position calculation means 17 in FIG.
  • Appearance feature selection means 2 0 8 included in communication partner identification device B and appearance feature selection means 3 included in communication partner identification device C 3 0 8 is an appearance feature selection means 1 in FIG.
  • the storage means 14 classifies and stores in advance the features of the appearance shape of the own apparatus extracted from the image obtained by photographing the own apparatus from an arbitrary relative position around the apparatus for each relative position.
  • Appearance feature selection means 1 8 is the key for the relative position information received by communication means 1 1
  • the storage means 14 is searched for, and the feature of the external shape of the other device corresponding to the relative position is selected and transmitted to the external feature comparison means 13.
  • the communication partner specifying apparatus is constructed as hardware, but is not limited to this, and the computer constituting the communication partner specifying apparatus shown in FIG.
  • the communication partner specifying program is installed in the computer, and the computer program for reading the communication partner is read out and executed by the computer's CPU, so that the imaging region detecting means 16, the control means 10, and the appearance feature extracting means 1 2, appearance feature comparison means 1 3, relative position calculation means 1 7 and appearance feature selection means 1 8.
  • the communication partner identification device shown in Fig. 13 is mounted on each of the vehicles as the own and other devices, and the relative positions of the communication identification devices mounted on the own and other devices are identified.
  • An example of the communication partner identification method will be described with reference to FIG. 8, FIG. 12, and FIG.
  • the communication partner is specified even when another device exists behind, diagonally rearward, or diagonally forward of the own device.
  • the communication partner identification device of Fig. 13 installed in the vehicle as its own device is the own communication device ⁇
  • the communication partner identification device of Fig. 13 installed in the vehicle traveling directly behind is the other communication of the communication partner.
  • the communication partner identification shown in Fig. 13 mounted on a vehicle traveling diagonally rearward or a vehicle traveling diagonally forward Devices C, D and E are present.
  • the control means 20 0 of the other communication device B uses the communication device 2 0 1 through the communication means 2 0 1 to determine the feature of the external shape of the vehicle located around the communication device A and the relative position. Pair data and identification ID data for identifying communication means 1 0 1 of own communication device A (Step S 4 0 in FIG. 14).
  • the control means 2 0 0 of the other communication device B stores the received pair data of the feature of the external shape of the vehicle and the relative position and the data of the number of vehicle information sent ( n ) in the storage means 2 0 4 (Step S 4 1 in Fig. 14).
  • the control means 2 0 0 of the other communication device B performs the following processing for each received vehicle (step S 4 3 to step S 4 7 in FIG. 14).
  • the appearance feature selection means 2 0 8 searches the storage information in the storage means 2 0 4 using the relative position information related to the i-th other communication device stored in the storage means 2 0 4 as a key. Then, the feature of the external shape of the other communication device B corresponding to the relative position is selected, and the feature of the selected external shape is read from the storage means 20 4 (step S 45 in FIG. 14). The appearance feature selection means 20 8 transmits the feature of the read appearance shape to the appearance feature comparison means 20 3.
  • Appearance feature comparison means 2 0 3 is the external shape feature of other communication device B selected by appearance feature selection means 2 0 8 and the i-th other communication device stored in storage means 2 0 4 The characteristics of the external shape of B are compared (Step S 4 6 in FIG. 14).
  • step S4 7 in Fig. 14 If it is determined in step S4 7 in Fig. 14 that the appearance feature comparison means 2 0 3 has a matching appearance feature (step S4 7; YES), the process is shown in Fig. 1. Proceed to step S4 8 in step 4.
  • step S 4 3 in FIG. 14 determines that the features of the appearance shape of the vehicle do not match. If the appearance feature comparison unit 20 3 determines that the features of the appearance of the vehicle do not match for all the received vehicles (step S 4 3 in FIG. 14; NO), the processing is performed as shown in FIG. Proceed to step S 5 0.
  • step S48 of Fig. 14 the control device 200 reads out information on the relative position paired with the matched feature of the external shape from the storage means 204. Since the relative position is the position of the other communication device B as seen from the own communication device A, the control means 20 0 of the other communication device B performs the reverse calculation of the information related to the relative position and the own communication device B as seen from the other communication device B. Calculate the relative position of communication device A.
  • step S 49 of FIG. 14 the control means 20 00 of the other communication device B uses the identification ID for identifying the communication device A of the communication partner, the calculated relative position, and the above calculation. The pair of the relative position data with the expiration date stored in the storage means 2 0 4
  • the relative position of the communication partner changes with time, and the error between the relative position of the communication partner estimated at an arbitrary time and the actual relative position of the communication partner becomes large.
  • 2 0 0 is set to an appropriate number (for example, 10) corresponding to the expiration date of the relative position information. If the relative position is not updated, the number is set for each appropriate time (for example, 1 OO ms). Decrement (1 0) and decrement until the numerical value is reduced.
  • the control means 20 0 sequentially updates the information on the relative position that has been decremented and the expiration date becomes 0 using this communication partner identification method.
  • step S50 of Fig. 14 the control means 20 00 compares the result of comparison of the vehicle exterior shape characteristics acquired in step S47 of Fig. 14 with the identification ID of the other communication device ⁇ . Is transmitted to the own communication device ⁇ ⁇ via the communication means 2 0 1.
  • the own communication device A communicates with the communication partner's other communication device B the information regarding the external shape feature of the other communication device extracted from the peripheral image of the own device and the information regarding the relative position.
  • the other communication device B classifies the features of the external shape extracted from images taken from various relative positions, stores them in advance for each relative position, and stores the relative position data received from its own communication device A. Is used to select the feature of the external shape of the other communication device B and compare the feature of the selected external shape with the feature of the external shape received from the own communication device A. Therefore, it is possible to identify the characteristics of the other communication device B and its external shape and its relative position.
  • the other communication device B of the communication partner can identify the communication device A of the communication partner in an arbitrary positional relationship and its relative position by calculating the received relative position backward.
  • Embodiment 4 Next, an example in which the communication partner identifying apparatus according to Embodiment 3 of the present invention is changed will be described as Embodiment 4.
  • the communication partner specifying apparatus has the circuit configuration shown in FIG. 13 as a basic configuration, and the circuit configuration shown in FIG. This is constructed as a circuit configuration to which selection means 19 is added.
  • Control means 1 communication means 1 1, appearance feature extraction means 1 2, appearance feature comparison means 1 3, storage means 1 4, imaging means 1 5, imaging area detection means 1 6
  • the relative position calculation means 17, the appearance feature selection means 18, and the communication target selection means 19 are connected by a system bus.
  • the communication partner identification device shown in Fig. 16 is installed in its own device and in the other device that is the communication partner.
  • the block indicated by the dotted line indicates that the communication partner is in the idle state
  • the block indicated by the solid line indicates that it is in the active state.
  • Control means 1 0 0 included in communication partner specifying device A shown in FIG. 15, control means 2 0 0 included in communication partner specifying device B, and control means 3 0 0 included in communication partner specifying device C Corresponds to the control means 10 of FIG.
  • the communication means 1 0 1 included in the communication partner specifying device A shown in FIG. 15, the communication means 2 0 1 included in the communication partner specifying device B, and the communication means 3 0 1 included in the communication partner specifying device C are: Corresponds to communication means 1 1 in Fig. 6.
  • Appearance feature extraction means 2 0 2 included in communication partner identification device B and appearance feature extraction means included in communication partner identification device C 3 0 2 corresponds to the appearance feature extraction means 12 in FIG.
  • Means 3 0 3 corresponds to appearance feature comparison means 1 3 of FIG.
  • the storage means 10 04 included in the communication partner specifying device A shown in FIG. 15, the storage means 20 04 included in the communication partner specifying device B, and the storage means 30 04 included in the communication partner specifying device C are shown in FIG. Corresponds to 6 storage means 1 4.
  • the photographing means 20 5 included in the communication partner identification device B and the photographing means 30 05 included in the communication partner identification device C correspond to the photographing means 15 in FIG.
  • Means 3 06 corresponds to the imaging region detection means 16 of FIG.
  • relative position calculation means 20 07 included in communication partner identification device B, and relative position calculation means included in communication partner identification device C 3 0 7 corresponds to the relative position calculation means 1 7 in FIG.
  • Appearance feature selection means 2 0 8 included in communication partner specifying device B and appearance feature selection means 3 included in communication partner specifying device C 0 8 corresponds to the appearance feature selection means 18 in FIG.
  • Communication target selection means 1 0 9 included in communication partner identification device A shown in FIG. 15, communication target selection means 2 0 9 included in communication partner identification device B, and communication target selection means included in communication partner identification device C 3 0 9 corresponds to the communication target selection means 19 in FIG.
  • the communication target selecting means 19 communicates based on the relative position data calculated by the relative position calculating means 17 based on the relative position data calculated by the relative position calculating means 17 from the area where the other communication device detected by the imaging area detecting means 16 exists.
  • the feature of the appearance shape extracted by the appearance feature extraction means 1 2 and the relative position calculated by the relative position calculation means 1 7 are stored in the storage means 14. A pair of and is transmitted to the communication means 1 1.
  • the communication device 1 1 uses a pair of the feature of the external shape selected by the communication target selection means 19 and the relative position, and an identification ID for identifying the communication device 1 1 of the own device to surrounding vehicles. Make it broadcast.
  • the communication partner specifying device is constructed as hardware, the present invention is not limited to this, and the computer constituting the communication partner specifying device shown in FIG.
  • the communication partner identification program is installed in the computer, and the communication partner identification program is read and executed by the computer CPU.
  • the communication target selection means 19 may be configured to execute the function.
  • the communication partner identification device shown in Fig. 16 is mounted on the vehicle as the own / other device, and the relative position between the communication identification devices mounted on the own / other device is identified.
  • An example of the communication partner identification method will be described with reference to FIG. 8, FIG. 15 and FIG.
  • the communication partner is specified even when another device exists behind, diagonally rearward, or diagonally forward of the own device.
  • the communication partner identification device of Fig. 16 installed in the vehicle as its own device is the own communication device A
  • the communication partner identification device of Fig. 16 installed in the vehicle traveling directly behind is the other communication of the communication partner.
  • the communication partner identification shown in FIG. 16 is installed in a vehicle traveling diagonally backward or a vehicle traveling diagonally forward.
  • Devices C, D and E are present.
  • Steps S60 to S72 in Fig. 17 are the same as steps S10 to S22 in Fig. 10, and therefore steps S73 to S77 will be described.
  • step S73 of Fig. 17 uses relative position condition of other communication apparatus B designated by the external apparatus via control means 1 0 0 Based on the above, from the relative positions of all the surrounding vehicles stored in the storage means 104, the relative position of the vehicle meeting the relative position condition and the feature of the external shape of the paired vehicle are selected.
  • the communication means 1 0 1 of the self-communication device A determines the relative position and the vehicle feature.
  • the pair data and the identification ID data for identifying the communication means 1001 of the own communication device A are broadcast to other peripheral communication devices B, C, D, and E (step S74 in FIG. 17).
  • step S75 of Fig. 17 the own communication device A compares the comparison results of the features of the external shapes of other peripheral communication devices B, C, D, and E with each communication device B, The identification IDs of C, D, and E are received via the communication means 1 0 1.
  • the control unit 100 of the own communication device A stores the identification ID of the other communication device B that the comparison result of the feature of the received external appearance has done in the storage unit 104 (step of Fig. 17). S76, S77).
  • the own communication device A broadcasts the feature of the external shape of the other communication device extracted from the peripheral image of the own device and the data of the relative position to the other communication device in the periphery. Hesitate.
  • Other communication devices in the vicinity where the own communication device A is located the features of the external shape extracted from images taken from various relative positions, classified in advance for each relative position, and received by the relative communication device A Using the position data, select the feature data of the external shape of the own device, compare the selected feature of the external shape with the feature of the external shape received from the own communication device A, and compare the comparison result with the own communication device. Send to A. Then, based on the received comparison result, the own communication device A can identify the other communication device B that has communicated, the characteristics of its external shape, and its relative position.
  • other communication device B can identify the communication partner (self-communication device A) having an arbitrary positional relationship and its relative position by calculating back the received relative position.
  • the communication partner specifying apparatus includes a control means 20, a communication means 21, an appearance feature extraction means 22, an appearance feature comparison means 23, and a storage means 24. Yes.
  • the output means 2 2, the appearance feature comparison means 2 3, and the storage means 2 4 are connected to each other by a system bus.
  • the control means 20 controls the operations of the communication means 21, the appearance feature extraction means 22, the appearance feature comparison means 23, and the storage means 24. Further, the control means 20 has an interface with an external device (not shown), and receives an image of the surroundings of the device and various sensor information via the interface, and specifies a communication partner. Send data such as results. Further, the control means 20 identifies another device of the communication partner based on the comparison result transmitted from the appearance feature comparison means 23.
  • a camera is used as an external device that captures an image around the device itself.
  • the camera external device
  • the camera external device
  • the camera is attached to the rear, and images other vehicles that are traveling directly behind and obliquely behind the vehicle.
  • the camera is attached to a posture that goes around the vehicle and images the surrounding situation. Since a general technique is applied to a camera that captures the periphery of its own device, detailed description of its configuration and imaging method is omitted.
  • the communication means 21 uses communication standards such as IEEE 802.11b and UWB to extract extracted appearance shape feature data and appearance shape feature comparison results. Send and receive.
  • communication standards such as IEEE 802.11b and UWB to extract extracted appearance shape feature data and appearance shape feature comparison results. Send and receive.
  • the appearance feature extraction unit 22 extracts the feature of the external shape of the other device that is reflected in the image around the own device received by the control unit 20 from the external device. Images taken around the device include unnecessary backgrounds as well as other devices. Therefore, the appearance feature extraction means 22 cuts out the target other device from the image, and extracts the feature of the appearance shape of the other device from the cut out image. Since a general technique is applied to a configuration and method for extracting features of another target device from an image, detailed description of the configuration and extraction method is omitted. The appearance feature extraction means 22 sends the extracted feature data of the appearance shape of the other device to the communication means 21.
  • Appearance feature comparison means 2 3 is the appearance feature of the communication partner received by communication means 2 1 Compare the feature of the external shape of the other device extracted by the extraction means 2 2 with the feature of the external shape of its own device stored in the storage means 24, and send the comparison result to the communication partner via the communication means 21. To do.
  • the appearance feature comparison means 2 3 includes the appearance shape feature of the other device extracted by the communication partner appearance feature extraction means 2 2 and the appearance shape feature of the own device stored in the storage means 24. -If it matches, the communication means 21 sends the comparison result that the object captured in the image captured by the communication partner is the device itself to the communication partner.
  • Appearance feature comparison means 2 3 is used when the appearance shape feature of the other device extracted by the appearance feature extraction means 2 2 of the communication partner does not match the appearance shape feature of the own device stored in the storage means 24.
  • the communication means 21 sends the comparison result that the object reflected in the image taken by the communication partner is not the device itself to the communication partner.
  • the storage means 24 provides a working storage area required when the control means 20, the communication means 21, the appearance feature extraction means 22, and the appearance feature comparison means 23 are executed. . Further, the storage means 24 stores data of the feature of the external shape of the own device.
  • the communication partner specifying apparatus may be constructed as software.
  • the communication partner specifying program is installed in the computer constituting the communication partner specifying device shown in FIG. 18, and the computer CPU reads out and executes the communication partner specifying program. It is constructed so that the functions of 0, appearance feature extraction means 2 2 and appearance feature comparison means 23 are executed.
  • Embodiment 5 a vehicle traveling on a road is used as the own / other device, and the communication partner identifying device shown in FIG. 18 is mounted on each vehicle, and the other communication device communicates with the own communication device. The case of specifying will be described with reference to FIGS.
  • the communication partner specifying device mounted on the own device will be described as self communication device A, and the communication partner specifying device mounted on the other device will be described as other communication devices B and C.
  • FIG. 19 it is assumed that a communication path R is formed between the own communication device A and the other communication device B. The same applies when a communication path is formed between the own communication device A and the other communication device C instead of the other communication device B.
  • the control means 20 of the other communication apparatus B that is the communication partner of the own communication apparatus A includes information on the external shape of the own vehicle stored in the storage means 24 shown in FIG. Information of an identification ID for uniquely identifying B and the own communication device A is transmitted to the own communication device A via the communication means 21 (step S 80 in FIG. 20).
  • control means 20 of the own communication device A receives the information about the feature of the external shape and the identification ID of the other communication device B from the other communication device B via the communication means 21, the information about the external shape Is provided to the appearance feature comparison means 23, and information relating to the identification ID is stored in the storage means 24 (see step S8 in FIG. 21).
  • the control means 20 obtains information of an image obtained by photographing the periphery of the own communication device A from an external device, and transmits the information of the image to the appearance feature extraction means 22 (step S in Fig. 21). 8 2).
  • the appearance feature extraction means 2 2 uses the working storage area of the storage means 24, and from the image provided by the control means 20 the appearance shape of the other communication device (B or C) reflected in the image
  • the feature data of the extracted external shape is extracted and transmitted to the external feature comparison means 23 (step S 8 3 in FIG. 21).
  • the appearance feature comparison means 2 3 uses the work storage area of the storage means 24, and receives the data regarding the feature of the appearance shape of the other communication device B received via the communication means 21 and the appearance feature extraction means 2 Comparison is made with the features of the external shape extracted in step 2 (step S84 in Fig. 21).
  • step S85 in Fig. 21 the control means 20 compares the appearance shape feature in the appearance feature comparison means 23 with a mismatch (step S85; NO), step S81. ⁇ Repeat the process up to step S85.
  • step S85 of Fig. 21 the control means 20 matches the appearance when the comparison of the features of the appearance shape in the appearance feature comparison means 23 is the same (step S85: YES).
  • the other communication device B having the shape feature is associated with the identification ID of the communication partner (other communication device B) received via the communication means 21 (step S 86 in FIG. 21).
  • control means 20 of the own communication device A is the communication means 2 of the own communication device A.
  • Necessary communication is performed between 1 and the communication means 2 1 of the other communication device B that is the communication partner.
  • the characteristics of the external shape of the other communication device extracted from the image obtained by photographing the periphery of the own communication device and the external shape of the communication partner received from the other communication device of the communication partner By comparing the characteristics, the communication partner can be associated with the other communication device.
  • the communication partner specifying device is mounted on a vehicle and the communication partner is specified between vehicles.
  • the communication partner identification device is not limited to a vehicle, for example, public transportation, or the like, as long as it is necessary to identify the communication partner. It can also be applied to direct communication without using a network by incorporating it in a mobile phone, etc., and its application range is wide.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

A communication means allows a self-device and the other device to perform mutual information exchange. An appearance feature extraction means allows an image obtained by photographing an appearance of the other device positioned around the self-device to be analyzed to extract appearance features of the other device from the image. An appearance feature comparison means allows the appearance information extracted by the appearance feature extraction means of the self-device and the appearance features held by the other device to be compared with each other. An identification unit allows the other device of the communication party to be identified according to the comparison result transmitted from the appearance feature comparison means of the other device.

Description

明 細 書  Specification
通信相手特定装置、 通信相手特定方法及び通信相手特定用プロダラ ム  Communication partner identification device, communication partner identification method, and communication partner identification program
技術分野  Technical field
[0001 ] 本発明は、 通信相手特定装置、 通信相手特定方法及び通信相手特定用プロ グラムに関し、 特に、 通信相手の形状情報を用いて、 通信相手を特定する通 信相手特定装置及び通信相手特定方法、 通信相手特定用プログラムに関する 背景技術  TECHNICAL FIELD [0001] The present invention relates to a communication partner identification device, a communication partner identification method, and a communication partner identification program, and in particular, a communication partner identification device and a communication partner identification for identifying a communication partner using shape information of the communication partner. Method, Background Art Regarding Communication Partner Identification Program
[0002] 近年、 移動体無線通信装置において、 特定の相手とのみ情報のやりとりを するサービスが注目を集めている。 特に、 I T S (高度道路交通システム) の分野では、 自動車に移動体無線通信装置を取り付け、 自動車間で局地的な 情報のやりとりをすることにより、 ドライバ一の安全性や利便性を向上させ るシステムが登場している。  In recent years, in mobile radio communication devices, services that exchange information only with a specific partner have attracted attention. In particular, in the field of ITS (Intelligent Transport System), a mobile wireless communication device is installed in a vehicle, and local information is exchanged between the vehicles to improve the safety and convenience of the driver. A system has appeared.
[0003] 例えば特許文献 1では、 自動車の制御情報と G P S (全地球測位システム ) により取得した位置情報とを無線通信を介して基地局に送信し、 基地局は 無線通信で受信した位置情報を基に、 複数の自動車に対する位置関係を特定 して、 先行する自動車の制御情報を特定の後続自動車に無線通信を介して送 信し、 特定の後続自動車は、 無線通信で受信した先行する自動車の制御情報 を基に、 自車の制御を自動的に行う。 これにより、 先行する自動車が急ブレ ーキした場合において、 後続する自動車のブレーキを自動的に作動させるこ とで、 玉突き事故の発生を回避できる。  For example, in Patent Document 1, vehicle control information and position information acquired by GPS (Global Positioning System) are transmitted to a base station via wireless communication, and the base station receives position information received by wireless communication. Based on this, the positional relationship with respect to a plurality of vehicles is specified, and the control information of the preceding vehicle is transmitted to a specific succeeding vehicle via wireless communication. The specific succeeding vehicle receives the preceding vehicle received by wireless communication. The vehicle is automatically controlled based on the control information. As a result, when the preceding vehicle suddenly brakes, the brakes of the following vehicle are automatically activated, thereby avoiding a crushing accident.
[0004] また特許文献 2では、 カメラにより撮影した自車周辺の画像と G P Sによ り取得した位置情報とを無線通信を介して周辺車両に送信する。 前記画像と 前記位置情報を無線通信を介して受信した車両は、 G P Sにより取得した自 車両の位置情報と、 無線通信が受信した位置情報とを比較することにより、 送信元と自車との位置関係を特定し、 無線通信で受信した画像を、 前記位置 関係に応じたディスプレイ上の位置に表示する。 これにより、 自車周囲のど の位置の車両から見た画像であるかを直感的に理解でき、 自車から死角とな つている位置の状況を画像から判断できる。 [0004] Further, in Patent Document 2, an image of the surroundings of the vehicle taken by the camera and the position information acquired by the GPS are transmitted to the surrounding vehicles via wireless communication. The vehicle that has received the image and the position information via wireless communication compares the position information of the host vehicle acquired by GPS with the position information received by wireless communication, thereby determining the position of the transmission source and the host vehicle. The relationship is identified and the image received via wireless communication is Display on the display according to the relationship. As a result, it is possible to intuitively understand where the image is seen from the vehicle around the vehicle, and it is possible to determine from the image the status of the position that is a blind spot from the vehicle.
[0005] しかしながら、 特許文献 1及び特許文献 2に記載の発明では、 G P Sを用 いて各車両の位置情報を取得し、 当該位置情報を用いて、 各車両間の位置関 係を把握しているが、 G P Sによる位置情報には数百メートルの誤差が含ま れている。 このため、 特許文献 3に記載の発明 (D G P S ) では、 F M多重 で放送される誤差修正情報を受信して、 G P Sにより得られる位置情報を修 正するが、 D G P Sが算出する位置情報においても、 数十メートルの誤差が 生じる。  [0005] However, in the inventions described in Patent Document 1 and Patent Document 2, the position information of each vehicle is acquired using GPS, and the position relationship between the vehicles is grasped using the position information. However, GPS location information includes an error of several hundred meters. For this reason, in the invention (DGPS) described in Patent Document 3, error correction information broadcast by FM multiplexing is received and the position information obtained by GPS is corrected, but the position information calculated by DGPS is also An error of several tens of meters occurs.
[0006] このため、 特許文献 1に記載の発明では、 急ブレーキを動作させた車両の 制御情報が、 当該車両の急ブレーキの動作と無関係の車両に対しても送信さ れ、 無関係の車両が誤った車両制御をしてしまう課題が生じる。 また、 特許 文献 2の発明では、 周囲の車両から受信した画像をディスプレイの正しくな い位置に表示してしまい、 ドライバーに誤った判断をさせてしまう課題が生 じる。  [0006] For this reason, in the invention described in Patent Document 1, the control information of the vehicle that has actuated the sudden brake is also transmitted to a vehicle that is unrelated to the sudden brake operation of the vehicle. There arises a problem of erroneous vehicle control. In addition, the invention of Patent Document 2 presents a problem that an image received from a surrounding vehicle is displayed at an incorrect position on the display, causing the driver to make an incorrect determination.
[0007] そこで、 特許文献 4では、 先行する車両から後続の車両のナンバープレー トをカメラ部で監視し、 先行車両から後続車両に向けて、 I D信号, 同期信 号及び送信すべき情報を送信し、 後続車両は、 先行車両からの情報が自車両 の情報である場合、 車両間で通信を行い、 事故等を回避する技術が開発され ている。  Therefore, in Patent Document 4, the camera unit monitors the number plate of the following vehicle from the preceding vehicle, and transmits an ID signal, a synchronization signal, and information to be transmitted from the preceding vehicle to the following vehicle. On the other hand, when the information from the preceding vehicle is the information on the own vehicle, technology has been developed to communicate between the vehicles and avoid accidents.
[0008] また特許文献 5では、 被比較対象の画像が撮影される一定時間前の画像に おいて、 算出された被写体の向き及び大きさと車速及び舵角から、 現時点で の被写体の向き及び大きさを予測し、 その予測情報を用いて、 比較に用いる テンプレートを選択する技術が開示されている。  [0008] Further, in Patent Document 5, the current orientation and size of the subject from the calculated orientation and size of the subject, the vehicle speed, and the steering angle in the image before the comparison target image is taken. A technique for predicting the length and selecting a template to be used for comparison using the prediction information is disclosed.
特許文献 1 :特開 2 0 0 2 _ 2 2 2 4 9 1号公報  Patent Document 1: Japanese Patent Application Laid-Open No. 2 00 2 _ 2 2 2 4 9 1
特許文献 2:特開 2 0 0 6 _ 0 3 1 3 2 8号公報  Patent Document 2: Japanese Patent Laid-Open No. 2 0 0 6 _ 0 3 1 3 2 8
特許文献 3:特開平 5— 2 7 0 0 5号公報 特許文献 4:特開平 9— 9 8 1 2 5号公報 Patent Document 3: Japanese Patent Laid-Open No. 5-2700 Patent Document 4: Japanese Patent Laid-Open No. 9-9 8 1 2 5
特許文献 5:特開 2 0 0 5 _ 3 1 8 5 4 6号公報  Patent Document 5: Japanese Unexamined Patent Publication No. 2 0 0 5 _ 3 1 8 5 4 6
発明の開示  Disclosure of the invention
発明が解決しょうとする課題  Problems to be solved by the invention
[0009] 確かに、 特許文献 4による技術を用いれば、 先行車両と後続車両との間に 通信伝送路を開設して、 相互通信を行って事故等を回避することができる。 し力、し、 相互通信は、 前後に位置する車両同士に限定されてしまい、 自車両 の斜め後方を走行する車両との通信を確立することは不可能になる。 そのた め、 特許文献 4の技術では、 前後を走行する車両が車線変更を行う際には、 危険回避を事前に行うことが不可能であるという課題が残されている。  [0009] Certainly, if the technique according to Patent Document 4 is used, it is possible to establish a communication transmission path between the preceding vehicle and the succeeding vehicle and perform mutual communication to avoid an accident or the like. The mutual communication is limited to the vehicles located in the front and rear, and it becomes impossible to establish communication with the vehicle traveling diagonally behind the host vehicle. Therefore, the technique of Patent Document 4 has a problem that it is impossible to avoid danger in advance when a vehicle traveling forward and backward changes lanes.
[0010] また特許文献 5では、 テンプレートを選択する基準を自車両のみの情報に 限定しており、 時々刻々変化する車両の通行状態に対応することが困難であ るという課題が残されている。  [0010] Further, in Patent Document 5, the criterion for selecting a template is limited to information about only the own vehicle, and there remains a problem that it is difficult to cope with the traffic state of the vehicle that changes from moment to moment. .
[001 1 ] 本発明の目的は、 個人情報の保護の面をも考慮して、 前記課題を解決した 通信相手特定装置、 通信相手特定方法及び通信相手方特定用プログラムを提 供することにある。  [001 1] An object of the present invention is to provide a communication partner identification device, a communication partner identification method, and a communication partner identification program that solve the above-mentioned problems in consideration of the protection of personal information.
課題を解決するための手段  Means for solving the problem
[0012] 前記目的を達成するため、 本発明に係る通信相手特定装置は、 自他装置に それぞれ配置され、 自装置の周囲に位置する任意の他装置から通信相手とな る他装置を特定する通信相手特定装置であって、  [0012] In order to achieve the above object, a communication partner identifying device according to the present invention is arranged in each of its own and other devices, and identifies another device that is a communication partner from any other device located around the own device. A communication partner identification device,
自他装置の相互間で情報の遣り取りを行う通信手段と、  A communication means for exchanging information between itself and other devices;
自装置の周辺に位置する他装置の外観形状を撮影した画像を分析し、 前記 画像から他装置の外観形状の特徴を抽出する外観特徴抽出手段と、  An external feature extraction unit that analyzes an image obtained by capturing an external shape of another device located around the device, and extracts features of the external shape of the other device from the image;
他装置の外観特徴抽出手段で抽出された外観形状情報と、 自装置が保有す る外観形状の特徴とを比較する外観特徴比較手段と、  Appearance feature comparison means for comparing the appearance shape information extracted by the appearance feature extraction means of the other device and the features of the appearance shape possessed by the own device;
他装置の外観特徴比較手段から送信される比較結果に基づいて、 通信相手 の他装置を同定する同定部と、 を有することを特徴とするものである。  And an identification unit that identifies the other device of the communication partner based on the comparison result transmitted from the appearance feature comparison means of the other device.
[0013] 自装置の通信手段は、 自装置の外観特徴抽出手段で抽出した他装置の外観 形状の特徴データを他装置に送信すると共に、 他装置の外観特徴比較手段で の比較結果を受信するようにしてもよいものである。 [0013] The communication means of the own device is the appearance of the other device extracted by the appearance feature extracting means of the own device. The feature data of the shape may be transmitted to the other device, and the comparison result by the appearance feature comparison unit of the other device may be received.
[0014] 自装置の周辺を撮影した画像中における他装置の位置データを分析する撮 影領域検出手段と、 前記他装置の位置データに基づいて、 自装置に対する他 装置の相対位置を算出する相対位置算出手段を有する個性としてもよいもの である。  [0014] Imaging region detection means for analyzing position data of another device in an image obtained by photographing the periphery of the device, and a relative position for calculating the relative position of the other device with respect to the device based on the position data of the other device The personality having the position calculating means may be used.
[0015] 前記相対位置の情報をキーとして、 前記相対位置に対応する外観形状の特 徵データを選択し、 その選択した外観形状の特徴データを他装置の外観特徴 抽出手段に送信する外観特徴選択手段を有する構成としてもよいものである  [0015] Using the information on the relative position as a key, the feature data of the appearance shape corresponding to the relative position is selected, and the feature data of the selected appearance shape is transmitted to the appearance feature extraction unit of another device. It is good also as a structure which has a means
[001 6] 前記相対位置の情報に基づいて通信相手を選択し、 その選択した通信相手 の情報を自装置の通信手段に送信する通信対象選択手段を有する構成として もよいものである。 [001 6] A configuration may be adopted in which a communication partner is selected based on the information on the relative position, and information on the selected communication partner is transmitted to the communication unit of the own apparatus.
[001 7] 本発明の通信相手特定方法は、 自装置の周囲に位置する任意の他装置から 通信相手となる他装置を特定する通信相手特定方法であって、  [001 7] The communication partner specifying method of the present invention is a communication partner specifying method for specifying another device as a communication partner from any other device located around the own device,
自装置の周辺に位置する他装置の外観形状を撮影した画像を分析し、 前記 画像から他装置の外観形状の特徴を抽出する外観特徴抽出ステップと、 他装 置の外観特徴抽出手段で抽出された外観形状情報と、 自装置が保有する外観 形状の特徴とを比較する外観特徴比較ステツプと、 他装置の外観特徴比較手 段から送信される比較結果に基づいて、 通信相手の他装置を同定する同定ス テツプと、 を実行することを特徴とするものである。  Analyzing an image obtained by photographing the external shape of another device located in the periphery of the own device, and extracting the feature of the external shape of the other device from the image, and the external feature extracting means of the other device. The other device of the communication partner is identified based on the external feature comparison step that compares the external shape information with the external shape characteristics of the device itself, and the comparison results sent from the external feature comparison means of other devices. And an identification step to execute.
[0018] 自装置で抽出した他装置の外観形状の特徴データを他装置に送信すると共 に、 他装置での外観特徴比較結果を受信する構成としてもよいものである。  [0018] The feature data of the external shape of the other device extracted by the own device may be transmitted to the other device, and the external feature comparison result in the other device may be received.
[001 9] 自装置の周辺を撮影した画像中における他装置の位置データを分析する撮 影領域検出ステップと、 前記他装置の位置データに基づいて、 自装置に対す る他装置の相対位置を算出する相対位置算出ステップと、 を実行する構成と してもよいものである。  [001 9] Based on the imaging region detection step of analyzing the position data of the other device in the image of the periphery of the own device and the position data of the other device, the relative position of the other device to the own device is determined. A relative position calculating step for calculating, and a configuration for executing are also possible.
[0020] 前記相対位置の情報をキーとして、 前記相対位置に対応する外観形状の特 徵データを選択し、 その選択した外観形状の特徴データに基づいて、 外観形 状の特徴を比較する構成としてもよいものである。 [0020] Using the information on the relative position as a key, the characteristic of the external shape corresponding to the relative position It is also possible to select a 徵 data and compare the features of the appearance shape based on the feature data of the selected appearance shape.
[0021 ] 前記相対位置の情報に基づいて通信相手を選択し、 その選択した通信相手 に情報を送信する構成としてもよいものである。  [0021] A communication partner may be selected based on the information on the relative position, and the information may be transmitted to the selected communication partner.
[0022] 本発明の通信相手特定用プログラムは、 自装置の周囲に位置する任意の他 装置から通信相手となる他装置を特定する通信相手特定装置を構成するコン ピュータに、 [0022] The communication partner specifying program according to the present invention includes a computer constituting a communication partner specifying device that specifies another device serving as a communication partner from any other device located around the own device.
自装置の周辺に位置する他装置の外観形状を撮影した画像を分析し、 前記 画像から他装置の外観形状の特徴を抽出する機能と、  A function of analyzing an image obtained by photographing the external shape of another device located in the periphery of the own device, and extracting a feature of the external shape of the other device from the image;
他装置の外観特徴抽出手段で抽出された外観形状情報と、 自装置が保有す る外観形状の特徴とを比較する機能と、  A function for comparing the external shape information extracted by the external feature extraction means of the other device with the external shape features possessed by the device;
他装置の外観特徴比較手段から送信される比較結果に基づいて、 通信相手 の他装置を同定する機能と、 を実行させることを特徴とするものである。  And a function of identifying the other device of the communication partner based on the comparison result transmitted from the appearance feature comparison means of the other device.
[0023] 前記コンピュータに、 自装置で抽出した他装置の外観形状の特徴データを 他装置に送信すると共に、 他装置での外観特徴比較結果を受信する機能と、 を実行させる構成としてもよいものである。 [0023] The computer may transmit the feature data of the appearance shape of the other device extracted by the own device to the other device, and may receive the appearance feature comparison result in the other device, and It is.
[0024] 前記コンピュータに、 自装置の周辺を撮影した画像中における他装置の位 置データを分析する機能と、 前記他装置の位置データに基づいて、 自装置に 対する他装置の相対位置を算出する機能と、 を実行させる構成としてもよい ものである。 [0024] Based on the function of analyzing the position data of the other apparatus in the image obtained by photographing the periphery of the own apparatus and the position data of the other apparatus, the computer calculates the relative position of the other apparatus with respect to the own apparatus. It is good also as a structure which performs the function to perform and.
[0025] 前記コンピュータに、 前記相対位置の情報をキーとして、 前記相対位置に 対応する外観形状の特徴データを選択し、 その選択した外観形状の特徴デー タに基づいて、 外観形状の特徴を比較する機能を実行させる構成としてもよ いものである。  [0025] Using the information on the relative position as a key, the computer selects feature data of the outer shape corresponding to the relative position, and compares the features of the outer shape based on the selected feature data of the outer shape. It is also good as a configuration for executing the function to be performed.
[0026] 前記コンピュータに、 前記相対位置の情報に基づいて通信相手を選択し、 その選択した通信相手に情報を送信する機能を実行させる構成としてもよい ものである。  [0026] The computer may be configured to execute a function of selecting a communication partner based on the relative position information and transmitting the information to the selected communication partner.
[0027] 本発明の通信相手特定装置は、 自他装置にそれぞれ配置され、 自装置の周 囲に位置する任意の他装置から通信相手となる他装置を特定する通信相手特 定装置であって、 [0027] The communication partner identification device of the present invention is arranged in each of its own and other devices, A communication partner specifying device for specifying another device as a communication partner from any other device located in the enclosure,
自他装置の相互間で情報の遣り取りを行う通信手段と、 自装置の周辺に位 置する他装置の外観形状を撮影した画像を分析し、 前記画像から他装置の外 観形状の特徴を抽出する外観特徴抽出手段と、 自装置の外観特徴抽出手段で 抽出された外観形状情報と、 他装置から取得した外観形状の特徴とを比較す る外観特徴比較手段と、 外観特徴比較手段から送信される比較結果に基づい て、 通信相手の他装置を同定する同定部と、 を有することを特徴とするもの である。  Analyze the communication means for exchanging information between the self-other devices and the image of the external shape of the other devices located around the self-device, and extract the features of the external shape of the other devices from the images Sent from the appearance feature comparison means, the appearance feature comparison means for comparing the appearance shape information extracted by the appearance feature extraction means of the device itself, and the feature of the appearance shape acquired from the other device. And an identification unit for identifying another device of the communication partner based on the comparison result.
[0028] 本発明の通信相手特定方法は、 自装置の周囲に位置する任意の他装置から 通信相手となる他装置を特定する通信相手特定方法であって、  [0028] The communication partner specifying method of the present invention is a communication partner specifying method for specifying another device as a communication partner from any other device located around the own device,
自装置の周辺に位置する他装置の外観形状を撮影した画像を分析し、 前記 画像から他装置の外観形状の特徴を抽出する外観特徴抽出ステップと、 自装 置で抽出された外観形状情報と、 他装置から取得した外観形状の特徴とを比 較する外観特徴比較ステップと、 比較結果に基づいて、 通信相手の他装置を 同定する同定ステップと、 を実行することを特徴とするものである。  An external feature extraction step of analyzing an external shape image of another device located around the own device and extracting features of the external shape of the other device from the image; and external shape information extracted by the own device; An external appearance feature comparison step for comparing external appearance features acquired from other devices, and an identification step for identifying other communication partner devices based on the comparison results. .
[0029] 本発明の通信相手特定用プログラムは、 自装置の周囲に位置する任意の他 装置から通信相手となる他装置を特定する通信相手特定装置を構成するコン ピュータに、  [0029] The communication partner specifying program according to the present invention includes a computer constituting a communication partner specifying device that specifies another device serving as a communication partner from any other device located around the own device.
自装置の周辺に位置する他装置の外観形状を撮影した画像を分析し、 画像 から他装置の外観形状の特徴を抽出する機能と、 自装置で抽出された外観形 状情報と、 他装置から取得した外観形状の特徴とを比較する機能と、 比較結 果に基づいて、 通信相手の他装置を同定する機能と、 を実行させることを特 徵とするものである。  Analyzing the image of the external shape of the other device located around the device itself, extracting the features of the external shape of the other device from the image, the external shape information extracted by the device, and from the other device It is characterized by executing a function of comparing the acquired appearance shape characteristics, and a function of identifying another device of the communication partner based on the comparison result.
発明の効果  The invention's effect
[0030] 本発明によれば、 通信装置の外観形状の画像情報を用いることにより、 通 信相手の通信装置を精度よく特定することができる。  [0030] According to the present invention, the communication device of the communication partner can be accurately identified by using the image information of the external shape of the communication device.
[0031 ] また、 本発明によれば、 通信装置の外観の画像を用いることにより、 通信 相手の通信装置を精度よく特定することができる。 また、 画像から通信相手 の相対位置を算出することにより、 通信相手の通信装置との位置関係を精度 よく特定することができる。 また、 通信した通信装置の外観の画像をブロー ドキャス卜することにより、 特定の通信装置とのみ通信を行うことができる 図面の簡単な説明 [0031] Further, according to the present invention, communication is performed by using an image of the appearance of the communication device. The other party's communication device can be specified with high accuracy. In addition, by calculating the relative position of the communication partner from the image, the positional relationship with the communication device of the communication partner can be accurately identified. Also, it is possible to communicate only with a specific communication device by broadcasting the appearance image of the communication device that communicated.
上述した目的、 およびその他の目的、 特徴および利点は、 以下に述べる好 適な実施の形態、 およびそれに付随する以下の図面によってさらに明らかに なる。  The above-described object and other objects, features, and advantages will be further clarified by the preferred embodiments described below and the following drawings attached thereto.
[図 1 ]本発明の実施形態 1に係る通信相手特定装置を用いて通信相手を特定す る動作を説明するプロック図である。  FIG. 1 is a block diagram for explaining an operation of specifying a communication partner using the communication partner specifying apparatus according to Embodiment 1 of the present invention.
[図 2]本発明の実施形態 1に係る通信相手特定装置の構成を示すブロック図で  FIG. 2 is a block diagram showing a configuration of a communication partner identifying apparatus according to Embodiment 1 of the present invention.
[図 3]複数の通信相手特定装置の位置関係を説明する図である。 FIG. 3 is a diagram for explaining a positional relationship among a plurality of communication partner identifying apparatuses.
[図 4]本発明の実施形態 1に係る通信相手特定方法を説明するフローチヤ一ト である。  FIG. 4 is a flowchart explaining a communication partner specifying method according to Embodiment 1 of the present invention.
[図 5]本発明の実施形態 1に係る通信相手特定方法を説明するフローチヤ一ト である。  FIG. 5 is a flowchart explaining a communication partner specifying method according to Embodiment 1 of the present invention.
[図 6]本発明の実施形態 2に係る通信相手特定装置を用いて通信相手を特定す る動作を説明するプロック図である。  FIG. 6 is a block diagram for explaining an operation of identifying a communication partner using the communication partner identifying apparatus according to Embodiment 2 of the present invention.
[図 7]本発明の実施形態 2に係る通信相手特定装置の構成を示すブロック図で  FIG. 7 is a block diagram showing a configuration of a communication partner identifying apparatus according to Embodiment 2 of the present invention.
[図 8]複数の通信相手特定装置の位置関係を説明する図である。 FIG. 8 is a diagram for explaining a positional relationship among a plurality of communication partner specifying devices.
[図 9]相対位置を算出する過程を説明する図である。 FIG. 9 is a diagram illustrating a process of calculating a relative position.
[図 10]本発明の実施形態 2に係る通信相手特定方法を説明するフローチヤ一 トである。  FIG. 10 is a flowchart for explaining a communication partner specifying method according to Embodiment 2 of the present invention.
[図 1 1 ]本発明の実施形態 2に係る通信相手特定方法を説明するフローチヤ一 トである。 [図 12]本発明の実施形態 3に係る通信相手特定装置を用いて通信相手を特定 する動作を説明するプロック図である。 FIG. 11 is a flowchart for explaining a communication partner specifying method according to Embodiment 2 of the present invention. FIG. 12 is a block diagram for explaining an operation of specifying a communication partner using the communication partner specifying apparatus according to Embodiment 3 of the present invention.
[図 13]本発明の実施形態 3に係る通信相手特定装置の構成を示すブロック図 である。  FIG. 13 is a block diagram showing a configuration of a communication partner identifying apparatus according to Embodiment 3 of the present invention.
[図 14]本発明の実施形態 3に係る通信相手特定方法を説明するフローチヤ一 トである。  FIG. 14 is a flowchart for explaining a communication partner specifying method according to Embodiment 3 of the present invention.
[図 15]本発明の実施形態 4に係る通信相手特定装置を用いて通信相手を特定 する動作を説明するプロック図である。  FIG. 15 is a block diagram illustrating an operation of specifying a communication partner using the communication partner specifying apparatus according to Embodiment 4 of the present invention.
[図 1 6]本発明の実施形態 4に係る通信相手特定装置の構成を示すブロック図 である。  FIG. 16 is a block diagram showing a configuration of a communication partner identifying apparatus according to Embodiment 4 of the present invention.
[図 1 7]本発明の実施形態 4に係る通信相手特定方法を説明するフローチヤ一 トである。  FIG. 17 is a flowchart for explaining a communication partner specifying method according to Embodiment 4 of the present invention.
[図 18]本発明の実施形態 5に係る通信相手特定装置の構成を示すブロック図 である。  FIG. 18 is a block diagram showing a configuration of a communication partner identifying apparatus according to Embodiment 5 of the present invention.
[図 1 9]複数の通信相手特定装置の位置関係を説明する図である。  FIG. 19 is a diagram for explaining a positional relationship among a plurality of communication partner specifying devices.
[図 20]本発明の実施形態 5に係る通信相手特定方法を説明するフローチヤ一 トである。  FIG. 20 is a flowchart for explaining a communication partner specifying method according to Embodiment 5 of the present invention.
[図 21 ]本発明の実施形態 5に係る通信相手特定方法を説明するフローチヤ一 トである。  FIG. 21 is a flowchart for explaining a communication partner specifying method according to Embodiment 5 of the present invention.
発明を実施するための最良の形態  BEST MODE FOR CARRYING OUT THE INVENTION
[0033] 以下、 本発明の実施形態について図面を参照して詳細に説明する。  Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.
(実施形態 1 )  (Embodiment 1)
[0034] 実施形態 1は、 自他装置 (自動車) にそれぞれ配置され、 自装置 (自身の 自動車) の周囲に位置する任意の他装置 (他の自動車) から通信相手となる 他装置 (他の自動車) を特定する通信相手特定装置 A〜Cである。  [0034] The first embodiment is arranged in each of its own and other devices (automobiles), and from any other device (other automobile) located around the own device (own automobile) to another device (others) Communication partner identification devices A to C for identifying a vehicle).
通信相手特定装置 A〜Cは、 自他装置 (自動車) の相互間で情報の遣り取 りを行う通信手段 1 1 と、 自装置 (自身の自動車) の周辺に位置する他装置 (他の自動車) の外観形状を撮影した画像を分析し、 画像から他装置 (他の 自動車) の外観形状の特徴を抽出する外観特徴抽出手段 1 2と、 他装置 (他 の自動車) の外観特徴抽出手段 1 2で抽出された外観形状情報と、 自装置 ( 自身の自動車) が保有する外観形状の特徴を比較する外観特徴比較手段 1 3 と、 他装置 (他の自動車) の外観特徴比較手段 1 3から送信される比較結果 に基づいて、 通信相手の他装置を同定する制御手段 1 0と、 を有する。 The communication partner identification devices A to C are communication means 1 1 for exchanging information between themselves and other devices (automobiles), and other devices (other automobiles) located around the own device (own automobile). ) Analyze the image taken of the appearance shape of the other device from the image (other Appearance feature extraction means 1 2 that extracts features of the appearance shape of automobiles, appearance shape information extracted by appearance feature extraction means 1 2 of other devices (other automobiles), and own device (own automobile) Appearance feature comparison means 1 3 for comparing features of external shapes to be performed and control means for identifying another device of the communication partner based on the comparison result transmitted from the appearance feature comparison means 1 3 of another device (another automobile) 1 0 and
[0035] 実施形態 1に係る通信相手特定装置は図 2に示すように、 制御手段 1 0と 、 通信手段 1 1 と、 外観特徴抽出手段 1 2と、 外観特徴比較手段 1 3と、 記 憶手段 1 4を有している。 制御手段 1 0と、 通信手段 1 1 と、 外観特徴抽出 手段 1 2と、 外観特徴比較手段 1 3と、 記憶手段 1 4との相互間は、 システ ムバスによって接続されている。 図 2に示す通信相手特定装置は図 1に示す ように、 自装置と、 通信相手方となる他装置とにそれぞれ搭載される。 図 1 において、 点線で示すブロックは、 通信相手を特定する際に休止状態である ことを示し、 実線で示すブロックは動作状態であることを示している。 As shown in FIG. 2, the communication partner specifying apparatus according to the first embodiment includes a control unit 10, a communication unit 11, an appearance feature extraction unit 12, an appearance feature comparison unit 13, Means 1 to 4 are provided. The control means 10, the communication means 1 1, the appearance feature extraction means 1 2, the appearance feature comparison means 1 3, and the storage means 1 4 are connected by a system bus. As shown in Fig. 1, the communication partner identification device shown in Fig. 2 is installed in its own device and in the other device as the communication partner. In Fig. 1, the block indicated by a dotted line indicates that the communication partner is in a suspended state, and the block indicated by a solid line indicates that it is in an operating state.
また、 図 1の通信経路 R上の X印は、 個人情報を送らないことを示すもの である。  In addition, the X on communication path R in Fig. 1 indicates that no personal information is sent.
[0036] 図 1に示す通信相手特定装置 Aに含まれる制御手段 1 0 0、 通信相手特定 装置 Bに含まれる制御手段 2 0 0及び通信相手特定装置 Cに含まれる制御手 段 3 0 0は、 図 2の制御手段 1 0に対応する。 図 1に示す通信相手特定装置 Aに含まれる通信手段 1 0 1、 通信相手特定装置 Bに含まれる通信手段 2 0 1及び通信相手特定装置 Cに含まれる通信手段 3 0 1は、 図 2の通信手段 1 1に対応する。 図 1に示す通信相手特定装置 Aに含まれる外観特徴抽出手段 1 0 2、 通信相手特定装置 Bに含まれる外観特徴抽出手段 2 0 2及び通信相 手特定装置 Cに含まれる外観特徴抽出手段 3 0 2は、 図 2の外観特徴抽出手 段 1 2に対応する。 図 1に示す通信相手特定装置 Aに含まれる外観特徴比較 手段 1 0 3、 通信相手特定装置 Bに含まれる外観特徴比較手段 2 0 3及び通 信相手特定装置 Cに含まれる外観特徴比較手段 3 0 3は、 図 2の外観特徴比 較手段 1 3に対応する。 図 1に示す通信相手特定装置 Aに含まれる記憶手段 1 0 4、 通信相手特定装置 Bに含まれる記憶手段 2 0 4及び通信相手特定装 置 Cに含まれる記憶手段 3 0 4は、 図 2の記憶手段 1 4に対応する。 [0036] The control means 1 0 0 included in the communication partner specifying device A shown in FIG. 1, the control means 2 0 0 included in the communication partner specifying device B, and the control means 3 0 0 included in the communication partner specifying device C are: Corresponds to the control means 10 of FIG. The communication means 1 0 1 included in the communication partner specifying device A shown in FIG. 1, the communication means 2 0 1 included in the communication partner specifying device B, and the communication means 3 0 1 included in the communication partner specifying device C are shown in FIG. Corresponds to communication means 1 1. Appearance feature extraction means 1 0 2 included in communication partner identification device A shown in FIG. 1, appearance feature extraction means 2 0 2 included in communication partner identification device B and appearance feature extraction means 3 included in communication partner identification device C 3 0 2 corresponds to appearance feature extraction means 1 2 in FIG. Appearance feature comparison means 1 0 3 included in communication partner identification device A shown in FIG. 1, appearance feature comparison means 2 0 3 included in communication partner identification device B and appearance feature comparison means 3 included in communication partner identification device C 3 0 3 corresponds to the appearance feature comparison means 13 in FIG. Storage means 1 0 4 included in communication partner identification device A shown in FIG. 1, storage means 2 0 4 included in communication partner identification device B, and communication partner identification device The storage means 30 04 included in the device C corresponds to the storage means 14 in FIG.
[0037] 制御手段 1 0は、 通信手段 1 1 と、 外観特徴抽出手段 1 2と、 外観特徴比 較手段 1 3と、 記憶手段 1 4の動作を制御する。 また、 制御手段 1 0は図示 しない外部装置とのインタフヱ_スを備えており、 前記インタフヱ_スを介 して、 自装置周辺の画像及び各種センサ情報などを受信すると共に、 通信相 手の特定結果などのデータを送信する。 また、 制御手段 1 0は、 他装置の外 観特徴比較手段から送信される比較結果に基づいて、 通信相手の他装置を同 定する。 The control means 10 controls the operation of the communication means 1 1, appearance feature extraction means 12, appearance feature comparison means 13, and storage means 14. Further, the control means 10 has an interface with an external device (not shown), and receives an image of the surroundings of the device and various sensor information via the interface, and specifies a communication partner. Send data such as results. Further, the control means 10 identifies the other device of the communication partner based on the comparison result transmitted from the external feature comparison means of the other device.
[0038] 自装置周辺の画像を取り込む外部装置としては、 例えばカメラが用いられ る。 カメラ (外部装置) は、 自装置が自動車である場合、 後方に取り付けら れて、 車両の真後ろ及び斜め後方を走行している他の車両を撮像する。 また 、 カメラは、 車両の周囲を一周して周辺の状況を撮像する姿勢に取り付けら れる。 自装置周辺を撮像するカメラには、 一般的な技術が適用されるため、 その構成及び撮像方法については詳細な説明を省略する。  [0038] For example, a camera is used as an external device that captures an image around the device itself. When the camera (external device) is an automobile, the camera (external device) is attached to the rear and images other vehicles that are traveling right behind and obliquely behind the vehicle. In addition, the camera is attached to a posture that goes around the vehicle and images the surrounding situation. Since a general technique is applied to a camera that captures the periphery of its own device, detailed description of its configuration and imaging method is omitted.
[0039] 通信手段 1 1は、 I E E E 8 0 2 . 1 1 b及び U W Bなどの通信規格を用 いて、 抽出された外観形状の特徴のデータや外観形状の特徴の比較結果のデ ータなどを送受信する。  [0039] The communication means 11 1 uses the communication standards such as IEEE 802.11b and UWB to extract the extracted feature data of the outer shape and the comparison result of the features of the outer shape. Send and receive.
[0040] 外観特徴抽出手段 1 2は、 制御手段 1 0が外部装置から受信した自装置周 辺の画像に写り込んだ他装置の外観形状の特徴を抽出する。 自装置周辺を撮 像した画像には、 他装置ばかりでなく、 不要な背景などが含まれる。 そこで 、 外観特徴抽出手段 1 2は、 画像から対象となる他装置を切り出し、 その切 り出した画像から他装置の外観形状の特徴を抽出する。 画像から対象となる 他装置の特徴を抽出する構成及び方法には、 一般的な技術が適用されるため 、 その構成及び抽出方法については詳細な説明を省略する。 外観特徴抽出手 段 1 2は、 抽出した他装置の外観形状の特徴データを通信手段 1 1に送信す る。  [0040] The appearance feature extraction unit 12 extracts the feature of the external shape of the other device that is reflected in the image around the own device received by the control unit 10 from the external device. Images taken around the device include unnecessary backgrounds as well as other devices. Therefore, the appearance feature extraction unit 12 cuts out the target other device from the image, and extracts the feature of the appearance shape of the other device from the cut out image. Since a general technique is applied to a configuration and method for extracting features of another target device from an image, detailed description of the configuration and extraction method is omitted. The external feature extraction unit 12 sends the extracted external shape feature data of the other device to the communication means 11.
[0041 ] 車両の外観形状の特徴として、 たとえば、 車両の外形状、 車種、 ナンバー プレートなどを用いる。 前記車両の外形状としては、 車両のシルエット、 リ ァ■テールランプの取付位置、 フロントグリル■ リアグリルの形状、 車幅や 車高、 前後のタイヤの取付幅、 窓枠の形状などを用いる。 また前記車両の車 種は、 前記車両の外形状に対するパターンマッチングにより、 トラック、 ミ 二バン、 セダンであるかなどを推定する。 また、 外観の色彩も車両の特徴と して抽出する。 [0041] For example, the outer shape of the vehicle, the vehicle type, and the license plate are used as the characteristics of the vehicle outer shape. Examples of the outer shape of the vehicle include a vehicle silhouette, Use the tail lamp mounting position, front grille, rear grille shape, vehicle width and height, front and rear tire mounting width, and window frame shape. In addition, it is estimated whether the vehicle type of the vehicle is a truck, a motorcycle, or a sedan by pattern matching with respect to the outer shape of the vehicle. Also, the appearance color is extracted as a feature of the vehicle.
[0042] 外観特徴比較手段 1 3は、 通信手段 1 1で受信した、 通信相手の外観特徴 抽出手段 1 2で抽出された他装置の外観形状の特徴と、 記憶手段 1 4に記憶 された自装置の外観形状の特徴とを比較し、 その比較結果を通信手段 1 1で 通信相手に送信する。  [0042] The appearance feature comparison unit 13 receives the feature of the external shape of the other device extracted by the communication unit 11 1 and received from the communication unit 11 1 and the self stored in the storage unit 14 The characteristics of the external shape of the device are compared, and the comparison result is transmitted to the communication partner via communication means 11.
[0043] 外観特徴比較手段 1 3は、 通信相手の外観特徴抽出手段 1 2で抽出された 他装置の外観形状の特徴と、 記憶手段 1 4に記憶された自装置の外観形状の 特徴とがー致する場合、 通信相手が撮影した画像に写り込んだ対象物が自装 置であるとの比較結果を、 通信手段 1 1で通信相手に送信する。  [0043] The appearance feature comparison means 13 includes the appearance shape feature of the other device extracted by the communication partner appearance feature extraction means 12 and the appearance shape feature of the own apparatus stored in the storage means 14. -If it matches, the communication means 11 sends the comparison result that the object captured in the image captured by the communication partner is the device itself to the communication partner.
外観特徴比較手段 1 3は、 通信相手の外観特徴抽出手段 1 2で抽出された 他装置の外観形状の特徴と、 記憶手段 1 4に記憶された自装置の外観形状の 特徴とが不一致の場合、 通信相手が撮影した画像に写り込んだ対象物が自装 置でないとの比較結果を、 通信手段 1 1で通信相手に送信する。  Appearance feature comparison means 1 3 is used when the appearance feature of the other device extracted by the appearance feature extraction means 1 2 of the communication partner does not match the appearance shape feature of the own device stored in the storage means 14 The communication means 11 sends the comparison result that the object reflected in the image taken by the communication partner is not the device itself to the communication partner.
[0044] 以下に、 外観特徴比較手段 1 3が通信手段 1 1で受信した、 通信相手の外 観特徴抽出手段 1 2で抽出された他装置の外観形状の特徴と、 記憶手段 1 4 に記憶された自装置の外観形状の特徴とを比較する手法について一例を示す  [0044] In the following, the features of the external shape of the other device extracted by the external feature extraction unit 12 of the communication partner received by the communication unit 11 1 by the external feature comparison unit 13 and stored in the storage unit 14 Shows an example of a method to compare the appearance shape characteristics of the device
[0045] 自装置は、 周囲を走行する他装置が画像から抽出した通信装置 (車両) の 外観特徴を受信し、 受信した外観特徴と自装置が保持する自装置の外観特徴 とを比較して、 受信した外観特徴が自身の外観特徴と一致するかを判断する 。 この動作は、 大まかな特徴 (色、 大きさまたは形状等) から比較を開始し て、 各特徴における一致度を算出することにより行うものである。 算出した 一致度が規定値以下であれば、 送られてきた画像と自身は不一致と判断する 。 また、 一致度が規定値より大きい場合は、 別の特徴を比較する。 記憶手段 1 4に記憶される全ての特徴において不一致と判断されなければ、 送られて きた画像と自身は一致するとして判断する。 [0045] The own device receives the appearance characteristics of the communication device (vehicle) extracted from the image by other devices that travel around, and compares the received appearance features with the appearance features of the own device held by the own device. Determine if the received appearance features match your own appearance features. This operation is performed by starting the comparison from the rough features (color, size, shape, etc.) and calculating the degree of coincidence in each feature. If the calculated degree of coincidence is less than the specified value, it is determined that the sent image and itself are inconsistent. If the degree of coincidence is greater than the specified value, compare other features. Storage means 1 If all the features stored in 4 are not judged to be inconsistent, it is judged that the sent image matches itself.
[0046] 上記の一致/不一致の判断手法としては、 車両の外観形状の特徴として色 彩を利用する場合は、 色を数値化し、 色差を算出して評価することができる また、 車両の外観形状の特徴として、 外形状を利用する場合、 所定の解析 を行い、 外形状の特徴のある部分をパラメータ比較する。 このパラメータ比 較には、 たとえば、 画像の画素値 (濃度値) がある一定の変化を生じる領域 を車両の輪郭線及び部分線として抽出し、 線に囲まれた部分の面積値を算出 することにより外形状の部分を特定することができる。 [0046] As the above-mentioned matching / non-matching determination method, when using color as a feature of the appearance shape of the vehicle, the color can be digitized, and the color difference can be calculated and evaluated. When using the outer shape as a feature, perform a predetermined analysis and compare the parameters of the features with the outer shape. For this parameter comparison, for example, a region where a certain change in the pixel value (density value) of the image is extracted as a vehicle outline and partial line, and the area value of the part surrounded by the line is calculated. Thus, the outer shape portion can be specified.
このように外観形状の特徴を数値化することにより、 他装置が取得した画 像から抽出された外観形状の特徴と、 対応する自装置の外観形状の特徴とを 対比することができる。 この対比の結果、 得られる値を一致度とする。 たと えば、 完全に一致する場合を 1 0 0 %として、 規定値を 6 0 %とすると、 一 致度が 6 0 <½以下であれば両者の外観形状の特徴を不一致と判断することが できる。  By quantifying the features of the appearance shape in this way, it is possible to compare the features of the appearance shape extracted from the image acquired by the other device with the features of the appearance shape of the corresponding own device. The value obtained as a result of this comparison is taken as the degree of coincidence. For example, assuming that 100% corresponds to 100% and the specified value is 60%, it is possible to determine that the features of both external shapes are inconsistent if the degree of match is 6 0 <½ or less. .
[0047] なお、 この判断手法は、 一つの例を示したものにすぎず、 他の様々な態様 を採用することができる。  [0047] Note that this determination method is merely an example, and various other modes can be adopted.
[0048] 通信相手の外観特徴抽出手段 1 2で抽出された他装置の外観形状の特徴と 、 記憶手段 1 4に記憶された自装置の外観形状の特徴とがー致する場合、 通 信相手が撮影した画像に写り込んだ対象物が自装置であるとの比較結果を、 通信手段 1 1で通信相手に送信する。  [0048] If the features of the external shape of the other device extracted by the external feature extraction means 12 of the communication partner and the features of the external shape of the own device stored in the storage means 14 match, the communication partner The communication means 11 sends the comparison result that the object reflected in the image taken by is the own device to the communication partner.
[0049] 通信相手方を特定する過程で、 自装置から他装置を撮影した際に、 他装置 が自動車である場合、 自動車の外観形状の画像に、 その自動車の乗員の画像 が含まれてしまうことがある。 この場合、 乗員の画像には肖像権の問題があ る。 また、 自装置と他装置が携帯電話機である場合、 自装置の携帯電話機か ら通信相手の他装置の携帯電話機を撮影すると、 携帯電話機のユーザが撮影 画像に含まれる。 この場合でも、 ユーザの画像には肖像権の問題がある。 最 近、 個人情報保護法が施行され、 個人情報に関する情報を慎重に取り扱うこ とが重要になってきている。 通信相手方を特定するに当たっても、 個人情報 の取扱が問題となる。 [0049] In the process of identifying the communication partner, when another device is photographed from the own device, if the other device is a car, the image of the occupant of the car is included in the image of the appearance of the car There is. In this case, there is a problem of portrait rights in the occupant image. Further, when the own device and the other device are mobile phones, if the mobile phone of the other device of the communication partner is photographed from the own device's mobile phone, the user of the mobile phone is included in the photographed image. Even in this case, the user's image has a problem of portrait rights. Most Recently, the Personal Information Protection Law has been enforced, and it has become important to handle information related to personal information carefully. The handling of personal information becomes a problem even when identifying the communication partner.
[0050] 実施形態 1では、 外観特徴比較手段 1 3は、 通信相手の外観特徴抽出手段  [0050] In the first embodiment, the appearance feature comparison unit 13 is a communication partner appearance feature extraction unit.
1 2で抽出された他装置の外観形状の特徴と、 記憶手段 1 4に記憶された自 装置の外観形状の特徴とを比較するため、 外観形状の特徴の比較は、 自装置 で行われる。 このため、 個人情報を含む外観形状の特徴のデータが、 撮影す る装置に送信されることはない。 したがって、 外観形状の特徴に含まれて個 人情報が通信相手方に漏洩することはなく、 本発明は、 個人情報を保護する 面からも優れている。  In order to compare the feature of the external shape of the other device extracted in 12 and the feature of the external shape of the own device stored in the storage unit 14, the feature of the external shape is compared with the own device. For this reason, external appearance feature data including personal information is not sent to the device that captures the image. Therefore, the personal information is not leaked to the communication partner because it is included in the feature of the external shape, and the present invention is excellent in terms of protecting personal information.
[0051 ] さらに、 実施形態 1では、 通信相手の外観特徴抽出手段 1 2で抽出された 他装置の外観形状の特徴と、 記憶手段 1 4に記憶された自装置の外観形状の 特徴との比較結果が通信相手に送信される。 しかし、 前記比較結果は、 通信 相手が撮影した画像に写り込んだ対象物が自装置であるか否かの情報、 すな わち " Y E S "又は" N O "の情報であって、 この情報には個人情報が含まれる ことはない。 したがって、 外観形状の特徴の比較結果を遣り取りするとして も、 個人情報が通信相手方に漏洩することはなく、 本発明は、 個人情報を保 護する面からも優れている。  [0051] Further, in the first embodiment, the external shape feature of the other device extracted by the external feature extraction unit 12 of the communication partner is compared with the external shape feature of the own device stored in the storage unit 14 The result is sent to the communication partner. However, the comparison result is information on whether or not the object captured in the image taken by the communication partner is the device itself, that is, information of “YES” or “NO”, Does not contain personal information. Therefore, even if the comparison result of the feature of the external shape is exchanged, the personal information is not leaked to the other party of communication, and the present invention is excellent in terms of protecting the personal information.
[0052] なお、 記憶手段 1 4から読み出された自装置の外観形状の特徴と、 通信相 手方の外観特徴抽出手段 1 2から送信された装置の外観形状の特徴との比較 には、 一般的な技術が適用されるため、 その構成及び比較方法については詳 細な説明を省略する。  [0052] It should be noted that the comparison between the external shape feature of the own device read from the storage means 14 and the external appearance feature of the device transmitted from the external appearance feature extraction means 12 of the communication partner is Since general techniques are applied, detailed description of the configuration and comparison method is omitted.
[0053] 記憶手段 1 4は、 制御手段 1 0と、 通信手段 1 1 と、 外観特徴手段 1 2と 、 外観特徴比較手段 1 3が動作を実行する際に必要な作業記憶領域を提供す る。 また、 記憶手段 1 4は、 自装置の外観形状の特徴のデータと、 自他装置 を識別する識別 I Dのデータとを記憶する。 記憶手段 1 4が記憶する情報は 、 必要最小限である、 自装置の外観形状の特徴のデータ及び識別 I Dのデー タであり、 その記憶容量は極めて少なく抑えられる。 [0054] なお、 図 1に示す実施形態 1に係る通信相手特定装置は、 ハードウエアと して構築したが、 これに限られるものではない。 実施形態 1に係る通信相手 特定装置をソフトウェアとして構築してもよいものである。 この場合、 図 1 に示す通信相手特定装置を構成するコンピュータに通信相手特定用プログラ ムを組み込み、 コンピュータの C P Uにより前記通信相手特定用プログラム を読み出して実行することにより、 コンピュータに、 制御手段 1 0と、 外観 特徴抽出手段 1 2と、 外観特徴比較手段 1 3の機能を実行させるように構築 する。 The storage means 14 provides a working storage area required when the control means 10, the communication means 11, the appearance feature means 12, and the appearance feature comparison means 13 execute operations. . Further, the storage means 14 stores data on the appearance shape of the own device and identification ID data for identifying the other device. The information stored in the storage means 14 is the minimum necessary data of the external appearance feature and the data of the identification ID, and the storage capacity is extremely small. Note that the communication partner identifying apparatus according to the first embodiment shown in FIG. 1 is constructed as hardware, but is not limited thereto. The communication partner identifying apparatus according to Embodiment 1 may be constructed as software. In this case, the communication partner specifying program is incorporated in the computer constituting the communication partner specifying device shown in FIG. 1, and the computer program is read out and executed by the computer CPU, so that the control means 10 is controlled by the computer. And the appearance feature extraction means 1 2 and the appearance feature comparison means 1 3.
[0055] 次に、 図 2に示す通信相手特定装置を用いて、 通信相手を特定する通信相 手特定方法を図 1〜図 5に基づいて説明する。  Next, a communication partner specifying method for specifying a communication partner using the communication partner specifying apparatus shown in FIG. 2 will be described with reference to FIGS.
[0056] 実施形態 1に係る通信相手特定方法は、 自装置及び他装置として、 道路を 走行する車両 (自動車) を用い、 図 2に示す通信相手特定装置を各車両にそ れぞれ搭載した場合について説明する。 [0056] The communication partner identification method according to the first embodiment uses a vehicle (automobile) traveling on a road as its own device and other devices, and each vehicle is equipped with the communication partner identification device shown in FIG. The case will be described.
[0057] 図 3に示すように、 実施形態 1では、 自車両 (自装置) の後方斜め方向に[0057] As shown in Fig. 3, in the first embodiment, the host vehicle (own device) is inclined rearward.
2台の他車両 (他装置) が走行しているものとする。 図 3において、 自装置 に搭載された通信相手特定装置を自通信装置 Aとし、 他装置に搭載された通 信相手特定装置を他通信装置 B , Cとする。 It is assumed that two other vehicles (other devices) are traveling. In FIG. 3, the communication partner specifying device mounted on the own device is referred to as self communication device A, and the communication partner specifying device mounted on the other device is referred to as other communication devices B and C.
[0058] 図 3において、 自通信装置 Aと他通信装置 Bとの間に通信経路 Rが形成さ れ、 自通信装置 Aと他通信装置 Bとの間で通信が行われているものとする。 なお、 他通信装置 Bに代えて、 自通信装置 Aと他通信装置 Cとの間に通信経 路 Rが形成されている場合にも同様である。 In FIG. 3, it is assumed that a communication path R is formed between the own communication device A and the other communication device B, and communication is performed between the own communication device A and the other communication device B. . The same applies when a communication path R is formed between the own communication apparatus A and the other communication apparatus C instead of the other communication apparatus B.
[0059] 先ず、 自通信装置 Aの制御手段 1 0 0は、 外部装置 (カメラなど) 力《自車 両の周辺を撮影した画像を取得し、 その取得した画像データを外観特徴抽出 手段 1 0 2に送信する (図 4のステップ S 1 ) 。 First, the control means 10 0 of the own communication device A acquires an external device (camera etc.) force << an image obtained by photographing the periphery of the own vehicle, and the obtained image data is an appearance feature extraction means 1 0. 2 (Step S1 in Fig. 4).
[0060] 自通信装置 Aの外観特徴抽出手段 1 0 2は、 制御手段 1 0 0から送信され た、 外部装置から取得した画像に写り込んだ他通信装置 (B又は C ) の外観 形状の特徴を抽出し、 その抽出した外観形状の特徴データを通信手段 1 0 1 に送信する (図 4のステップ S 2 ) 。 [0061 ] 通信手段 1 0 1は、 外観特徴抽出手段 1 0 2から送信された外観形状の特 徵データを通信相手の他通信装置 Bに送信する (図 4のステップ S 3 ) 。 [0060] The appearance feature extraction means 10 0 2 of the own communication device A is the feature of the appearance shape of the other communication device (B or C) reflected in the image acquired from the external device transmitted from the control means 100. And the extracted feature data of the external shape is transmitted to the communication means 101 (step S2 in FIG. 4). [0061] The communication unit 101 transmits the feature data of the outer shape transmitted from the appearance feature extraction unit 102 to the other communication device B of the communication partner (step S3 in Fig. 4).
[0062] 自通信装置 Aと通信状態 (通信経路 Rで接続された) にある他通信装置 B の通信手段 2 0 1は、 自通信装置 Aが抽出した外観形状の特徴データを受信 し、 その受信した外観形状の特徴データを他通信装置 Bの外観特徴比較手段 2 0 3に送信する (図 5のステップ S 4 ) 。  [0062] The communication means 2 0 1 of the other communication device B in communication with the own communication device A (connected by the communication path R) receives the feature data of the external shape extracted by the own communication device A, and The received external appearance feature data is transmitted to the external feature comparison means 20 3 of the other communication device B (step S 4 in FIG. 5).
[0063] 他通信装置 Bの外観特徴比較手段 2 0 3は、 自通信装置 Aから送信された 外観形状の特徴データと、 記憶手段 2 0 4に予め保存された他通信装置 Bの 外観形状の特徴データとを比較する (図 5のステップ S 5 ) 。  [0063] The appearance feature comparison means 20 03 of the other communication device B includes the appearance shape feature data transmitted from the own communication device A and the appearance shape of the other communication device B stored in the storage means 20 04 in advance. The feature data is compared (step S 5 in FIG. 5).
[0064] 図 5のステップ S 5において、 外観特徴比較手段 2 0 3は、 通信相手の外 観特徴抽出手段 1 0 2で抽出された他装置の外観形状の特徴と、 記憶手段 2 0 4に記憶された他通信装置 (自装置) Bの外観形状の特徴とがー致する場 合、 通信相手が撮影した画像に写り込んだ対象物が他通信装置 (自装置) B であるとの比較結果と、 記憶手段 2 0 4に記憶された他通信装置 Bを識別す る識別 I Dとを、 通信手段 2 0 1で通信相手に送信する (図 5のステップ S 6 ) 。  In step S 5 of FIG. 5, the appearance feature comparison means 2 0 3 stores the appearance feature of the other device extracted by the appearance feature extraction means 1 0 2 of the communication partner and the storage means 2 0 4. If the stored external communication device (own device) B matches the feature of the external shape, it is compared with the other communication device (self device) B that is the object captured in the image captured by the communication partner. The result and the identification ID for identifying the other communication device B stored in the storage means 204 are transmitted to the communication partner by the communication means 2011 (step S6 in FIG. 5).
[0065] 図 5のステップ S 5において、 外観特徴比較手段 2 0 3は、 通信相手の外 観特徴抽出手段 1 0 2で抽出された他装置の外観形状の特徴と、 記憶手段 2 0 4に記憶された他通信装置 (自装置) Bの外観形状の特徴とが不一致の場 合、 通信相手が撮影した画像に写り込んだ対象物が他通信装置 (自装置) B でないとの比較結果と、 記憶手段 2 0 4に記憶された他通信装置 Bを識別す る識別 I Dとを、 通信手段 1 1で通信相手に送信する (図 5のステップ S 6 In step S 5 of FIG. 5, the appearance feature comparison means 2 0 3 stores the appearance feature of the other device extracted by the appearance feature extraction means 1 0 2 of the communication partner and the storage means 2 0 4. If the external appearance feature of the other communication device (own device) B does not match, the comparison result is that the object reflected in the image captured by the communication partner is not the other communication device (self device) B. The communication means 11 sends the identification ID for identifying the other communication device B stored in the storage means 2 0 4 to the communication partner (step S 6 in FIG. 5).
) o ) o
[0066] 他通信装置 Bの通信手段 2 0 1 と通信経路 Rを介して接続されている自通 信装置 Aの通信手段 1 0 1は、 通信相手の他通信装置 Bの通信手段 2 0 1か ら、 前記比較結果と、 他通信装置 Bを識別する識別 I Dとを受信し、 その受 信した比較結果と識別 I Dとを制御手段 1 0 0に送信する (図 4のステップ S 7 ) 。 [0067] 自通信装置 Aの制御手段 1 0 0は、 通信手段 1 0 1から送信された比較結 果に基づいて、 自通信装置 Aの外観特徴抽出手段 1 0 2で抽出した外観形状 の特徴が、 通信相手の他通信装置 Bの外観形状の特徴に一致又は不一致かの いずれであるかを検出する (図 4のステップ S 8 ) 。 [0066] The communication means 1 0 1 of the own communication apparatus A connected to the communication means 2 0 1 of the other communication apparatus B via the communication path R is the communication means 2 0 1 of the other communication apparatus B of the communication partner. Therefore, the comparison result and the identification ID for identifying the other communication device B are received, and the received comparison result and the identification ID are transmitted to the control means 100 (step S 7 in FIG. 4). [0067] The control means 100 of own communication device A is characterized by the appearance shape extracted by appearance feature extraction means 10 2 of own communication device A based on the comparison result transmitted from communication means 1001. , It is detected whether it matches or does not match the feature of the external shape of the other communication device B (step S 8 in FIG. 4).
[0068] 図 4のステップ S 8において、 自通信装置 Aの制御手段 1 0 0は、 他通信 装置 Bから送信された比較結果が一致するという比較結果である場合、 すな わち、 自通信装置 Aから送信した外観形状の特徴が、 画像に写し込まれた他 通信装置 Bの外観形状の特徴と一致すると認定した場合 (図 4のステップ S 8 ; Y E S ) 、 外部装置で捉えた画像に写し込まれた車両が、 通信相手の他 通信装置 Bであることを、 他通信装置 Bの識別 I Dと対応させて、 記憶手段 1 0 4に記憶させる (図 4のステップ S 9 ) 。  [0068] In step S8 of Fig. 4, if the control means 100 of own communication device A is a comparison result that the comparison results transmitted from other communication device B match, that is, the own communication device When it is determined that the appearance shape feature transmitted from device A matches the appearance shape feature of other communication device B copied in the image (step S8 in Fig. 4; YES), the image captured by the external device is displayed. The fact that the imprinted vehicle is the other communication device B of the communication partner is stored in the storage means 104 in association with the identification ID of the other communication device B (step S 9 in FIG. 4).
[0069] 図 4のステップ S 8において、 自通信装置 Aの制御手段 1 0 0は、 他通信 装置 Bから送信された比較結果が不一致であるという比較結果である場合、 すなわち、 自通信装置 Aから送信した外観形状の特徴が、 画像に写し込まれ た他通信装置 Bの外観形状の特徴と不一致であると認定した場合 (図 4のス テツプ S 8 ; N O ) 、 処理を図 4のステップ S 1に戻し、 再び外部装置から 自通信装置 Aの周辺の画像を取得し、 処理を続ける (図 4のステップ S 1〜 S 3 ) 。  [0069] In step S8 of Fig. 4, the control means 1 0 0 of the own communication device A has a comparison result that the comparison result transmitted from the other communication device B is inconsistent, that is, the own communication device A If it is determined that the feature of the external shape transmitted from is inconsistent with the feature of the external shape of the other communication device B imprinted in the image (step S 8 in FIG. 4; NO), the process is the step of FIG. Returning to S 1, the image around the own communication device A is acquired again from the external device, and the processing is continued (steps S 1 to S 3 in FIG. 4).
[0070] 以上のように、 実施形態 1によれば、 通信装置の外観形状の画像情報を用 いることにより、 通信相手の通信装置を精度よく特定することができる。  [0070] As described above, according to the first embodiment, by using the image information of the external shape of the communication device, the communication device of the communication partner can be accurately identified.
[0071 ] 実施形態 1によれば、 自他装置の特定部位に現れる特徴部分に絞ることな <、 自他装置の外観に現れる外観形状の特徴に基づいて通信相手方を特定す るため、 自他装置の相対位置に左右されることなく、 通信相手方を特定でき る。  [0071] According to the first embodiment, the communication partner is specified based on the feature of the appearance shape that appears on the external appearance of the self-other device without focusing on the characteristic portion that appears in the specific part of the self-other device. The communication partner can be identified without being affected by the relative position of the device.
[0072] 実施形態 1によれば、 特定対象物を撮像する処理と、 その撮像画像から特 徵を抽出する処理とを一方の装置で行い、 特徴同士の比較を他方の装置で行 うため、 各装置での処理に過剰な負担をかけることがない。 したがって、 通 信相手特定装置を車両案内のカーナビシステムに適用しても、 車両案内の動 作に影響を与えることはない。 また、 記憶手段は、 自装置の外観形状の特徴 のデータ及び識別 I Dのデータを記憶するに止まり、 記憶手段が記憶する情 報を必要最小限となるため、 その記憶容量を極めて少なく抑えることができ る。 したがって、 現有の力一ナビシステムの仕様を変更せずに、 本発明を適 用することができる。 [0072] According to the first embodiment, the process of capturing an image of a specific object and the process of extracting features from the captured image are performed by one device, and the features are compared by the other device. There is no excessive burden on the processing in each device. Therefore, even if the communication partner identification device is applied to a car navigation system for vehicle guidance, Does not affect the work. In addition, the storage means only stores the data of the features of the external appearance of the device itself and the data of the identification ID, and the information stored in the storage means becomes the minimum necessary, so that the storage capacity can be kept extremely low. it can. Therefore, the present invention can be applied without changing the specifications of the existing Chikichi navigation system.
[0073] 通信相手方を特定する過程で、 自装置から他装置を撮影した際に、 他装置 が車両である場合、 車両の外観形状の画像に、 車両に同乗する乗員の画像が 含まれてしまうことがある。 この場合、 乗員の画像には肖像権の問題がある 。 しかし、 実施形態 1では、 他通信装置の外観特徴比較手段は、 通信相手の 自通信装置の外観特徴抽出手段で抽出された外観形状の特徴と、 他通信装置 の記憶手段に記憶された外観形状の特徴とを比較するため、 外観形状の特徴 の比較は自装置で行われる。 このため、 個人情報を含む外観形状の特徴デー タが、 撮影する装置に送信されることはない。 したがって、 外観形状の特徴 に含まれて個人情報が必要以上に拡散するのを防止でき、 個人情報の保護を 強化できる。  [0073] In the process of identifying the other party of communication, when another device is photographed from the own device, if the other device is a vehicle, the image of the occupant riding in the vehicle is included in the image of the appearance of the vehicle Sometimes. In this case, there is a problem of portrait rights in the crew image. However, in the first embodiment, the external feature comparison unit of the other communication device includes the external shape feature extracted by the external feature extraction unit of the communication partner's own communication device, and the external shape stored in the storage unit of the other communication device. In order to compare with the features of the device, the comparison of the features of the external shape is performed by the own device. For this reason, feature data of the external shape including personal information is not transmitted to the imaging device. Therefore, it is possible to prevent personal information from being unnecessarily diffused in the features of the external shape, and to strengthen the protection of personal information.
[0074] 実施形態 1では、 通信相手の他通信装置の外観特徴比較手段で比較された 比較結果が自通信装置に送信される。 しかし、 前記比較結果は、 通信相手が 撮影した画像に写り込んだ対象物が自装置であるか否かの情報、 すなわち " Y E S "又は" N O "の情報であって、 この情報には個人情報が含まれることはな し、。 したがって、 外観形状の特徴の比較結果を遣り取りするとしても、 個人 情報の保護を強化できる。  In Embodiment 1, the comparison result compared by the appearance feature comparison means of the other communication device of the communication partner is transmitted to the own communication device. However, the comparison result is information on whether or not the object captured in the image taken by the communication partner is the device itself, that is, information of “YES” or “NO”, and this information includes personal information. Is not included. Therefore, it is possible to enhance the protection of personal information even if the comparison results of the appearance shape characteristics are exchanged.
[0075] (実施形態 2 )  [0075] (Embodiment 2)
次に、 通信対象となっている自通信装置と他通信装置の相対位置を算出す る構成とした例を実施形態 2として説明する。 また、 実施形態 2では、 算出 した相対位置のデータに有効期限を設定し、 相対位置データを管理する。  Next, an example in which the relative position between the own communication device that is the communication target and the other communication device is calculated will be described as a second embodiment. In the second embodiment, an expiration date is set for the calculated relative position data, and the relative position data is managed.
[0076] 実施形態 2に係る通信相手特定装置は図 7に示すように、 図 2に示す回路 構成に、 撮影手段 1 5と、 撮影領域検出手段 1 6と、 相対位置算出手段 1 7 を追加した回路構成としたものである。 制御手段 1 0と、 通信手段 1 1 と、 外観特徴抽出手段 1 2と、 外観特徴比較手段 1 3と、 記憶手段 1 4と、 撮影 手段 1 5と、 撮影領域検出手段 1 6と、 相対位置算出手段 1 7との相互間は 、 システムバスによって接続されている。 図 7に示す通信相手特定装置は図 6に示すように、 自装置と、 通信相手方となる他装置とにそれぞれ搭載され る。 図 6において、 点線で示すブロックは、 通信相手を特定する際に休止状 態であることを示しており、 実線で示すブロックは動作状態であることを示 している。 As shown in FIG. 7, the communication partner specifying apparatus according to the second embodiment includes an imaging unit 15, an imaging region detection unit 16, and a relative position calculation unit 1 7 in the circuit configuration shown in FIG. The circuit configuration is as described above. Control means 1 0, communication means 1 1 and Appearance feature extraction means 1 2, appearance feature comparison means 1 3, storage means 1 4, imaging means 1 5, imaging area detection means 1 6, and relative position calculation means 1 7 Connected by. As shown in FIG. 6, the communication partner identification device shown in FIG. 7 is installed in the own device and the other device as the communication partner. In FIG. 6, the block indicated by the dotted line indicates that the communication partner is in a suspended state, and the block indicated by a solid line indicates that it is in an operating state.
図 6に示す通信相手特定装置 Aに含まれる制御手段 1 0 0、 通信相手特定 装置 Bに含まれる制御手段 2 0 0及び通信相手特定装置 Cに含まれる制御手 段 3 0 0は、 図 7の制御手段 1 0に対応する。 図 6に示す通信相手特定装置 Aに含まれる通信手段 1 0 1、 通信相手特定装置 Bに含まれる通信手段 2 0 1及び通信相手特定装置 Cに含まれる通信手段 3 0 1は、 図 7の通信手段 1 1に対応する。 図 6に示す通信相手特定装置 Aに含まれる外観特徴抽出手段 1 0 2、 通信相手特定装置 Bに含まれる外観特徴抽出手段 2 0 2及び通信相 手特定装置 Cに含まれる外観特徴抽出手段 3 0 2は、 図 7の外観特徴抽出手 段 1 2に対応する。 図 6に示す通信相手特定装置 Aに含まれる外観特徴比較 手段 1 0 3、 通信相手特定装置 Bに含まれる外観特徴比較手段 2 0 3及び通 信相手特定装置 Cに含まれる外観特徴比較手段 3 0 3は、 図 7の外観特徴比 較手段 1 3に対応する。 図 6に示す通信相手特定装置 Aに含まれる記憶手段 1 0 4、 通信相手特定装置 Bに含まれる記憶手段 2 0 4及び通信相手特定装 置 Cに含まれる記憶手段 3 0 4は、 図 7の記憶手段 1 4に対応する。 図 6に 示す通信相手特定装置 Aに含まれる撮影手段 1 0 5、 通信相手特定装置 Bに 含まれる撮影手段 2 0 5及び通信相手特定装置 Cに含まれる撮影手段 3 0 5 は、 図 7の撮影手段 1 5に対応する。 図 6に示す通信相手特定装置 Aに含ま れる撮影領域検出手段 1 0 6、 通信相手特定装置 Bに含まれる撮影領域検出 手段 2 0 6及び通信相手特定装置 Cに含まれる撮影領域検出手段 3 0 6は、 図 7の撮影領域検出手段 1 6に対応する。 図 6に示す通信相手特定装置 Aに 含まれる相対位置算出手段 1 0 7、 通信相手特定装置 Bに含まれる相対位置 算出手段 2 0 7及び通信相手特定装置 Cに含まれる相対位置算出手段 3 0 7 は、 図 7の相対位置算出手段 1 7に対応する。 The control means 1 0 0 included in the communication partner specifying device A shown in FIG. 6, the control means 2 0 0 included in the communication partner specifying device B, and the control means 3 0 0 included in the communication partner specifying device C are shown in FIG. Corresponds to the control means 10 of the above. The communication means 1 0 1 included in the communication partner specifying device A shown in FIG. 6, the communication means 2 0 1 included in the communication partner specifying device B, and the communication means 3 0 1 included in the communication partner specifying device C are shown in FIG. Corresponds to communication means 1 1. Appearance feature extraction means 1 0 2 included in communication partner identification device A shown in FIG. 6, appearance feature extraction means 2 0 2 included in communication partner identification device B and appearance feature extraction means 3 included in communication partner identification device C 3 0 2 corresponds to appearance feature extraction means 1 2 in FIG. Appearance feature comparison means 1 0 3 included in communication partner identification device A shown in FIG. 6, appearance feature comparison means 2 0 3 included in communication partner identification device B and appearance feature comparison means 3 included in communication partner identification device C 3 0 3 corresponds to the appearance feature comparison means 13 in FIG. The storage means 10 04 included in the communication partner specifying device A shown in FIG. 6, the storage means 20 04 included in the communication partner specifying device B, and the storage means 30 04 included in the communication partner specifying device C are shown in FIG. This corresponds to the storage means 1 of 4. The imaging means 10 05 included in the communication partner identification device A shown in FIG. 6, the imaging means 20 05 included in the communication partner identification device B, and the imaging means 30 0 included in the communication partner identification device C are shown in FIG. Corresponds to shooting means 1-5. Imaging area detection means 1 0 6 included in communication partner identification apparatus A shown in FIG. 6, imaging area detection means 2 0 6 included in communication partner identification apparatus B and imaging area detection means 3 0 included in communication partner identification apparatus C 6 corresponds to the imaging region detection means 16 in FIG. Relative position calculation means 1 0 7 included in communication partner identification device A shown in FIG. 6, relative position included in communication partner identification device B The calculating means 2 07 and the relative position calculating means 3 0 7 included in the communication partner specifying device C correspond to the relative position calculating means 17 in FIG.
なお、 図 6の通信経路 R上の X印は、 個人情報を送らないことを示すもの である。  Note that the X on communication path R in Fig. 6 indicates that no personal information is sent.
[0078] 撮影手段 1 5は、 自装置の全周囲を撮影範囲とし、 自装置の全周囲の状況 を撮影した画像データを取得する。 撮影手段 1 5は、 その取得した画像デ一 タに撮影パラメータを付加し、 これらのデータを撮影領域検出手段 1 6に送 信する。 前記撮影パラメータは、 送信する画像を撮影したカメラの情報であ つて、 撮影パラメータには、 カメラが自装置に取り付けられた位置情報、 力 メラが撮影したときの向きなどの情報が含まれる。 撮影手段 1 5には、 一般 的なカメラ技術が適用されるため、 その構成及び撮影方法、 撮影パラメータ などについては、 その詳細を省略する。  The photographing means 15 obtains image data obtained by photographing the entire surroundings of the own device with the entire periphery of the own device as the photographing range. The imaging means 15 adds imaging parameters to the acquired image data, and transmits these data to the imaging area detection means 16. The shooting parameters include information about the camera that has captured the image to be transmitted, and the shooting parameters include information such as the position information of the camera attached to the device and the direction when the camera shot. Since general camera technology is applied to the imaging means 15, details of the configuration, imaging method, imaging parameters, etc. are omitted.
[0079] 撮影手段 1 5は、 複数のカメラを自装置の周囲に設置して、 自装置の全周 囲の状況を撮影して画像データを取得する構成、 或いは単体のカメラを自装 置の周囲に移動させつつ、 自装置の全周囲の状況を撮影して画像データを取 得する構成のいずれであってもよい。 要は、 撮影手段 1 5の構成としては、 自装置の全周囲の状況を撮影して画像データを取得可能な構成であれば、 い ずれの構成であってもよいものである。  [0079] The imaging means 15 has a configuration in which a plurality of cameras are installed around the own device and the image of the entire surroundings of the own device is acquired to acquire image data, or a single camera is installed on the own device. Any configuration may be employed in which image data is acquired by photographing the situation around the entire device while moving to the surroundings. In short, the configuration of the imaging means 15 may be any configuration as long as it can acquire image data by capturing the situation around the entire device.
[0080] 図 7に示す実施形態では、 撮影手段 1 5を通信相手特定装置の一構成要素 として設置したが、 これに限られるものではない。 外部装置からインタフエ ースを介して、 自装置の全周囲の状況を撮影した画像データと、 前記画像デ ータに対応する、 撮影位置情報及び撮影姿勢に関する情報などの撮影パラメ ータを取得してもよいものである。  In the embodiment shown in FIG. 7, the photographing means 15 is installed as one component of the communication partner specifying device, but the present invention is not limited to this. Captures image data such as image position information and image position information corresponding to the image data from the external device via the interface, and the image data of the entire surroundings of the device. It may be.
[0081 ] 撮影領域検出手段 1 6は、 撮影手段 1 5から送信された撮影パラメータ及 び画像に基づいて、 前記画像を分析して、 画像中で他装置の写っている領域 を検出し、 当該領域の画像中での他装置の位置を相対位置算出手段 1 7に送 信すると共に、 当該領域を切り出した切片画像を外観特徴抽出手段 1 2に送 信する。 [0082] 前記他装置の位置は、 検出された他装置の重心座標、 または検出された他 装置のシルエツトを構成する点列の座標群、 または前記シルエツトを包含す る矩形領域の座標を用いて特定する。 し力、し、 これに限られるものではない 。 前記他装置の位置を特定するには、 一般的な技術が適用されるため、 その 内容については詳細な説明を省略する。 [0081] The imaging region detection means 16 analyzes the image based on the imaging parameters and images transmitted from the imaging means 15 to detect an area where another device is captured in the image, and The position of the other device in the image of the area is transmitted to the relative position calculation means 17, and the section image cut out from the area is transmitted to the appearance feature extraction means 12. [0082] The position of the other device is determined by using the coordinates of the center of gravity of the detected other device, the coordinate group of the point sequence that constitutes the detected silhouette of the other device, or the coordinates of the rectangular area that includes the silhouette. Identify. It is not limited to this. Since a general technique is applied to specify the position of the other device, a detailed description thereof will be omitted.
[0083] 相対位置算出手段 1 7は、 撮影領域検出手段 1 6から送信された画像中で の他装置の位置と、 撮影手段 1 5から送信された撮影パラメータを用いて、 自装置に対する他装置の相対位置を算出する。 相対位置算出手段 1 7が算出 した相対位置のデータは、 記憶手段 1 4に保存される。  [0083] Relative position calculation means 17 uses the position of the other apparatus in the image transmitted from imaging area detection means 16 and the imaging parameter transmitted from imaging means 15 to The relative position of is calculated. The relative position data calculated by the relative position calculation means 17 is stored in the storage means 14.
[0084] 相対位置算出手段 1 7が相対位置を算出する一例を説明する。 相対位置算 出手段 1 7が撮影手段 1 5から受信した画像には、 横軸 X gと縦軸 Y gの座 標上で、 (x g, _ y g) の位置に、 例えば車両などの他装置が写り込んで いるものとする (図 9 (a) ) 。  An example in which the relative position calculation means 17 calculates the relative position will be described. The image received by the relative position calculating means 1 7 from the imaging means 15 is displayed on the coordinate of the horizontal axis X g and the vertical axis Y g at the position (xg, _ yg), for example, another device such as a vehicle. (Fig. 9 (a)).
[0085] 相対位置算出手段 1 7は、 図 9 (a) に示す画面座標系の撮影画面を、 図 9 (b) に示す射影変換により、 図 9 (c) に示す (X p, Y p) の実世界 カメラ座標系上に射影変換する。  [0085] Relative position calculation means 17 uses the screen coordinate system shown in Fig. 9 (a) to perform the projection transformation shown in Fig. 9 (b), as shown in Fig. 9 (c) (X p, Y p Projective transformation on the real world camera coordinate system.
[0086] 相対位置算出手段 1 7は、 図 9 (c) に示す実世界カメラ座標系へ射影変 換する際に、 次の式に基づいて、 y pと y gとの関係を導き出す。  [0086] Relative position calculation means 17 derives the relationship between yp and yg based on the following equation when performing projective transformation to the real-world camera coordinate system shown in Fig. 9 (c).
f : - y g= ( f + y p) : h 但し、 f << y pとする。  f: -y g = (f + yp): h where f << yp.
[数 1]  [Number 1]
β  β
yp^-"—  yp ^-"—
yg  yg
[0087] 相対位置算出手段 1 7は、 図 9 (c) に示す実世界カメラ座標系へ射影変 換する際に、 次の式に基づいて、 χ ρと、 y pと、 χ ρとの関係を導き出す f : X g = ( f + y p ) : X p 但し、 f < < y pとする。  [0087] When the relative position calculation means 17 is projected to the real world camera coordinate system shown in Fig. 9 (c), the relationship between χ ρ, yp, and χ ρ F: X g = (f + yp): X p where f <<yp.
[数 2]  [Equation 2]
xg yp h xg  xg yp h xg
f ys [0088] 相対位置算出手段 1 7は図 9 (c) に示すように、 図 9 (a) に示す画像 座標系を, 図 9 (b) に示す射影変換により, 実世界カメラ座標系に変換し , カメラ (撮影手段 1 5) を中心として自装置 (例えば自車両) と他装置 ( 例えば他車両) との関係を実世界座標系上にプロットする。 f ys [0088] As shown in Fig. 9 (c), the relative position calculation means 17 converts the image coordinate system shown in Fig. 9 (a) into a real world camera coordinate system by projective transformation shown in Fig. 9 (b). Then, the relationship between the own device (for example, the own vehicle) and the other device (for example, another vehicle) is plotted on the real world coordinate system with the camera (imaging means 15) as the center.
[0089] 図 9 (c) において、 (X p, Y p) が実世界カメラ座標系であり、 (X  [0089] In Fig. 9 (c), (X p, Y p) is the real world camera coordinate system, and (X
V , Y v) が実世界自車両座標系である。 図 9 (c) に示す実世界座標系に おいて、 相対位置算出手段 1 7は、 通信相手の他車両の座標位置 (X V , y v) を次の式に基づいて算出する。  V, Y v) is the real-world vehicle coordinate system. In the real world coordinate system shown in FIG. 9 (c), the relative position calculation means 17 calculates the coordinate position (X V, y v) of the other vehicle of the communication partner based on the following equation.
[数 3]  [Equation 3]
cosiO)+ sin(^)  cosiO) + sin (^)
vv= sin(^) + cos(^)  vv = sin (^) + cos (^)
[0090] なお、 上述した相対位置算出方法は、 一例であり、 これに限られるもので はない。 相対位置算出方法には、 一般的な技術が適用されるため、 上述した 以外の相対位置算出方法については、 その詳細な説明を省略する。  Note that the relative position calculation method described above is merely an example, and the present invention is not limited to this. Since a general technique is applied to the relative position calculation method, detailed description of the relative position calculation methods other than those described above is omitted.
[0091] 外観特徴抽出手段 1 2は、 撮影領域検出手段 1 6から送信される、 他装置 が写っている領域を切り出した切片画像を受信し、 その受信した切片画像か ら他装置の外観形状の特徴を抽出する。 なお、 前記外観特徴抽出手段 1 2で 抽出した他装置の外観形状の特徴に関するデータは、 記憶手段 1 4に記憶さ れる。  [0091] The appearance feature extraction means 12 receives the slice image cut out from the area where the other apparatus is transmitted, which is transmitted from the imaging area detection means 16 and the appearance shape of the other apparatus from the received slice image Extract features. Note that the data relating to the feature of the external shape of the other device extracted by the appearance feature extraction means 12 is stored in the storage means 14.
[0092] 記憶手段 1 4は、 自装置の撮影領域検出手段 1 6で検出された同一の他装 置に関して、 自装置の相対位置検出手段 1 7で算出された他装置の相対位置 のデータと、 自装置の外観特徴抽出手段 1 2で抽出された他装置の外観形状 の特徴データとをペアとして保存すると共に、 このペアのデータを通信手段 1 1に送信する。 また、 記憶手段 1 4は、 通信手段 1 1から送信された同一 の他装置に関する相対位置と他装置の外観形状の特徴をペアとして記憶する 。 また、 記憶手段 1 4は、 通信相手を識別する識別 I Dと、 制御手段 1 0か ら送信された通信相手の他装置の相対位置と、 制御手段 1 0が設定した他装 置の相対位置データの有効期間をペアとして記憶する。 また、 記憶手段 1 4 は、 予め自装置の外観形状の特徴を記憶する。 また、 記憶手段 1 4は、 制御 手段 1 0と、 通信手段 1 1 と、 外観特徴抽出手段 1 2と、 外観特徴比較手段The storage means 14 relates to the relative position data of the other apparatus calculated by the relative position detection means 17 of the own apparatus with respect to the same other apparatus detected by the imaging region detection means 16 of the own apparatus. The feature data of the appearance shape of the other device extracted by the appearance feature extraction means 12 of the own device is stored as a pair, and the data of this pair is transmitted to the communication means 11. In addition, the storage unit 14 stores the relative position regarding the same other device transmitted from the communication unit 11 and the feature of the external shape of the other device as a pair. The storage means 14 includes an identification ID for identifying the communication partner, the relative position of the other device of the communication partner transmitted from the control means 10, and the relative position data of the other device set by the control means 10. Is stored as a pair. Storage means 1 4 Stores in advance the features of the external shape of its own device. The storage means 14 includes a control means 10, a communication means 1 1, an appearance feature extraction means 1 2, and an appearance feature comparison means.
1 3と、 記憶手段 1 4と、 撮影手段 1 5と、 撮影領域検出手段 1 6と、 相対 位置算出手段 1 7が動作を実行する際に必要な作業記憶領域を提供する。 1 3, storage means 14, imaging means 15, imaging area detection means 16, and relative position calculation means 17 provide a working storage area necessary for executing operations.
[0093] 通信手段 1 1は、 記憶手段 1 4に保存された、 自装置の周囲に位置する他 装置に対する、 相対位置と外観形状の特徴のペアと、 自装置の通信手段 1 1 を識別する識別 I Dとを通信相手に送信する。 また、 通信手段 1 1は、 同様 の通信相手特定装置を装備した他装置から、 その他装置の周囲に存在する別 の他装置に対する、 相対位置と外観形状の特徴のペアと、 他装置の通信装置[0093] The communication unit 1 1 identifies the pair of the relative position and the external shape feature stored in the storage unit 1 4 with respect to other devices located around the own device, and the communication unit 1 1 of the own device. Send the identification ID to the communication partner. In addition, the communication means 11 includes a pair of features of the relative position and appearance shape with respect to another device provided around the other device from another device equipped with the same communication partner identifying device, and a communication device of the other device.
1 1を識別する識別 I Dを受信し、 これらのデータを記憶手段 1 4に送信す る。 1 Identification ID for identifying 1 is received, and these data are transmitted to storage means 14.
[0094] 通信手段 1 1は、 図 7に示す通信相手特定装置を保有する、 周囲に位置す る他装置についての相対位置データと外観形状の特徴データのペアと、 他装 置の通信手段 1 1を識別する識別 I Dとを受信する。 前記通信手段 1 1が受 信した前記情報は、 記憶手段 1 4に保存される。  [0094] The communication means 1 1 has a communication partner identifying apparatus shown in Fig. 7, and a pair of relative position data and external appearance feature data for other apparatuses located in the vicinity, and communication means 1 of the other apparatus. An identification ID for identifying 1 is received. The information received by the communication unit 11 is stored in the storage unit 14.
[0095] 通信手段 1 1の通信には、 自装置と他装置の相互間で直接通信を行うアド ホック通信、 或いは自装置と他装置の間の通信経路の一部又は全部が路車間 通信として実現される車両同士の通信も含まれる。 通信手段 1 1の通信には 、 一般的な技術が適用されるため、 その構成及び通信方法については、 その 詳細な説明を省略する。  [0095] The communication of the communication means 1 1 includes ad hoc communication in which direct communication is performed between the own device and another device, or a part or all of a communication path between the own device and the other device is used as road-to-vehicle communication. Communication between vehicles is also included. Since a general technique is applied to the communication of the communication means 11, the detailed description of the configuration and the communication method is omitted.
[0096] 前記識別 I Dは、 自装置と他装置に割り振られた識別子、 自他装置がカー ナビゲ一ション装置である場合には力一ナビゲ一ション装置毎に割り振られ た識別子、 通信手段 1 1間の通信プロ トコルにより自装置と他装置に割り振 られた識別子などのように、 自装置と他装置を識別することが可能なもので あれば、 いずれのものであってもよい。  [0096] The identification ID is an identifier assigned to the own device and the other device. When the own device is a car navigation device, the identifier is assigned to each power navigation device. Any device can be used as long as it can identify the device and the other device, such as an identifier assigned to the device and the other device by the communication protocol between them.
[0097] 通信手段 1 1は、 自装置を識別する識別 I Dを通信相手に送信する。 通信 手段 1 1は、 通信相手の他装置の通信手段 1 1を識別する識別 I Dを受信し 、 その識別 I Dを制御手段 1 0に送信する。 [0098] 通信手段 1 1は、 自装置の外観特徴比較手段 1 3の比較結果を通信相手に 送信する。 通信手段 1 1は、 通信相手の他装置の外観特徴比較手段 1 3の比 較結果を受信し、 その比較結果を制御手段 1 0に送信する。 [0097] The communication means 11 transmits an identification ID for identifying its own device to the communication partner. The communication unit 11 receives the identification ID for identifying the communication unit 11 of the other device of the communication partner, and transmits the identification ID to the control unit 10. [0098] The communication means 11 transmits the comparison result of the appearance characteristic comparison means 13 of its own device to the communication partner. The communication unit 11 receives the comparison result of the appearance feature comparison unit 13 of the other device of the communication partner, and transmits the comparison result to the control unit 10.
[0099] 外観特徴比較手段 1 3は、 記憶手段 1 4に保存されている自装置の外観形 状の特徴と、 同様の通信相手特定装置を持つ他装置から送信されて自装置の 記憶手段 1 4に保存された外観形状の特徴との比較を行う。 外観特徴比較手 段 1 3は、 比較結果を自装置の通信手段 1 1及び制御手段 1 0に送信する。  [0099] Appearance feature comparison means 13 is a storage means 1 of its own device that is transmitted from another device having the same communication partner specifying device as the feature of the appearance shape of its own device stored in storage means 14 A comparison with the features of the external shape stored in 4 is performed. The appearance feature comparison unit 13 transmits the comparison result to the communication unit 11 and the control unit 10 of its own device.
[0100] 制御手段 1 0は、 撮影手段 1 5、 撮影領域検出手段 1 6、 通信手段 1 1 、 外観特徴抽出手段 1 2、 外観特徴比較手段 1 3、 記憶手段 1 4、 相対位置算 出手段 1 7のそれぞれの動作を制御する。 制御手段 1 0は、 外部装置とのィ ンタフエースを有し、 自装置周辺の画像、 及び各種センサ情報などの受け取 りや、 通信相手の特定結果などの出力を行う。  [0100] The control means 10 is an imaging means 15, an imaging area detection means 16, a communication means 11, an appearance feature extraction means 12, an appearance feature comparison means 13, a storage means 14, a relative position calculation means 1 Control each operation of 7. The control means 10 has an interface with an external device, and receives an image around the device and various sensor information, and outputs a communication partner identification result.
[0101 ] 具体的には、 制御手段 1 0は、 通信手段 1 1から送信された、 通信相手を 識別する識別 I Dと、 記憶手段 1 4に保存された識別 I Dとの照合を行う。 制御手段 1 0は、 一致する識別 I Dがない、 または一致する識別 I Dとペア で保存されている相対位置の有効期限が無効である場合、 通信相手の自装置 に対する相対位置データを取得し、 相対位置と識別 I Dとのデータを記憶手 段 1 4に保存する。  Specifically, the control means 10 collates the identification ID transmitted from the communication means 11 for identifying the communication partner and the identification ID stored in the storage means 14. If there is no matching identification ID or the expiration date of the relative position stored as a pair with the matching identification ID is invalid, the control means 1 0 acquires the relative position data for the communication partner's own device, Store the data of position and identification ID in memory unit 14.
[0102] 制御手段 1 0は、 一定時間毎に記憶手段 1 4に保存された通信相手の相対 位置データの有効期限を参照して、 有効期限が無効である場合、 通信相手の 自装置に対する相対位置データを取得し、 相対位置と識別 I Dとのデータを 記憶手段 1 4に保存する。  [0102] The control means 1 0 refers to the expiration date of the relative position data of the communication partner stored in the storage means 14 at regular intervals. If the expiration date is invalid, the control means 1 0 The position data is acquired, and the data of the relative position and the identification ID are stored in the storage means 14.
[0103] なお、 図 7に示す実施形態 2に係る通信相手特定装置はハードウエアとし て構築したが、 これに限られるものではなく、 図 7に示す通信相手特定装置 を構成するコンピュータに通信相手特定用プログラムを組み込み、 コンビュ ータの C P Uにより前記通信相手特定用プログラムを読み出して実行するこ とにより、 コンピュータに、 撮影領域検出手段 1 6、 制御手段 1 0と、 外観 特徴抽出手段 1 2と、 外観特徴比較手段 1 3と、 相対位置算出手段 1 7の機 能を実行させるように構築してもよいものである。 [0103] The communication partner specifying apparatus according to the second embodiment shown in FIG. 7 is constructed as hardware, but is not limited to this. The communication partner specifying apparatus shown in FIG. By incorporating the identification program, and reading and executing the communication partner identification program by the CPU of the computer, the imaging area detection means 16, the control means 10, the appearance feature extraction means 12, and Appearance feature comparison means 1 3 and relative position calculation means 1 7 It may be constructed to perform a function.
[0104] 次に、 図 7に示す通信相手特定装置を、 自他装置としての車両にそれぞれ 搭載して、 自他装置に搭載した通信相手特定装置の相互間の相対位置を同定 する通信相手特定方法の一例について、 図 8、 図 1 0及び図 1 1を用いて説 明する。  [0104] Next, the communication partner identification device shown in Fig. 7 is mounted on the vehicle as the own / other device, and the relative position between the communication partner identification devices mounted on the own / other device is identified. An example of the method will be described with reference to FIG. 8, FIG. 10, and FIG.
[0105] 図 8に示すように、 実施形態 2では、 他装置が、 自装置の真後ろ、 斜め後 方、 斜め前方に存在する場合にも、 通信相手を特定する。 図 8において、 自 装置としての車両に搭載した図 7の通信相手特定装置を自通信装置 A、 真後 ろを走行する車両に搭載した図 7の通信相手特定装置を通信相手の他通信装 置 Bとする。 図 8においては、 自通信装置 Aの周囲には、 通信相手の他通信 装置 B以外に、 斜め後方を走行する車両、 或いは斜め前方を走行する車両に 搭載された図 7の通信相手特定装置 C , D , Eが存在している。  [0105] As shown in Fig. 8, in the second embodiment, the communication partner is specified even when another device exists behind, diagonally rearward, or diagonally forward of the own device. In Fig. 8, the communication partner identification device of Fig. 7 installed in the vehicle as its own device is the own communication device A, and the communication partner identification device of Fig. 7 installed in the vehicle traveling in the rear is the other communication device of the communication partner. B. In FIG. 8, there is a communication partner identifying device C in FIG. 7 mounted on a vehicle traveling diagonally backward or a vehicle traveling diagonally forward in addition to other communication device B of the communication partner in the vicinity of the communication device A. , D and E exist.
[0106] 実施形態 2では、 自通信装置 Aが図 1 0に示す処理を行い、 他通信装置 B が図 1 1に示す処理を行うことにより、 通信相手を特定する共に、 通信相手 との相対位置を算出する。 なお、 自通信装置 Aが図 1 1に示す処理を行い、 他通信装置 Bが図 1 0に示す処理を行うことにより、 通信相手を特定する共 に、 通信相手との相対位置を算出するようにしてもよい。 また、 自通信装置 Aと他通信装置 Cとの間、 或いは他通信装置 Bと他通信装置 Cとの間で、 通 信相手を特定すると共に、 通信相手との相対位置を算出するようにしてもよ い。  In the second embodiment, the own communication device A performs the processing shown in FIG. 10 and the other communication device B performs the processing shown in FIG. Calculate the position. Note that the local communication device A performs the process shown in FIG. 11 and the other communication device B performs the process shown in FIG. 10 so that the communication partner is identified and the relative position with the communication partner is calculated. It may be. In addition, the communication partner is specified between the own communication device A and the other communication device C, or between the other communication device B and the other communication device C, and the relative position with respect to the communication partner is calculated. It's good.
[0107] 先ず、 自通信装置 Aの制御手段 1 0 0は、 通信手段 1 0 1を介して、 他通 信装置 Bとの通信を識別する識別 I Dを取得した場合 (図 1 0のステップ S 1 0 ; Y E S ) 、 処理を図 1 0のステップ S 1 1に進め、 前記識別 I Dを取 得しない場合 (図 1 0のステップ S 1 0 ; N O ) 、 処理を図 1 0のステップ S 1 3に進める。  First, when the control means 1 0 0 of the own communication apparatus A acquires an identification ID for identifying communication with another communication apparatus B via the communication means 1 0 1 (step S in FIG. 10). 1 0; YES), the process proceeds to step S 11 in FIG. 10 and if the identification ID is not obtained (step S 10 in FIG. 10; NO), the process proceeds to step S 13 in FIG. 10. Proceed to
[0108] 図 1 0のステップ S 1 1では、 自通信装置 Aの制御手段 1 0 0は、 他通信 装置 Bから取得した識別 I Dと一致する識別 I Dを記憶手段 1 0 4の記憶情 報から検索する。 [0109] 図 1 0のステップ S 1 1において、 自通信装置 Aの制御手段 1 00は、 他 通信装置 Bから取得した識別 I Dが記憶手段 1 04の記憶情報中に存在しな し、、 又は一致する識別 I Dの有効期間が過ぎて無効であると判断した場合 ( 図 1 0の S 1 2 ; YES) 、 処理を図 1 0のステップ S 1 5に進める。 また 自通信装置 Aの制御手段 1 00は、 他通信装置 Bから取得した識別 I Dが記 憶手段 1 04の記憶情報中に存在する、 又は一致する識別 I Dの有効期間内 であり有効であると判断した場合、 処理を図 1 0のステップ S 1 0に戻す。 In step S 11 of FIG. 10, the control means 1 0 0 of the own communication device A obtains the identification ID that matches the identification ID acquired from the other communication device B from the storage information of the storage means 1 0 4. Search for. [0109] In step S11 of FIG. 10, the control means 100 of the own communication device A does not have the identification ID acquired from the other communication device B in the storage information of the storage means 104, or If it is determined that the validity period of the matching identification ID has passed and is invalid (S 1 2 in FIG. 10; YES), the process proceeds to step S 15 in FIG. In addition, the control means 100 of the own communication device A is valid if the identification ID acquired from the other communication device B is within the validity period of the identification ID that is present in or coincides with the storage information of the storage device 104. If so, the process returns to step S 1 0 in FIG.
[0110] 図 1 0のステップ S 1 3において、 自通信装置 Aの制御手段 1 00は、 記 憶手段 1 04に保存された相対位置データの有効期限を確認する。 制御手段 1 00は、 有効期限が過ぎている場合 (S 1 3の YES) 、 処理を図 1 0の ステップ S 1 5に進め、 有効期限を過ぎていない場合、 処理を図 1 0のステ ップ S 1 0に進める。  In step S 13 of FIG. 10, the control unit 100 of the own communication device A confirms the expiration date of the relative position data stored in the storage unit 104. If the expiration date has passed (YES in S1 3), the control means 100 advances the process to step S15 in FIG. 10, and if the expiration date has not passed, the process proceeds to the step in FIG. Proceed to step S 1 0.
[0111] 図 1 0のステップ S 1 5において、 撮影手段 1 05は制御手段 1 00の制 御の下に、 自通信装置 Aの周辺の状況を撮像して画像データを取得し、 その 取得した画像データを撮影領域検出手段 1 06に送信する。  [0111] In step S15 of Fig. 10, the imaging means 1 05 captures the image of the surroundings of the own communication device A under the control of the control means 100, and acquires the image data. The image data is transmitted to the imaging region detection means 106.
[0112] 撮影領域検出手段 1 06は、 撮影手段 1 05から送信された画像データに おいて、 他通信装置 Bが写っている領域を検出し (図 1 0のステップ S 1 6 ) 、 車両情報数を nとして (図 1 0のステップ S 1 7) 、 他通信装置毎に以 下の処理 (図 1 0のステップ S 1 8〜S 22) を行う。  [0112] The shooting area detection means 106 detects the area in which the other communication device B appears in the image data transmitted from the shooting means 105 (step S16 in FIG. 10), and vehicle information Assuming that the number is n (step S 17 in FIG. 10), the following processing (steps S 18 to S 22 in FIG. 10) is performed for each other communication device.
[0113] 先ず、 最初は、 i = 1 と設定されるので (図 1 0のステップ S 1 8 ; Y E S) 、 S 1 9に進む。 相対位置算出手段 1 07は、 自通信装置 Aに対する他 通信装置 Bの相対位置を算出する。 ここでは、 相対位置算出手段 1 07は、 撮影領域検出手段 1 06で検出された他通信装置 Bの画像中での位置、 及び 撮影手段 1 05から提供された、 前記画像を撮影したカメラが自通信装置 A に取り付けられた取り付け位置情報と、 カメラがどの向きを向いているかを 示す撮影パラメータに基づいて、 自通信装置 Aに対する他通信装置 Bの相対 位置を算出する (図 1 0のステップ S 1 9) 。  First, since i = 1 is set (step S 1 8 in FIG. 10; Y ES), the process proceeds to S 19. The relative position calculating means 107 calculates the relative position of the other communication device B with respect to the own communication device A. Here, the relative position calculation means 107 is the position in the image of the other communication device B detected by the imaging area detection means 106, and the camera that has taken the image provided by the imaging means 105 Based on the mounting position information attached to communication device A and the shooting parameters indicating the direction the camera is facing, the relative position of other communication device B with respect to own communication device A is calculated (step S in FIG. 10). 1 9).
[0114] なお、 前記カメラパラメータを予め記憶手段 1 04に保存しておき、 記憶 手段 1 0 4が前記カメラパラメータを相対位置算出手段 1 0 7に提供しても よい。 [0114] It should be noted that the camera parameters are stored in advance in the storage means 104 and stored. The means 104 may provide the camera parameter to the relative position calculation means 107.
[01 15] 外観特徴抽出手段 1 0 2は、 相対位置が算出された他通信装置 Bの外観形 状の特徴を抽出する (図 1 0のステップ S 2 0 ) 。 ここでは、 車両の外観形 状の特徴として、 車両の外形状、 車種、 色、 ナンバープレートなどを用いる 。 前記車両の外形状としては、 車両のシルエツト、 リア■テールランプの取 付位置、 フロントグリル■ リアグリルの形状、 車幅や車高、 前後のタイヤの 取付幅、 窓枠の形状などを用いる。 また前記車両の車種は、 前記車両の外形 状に対するパターンマッチングにより、 トラック、 ミニバン、 セダンである かなどを推定する。  [0115] The appearance feature extraction means 10 2 extracts the feature of the appearance shape of the other communication device B whose relative position has been calculated (step S 2 0 in FIG. 10). Here, the external shape of the vehicle, vehicle type, color, license plate, etc. are used as features of the external shape of the vehicle. As the outer shape of the vehicle, vehicle seat, rear tail lamp mounting position, front grille, rear grille shape, vehicle width and height, front and rear tire mounting width, window frame shape, and the like are used. The vehicle type of the vehicle is estimated to be a truck, a minivan, or a sedan by pattern matching with respect to the outer shape of the vehicle.
[01 16] 制御手段 1 0 0は、 相対位置算出手段 1 0 7で算出した他通信装置 Bの相 対位置と、 外観特徴抽出手段 1 0 2で求めた他通信装置 Bの外観形状の特徴 をペアとして、 記憶手段 1 0 4に保存する (図 1 0のステップ S 2 1 ) 。  [0116] The control means 1 0 0 is the relative position of the other communication device B calculated by the relative position calculation means 1 0 7 and the external shape feature of the other communication device B calculated by the external feature extraction means 1 0 2 Are stored in the storage means 104 as a pair (step S 2 1 in FIG. 10).
[01 17] 次に、 制御手段 3 0 2はインクリメントして (図 1 0のステップ S 2 2 ) 処理を図 1 0のステップ S 1 8に戻し、 次の自通信装置 Aの周辺に存在する 他通信装置 C , D , Eに対して上述した処理を実行させる。  [01 17] Next, the control means 3 0 2 increments (step S 2 2 in FIG. 10) and returns the processing to step S 1 8 in FIG. 10, and exists around the next self-communication device A. The other communication devices C, D, E are caused to execute the above-described processing.
[01 18] 撮影領域検出手段 1 0 6で検出された全ての周辺に存在する他通信装置 B , C , D , Eについて、 以上の処理 (図 1 0のステップ S 1 9〜S 2 2 ) が 終了した時点において (図 1 0のステップ S 1 8 ; N O ) 、 自通信装置 Aの 制御手段 1 0 0は、 記憶手段 1 0 4に保存された全ての他通信装置 B , C , D , E ( 1〜n番) の相対位置と車両 (他装置) の外観形状の特徴とのペア と、 自通信装置 Aの通信手段 1 0 1を識別する識別 I Dとを、 自通信装置 A の通信装置 1 0 1を介して、 通信相手の他通信装置 Bに送信する (図 1 0の ステップ S 2 3 ) 。  [0118] With respect to the other communication devices B 1, C 2, D 3 and E existing in all the surroundings detected by the imaging region detection means 10 6, the above processing (steps S 19 to S 2 2 in FIG. 10) Is completed (step S 1 8 in FIG. 10; NO), the control means 100 of the own communication device A sends all other communication devices B, C, D, The communication ID of the communication device A is determined by a pair of the relative position of E (No. 1 to n) and the feature of the appearance of the vehicle (other device) and the identification ID for identifying the communication means 1 0 1 of the communication device A. The data is transmitted to the other communication device B through the device 1001 (step S2 3 in FIG. 10).
[01 19] 次に、 通信相手の他通信装置 Bにおける通信相手特定方法を図 1 1に基づ いて説明する。  [0119] Next, a communication partner specifying method in the other communication device B of the communication partner will be described with reference to FIG.
[0120] 先ず、 他通信装置 Bの制御手段 2 0 0は、 通信相手の自通信装置 Aから通 信手段 1 0 1を介して、 自通信装置 Aの周囲の車両に関する外観形状の特徴 と相対位置のペアデータと、 自通信装置 Aの通信手段 1 0 1を識別する識別 I Dのデータを受信し (図 1 1のステップ S 2 4 ) 、 受信した車両の外観形 状の特徴と相対位置のペアデータと、 送信された車両情報の数 (n ) のデー タを記憶手段 2 0 4に保存する (図 1 1のステップ S 2 5、 S 2 6 ) 。 [0120] First, the control means 2 0 0 of the other communication device B is characterized by the external shape related to the vehicle around the communication device A via the communication device 1 0 1 from the communication device 1 0 And relative position pair data and identification ID data for identifying communication means 1 0 1 of own communication device A (step S 2 4 in FIG. 11). The position pair data and the number of transmitted vehicle information (n) data are stored in the storage means 204 (steps S 25 and S 26 in FIG. 11).
[0121 ] 他通信装置 Bの制御手段 2 0 0は、 受信した車両毎に以下の処理を行う ( 図 1 1のステップ S 2 7〜S 3 0 ) 。  [0121] The control means 20 0 of the other communication device B performs the following processing for each received vehicle (steps S 27 to S 30 in FIG. 11).
[0122] 先ず、 最初は i = 1 と設定されるので (図 1 1のステップ S 2 7 ) 、 S 2 8に進み、 i をインクリメントする (図 1 1のステップ S 2 8 ) 。 他通信装 置 Bの外観特徴比較手段 2 0 3は、 記憶手段 2 0 4に保存された i番目の車 両に関する車両の外観形状の特徴と、 予め記憶手段 2 0 4に保存しておいた 他通信装置 Bの車両の外観形状の特徴とを比較する (図 1 1のステップ S 2 9 ) 。  First, since i = 1 is initially set (step S 2 7 in FIG. 11), the process proceeds to S 28 and i is incremented (step S 2 8 in FIG. 11). The external feature comparison means 2 0 3 of the other communication device B is stored in the storage means 2 0 4 in advance and the characteristics of the external shape of the vehicle relating to the i-th vehicle stored in the storage means 2 0 4. The characteristics of the external shape of the vehicle of the other communication device B are compared (step S 29 in FIG. 11).
[0123] 外観特徴比較手段 2 0 3がー致した車両の外観形状の特徴があると判断し た場合 (図 1 1のステップ S 3 0 ; Y E S ) 、 処理を図 1 1のステップ S 3 1に進める。  [0123] Appearance feature comparison means When it is determined that there is a feature of the appearance of the vehicle that has passed the action (step S3 0 in Fig. 11; YES), processing is performed in step S3 1 in Fig. 11 Proceed to
[0124] 外観特徴比較手段 2 0 3が車両の外観形状の特徴が一致しないと判断した 場合 (図 1 1のステップ S 3 0 ; N O ) 、 処理を図 1 1のステップ S 2 7に 戻し、 受信した全ての車両について、 上記処理を行う。 そして、 受信した全 ての車両について、 外観特徴比較手段 2 0 3が車両の外観形状の特徴が一致 しないと判断した場合、 処理を図 1 1のステップ S 3 3に進める。  [0124] When the appearance feature comparison means 2 0 3 determines that the features of the vehicle exterior shape do not match (step S 3 0 in FIG. 11; NO), the process returns to step S 2 7 in FIG. The above process is performed for all received vehicles. If the appearance feature comparison means 20 3 determines that the features of the appearance of the vehicles do not match for all the received vehicles, the process proceeds to step S 33 in FIG.
[0125] 図 1 1のステップ S 3 1において、 他通信装置 Bの制御装置 2 0 0は、 一 致した車両の外観形状の特徴データとペアになっている相対位置のデ一タを 記憶手段 2 0 4から読み出す。 前記相対位置は、 自通信装置 Aからみた他通 信装置 Bの位置の情報であることから、 制御手段 2 0 0は、 前記相対位置の 逆算を行い、 他通信装置 Bからみた自通信装置 Aの相対位置を算出する。  In step S 3 1 of FIG. 11, the control device 2 00 of the other communication device B stores the data of the relative position paired with the matched feature data of the appearance of the vehicle. Read from 2 0 4 Since the relative position is information on the position of the other communication device B as viewed from the own communication device A, the control means 20 0 performs the reverse calculation of the relative position and the own communication device A as viewed from the other communication device B. The relative position of is calculated.
[0126] 前記相対位置を算出した後、 他通信装置 Bの制御手段 2 0 0は、 通信相手 の自通信装置 Aを識別する識別 I Dと、 前記算出した相対位置と、 前記算出 した相対位置の有効期限とをペアとして、 記憶手段 2 0 4に保存する (図 1 1のステップ S 3 2 ) 。 [0126] After calculating the relative position, the control means 20 0 of the other communication device B uses the identification ID for identifying the communication device A as the communication partner, the calculated relative position, and the calculated relative position. Save the expiration date as a pair in the storage means 2 0 4 (Fig. 1 Step 1 of S 3 2).
[0127] ところで、 通信相手の相対位置は時間の経過と共に変化し、 任意の時点で 推定された通信相手の相対位置と、 実際の通信相手の相対位置との誤差は大 きくなるため、 制御手段 2 0 0は、 前記相対位置の情報の有効期限に対応さ せて適当な数 (例えば 1 0 ) を設定し、 相対位置が更新されない場合、 適当 な時間 (例えば 1 O O m s ) 毎に前記数 (1 0 ) をデクリメントし、 前記数 力 ' Ο Ίこなるまでデクリメン卜する。 制御手段 2 0 0は、 デクリメントして 有効期限が 0となった相対位置を、 本通信相手特定方法を用いて順次更新す る。 [0127] By the way, the relative position of the communication partner changes with time, and the error between the relative position of the communication partner estimated at an arbitrary time and the actual relative position of the communication partner becomes large. 2 0 0 is set to an appropriate number (for example, 10) corresponding to the expiration date of the relative position information. If the relative position is not updated, the number is set for each appropriate time (for example, 1 OO ms). Decrement (1 0) and decrement until the numerical value is reduced. The control means 2 0 0 sequentially updates the relative position where the expiration date becomes 0 after decrementing using this communication partner identification method.
[0128] 図 1 1のステップ S 3 3において、 制御手段 2 0 0は、 図 1 1のステップ S 2 9で取得した車両の外観形状の特徴を比較した結果と、 他通信装置 Βの 識別 I Dを、 通信手段 2 0 1を介して自通信装置 Αに送信する。  [0128] In step S3 3 of Fig. 1 1, the control means 2 0 0 compares the result of comparing the external appearance characteristics of the vehicle acquired in step S29 of Fig. 1 1 with the identification ID of the other communication device Β. Is transmitted to the own communication device し て via the communication means 2 0 1.
[0129] 以上の処理工程を経て、 自通信装置 Aと通信相手である他通信装置 Bでの 処理が終了し、 処理が自通信装置 Aでの処理に移行する。  Through the above processing steps, the processing in own communication device A and the other communication device B that is the communication partner is completed, and the processing shifts to the processing in own communication device A.
[0130] 自通信装置 Aの通信手段 1 0 1は、 通信相手の他通信装置 Bからの車両の 外観形状の特徴に関する比較結果と、 他通信装置 Bの識別 I Dを受信し (図 1 0のステップ S 3 4 ) 、 それらのデータを制御手段 1 0 0に送信する。  [0130] The communication means 1 0 1 of the own communication device A receives the comparison result of the external shape feature of the vehicle from the other communication device B of the communication partner and the identification ID of the other communication device B (of FIG. 10). In step S 3 4), these data are transmitted to the control means 100.
[0131 ] 制御手段 1 0 0は、 通信相手の他通信装置 Bから車両の外観形状の特徴が —致するとの比較結果を取得した場合 (図 1 0のステップ S 3 5 ; Y E S ) 、 処理をステップ S 3 6に進め、 一致していない場合 (図 1 0のステップ S 3 5 ; N O ) 、 処理をステップ S 1 0に戻す。  [0131] The control means 1 0 0 performs processing when it obtains a comparison result that the feature of the appearance of the vehicle matches from the other communication device B of the communication partner (step S 3 5 in FIG. 10; YES). Proceeding to step S 3 6, if they do not match (step S 3 5; NO in FIG. 10), the process returns to step S 10.
[0132] 自通信装置 Aの制御手段 1 0 0は、 一致した車両の外観形状の特徴とペア の相対位置データを記憶手段 1 0 4から読み出す。 そして制御手段 1 0 0は 、 読み出した前記相対位置と、 前記相対位置情報についての有効期限と、 通 信相手の他通信装置 Bの識別 I Dとのデータをペアとして記憶手段 1 0 4に 保存し (図 1 0のステップ S 3 6 ) 、 図 1 0のステップ S 1 0からの処理に 戻る。  [0132] The control means 100 of the own communication device A reads out the matching vehicle exterior shape feature and the pair relative position data from the storage means 104. Then, the control means 100 stores the data of the read relative position, the expiration date for the relative position information, and the identification ID of the other communication device B of the communication partner as a pair in the storage means 104. (Step S 36 in FIG. 10), the process returns to Step S 10 in FIG. 10.
[0133] 実施形態 2によれば、 通信相手の一方である自通信装置 Aは、 自装置の周 辺画像から抽出した、 通信相手の他方である他通信装置 Bの外観形状の特徴 と、 他通信装置 Bとの相対位置とのデータを他通信装置 Bに送信し、 通信相 手の他通信装置 Bで外観形状の特徴を比較し、 その比較結果を受信すること により、 自通信装置 Aでは、 通信相手の他通信装置 Bと、 その外観形状の特 徵と、 その相対位置を同定することができる。 [0133] According to the second embodiment, the own communication device A, which is one of the communication partners, The data of the external shape of the other communication device B, which is the other communication partner, extracted from the side image and the relative position with the other communication device B are transmitted to the other communication device B, and the other communication device of the communication partner By comparing the features of the external shape with B and receiving the comparison result, the own communication device A can identify the other communication device B of the communication partner, the features of the external shape, and its relative position. it can.
[0134] また通信相手の他通信装置 Bにおいては、 受信した相対位置の情報を逆算 することにより、 通信相手の自通信装置 Aと、 その相対位置を同定すること ができる。 [0134] In addition, the other communication device B of the communication partner can identify the communication device A of the communication partner and the relative position by performing reverse calculation of the received information on the relative position.
[0135] 実施形態 2では、 算出した相対位置のデータに有効期限を設定し、 相対位 置データを管理するため、 常に最近の相対位置のデータに基づいて同定でき る。  In the second embodiment, the expiration date is set for the calculated relative position data and the relative position data is managed, so that the identification can always be performed based on the latest relative position data.
[0136] (実施形態 3 )  [0136] (Embodiment 3)
次に、 本発明の実施形態 2に係る通信相手特定装置を変更した例を実施形 態 3として説明する。  Next, an example in which the communication partner specifying apparatus according to Embodiment 2 of the present invention is changed will be described as Embodiment 3.
[0137] 実施形態 3の通信相手特定装置は図 1 3に示すように、 基本的構成として 図 7に示す回路構成を有し、 図 7に示す回路構成に、 外観特徴選択手段 1 8 を追加した回路構成として構築したものである。  As shown in FIG. 13, the communication partner specifying device of Embodiment 3 has the circuit configuration shown in FIG. 7 as a basic configuration, and the appearance feature selection means 1 8 is added to the circuit configuration shown in FIG. It is constructed as a circuit configuration.
[0138] 制御手段 1 0と、 通信手段 1 1 と、 外観特徴抽出手段 1 2と、 外観特徴比 較手段 1 3と、 記憶手段 1 4と、 撮影手段 1 5と、 撮影領域検出手段 1 6と 、 相対位置算出手段 1 7と、 外観特徴選択手段 1 8との相互間は、 システム バスによって接続されている。 図 1 3に示す通信相手特定装置は図 1 2に示 すように、 自装置と、 通信相手方となる他装置とにそれぞれ搭載される。 図 1 2において、 点線で示すブロックは、 通信相手を特定する際に休止状態で あることを示しており、 実線で示すブロックは動作状態であることを示して いる。  Control unit 10, communication unit 1 1, appearance feature extraction unit 1 2, appearance feature comparison unit 1 3, storage unit 1 4, imaging unit 1 5, imaging region detection unit 1 6 The relative position calculation means 17 and the appearance feature selection means 18 are connected by a system bus. As shown in Fig. 12, the communication partner identification device shown in Fig. 13 is installed in its own device and in the other device as the communication partner. In Fig. 12, the block indicated by the dotted line indicates that the communication partner is in a suspended state, and the block indicated by a solid line indicates that it is in an operating state.
[0139] 図 1 2に示す通信相手特定装置 Aに含まれる制御手段 1 0 0、 通信相手特 定装置 Bに含まれる制御手段 2 0 0及び通信相手特定装置 Cに含まれる制御 手段 3 0 0は、 図 1 3の制御手段 1 0に対応する。 図 1 2に示す通信相手特 定装置 Aに含まれる通信手段 1 0 1、 通信相手特定装置 Bに含まれる通信手 段 2 0 1及び通信相手特定装置 Cに含まれる通信手段 3 0 1は、 図 1 3の通 信手段 1 1に対応する。 図 1 2に示す通信相手特定装置 Aに含まれる外観特 徵抽出手段 1 0 2、 通信相手特定装置 Bに含まれる外観特徴抽出手段 2 0 2 及び通信相手特定装置 Cに含まれる外観特徴抽出手段 3 0 2は、 図 1 3の外 観特徴抽出手段 1 2に対応する。 図 1 2に示す通信相手特定装置 Aに含まれ る外観特徴比較手段 1 0 3、 通信相手特定装置 Bに含まれる外観特徴比較手 段 2 0 3及び通信相手特定装置 Cに含まれる外観特徴比較手段 3 0 3は、 図Control means 1 0 0 included in communication partner identification apparatus A shown in FIG. 12, control means 2 0 0 included in communication partner identification apparatus B, and control means 3 0 0 included in communication partner identification apparatus C Corresponds to the control means 10 of FIG. Figure 12 Communication partner characteristics The communication means 1 0 1 included in the fixed device A, the communication means 2 0 1 included in the communication partner identification device B, and the communication means 3 0 1 included in the communication partner identification device C are the communication means 1 in FIG. Corresponds to 1. Appearance feature extraction means 1 0 2 included in communication partner identification device A shown in FIG. 12 Appearance feature extraction means 2 0 2 included in communication partner identification device B and appearance feature extraction means included in communication partner identification device C 3 0 2 corresponds to the appearance feature extraction means 12 in FIG. Appearance feature comparison means included in communication partner identification device A shown in Fig. 1 2, appearance feature comparison means included in communication partner identification device B 2 0 3 and comparison of appearance features included in communication partner identification device C Means 3 0 3
1 3の外観特徴比較手段 1 3に対応する。 図 1 2に示す通信相手特定装置 A に含まれる記憶手段 1 0 4、 通信相手特定装置 Bに含まれる記憶手段 2 0 4 及び通信相手特定装置 Cに含まれる記憶手段 3 0 4は、 図 1 3の記憶手段 1Corresponds to 1 3 of appearance feature comparison means 1 3. The storage means 10 04 included in the communication partner identification device A shown in FIG. 12, the storage means 20 04 included in the communication partner identification device B, and the storage means 30 04 included in the communication partner identification device C are shown in FIG. 3 storage means 1
4に対応する。 図 1 2に示す通信相手特定装置 Aに含まれる撮影手段 1 0 5 、 通信相手特定装置 Bに含まれる撮影手段 2 0 5及び通信相手特定装置 Cに 含まれる撮影手段 3 0 5は、 図 1 3の撮影手段 1 5に対応する。 図 1 2に示 す通信相手特定装置 Aに含まれる撮影領域検出手段 1 0 6、 通信相手特定装 置 Bに含まれる撮影領域検出手段 2 0 6及び通信相手特定装置 Cに含まれる 撮影領域検出手段 3 0 6は、 図 1 3の撮影領域検出手段 1 6に対応する。 図Corresponds to 4. The image capturing means 10 05 included in the communication partner specifying apparatus A shown in FIG. 12, the image capturing means 20 05 included in the communication partner specifying apparatus B, and the image capturing means 30 05 included in the communication partner specifying apparatus C are shown in FIG. Corresponds to 3 shooting means 1 5. Imaging area detection means 1 0 6 included in communication partner identification device A shown in FIG. 12, imaging area detection means 2 0 6 included in communication partner identification apparatus B and imaging area detection included in communication partner identification apparatus C Means 30 06 corresponds to the imaging region detection means 16 in FIG. Figure
1 2に示す通信相手特定装置 Aに含まれる相対位置算出手段 1 0 7、 通信相 手特定装置 Bに含まれる相対位置算出手段 2 0 7及び通信相手特定装置 Cに 含まれる相対位置算出手段 3 0 7は、 図 1 3の相対位置算出手段 1 7に対応 する。 図 1 2に示す通信相手特定装置 Aに含まれる外観特徴選択手段 1 0 8 、 通信相手特定装置 Bに含まれる外観特徴選択手段 2 0 8及び通信相手特定 装置 Cに含まれる外観特徴選択手段 3 0 8は、 図 1 3の外観特徴選択手段 11 Relative position calculation means 10 included in the communication partner identification device A shown in 1 2, Relative position calculation means 20 07 included in the communication partner identification device B and Relative position calculation means 3 included in the communication partner identification device C 3 0 7 corresponds to the relative position calculation means 17 in FIG. Appearance feature selection means 1 0 8 included in communication partner identification device A shown in FIG. 12 Appearance feature selection means 2 0 8 included in communication partner identification device B and appearance feature selection means 3 included in communication partner identification device C 3 0 8 is an appearance feature selection means 1 in FIG.
8に対応する。 Corresponds to 8.
[0140] 以下では、 図 7と図 1 3の構成の差分について説明する。 記憶手段 1 4は 、 自装置を周囲の任意の相対位置から撮影した画像から抽出した自装置の外 観形状の特徴を前記相対位置毎に分類して予め保存する。  [0140] The difference between the configurations of Fig. 7 and Fig. 13 will be described below. The storage means 14 classifies and stores in advance the features of the appearance shape of the own apparatus extracted from the image obtained by photographing the own apparatus from an arbitrary relative position around the apparatus for each relative position.
[0141 ] 外観特徴選択手段 1 8は、 通信手段 1 1が受信した相対位置の情報をキー として記憶手段 1 4を検索し、 前記相対位置に対応する他装置の外観形状の 特徴を選択し、 外観特徴比較手段 1 3に送信する。 [0141] Appearance feature selection means 1 8 is the key for the relative position information received by communication means 1 1 The storage means 14 is searched for, and the feature of the external shape of the other device corresponding to the relative position is selected and transmitted to the external feature comparison means 13.
[0142] なお、 図 1 3に示す実施形態 3に係る通信相手特定装置はハードウェアと して構築したが、 これに限られるものではなく、 図 1 3に示す通信相手特定 装置を構成するコンピュータに通信相手特定用プログラムを組み込み、 コン ピュータの C P Uにより前記通信相手特定用プログラムを読み出して実行す ることにより、 コンピュータに、 撮影領域検出手段 1 6、 制御手段 1 0と、 外観特徴抽出手段 1 2と、 外観特徴比較手段 1 3と、 相対位置算出手段 1 7 と、 外観特徴選択手段 1 8の機能を実行させるように構築してもよいもので [0142] The communication partner specifying apparatus according to the third embodiment shown in FIG. 13 is constructed as hardware, but is not limited to this, and the computer constituting the communication partner specifying apparatus shown in FIG. The communication partner specifying program is installed in the computer, and the computer program for reading the communication partner is read out and executed by the computer's CPU, so that the imaging region detecting means 16, the control means 10, and the appearance feature extracting means 1 2, appearance feature comparison means 1 3, relative position calculation means 1 7 and appearance feature selection means 1 8.
[0143] 次に、 図 1 3に示す通信相手特定装置を、 自他装置としての車両にそれぞ れ搭載して、 自他装置に搭載した通信特定装置の相互間の相対位置を同定す る通信相手特定方法の一例について、 図 8、 図 1 2及び図 1 4を用いて説明 する。 [0143] Next, the communication partner identification device shown in Fig. 13 is mounted on each of the vehicles as the own and other devices, and the relative positions of the communication identification devices mounted on the own and other devices are identified. An example of the communication partner identification method will be described with reference to FIG. 8, FIG. 12, and FIG.
[0144] 図 8に示すように、 実施形態 3では、 他装置が、 自装置の真後ろ、 斜め後 方、 斜め前方に存在する場合にも、 通信相手を特定する。 図 8において、 自 装置としての車両に搭載した図 1 3の通信相手特定装置を自通信装置 Α、 真 後ろを走行する車両に搭載した図 1 3の通信相手特定装置を通信相手の他通 信装置 Βとする。 図 8においては、 自通信装置 Αの周囲には、 通信相手の他 通信装置 B以外に、 斜め後方を走行する車両、 或いは斜め前方を走行する車 両に搭載された図 1 3の通信相手特定装置 C , D , Eが存在している。  [0144] As shown in Fig. 8, in the third embodiment, the communication partner is specified even when another device exists behind, diagonally rearward, or diagonally forward of the own device. In Fig. 8, the communication partner identification device of Fig. 13 installed in the vehicle as its own device is the own communication device Α, and the communication partner identification device of Fig. 13 installed in the vehicle traveling directly behind is the other communication of the communication partner. Device 装置. In Fig. 8, in the vicinity of the communication device Α, in addition to the communication partner B, in addition to the communication device B, the communication partner identification shown in Fig. 13 mounted on a vehicle traveling diagonally rearward or a vehicle traveling diagonally forward Devices C, D and E are present.
[0145] 図 1 3に示す通信相手特定装置を利用した通信相手特定方法では、 自通信 装置 Aで行われる処理手順は、 実施形態 2の場合と同様であるため、 以下で は、 自通信装置 Aの通信相手となる他通信装置 Bの処理手順について、 図 1 4を用いて詳細に説明する。  [0145] In the communication partner identification method using the communication partner identification device shown in Fig. 1 3, the processing procedure performed in the own communication device A is the same as in the second embodiment. The processing procedure of the other communication device B that is the communication partner of A will be described in detail with reference to FIG.
[0146] 他通信装置 Bの制御手段 2 0 0は、 自通信装置 Aから通信手段 2 0 1を介 して、 自通信装置 Aの周囲に位置する車両の外観形状の特徴と相対位置との ペアデータと、 自通信装置 Aの通信手段 1 0 1を識別する識別 I Dのデータ を受信する (図 1 4のステップ S 4 0 ) 。 他通信装置 Bの制御手段 2 0 0は 、 受信した車両の外観形状の特徴と相対位置とのペアデータと、 送られてき た車両情報の数 (n ) のデータを記憶手段 2 0 4に保存する (図 1 4のステ ップ S 4 1 ) 。 [0146] The control means 20 0 of the other communication device B uses the communication device 2 0 1 through the communication means 2 0 1 to determine the feature of the external shape of the vehicle located around the communication device A and the relative position. Pair data and identification ID data for identifying communication means 1 0 1 of own communication device A (Step S 4 0 in FIG. 14). The control means 2 0 0 of the other communication device B stores the received pair data of the feature of the external shape of the vehicle and the relative position and the data of the number of vehicle information sent ( n ) in the storage means 2 0 4 (Step S 4 1 in Fig. 14).
[0147] 他通信装置 Bの制御手段 2 0 0は、 受信した車両毎に以下の処理を行う ( 図 1 4のステップ S 4 3〜ステップ S 4 7 ) 。  The control means 2 0 0 of the other communication device B performs the following processing for each received vehicle (step S 4 3 to step S 4 7 in FIG. 14).
[0148] 先ず、 外観特徴選択手段 2 0 8は、 記憶手段 2 0 4に保存された i番目の 他通信装置に関する相対位置の情報をキーとして、 記憶手段 2 0 4の記憶情 報を検索し、 前記相対位置に対応する他通信装置 Bの外観形状の特徴を選択 し、 その選択した外観形状の特徴を記憶手段 2 0 4から読み出す (図 1 4の ステップ S 4 5 ) 。 外観特徴選択手段 2 0 8は、 読み出した外観形状の特徴 を外観特徴比較手段 2 0 3に送信する。  First, the appearance feature selection means 2 0 8 searches the storage information in the storage means 2 0 4 using the relative position information related to the i-th other communication device stored in the storage means 2 0 4 as a key. Then, the feature of the external shape of the other communication device B corresponding to the relative position is selected, and the feature of the selected external shape is read from the storage means 20 4 (step S 45 in FIG. 14). The appearance feature selection means 20 8 transmits the feature of the read appearance shape to the appearance feature comparison means 20 3.
[0149] 外観特徴比較手段 2 0 3は、 前記外観特徴選択手段 2 0 8が選択した他通 信装置 Bの外観形状の特徴と、 記憶手段 2 0 4に保存された i番目の他通信 装置 Bの外観形状の特徴とを比較する (図 1 4のステップ S 4 6 ) 。  [0149] Appearance feature comparison means 2 0 3 is the external shape feature of other communication device B selected by appearance feature selection means 2 0 8 and the i-th other communication device stored in storage means 2 0 4 The characteristics of the external shape of B are compared (Step S 4 6 in FIG. 14).
[0150] 図 1 4のステップ S 4 7において、 外観特徴比較手段 2 0 3が一致した車 両の外観形状の特徴があると判断した場合 (ステップ S 4 7 ; Y E S ) 、 処 理を図 1 4のステップ S 4 8に進める。  [0150] If it is determined in step S4 7 in Fig. 14 that the appearance feature comparison means 2 0 3 has a matching appearance feature (step S4 7; YES), the process is shown in Fig. 1. Proceed to step S4 8 in step 4.
[0151 ] 外観特徴比較手段 2 0 3が車両の外観形状の特徴が一致しないと判断した 場合、 処理を図 1 4のステップ S 4 3に戻し、 受信した全ての車両について 、 上記処理を行う。 そして、 受信した全ての車両について、 外観特徴比較手 段 2 0 3が車両の外観形状の特徴が一致しないと判断した場合 (図 1 4のス テツプ S 4 3 ; N O ) 、 処理を図 1 4のステップ S 5 0に進める。  [0151] When the appearance feature comparison means 20 3 determines that the features of the appearance shape of the vehicle do not match, the process returns to step S 43 in FIG. 14, and the above process is performed for all the received vehicles. If the appearance feature comparison unit 20 3 determines that the features of the appearance of the vehicle do not match for all the received vehicles (step S 4 3 in FIG. 14; NO), the processing is performed as shown in FIG. Proceed to step S 5 0.
[0152] 図 1 4のステップ S 4 8において、 制御装置 2 0 0は、 一致した外観形状 の特徴とペアになっている相対位置に関する情報を記憶手段 2 0 4から読み 出す。 前記相対位置は自通信装置 Aからみた他通信装置 Bの位置であること から、 他通信装置 Bの制御手段 2 0 0は、 前記相対位置に関する情報の逆算 を行い、 他通信装置 Bからみた自通信装置 Aの相対位置を算出する。 [0153] 図 1 4のステップ S 4 9において、 他通信装置 Bの制御手段 2 0 0は、 通 信相手の自通信装置 Aを識別する識別 I Dと、 前記算出した相対位置と、 前 記算出した相対位置データの有効期限とのペアを記憶手段 2 0 4に保存する [0152] In step S48 of Fig. 14, the control device 200 reads out information on the relative position paired with the matched feature of the external shape from the storage means 204. Since the relative position is the position of the other communication device B as seen from the own communication device A, the control means 20 0 of the other communication device B performs the reverse calculation of the information related to the relative position and the own communication device B as seen from the other communication device B. Calculate the relative position of communication device A. In step S 49 of FIG. 14, the control means 20 00 of the other communication device B uses the identification ID for identifying the communication device A of the communication partner, the calculated relative position, and the above calculation. The pair of the relative position data with the expiration date stored in the storage means 2 0 4
[0154] ところで、 通信相手の相対位置は時間の経過と共に変化し、 任意の時点で 推定された通信相手の相対位置と、 実際の通信相手の相対位置との誤差は大 きくなるため、 制御手段 2 0 0は、 前記相対位置の情報の有効期限に対応さ せて適当な数 (例えば 1 0 ) を設定し、 相対位置が更新されない場合、 適当 な時間 (例えば 1 O O m s ) 毎に前記数 (1 0 ) をデクリメントし、 前記数 力 ' Ο Ίこなるまでデクリメン卜する。 制御手段 2 0 0は、 デクリメントして 有効期限が 0となった相対位置の情報を、 本通信相手特定方法を用いて順次 更新する。 [0154] By the way, the relative position of the communication partner changes with time, and the error between the relative position of the communication partner estimated at an arbitrary time and the actual relative position of the communication partner becomes large. 2 0 0 is set to an appropriate number (for example, 10) corresponding to the expiration date of the relative position information. If the relative position is not updated, the number is set for each appropriate time (for example, 1 OO ms). Decrement (1 0) and decrement until the numerical value is reduced. The control means 20 0 sequentially updates the information on the relative position that has been decremented and the expiration date becomes 0 using this communication partner identification method.
[0155] 図 1 4のステップ S 5 0において、 制御手段 2 0 0は、 図 1 4のステップ S 4 7で取得した車両の外観形状の特徴を比較した結果と、 他通信装置 Βの 識別 I Dを、 通信手段 2 0 1を介して自通信装置 Αに送信する。  [0155] In step S50 of Fig. 14, the control means 20 00 compares the result of comparison of the vehicle exterior shape characteristics acquired in step S47 of Fig. 14 with the identification ID of the other communication device 装置. Is transmitted to the own communication device し て via the communication means 2 0 1.
[0156] 実施形態 3によれば、 自通信装置 Aは、 自装置の周辺画像から抽出した他 通信装置の外観形状の特徴に関する情報と、 その相対位置に関する情報を通 信相手の他通信装置 Bに送信し、 他通信装置 Bは、 様々な相対位置から撮影 された画像から抽出した外観形状の特徴を、 相対位置毎に分類して予め保持 し、 自通信装置 Aより受信した相対位置のデータを用いて、 他通信装置 Bの 外観形状の特徴を選択し、 前記選択された外観形状の特徴と、 自通信装置 A より受信した外観形状の特徴とを比較することにより、 自通信装置 Aは, 任 意の位置関係にある通信相手の他通信装置 Bとその外観形状の特徴とその相 対位置を同定することができる。  [0156] According to the third embodiment, the own communication device A communicates with the communication partner's other communication device B the information regarding the external shape feature of the other communication device extracted from the peripheral image of the own device and the information regarding the relative position. The other communication device B classifies the features of the external shape extracted from images taken from various relative positions, stores them in advance for each relative position, and stores the relative position data received from its own communication device A. Is used to select the feature of the external shape of the other communication device B and compare the feature of the selected external shape with the feature of the external shape received from the own communication device A. Therefore, it is possible to identify the characteristics of the other communication device B and its external shape and its relative position.
[0157] また、 通信相手の他通信装置 Bにおいては、 受信した相対位置を逆算する ことにより、 任意の位置関係にある通信相手の自通信装置 Aとその相対位置 を同定することができる。  [0157] In addition, the other communication device B of the communication partner can identify the communication device A of the communication partner in an arbitrary positional relationship and its relative position by calculating the received relative position backward.
[0158] (実施形態 4 ) 次に、 本発明の実施形態 3に係る通信相手特定装置を変更した例を実施形 態 4として説明する。 [0158] (Embodiment 4) Next, an example in which the communication partner identifying apparatus according to Embodiment 3 of the present invention is changed will be described as Embodiment 4.
[0159] 本発明の実施形態 4に係る通信相手特定装置は図 1 6に示すように、 基本 的構成として図 1 3に示す回路構成を有し、 図 1 3に示す回路構成に、 通信 対象選択手段 1 9を追加した回路構成として構築したものである。  As shown in FIG. 16, the communication partner specifying apparatus according to Embodiment 4 of the present invention has the circuit configuration shown in FIG. 13 as a basic configuration, and the circuit configuration shown in FIG. This is constructed as a circuit configuration to which selection means 19 is added.
[0160] 制御手段 1 0と、 通信手段 1 1 と、 外観特徴抽出手段 1 2と、 外観特徴比 較手段 1 3と、 記憶手段 1 4と、 撮影手段 1 5と、 撮影領域検出手段 1 6と 、 相対位置算出手段 1 7と、 外観特徴選択手段 1 8と、 通信対象選択手段 1 9との相互間は、 システムバスによって接続されている。 図 1 6に示す通信 相手特定装置は図 1 5に示すように、 自装置と、 通信相手方となる他装置と にそれぞれ搭載される。 図 1 5において、 点線で示すブロックは、 通信相手 を特定する際に休止状態であることを示しており、 実線で示すプロックは動 作状態であることを示している。  [0160] Control means 1 0, communication means 1 1, appearance feature extraction means 1 2, appearance feature comparison means 1 3, storage means 1 4, imaging means 1 5, imaging area detection means 1 6 The relative position calculation means 17, the appearance feature selection means 18, and the communication target selection means 19 are connected by a system bus. As shown in Fig. 15, the communication partner identification device shown in Fig. 16 is installed in its own device and in the other device that is the communication partner. In Fig. 15, the block indicated by the dotted line indicates that the communication partner is in the idle state, and the block indicated by the solid line indicates that it is in the active state.
[0161 ] 図 1 5に示す通信相手特定装置 Aに含まれる制御手段 1 0 0、 通信相手特 定装置 Bに含まれる制御手段 2 0 0及び通信相手特定装置 Cに含まれる制御 手段 3 0 0は、 図 1 6の制御手段 1 0に対応する。 図 1 5に示す通信相手特 定装置 Aに含まれる通信手段 1 0 1、 通信相手特定装置 Bに含まれる通信手 段 2 0 1及び通信相手特定装置 Cに含まれる通信手段 3 0 1は、 図 1 6の通 信手段 1 1に対応する。 図 1 5に示す通信相手特定装置 Aに含まれる外観特 徵抽出手段 1 0 2、 通信相手特定装置 Bに含まれる外観特徴抽出手段 2 0 2 及び通信相手特定装置 Cに含まれる外観特徴抽出手段 3 0 2は、 図 1 6の外 観特徴抽出手段 1 2に対応する。 図 1 5に示す通信相手特定装置 Aに含まれ る外観特徴比較手段 1 0 3、 通信相手特定装置 Bに含まれる外観特徴比較手 段 2 0 3及び通信相手特定装置 Cに含まれる外観特徴比較手段 3 0 3は、 図 1 6の外観特徴比較手段 1 3に対応する。 図 1 5に示す通信相手特定装置 A に含まれる記憶手段 1 0 4、 通信相手特定装置 Bに含まれる記憶手段 2 0 4 及び通信相手特定装置 Cに含まれる記憶手段 3 0 4は、 図 1 6の記憶手段 1 4に対応する。 図 1 5に示す通信相手特定装置 Aに含まれる撮影手段 1 0 5 、 通信相手特定装置 Bに含まれる撮影手段 2 0 5及び通信相手特定装置 Cに 含まれる撮影手段 3 0 5は、 図 1 6の撮影手段 1 5に対応する。 図 1 5に示 す通信相手特定装置 Aに含まれる撮影領域検出手段 1 0 6、 通信相手特定装 置 Bに含まれる撮影領域検出手段 2 0 6及び通信相手特定装置 Cに含まれる 撮影領域検出手段 3 0 6は、 図 1 6の撮影領域検出手段 1 6に対応する。 図 1 5に示す通信相手特定装置 Aに含まれる相対位置算出手段 1 0 7、 通信相 手特定装置 Bに含まれる相対位置算出手段 2 0 7及び通信相手特定装置 Cに 含まれる相対位置算出手段 3 0 7は、 図 1 6の相対位置算出手段 1 7に対応 する。 図 1 5に示す通信相手特定装置 Aに含まれる外観特徴選択手段 1 0 8 、 通信相手特定装置 Bに含まれる外観特徴選択手段 2 0 8及び通信相手特定 装置 Cに含まれる外観特徴選択手段 3 0 8は、 図 1 6の外観特徴選択手段 1 8に対応する。 図 1 5に示す通信相手特定装置 Aに含まれる通信対象選択手 段 1 0 9、 通信相手特定装置 Bに含まれる通信対象選択手段 2 0 9及び通信 相手特定装置 Cに含まれる通信対象選択手段 3 0 9は、 図 1 6の通信対象選 択手段 1 9に対応する。 [0161] Control means 1 0 0 included in communication partner specifying device A shown in FIG. 15, control means 2 0 0 included in communication partner specifying device B, and control means 3 0 0 included in communication partner specifying device C Corresponds to the control means 10 of FIG. The communication means 1 0 1 included in the communication partner specifying device A shown in FIG. 15, the communication means 2 0 1 included in the communication partner specifying device B, and the communication means 3 0 1 included in the communication partner specifying device C are: Corresponds to communication means 1 1 in Fig. 6. Appearance feature extraction means 1 0 2 included in communication partner identification device A shown in FIG. 15 Appearance feature extraction means 2 0 2 included in communication partner identification device B and appearance feature extraction means included in communication partner identification device C 3 0 2 corresponds to the appearance feature extraction means 12 in FIG. Appearance feature comparison means 1 0 3 included in communication partner identification device A shown in Fig. 15 Appearance feature comparison means included in communication partner identification device B 2 0 3 and appearance feature comparison included in communication partner identification device C Means 3 0 3 corresponds to appearance feature comparison means 1 3 of FIG. The storage means 10 04 included in the communication partner specifying device A shown in FIG. 15, the storage means 20 04 included in the communication partner specifying device B, and the storage means 30 04 included in the communication partner specifying device C are shown in FIG. Corresponds to 6 storage means 1 4. Image capturing means included in communication partner identifying apparatus A shown in FIG. The photographing means 20 5 included in the communication partner identification device B and the photographing means 30 05 included in the communication partner identification device C correspond to the photographing means 15 in FIG. Imaging area detection means 1 0 6 included in communication partner identification apparatus A shown in FIG. 15, imaging area detection means 2 0 6 included in communication partner identification apparatus B and imaging area detection included in communication partner identification apparatus C Means 3 06 corresponds to the imaging region detection means 16 of FIG. Relative position calculation means 10 included in communication partner identification device A shown in FIG. 15, relative position calculation means 20 07 included in communication partner identification device B, and relative position calculation means included in communication partner identification device C 3 0 7 corresponds to the relative position calculation means 1 7 in FIG. Appearance feature selection means 1 0 8 included in communication partner specifying device A shown in FIG. 15 Appearance feature selection means 2 0 8 included in communication partner specifying device B and appearance feature selection means 3 included in communication partner specifying device C 0 8 corresponds to the appearance feature selection means 18 in FIG. Communication target selection means 1 0 9 included in communication partner identification device A shown in FIG. 15, communication target selection means 2 0 9 included in communication partner identification device B, and communication target selection means included in communication partner identification device C 3 0 9 corresponds to the communication target selection means 19 in FIG.
[0162] 以下では、 図 1 3と図 1 6の構成の差分について説明する。 通信対象選択 手段 1 9は、 撮影領域検出手段 1 6で検出された他通信装置の存在する領域 から、 相対位置算出手段 1 7で算出された自装置との相対位置のデータに基 づき、 通信相手を選択し、 その選択した通信相手について、 記憶手段 1 4に 保存された、 外観特徴抽出手段 1 2で抽出された外観形状の特徴と、 相対位 置算出手段 1 7で算出された相対位置とのペアを通信手段 1 1に送信する。  In the following, the difference between the configurations of FIGS. 13 and 16 will be described. The communication target selecting means 19 communicates based on the relative position data calculated by the relative position calculating means 17 based on the relative position data calculated by the relative position calculating means 17 from the area where the other communication device detected by the imaging area detecting means 16 exists. For the selected communication partner, the feature of the appearance shape extracted by the appearance feature extraction means 1 2 and the relative position calculated by the relative position calculation means 1 7 are stored in the storage means 14. A pair of and is transmitted to the communication means 1 1.
[0163] 通信装置 1 1は、 前記通信対象選択手段 1 9で選択された外観形状の特徴 と相対位置とのペアと、 自装置の通信装置 1 1を識別する識別 I Dとを周囲 の車両にブロードキャス卜する。  [0163] The communication device 1 1 uses a pair of the feature of the external shape selected by the communication target selection means 19 and the relative position, and an identification ID for identifying the communication device 1 1 of the own device to surrounding vehicles. Make it broadcast.
[0164] なお、 図 1 6に示す実施形態 4に係る通信相手特定装置はハードウェアと して構築したが、 これに限られるものではなく、 図 1 6に示す通信相手特定 装置を構成するコンピュータに通信相手特定用プログラムを組み込み、 コン ピュータの C P Uにより前記通信相手特定用プログラムを読み出して実行す ることにより、 コンピュータに、 撮影領域検出手段 1 6、 制御手段 1 0と、 外観特徴抽出手段 1 2と、 外観特徴比較手段 1 3と、 相対位置算出手段 1 7 と、 外観特徴選択手段 1 8と、 通信対象選択手段 1 9の機能を実行させるよ うに構築してもよいものである。 [0164] Although the communication partner specifying device according to Embodiment 4 shown in FIG. 16 is constructed as hardware, the present invention is not limited to this, and the computer constituting the communication partner specifying device shown in FIG. The communication partner identification program is installed in the computer, and the communication partner identification program is read and executed by the computer CPU. Thus, the image capturing area detecting means 1 6, the control means 10, the appearance feature extraction means 1 2, the appearance feature comparison means 1 3, the relative position calculation means 1 7, and the appearance feature selection means 1 8 The communication target selection means 19 may be configured to execute the function.
[0165] 次に、 図 1 6に示す通信相手特定装置を、 自他装置としての車両にそれぞ れ搭載して、 自他装置に搭載した通信特定装置の相互間の相対位置を同定す る通信相手特定方法の一例について、 図 8、 図 1 5及び図 1 7を用いて説明 する。 [0165] Next, the communication partner identification device shown in Fig. 16 is mounted on the vehicle as the own / other device, and the relative position between the communication identification devices mounted on the own / other device is identified. An example of the communication partner identification method will be described with reference to FIG. 8, FIG. 15 and FIG.
[0166] 図 8に示すように、 実施形態 4では、 他装置が、 自装置の真後ろ、 斜め後 方、 斜め前方に存在する場合にも、 通信相手を特定する。 図 8において、 自 装置としての車両に搭載した図 1 6の通信相手特定装置を自通信装置 A、 真 後ろを走行する車両に搭載した図 1 6の通信相手特定装置を通信相手の他通 信装置 Bする。 図 8においては、 自通信装置 Aの周囲には、 通信相手の他通 信装置 B以外に、 斜め後方を走行する車両、 或いは斜め前方を走行する車両 に搭載された図 1 6の通信相手特定装置 C , D , Eが存在している。  [0166] As shown in Fig. 8, in the fourth embodiment, the communication partner is specified even when another device exists behind, diagonally rearward, or diagonally forward of the own device. In Fig. 8, the communication partner identification device of Fig. 16 installed in the vehicle as its own device is the own communication device A, and the communication partner identification device of Fig. 16 installed in the vehicle traveling directly behind is the other communication of the communication partner. Device B. In FIG. 8, in addition to the other communication device B of the communication partner, in the vicinity of the own communication device A, the communication partner identification shown in FIG. 16 is installed in a vehicle traveling diagonally backward or a vehicle traveling diagonally forward. Devices C, D and E are present.
[0167] 図 1 6に示す通信相手特定装置を利用した通信相手特定方法では、 自通信 装置 Aで行われる処理手順は、 実施形態 2の場合と同様であるため、 以下で は、 自通信装置 Aの通信相手となる他通信装置 Bの処理手順について、 図 1 7を用いて詳細に説明する。  [0167] In the communication partner identification method using the communication partner identification device shown in Fig. 16, the processing procedure performed in the own communication device A is the same as in the case of the second embodiment. The processing procedure of the other communication device B that is the communication partner of A will be described in detail with reference to FIG.
[0168] 図 1 7のステップ S 6 0からステップ S 7 2は、 図 1 0のステップ S 1 0 からステップ S 2 2と同様であるため、 ステップ S 7 3からステップ S 7 7 について説明する。  [0168] Steps S60 to S72 in Fig. 17 are the same as steps S10 to S22 in Fig. 10, and therefore steps S73 to S77 will be described.
[0169] 図 1 7のステップ S 7 3において、 自通信装置 Aの通信対象選択手段 1 0 9では、 制御手段 1 0 0を介して外部装置から指定された他通信装置 Bの相 対位置条件に基づき、 記憶手段 1 0 4に保存された全ての周辺車両の相対位 置から、 相対位置条件に合致する車両の相対位置と、 そのペアである車両の 外観形状の特徴を選択する。  [0169] In step S73 of Fig. 17, communication target selection means 1 0 9 of own communication apparatus A uses relative position condition of other communication apparatus B designated by the external apparatus via control means 1 0 0 Based on the above, from the relative positions of all the surrounding vehicles stored in the storage means 104, the relative position of the vehicle meeting the relative position condition and the feature of the external shape of the paired vehicle are selected.
[0170] 次に、 自通信装置 Aの通信手段 1 0 1は、 前記相対位置と前記車両特徴の ペアデータと、 自通信装置 Aの通信手段 1 0 1を識別する識別 I Dのデータ とを、 周辺の他通信装置 B, C, D, Eにブロードキャストする (図 1 7の ステップ S 74) 。 [0170] Next, the communication means 1 0 1 of the self-communication device A determines the relative position and the vehicle feature. The pair data and the identification ID data for identifying the communication means 1001 of the own communication device A are broadcast to other peripheral communication devices B, C, D, and E (step S74 in FIG. 17).
[0171] 次に、 他通信装置 B, C, D, Eにおいて、 図 1 4に示された手順を実行 する (ステップ S 40からステップ S 50) 。  Next, in the other communication devices B, C, D, and E, the procedure shown in FIG. 14 is executed (from step S40 to step S50).
[0172] 最後に、 図 1 7のステップ S 75において、 自通信装置 Aでは、 周辺の他 通信装置 B, C, D, Eにおける外観形状の特徴を比較した比較結果と、 各 通信装置 B, C, D, Eの識別 I Dを通信手段 1 0 1を介して受信する。  [0172] Finally, in step S75 of Fig. 17, the own communication device A compares the comparison results of the features of the external shapes of other peripheral communication devices B, C, D, and E with each communication device B, The identification IDs of C, D, and E are received via the communication means 1 0 1.
[0173] 自通信装置 Aの制御手段 1 00は、 受信した外観形状の特徴の比較結果が —致していた他通信装置 Bの識別 I Dを、 記憶手段 1 04に保存する (図 1 7のステップ S 76、 S 77) 。  [0173] The control unit 100 of the own communication device A stores the identification ID of the other communication device B that the comparison result of the feature of the received external appearance has done in the storage unit 104 (step of Fig. 17). S76, S77).
[0174] 実施形態 4によれば、 自通信装置 Aは、 自装置の周辺画像から抽出した他 通信装置の外観形状の特徴とその相対位置のデータを、 周辺の他通信装置に 向けてブロードキャス卜する。 自通信装置 Aの位置する周辺の他通信装置は 、 様々な相対位置から撮影された画像から抽出した外観形状の特徴を、 相対 位置毎に分類し予め保持し、 自通信装置 Aより受信した相対位置のデータを 用いて、 自装置の外観形状の特徴データを選択し、 その選択された外観形状 の特徴と、 自通信装置 Aより受信した外観形状の特徴を比較し、 比較結果を 自通信装置 Aに送信する。 そして、 自通信装置 Aは、 受信した比較結果に基 づき、 通信した特定の他通信装置 Bと、 その外観形状の特徴と、 その相対位 置を同定することができる。  [0174] According to the fourth embodiment, the own communication device A broadcasts the feature of the external shape of the other communication device extracted from the peripheral image of the own device and the data of the relative position to the other communication device in the periphery. Hesitate. Other communication devices in the vicinity where the own communication device A is located, the features of the external shape extracted from images taken from various relative positions, classified in advance for each relative position, and received by the relative communication device A Using the position data, select the feature data of the external shape of the own device, compare the selected feature of the external shape with the feature of the external shape received from the own communication device A, and compare the comparison result with the own communication device. Send to A. Then, based on the received comparison result, the own communication device A can identify the other communication device B that has communicated, the characteristics of its external shape, and its relative position.
[0175] また、 他通信装置 Bにおいては、 受信した相対位置を逆算することにより 、 任意の位置関係にある通信相手 (自通信装置 A) とその相対位置を同定す ることができる。  In addition, other communication device B can identify the communication partner (self-communication device A) having an arbitrary positional relationship and its relative position by calculating back the received relative position.
[0176] (実施形態 5)  [0176] (Embodiment 5)
実施形態 5に係る通信相手特定装置は図 1 8に示すように、 制御手段 20 と、 通信手段 2 1 と、 外観特徴抽出手段 22と、 外観特徴比較手段 23と、 記憶手段 24を有している。 制御手段 20と、 通信手段 2 1 と、 外観特徵抽 出手段 2 2と、 外観特徴比較手段 2 3と、 記憶手段 2 4との相互間は、 シス テムバスによって接続されている。 As shown in FIG. 18, the communication partner specifying apparatus according to the fifth embodiment includes a control means 20, a communication means 21, an appearance feature extraction means 22, an appearance feature comparison means 23, and a storage means 24. Yes. Control means 20, communication means 2 1 The output means 2 2, the appearance feature comparison means 2 3, and the storage means 2 4 are connected to each other by a system bus.
[0177] 制御手段 2 0は、 通信手段 2 1 と、 外観特徴抽出手段 2 2と、 外観特徴比 較手段 2 3と、 記憶手段 2 4の動作を制御する。 また、 制御手段 2 0は図示 しない外部装置とのインタフヱ_スを備えており、 前記インタフヱ_スを介 して、 自装置周辺の画像及び各種センサ情報などを受信すると共に、 通信相 手の特定結果などのデータを送信する。 また、 制御手段 2 0は、 外観特徴比 較手段 2 3から送信される比較結果に基づいて、 通信相手の他装置を同定す る。 The control means 20 controls the operations of the communication means 21, the appearance feature extraction means 22, the appearance feature comparison means 23, and the storage means 24. Further, the control means 20 has an interface with an external device (not shown), and receives an image of the surroundings of the device and various sensor information via the interface, and specifies a communication partner. Send data such as results. Further, the control means 20 identifies another device of the communication partner based on the comparison result transmitted from the appearance feature comparison means 23.
[0178] 自装置周辺の画像を取り込む外部装置としては、 例えばカメラが用いられ る。 カメラ (外部装置) は、 自装置が自動車である場合、 後方に取り付けら れて、 車両の真後ろ及び斜め後方に走行している他の車両を撮像する。 また 、 カメラは、 車両の周囲を一周して周辺の状況を撮像する姿勢に取り付けら れる。 自装置周辺を撮像するカメラには、 一般的な技術が適用されるため、 その構成及び撮像方法については詳細な説明を省略する。  [0178] For example, a camera is used as an external device that captures an image around the device itself. When the camera (external device) is an automobile, the camera (external device) is attached to the rear, and images other vehicles that are traveling directly behind and obliquely behind the vehicle. In addition, the camera is attached to a posture that goes around the vehicle and images the surrounding situation. Since a general technique is applied to a camera that captures the periphery of its own device, detailed description of its configuration and imaging method is omitted.
[0179] 通信手段 2 1は、 I E E E 8 0 2 . 1 1 b及び U W Bなどの通信規格を用 いて、 抽出された外観形状の特徴のデータや外観形状の特徴の比較結果のデ ータなどを送受信する。  [0179] The communication means 21 uses communication standards such as IEEE 802.11b and UWB to extract extracted appearance shape feature data and appearance shape feature comparison results. Send and receive.
[0180] 外観特徴抽出手段 2 2は、 制御手段 2 0が外部装置から受信した自装置周 辺の画像に写り込んだ他装置の外観形状の特徴を抽出する。 自装置周辺を撮 像した画像には、 他装置ばかりでなく、 不要な背景などが含まれる。 そこで 、 外観特徴抽出手段 2 2は、 画像から対象となる他装置を切り出し、 その切 り出した画像から他装置の外観形状の特徴を抽出する。 画像から対象となる 他装置の特徴を抽出する構成及び方法には、 一般的な技術が適用されるため 、 その構成及び抽出方法については詳細な説明を省略する。 外観特徴抽出手 段 2 2は、 抽出した他装置の外観形状の特徴データを通信手段 2 1に送信す る。  [0180] The appearance feature extraction unit 22 extracts the feature of the external shape of the other device that is reflected in the image around the own device received by the control unit 20 from the external device. Images taken around the device include unnecessary backgrounds as well as other devices. Therefore, the appearance feature extraction means 22 cuts out the target other device from the image, and extracts the feature of the appearance shape of the other device from the cut out image. Since a general technique is applied to a configuration and method for extracting features of another target device from an image, detailed description of the configuration and extraction method is omitted. The appearance feature extraction means 22 sends the extracted feature data of the appearance shape of the other device to the communication means 21.
[0181 ] 外観特徴比較手段 2 3は、 通信手段 2 1で受信した、 通信相手の外観特徴 抽出手段 2 2で抽出された他装置の外観形状の特徴と、 記憶手段 2 4に記憶 された自装置の外観形状の特徴とを比較し、 その比較結果を通信手段 2 1で 通信相手に送信する。 [0181] Appearance feature comparison means 2 3 is the appearance feature of the communication partner received by communication means 2 1 Compare the feature of the external shape of the other device extracted by the extraction means 2 2 with the feature of the external shape of its own device stored in the storage means 24, and send the comparison result to the communication partner via the communication means 21. To do.
[0182] 外観特徴比較手段 2 3は、 通信相手の外観特徴抽出手段 2 2で抽出された 他装置の外観形状の特徴と、 記憶手段 2 4に記憶された自装置の外観形状の 特徴とがー致する場合、 通信相手が撮影した画像に写り込んだ対象物が自装 置であるとの比較結果を、 通信手段 2 1で通信相手に送信する。  [0182] The appearance feature comparison means 2 3 includes the appearance shape feature of the other device extracted by the communication partner appearance feature extraction means 2 2 and the appearance shape feature of the own device stored in the storage means 24. -If it matches, the communication means 21 sends the comparison result that the object captured in the image captured by the communication partner is the device itself to the communication partner.
外観特徴比較手段 2 3は、 通信相手の外観特徴抽出手段 2 2で抽出された 他装置の外観形状の特徴と、 記憶手段 2 4に記憶された自装置の外観形状の 特徴とが不一致の場合、 通信相手が撮影した画像に写り込んだ対象物が自装 置でないとの比較結果を、 通信手段 2 1で通信相手に送信する。  Appearance feature comparison means 2 3 is used when the appearance shape feature of the other device extracted by the appearance feature extraction means 2 2 of the communication partner does not match the appearance shape feature of the own device stored in the storage means 24. The communication means 21 sends the comparison result that the object reflected in the image taken by the communication partner is not the device itself to the communication partner.
[0183] なお、 記憶手段 2 4から読み出された自装置の外観形状の特徴と、 外観特 徵抽出手段 2 2から送信された装置の外観形状の特徴との比較には、 一般的 な技術が適用されるため、 その構成及び比較方法については詳細な説明を省 略する。  [0183] It should be noted that a general technique is used to compare the external shape feature of the device read from the storage means 24 and the external shape feature of the device transmitted from the external feature extraction means 22. Therefore, detailed description of the configuration and comparison method is omitted.
[0184] 記憶手段 2 4は、 制御手段 2 0と、 通信手段 2 1 と、 外観特徴抽出手段 2 2と、 外観特徴比較手段 2 3が動作を実行する際に必要な作業記憶領域を提 供する。 また、 記憶手段 2 4は、 自装置の外観形状の特徴のデータを記憶す る。  The storage means 24 provides a working storage area required when the control means 20, the communication means 21, the appearance feature extraction means 22, and the appearance feature comparison means 23 are executed. . Further, the storage means 24 stores data of the feature of the external shape of the own device.
[0185] なお、 図 1 8に示す本発明の実施形態 5に係る通信相手特定装置は、 ハー  Note that the communication partner identifying apparatus according to the fifth embodiment of the present invention shown in FIG.
ドウエアとして構築したが、 これに限られるものではない。 実施形態 5に係 る通信相手特定装置をソフトウエアとして構築してもよいものである。 この 場合、 図 1 8に示す通信相手特定装置を構成するコンピュータに通信相手特 定用プログラムを組み込み、 コンピュータの C P Uにより前記通信相手特定 用プログラムを読み出して実行することにより、 コンピュータに、 制御手段 2 0と、 外観特徴抽出手段 2 2と、 外観特徴比較手段 2 3の機能を実行させ るように構築する。  Although built as software, it is not limited to this. The communication partner specifying apparatus according to the fifth embodiment may be constructed as software. In this case, the communication partner specifying program is installed in the computer constituting the communication partner specifying device shown in FIG. 18, and the computer CPU reads out and executes the communication partner specifying program. It is constructed so that the functions of 0, appearance feature extraction means 2 2 and appearance feature comparison means 23 are executed.
[0186] 次に、 図 1 8に示す通信相手特定装置を用いて、 通信相手を特定する通信 相手特定方法を図 1 9〜図 2 1に基づいて説明する。 [0186] Next, communication for identifying a communication partner using the communication partner identifying apparatus shown in Fig. 18 is performed. The partner identification method will be described with reference to FIGS.
[0187] 実施形態 5では、 自他装置として、 道路を走行する車両を用い、 図 1 8に 示す通信相手特定装置を各車両にそれぞれ搭載して、 自通信装置と通信して いる他通信装置を特定する場合を図 1 9〜図 2 1を用いて説明する。  [0187] In Embodiment 5, a vehicle traveling on a road is used as the own / other device, and the communication partner identifying device shown in FIG. 18 is mounted on each vehicle, and the other communication device communicates with the own communication device. The case of specifying will be described with reference to FIGS.
[0188] 図 1 9に示すように、 実施形態 5では、 1台の他装置が自装置の真後ろに 存在する場合に加えて、 別の他装置が自装置の斜め後方に存在する場合にも 、 通信相手を特定するものである。  [0188] As shown in Fig. 19, in Embodiment 5, in addition to the case where one other device is present behind the own device, another device is present obliquely behind the own device. Identifies the communication partner.
[0189] 自装置に搭載された通信相手特定装置を自通信装置 Aとし、 他装置に搭載 された通信相手特定装置を他通信装置 B , Cとして説明する。  [0189] The communication partner specifying device mounted on the own device will be described as self communication device A, and the communication partner specifying device mounted on the other device will be described as other communication devices B and C.
[0190] 図 1 9において、 自通信装置 Aと他通信装置 Bとの間に通信経路 Rが形成 されているとする。 なお、 他通信装置 Bに代えて、 自通信装置 Aと他通信装 置 Cとの間に通信経路が形成されている場合にも同様である。  In FIG. 19, it is assumed that a communication path R is formed between the own communication device A and the other communication device B. The same applies when a communication path is formed between the own communication device A and the other communication device C instead of the other communication device B.
[0191 ] 自通信装置 Aの通信相手である他通信装置 Bの制御手段 2 0は、 図 1 8に 示す記憶手段 2 4に記憶されている、 自車の外観形状の情報と、 他通信装置 Bと自通信装置 Aを一意に識別する識別 I Dの情報を、 通信手段 2 1を介し て自通信装置 Aに送信する (図 2 0のステップ S 8 0 ) 。  [0191] The control means 20 of the other communication apparatus B that is the communication partner of the own communication apparatus A includes information on the external shape of the own vehicle stored in the storage means 24 shown in FIG. Information of an identification ID for uniquely identifying B and the own communication device A is transmitted to the own communication device A via the communication means 21 (step S 80 in FIG. 20).
[0192] 自通信装置 Aの制御手段 2 0は、 他通信装置 Bから通信手段 2 1を介して 、 他通信装置 Bの外観形状の特徴及び識別 I Dに関する情報を受信すると、 前記外観形状に関する情報を外観特徴比較手段 2 3に提供するとともに、 前 記識別 I Dに関する情報を記憶手段 2 4に保存する (図 2 1のステップ S 8 [0192] When the control means 20 of the own communication device A receives the information about the feature of the external shape and the identification ID of the other communication device B from the other communication device B via the communication means 21, the information about the external shape Is provided to the appearance feature comparison means 23, and information relating to the identification ID is stored in the storage means 24 (see step S8 in FIG. 21).
1 ) o 1) o
[0193] 制御手段 2 0は、 外部装置から自通信装置 Aの周辺を撮影した画像の情報 を取得し、 その画像の情報を外観特徴抽出手段 2 2に送信する (図 2 1のス テツプ S 8 2 ) 。  [0193] The control means 20 obtains information of an image obtained by photographing the periphery of the own communication device A from an external device, and transmits the information of the image to the appearance feature extraction means 22 (step S in Fig. 21). 8 2).
[0194] 外観特徴抽出手段 2 2は記憶手段 2 4の作業記憶領域を使って、 制御手段 2 0より提供された画像から、 該画像に写り込んだ他通信装置 (B又は C ) の外観形状の特徴を抽出し、 その抽出した外観形状の特徴データを外観特徴 比較手段 2 3に送信する (図 2 1のステップ S 8 3 ) 。 [0195] 外観特徴比較手段 2 3は記憶手段 2 4の作業記憶領域を使って、 通信手段 2 1を介して受信した他通信装置 Bの外観形状の特徴に関するデータと、 外 観特徴抽出手段 2 2で抽出した外観形状の特徴との比較を行う (図 2 1のス テツプ S 8 4 ) 。 [0194] The appearance feature extraction means 2 2 uses the working storage area of the storage means 24, and from the image provided by the control means 20 the appearance shape of the other communication device (B or C) reflected in the image The feature data of the extracted external shape is extracted and transmitted to the external feature comparison means 23 (step S 8 3 in FIG. 21). [0195] The appearance feature comparison means 2 3 uses the work storage area of the storage means 24, and receives the data regarding the feature of the appearance shape of the other communication device B received via the communication means 21 and the appearance feature extraction means 2 Comparison is made with the features of the external shape extracted in step 2 (step S84 in Fig. 21).
[0196] 図 2 1のステップ S 8 5において、 制御手段 2 0は、 外観特徴比較手段 2 3における外観形状の特徴との比較が不一致の場合 (ステップ S 8 5 ; N O ) 、 ステップ S 8 1〜ステップ S 8 5までの処理を繰り返して実行する。  [0196] In step S85 in Fig. 21, the control means 20 compares the appearance shape feature in the appearance feature comparison means 23 with a mismatch (step S85; NO), step S81. ~ Repeat the process up to step S85.
[0197] 図 2 1のステップ S 8 5において、 制御手段 2 0は、 外観特徴比較手段 2 3における外観形状の特徴の比較が一致していた場合 (ステップ S 8 5 ; Y E S ) 、 一致した外観形状の特徴を持つ他通信装置 Bと、 通信手段 2 1を介 して受信した通信相手 (他通信装置 B ) の識別 I Dとを対応付ける (図 2 1 のステップ S 8 6 ) 。  [0197] In step S85 of Fig. 21, the control means 20 matches the appearance when the comparison of the features of the appearance shape in the appearance feature comparison means 23 is the same (step S85: YES). The other communication device B having the shape feature is associated with the identification ID of the communication partner (other communication device B) received via the communication means 21 (step S 86 in FIG. 21).
[0198] したがって、 自通信装置 Aの制御手段 2 0は、 自通信装置 Aの通信手段 2  [0198] Therefore, the control means 20 of the own communication device A is the communication means 2 of the own communication device A.
1 と、 通信相手である他通信装置 Bの通信手段 2 1 との間で必要な通信を行 うこととなる。  Necessary communication is performed between 1 and the communication means 2 1 of the other communication device B that is the communication partner.
[01 99] 実施形態 5によれば、 自通信装置の周辺を撮影した画像から抽出した他通 信装置の外観形状の特徴と、 通信相手の他通信装置より受信した該通信相手 の外観形状の特徴とを比較することにより、 前記通信相手と前記他通信装置 を対応付けることができる。  [0199] According to the fifth embodiment, the characteristics of the external shape of the other communication device extracted from the image obtained by photographing the periphery of the own communication device and the external shape of the communication partner received from the other communication device of the communication partner By comparing the characteristics, the communication partner can be associated with the other communication device.
[0200] さらに、 通信相手の他通信装置から該他通信装置の外観形状の特徴に関す る情報を得て他通信装置を特定するため、 自通信装置と通信している他通信 装置を誤認することなく正確に特定することができる。 しかも、 他通信装置 を特定する外観形状の情報を通信相手から取得するため、 自通信装置が保有 すべき情報量を必要最小限に抑えることができる。  [0200] Furthermore, in order to identify other communication devices by obtaining information on the external shape characteristics of the other communication devices from other communication devices of the communication partner, misidentify other communication devices communicating with the own communication device. Can be identified accurately. In addition, since the external shape information for identifying the other communication device is acquired from the communication partner, the amount of information that the own communication device should hold can be minimized.
[0201 ] なお、 以上の実施形態では、 本発明に係る通信相手特定装置を車両に搭載 して、 車両相互間で通信相手を特定する例について説明したが、 これに限ら れるものではない。 本発明に係る通信相手特定装置は、 通信相手を特定する 必要があるものであれば、 車両以外のもの、 例えば公共の交通機関、 或いは 携帯電話などに組み込んで、 ネットワークを使用せずに、 直接通信を行う場 合にも適用できるものであり、 その適用範囲は広範囲となる。 [0201] In the above embodiment, the communication partner specifying device according to the present invention is mounted on a vehicle and the communication partner is specified between vehicles. However, the present invention is not limited to this. The communication partner identification device according to the present invention is not limited to a vehicle, for example, public transportation, or the like, as long as it is necessary to identify the communication partner. It can also be applied to direct communication without using a network by incorporating it in a mobile phone, etc., and its application range is wide.

Claims

請求の範囲 The scope of the claims
[1 ] 自他装置にそれぞれ配置され、 自装置の周囲に位置する任意の他装置から 通信相手となる他装置を特定する通信相手特定装置であって、  [1] A communication partner identification device that is arranged in each of its own devices and identifies other devices as communication partners from any other device located around the own device,
自他装置の相互間で情報の遣り取りを行う通信手段と、  A communication means for exchanging information between itself and other devices;
自装置の周辺に位置する他装置の外観形状を撮影した画像を分析し、 前記 画像から他装置の外観形状の特徴を抽出する外観特徴抽出手段と、  An external feature extraction unit that analyzes an image obtained by capturing an external shape of another device located around the device, and extracts features of the external shape of the other device from the image;
他装置の外観特徴抽出手段で抽出された外観形状情報と、 自装置が保有す る外観形状の特徴とを比較する外観特徴比較手段と、  Appearance feature comparison means for comparing the appearance shape information extracted by the appearance feature extraction means of the other device and the features of the appearance shape possessed by the own device;
他装置の外観特徴比較手段から送信される比較結果に基づいて、 通信相手 の他装置を同定する同定部と、 を有することを特徴とする通信相手特定装置  An identification unit for identifying the other device of the communication partner based on the comparison result transmitted from the appearance feature comparison means of the other device;
[2] 自装置の通信手段は、 自装置の外観特徴抽出手段で抽出した他装置の外観 形状の特徴データを他装置に送信すると共に、 他装置の外観特徴比較手段で の比較結果を受信する、 請求項 1に記載の通信相手特定装置。 [2] The communication means of the own device sends the feature data of the appearance shape of the other device extracted by the appearance feature extraction means of the own device to the other device and receives the comparison result by the appearance feature comparison means of the other device. The communication partner identifying apparatus according to claim 1.
[3] 自装置の周辺を撮影した画像中における他装置の位置データを分析する撮 影領域検出手段と、  [3] An imaging region detection means for analyzing position data of another device in an image obtained by photographing the periphery of the own device;
前記他装置の位置データに基づいて、 自装置に対する他装置の相対位置を 算出する相対位置算出手段を有する、 請求項 1に記載の通信相手特定装置。  The communication partner specifying device according to claim 1, further comprising: a relative position calculating unit that calculates a relative position of the other device with respect to the own device based on position data of the other device.
[4] 前記相対位置の情報をキーとして、 前記相対位置に対応する外観形状の特 徵データを選択し、 その選択した外観形状の特徴データを他装置の外観特徴 抽出手段に送信する外観特徴選択手段を有する、 請求項 3に記載の通信相手 特定装置。  [4] Appearance feature selection that selects feature data of the appearance shape corresponding to the relative position using the information on the relative position as a key, and transmits the feature data of the selected appearance shape to the appearance feature extraction unit of the other device The communication partner specifying device according to claim 3, further comprising means.
[5] 前記相対位置の情報に基づいて通信相手を選択し、 その選択した通信相手 の情報を自装置の通信手段に送信する通信対象選択手段を有する、 請求項 3 に記載の通信相手特定装置。  [5] The communication partner specifying device according to claim 3, further comprising a communication target selecting unit that selects a communication partner based on the information on the relative position and transmits information on the selected communication partner to a communication unit of the own device. .
[6] 自装置の周囲に位置する任意の他装置から通信相手となる他装置を特定す る通信相手特定方法であって、  [6] A communication partner identification method for identifying another device as a communication partner from any other devices located around the device,
自装置の周辺に位置する他装置の外観形状を撮影した画像を分析し、 前記 画像から他装置の外観形状の特徴を抽出する外観特徴抽出ステツプと、 他装置の外観特徴抽出手段で抽出された外観形状情報と、 自装置が保有す る外観形状の特徴とを比較する外観特徴比較ステツプと、 Analyzing an image taken of the external shape of another device located around the device itself, Appearance feature extraction step that extracts the feature of the external shape of the other device from the image, appearance shape information extracted by the external feature extraction means of the other device, and the feature of the external shape that the device owns A comparison step,
他装置の外観特徴比較手段から送信される比較結果に基づいて、 通信相手 の他装置を同定する同定ステツプと、 を実行することを特徴とする通信相手 特定方法。  An identification step for identifying another device of a communication partner based on the comparison result transmitted from the appearance feature comparison means of the other device.
[7] 自装置で抽出した他装置の外観形状の特徴データを他装置に送信すると共 に、 他装置での外観特徴比較結果を受信する、 請求項 6に記載の通信相手特 定方法。  7. The communication partner specifying method according to claim 6, wherein the external device feature data extracted by the own device is transmitted to the other device, and the external feature comparison result of the other device is received.
[8] 自装置の周辺を撮影した画像中における他装置の位置データを分析する撮 影領域検出ステップと、  [8] An imaging region detection step for analyzing position data of another device in an image obtained by photographing the periphery of the device;
前記他装置の位置データに基づいて、 自装置に対する他装置の相対位置を 算出する相対位置算出ステップと、 を実行する、 請求項 6に記載の通信相手 特定方法。  The communication partner identification method according to claim 6, wherein a relative position calculating step of calculating a relative position of the other apparatus with respect to the own apparatus is executed based on position data of the other apparatus.
[9] 前記相対位置の情報をキーとして、 前記相対位置に対応する外観形状の特 徵データを選択し、 その選択した外観形状の特徴データに基づいて、 外観形 状の特徴を比較する、 請求項 8に記載の通信相手特定方法。  [9] Using the information on the relative position as a key, select feature data of the appearance shape corresponding to the relative position, and compare the feature of the appearance shape based on the feature data of the selected appearance shape. Item 9. The communication partner identification method according to item 8.
[10] 前記相対位置の情報に基づいて通信相手を選択し、 その選択した通信相手 に情報を送信する、 請求項 8に記載の通信相手特定方法。 10. The communication partner specifying method according to claim 8, wherein a communication partner is selected based on the relative position information, and the information is transmitted to the selected communication partner.
[1 1 ] 自装置の周囲に位置する任意の他装置から通信相手となる他装置を特定す る通信相手特定装置を構成するコンピュータに、 [1 1] A computer that configures a communication partner identification device that identifies another device that is a communication partner from any other device located around the device itself.
自装置の周辺に位置する他装置の外観形状を撮影した画像を分析し、 前記 画像から他装置の外観形状の特徴を抽出する機能と、  A function of analyzing an image obtained by photographing the external shape of another device located in the periphery of the own device, and extracting a feature of the external shape of the other device from the image;
他装置の外観特徴抽出手段で抽出された外観形状情報と、 自装置が保有す る外観形状の特徴とを比較する機能と、  A function for comparing the external shape information extracted by the external feature extraction means of the other device with the external shape features possessed by the device;
他装置の外観特徴比較手段から送信される比較結果に基づいて、 通信相手 の他装置を同定する機能と、 を実行させることを特徴とする通信相手特定用 プログラム。 A program for identifying a communication partner, which executes a function of identifying another device of a communication partner based on a comparison result transmitted from an appearance feature comparison unit of the other device.
[12] 前記コンピュータに、 [12] In the computer,
自装置で抽出した他装置の外観形状の特徴データを他装置に送信すると共 に、 他装置での外観特徴比較結果を受信する機能と、 を実行させる、 請求項 1 1に記載の通信相手特定用プログラム。  The communication partner identification according to claim 11, wherein the feature data of the appearance shape of the other device extracted by the own device is transmitted to the other device, and the function of receiving the comparison result of the appearance feature of the other device is executed. Program.
[13] 前記コンピュータに、 [13] In the computer,
自装置の周辺を撮影した画像中における他装置の位置データを分析する機 能と、  A function to analyze the position data of other devices in the image taken around the device itself,
前記他装置の位置データに基づいて、 自装置に対する他装置の相対位置を 算出する機能と、 を実行させる、 請求項 1 1に記載の通信相手特定用プログ ラム。  The communication partner specifying program according to claim 11, wherein a function for calculating a relative position of the other device with respect to the own device is executed based on position data of the other device.
[14] 前記コンピュータに、  [14] In the computer,
前記相対位置の情報をキーとして、 前記相対位置に対応する外観形状の特 徵データを選択し、 その選択した外観形状の特徴データに基づいて、 外観形 状の特徴を比較する機能を実行させる、 請求項 1 3に記載の通信相手特定用 プログラム。  Using the information on the relative position as a key, select feature data of the appearance shape corresponding to the relative position, and execute a function of comparing the feature of the appearance shape based on the feature data of the selected appearance shape. The program for identifying a communication partner according to claim 13.
[15] 前記コンピュータに、 [15] In the computer,
前記相対位置の情報に基づいて通信相手を選択し、 その選択した通信相手 に情報を送信する機能を実行させる、 請求項 1 3に記載の通信相手特定用プ ログラム。  The communication partner specifying program according to claim 13, wherein a communication partner is selected based on the relative position information, and a function of transmitting information to the selected communication partner is executed.
[1 6] 自他装置にそれぞれ配置され、 自装置の周囲に位置する任意の他装置から 通信相手となる他装置を特定する通信相手特定装置であって、  [1 6] A communication partner identification device that is arranged in each of its own devices and identifies other devices that are communication partners from any other device located around the own device,
自他装置の相互間で情報の遣り取りを行う通信手段と、  A communication means for exchanging information between itself and other devices;
自装置の周辺に位置する他装置の外観形状を撮影した画像を分析し、 前記 画像から他装置の外観形状の特徴を抽出する外観特徴抽出手段と、  An external feature extraction unit that analyzes an image obtained by capturing an external shape of another device located around the device, and extracts features of the external shape of the other device from the image;
自装置の外観特徴抽出手段で抽出された外観形状情報と、 他装置から取得 した外観形状の特徴とを比較する外観特徴比較手段と、  Appearance feature comparison means for comparing the appearance shape information extracted by the appearance feature extraction means of the own device with the features of the appearance shape acquired from another device;
前記外観特徴比較手段から送信される比較結果に基づいて、 通信相手の他 装置を同定する同定部と、 を有することを特徴とする通信相手特定装置。 An identification unit for identifying another device of a communication partner based on a comparison result transmitted from the appearance feature comparison unit.
[17] 自装置の周囲に位置する任意の他装置から通信相手となる他装置を特定す る通信相手特定方法であって、 [17] A communication partner identification method for identifying another device as a communication partner from any other device located around the own device,
自装置の周辺に位置する他装置の外観形状を撮影した画像を分析し、 前記 画像から他装置の外観形状の特徴を抽出する外観特徴抽出ステツプと、 自装置で抽出された外観形状情報と、 他装置から取得した外観形状の特徴 とを比較する外観特徴比較ステツプと、  An external feature extraction step of analyzing an external shape image of another device located in the periphery of the own device and extracting features of the external shape of the other device from the image; and external shape information extracted by the own device; An appearance feature comparison step for comparing the features of the appearance shape obtained from other devices;
前記比較結果に基づいて、 通信相手の他装置を同定する同定ステップと、 を実行することを特徴とする通信相手特定方法。  An identification step of identifying another device of a communication partner based on the comparison result;
[18] 自装置の周囲に位置する任意の他装置から通信相手となる他装置を特定す る通信相手特定装置を構成するコンピュータに、 [18] A computer that constitutes a communication partner identification device that identifies another device as a communication partner from any other device located around the device itself.
自装置の周辺に位置する他装置の外観形状を撮影した画像を分析し、 前記 画像から他装置の外観形状の特徴を抽出する機能と、  A function of analyzing an image obtained by photographing the external shape of another device located in the periphery of the own device, and extracting a feature of the external shape of the other device from the image;
自装置で抽出された外観形状情報と、 他装置から取得した外観形状の特徴 とを比較する機能と、  A function that compares the external shape information extracted by the device itself with the external shape features obtained from other devices;
前記比較結果に基づいて、 通信相手の他装置を同定する機能と、 を実行さ せることを特徴とする通信相手特定用プログラム。  A program for identifying a communication partner, which executes a function for identifying another device of a communication partner based on the comparison result.
PCT/JP2007/000712 2006-06-30 2007-06-28 Communication party identifying apparatus, communication party identifying method, and communication party identifying program WO2008001503A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AT08014843T ATE461867T1 (en) 2007-06-28 2008-08-21 PACKAGING ELEMENT FOR A HONEYCOMB-SHAPED STRUCTURED BODY AND METHOD FOR TRANSPORTING A HONEYCOMB-SHAPED STRUCTURED BODY

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006181673A JP2009255600A (en) 2006-06-30 2006-06-30 Communication party identifying apparatus, communication party identifying method and communication party identifying program
JP2006-181673 2006-06-30

Publications (1)

Publication Number Publication Date
WO2008001503A1 true WO2008001503A1 (en) 2008-01-03

Family

ID=38845278

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/000712 WO2008001503A1 (en) 2006-06-30 2007-06-28 Communication party identifying apparatus, communication party identifying method, and communication party identifying program

Country Status (2)

Country Link
JP (1) JP2009255600A (en)
WO (1) WO2008001503A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010068282A (en) * 2008-09-11 2010-03-25 Sharp Corp Communication device, communication system, communication method, program, and recording medium
JP2010211535A (en) * 2009-03-10 2010-09-24 Aisin Aw Co Ltd Communication system, communication method, and communication program
JP2014106858A (en) * 2012-11-29 2014-06-09 Mitsubishi Electric Corp Message communication device, message communication system, and message communication method
EP3446718A1 (en) 2017-08-24 2019-02-27 Herbert Wittekind Moulded body for use as odour agent reservoir in a fragrance dispenser
US10360453B2 (en) 2015-03-31 2019-07-23 Sony Corporation Information processing apparatus and information processing method to link devices by recognizing the appearance of a device
TWI766242B (en) * 2019-12-06 2022-06-01 新加坡商鴻運科股份有限公司 Short-range communication system and method thereof

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5606353B2 (en) * 2011-02-14 2014-10-15 日本電信電話株式会社 Wireless router and location information notification method
JP2013080286A (en) * 2011-09-30 2013-05-02 Pioneer Electronic Corp Moving body identification device and moving body information transmission device
JP5786753B2 (en) * 2012-02-15 2015-09-30 株式会社デンソー VEHICLE DEVICE AND VEHICLE SYSTEM
JP5692204B2 (en) * 2012-11-14 2015-04-01 沖電気工業株式会社 Information processing apparatus, program, and information processing method
JP6056540B2 (en) * 2013-02-20 2017-01-11 株式会社デンソー Peripheral vehicle identification system, feature amount transmission device, and peripheral vehicle identification device
DE102014205511A1 (en) * 2014-03-25 2015-10-01 Conti Temic Microelectronic Gmbh METHOD AND DEVICE FOR DISPLAYING OBJECTS ON A VEHICLE INDICATOR
JP2016106326A (en) * 2016-02-29 2016-06-16 パイオニア株式会社 Moving body identification device
JP6523196B2 (en) 2016-03-17 2019-05-29 株式会社東芝 Estimation apparatus, method and program
JP2018045732A (en) * 2017-12-25 2018-03-22 パイオニア株式会社 Moving body identification device
JP2019179566A (en) * 2019-06-05 2019-10-17 パイオニア株式会社 Moving body identification device
JP7444736B2 (en) 2019-12-30 2024-03-06 株式会社Subaru traffic control system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002024994A (en) * 2000-07-12 2002-01-25 Equos Research Co Ltd Method and system for transmitting information
JP2002183888A (en) * 2000-12-14 2002-06-28 Matsushita Electric Ind Co Ltd Vehicle communication device and system
JP2004078786A (en) * 2002-08-22 2004-03-11 Alpine Electronics Inc Inter-vehicle communication device
JP2004280645A (en) * 2003-03-18 2004-10-07 Fuji Photo Film Co Ltd Vehicular communication device
JP2005207999A (en) * 2004-01-26 2005-08-04 Alpine Electronics Inc Navigation system, and intersection guide method
JP2005215753A (en) * 2004-01-27 2005-08-11 Toyota Motor Corp Communication equipment for vehicle

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002024994A (en) * 2000-07-12 2002-01-25 Equos Research Co Ltd Method and system for transmitting information
JP2002183888A (en) * 2000-12-14 2002-06-28 Matsushita Electric Ind Co Ltd Vehicle communication device and system
JP2004078786A (en) * 2002-08-22 2004-03-11 Alpine Electronics Inc Inter-vehicle communication device
JP2004280645A (en) * 2003-03-18 2004-10-07 Fuji Photo Film Co Ltd Vehicular communication device
JP2005207999A (en) * 2004-01-26 2005-08-04 Alpine Electronics Inc Navigation system, and intersection guide method
JP2005215753A (en) * 2004-01-27 2005-08-11 Toyota Motor Corp Communication equipment for vehicle

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010068282A (en) * 2008-09-11 2010-03-25 Sharp Corp Communication device, communication system, communication method, program, and recording medium
JP2010211535A (en) * 2009-03-10 2010-09-24 Aisin Aw Co Ltd Communication system, communication method, and communication program
JP2014106858A (en) * 2012-11-29 2014-06-09 Mitsubishi Electric Corp Message communication device, message communication system, and message communication method
US10360453B2 (en) 2015-03-31 2019-07-23 Sony Corporation Information processing apparatus and information processing method to link devices by recognizing the appearance of a device
US10789476B2 (en) 2015-03-31 2020-09-29 Sony Corporation Information processing apparatus and information processing method
EP3446718A1 (en) 2017-08-24 2019-02-27 Herbert Wittekind Moulded body for use as odour agent reservoir in a fragrance dispenser
WO2019038438A1 (en) 2017-08-24 2019-02-28 Herbert Wittekind Moulding for use as an aromatic-substance reservoir in a fragrance dispenser
TWI766242B (en) * 2019-12-06 2022-06-01 新加坡商鴻運科股份有限公司 Short-range communication system and method thereof

Also Published As

Publication number Publication date
JP2009255600A (en) 2009-11-05

Similar Documents

Publication Publication Date Title
WO2008001503A1 (en) Communication party identifying apparatus, communication party identifying method, and communication party identifying program
JP6468062B2 (en) Object recognition system
KR101740462B1 (en) Mobile terminal standby method, device, program and recording medium
US20100020169A1 (en) Providing vehicle information
WO2021147637A1 (en) Lane recommendation method and apparatus, and vehicular communication device
WO2017057044A1 (en) Information processing device and information processing method
JP6509361B2 (en) Parking support device and parking support method
WO2008001493A1 (en) Position estimating device for vehicles, position estimating method for vehicles, and position estimating program for vehicles
CN111292351A (en) Vehicle detection method and electronic device for executing same
CN106314424B (en) Householder method of overtaking other vehicles, device and automobile based on automobile
CN107818694A (en) alarm processing method, device and terminal
US11200800B2 (en) Vehicle search system, vehicle search method, and vehicle used therefor
JP2006107521A (en) Mobile communication equipment
CN106682970B (en) Method, client and system for opening shared traffic tool
JP2007128235A (en) Access report device
JP2013080286A (en) Moving body identification device and moving body information transmission device
CN113454692B (en) Driving information providing method, vehicle map providing server and method
WO2022005478A1 (en) Systems and methods for detecting projection attacks on object identification systems
JP4523445B2 (en) Communication vehicle display device
KR20210056632A (en) Method for image processing based on message and electronic device implementing the same
JP4164805B2 (en) Driving support system and in-vehicle terminal
CN114664081A (en) Local high-confidence information-based dynamic environment awareness validation and training service
JP2020113106A (en) Stolen vehicle tracking system
JP2018045732A (en) Moving body identification device
US20090267750A1 (en) Mobile unit communication apparatus and computer-readable recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07766956

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: RU

122 Ep: pct application non-entry in european phase

Ref document number: 07766956

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP