CN101713664A - Information presentation apparatus for vehicle and information presentation method for vehicle - Google Patents

Information presentation apparatus for vehicle and information presentation method for vehicle Download PDF

Info

Publication number
CN101713664A
CN101713664A CN200910178100A CN200910178100A CN101713664A CN 101713664 A CN101713664 A CN 101713664A CN 200910178100 A CN200910178100 A CN 200910178100A CN 200910178100 A CN200910178100 A CN 200910178100A CN 101713664 A CN101713664 A CN 101713664A
Authority
CN
China
Prior art keywords
vehicle
presents
information
indicator
display device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN200910178100A
Other languages
Chinese (zh)
Other versions
CN101713664B (en
Inventor
太田克己
三田村健
保泉秀明
大野健
山根雅夫
堺宏征
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nissan Motor Co Ltd
Original Assignee
Nissan Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nissan Motor Co Ltd filed Critical Nissan Motor Co Ltd
Publication of CN101713664A publication Critical patent/CN101713664A/en
Application granted granted Critical
Publication of CN101713664B publication Critical patent/CN101713664B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/008Registering or indicating the working of vehicles communicating information to a remotely located station

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

An information presentation apparatus for a vehicle includes an indicator mounted on the vehicle for indicating a target direction and a control unit for controlling the indicator. The control unit includes a function for acquiring a current position of the vehicle and a reference direction of the vehicle, a function for specifying a target and acquiring a target position, a function for calculating a target angle that indicates a target direction against the reference direction based on the target position, the current position and the reference direction, and a function for generating a presentation command based on the calculated target angle so as to indicate the target direction by the indicator. According to the apparatus, information for specifying the target position can be provided.

Description

Vehicular information display device and Vehicular information rendering method
Technical field
The present invention relates to a kind of three-dimensional indicator such as robot that uses to occupant's indication of vehicle Vehicular information display device and method towards the direction of objects such as facility.
Background technology
The assistant display device of the following guider of known existence: the pre-position of the route guidance of this assistant display device associating guider before from the point of crossing begins to keep towards the route guidance direction to notify channeling direction to the occupant, then in rotation (right-hand rotation/left-hand rotation) face forward of finishing to channeling direction (referring to TOHKEMY 2001-304899) afterwards.
Yet this traditional assistant display device is at the channeling direction of attempting keeping when the occupant provides the positional information of object towards vehicle.Therefore, have following problem: when changing with the relative position of object during vehicle ', the occupant of vehicle only can know the position that can not specify this object towards the roughly direction of object.
Summary of the invention
The object of the present invention is to provide and a kind ofly can use three-dimensional indicator to come the information presenting device and the method for the information of display object direction with respect to the object angle of the object orientation of the reference direction of vehicle according to indication.Here, calculate this object angle according to the current location of object's position, vehicle and the reference direction of vehicle.
According to the present invention, show relative object orientation with respect to the reference direction of vehicle.Therefore, even the reference direction of vehicle is along with vehicle ' one changes, the occupant also can the appointed object position.
According to an aspect of the present invention, a kind of Vehicular information display device comprises: indicator, and it is installed on the described vehicle, and is used to present the information of direction indication; And control module, be used to control the information of being undertaken and present by described indicator, wherein, described control module comprises: the information of vehicles acquiring unit is used to obtain the current location of described vehicle and the reference direction of described vehicle; The object designating unit is used to specify the occupant's that will present to described vehicle object; The object information acquiring unit is used to obtain object's position, and described object's position is the position that described object exists; The object angle calculation unit is used for based on described object's position, described current location and described reference direction, calculates the object angle of indication with respect to the object orientation of described reference direction, and described object orientation is the direction that described object exists; And present the order generation unit, and be used for generating and present order based on the object angle that calculates, wherein, present order according to described, described indicator presents the information of indicating described object orientation.
According to another aspect of the present invention, a kind of Vehicular information rendering method comprises: appointment will be presented to the occupant's of vehicle object; Calculate the object angle of denoted object direction based on the reference direction of the current location of the position of specified object, described vehicle and described vehicle, wherein, described object angle indication is with respect to the object orientation of described reference direction; And present the information of indicating described object orientation based on the object angle that calculates.
Description of drawings
Fig. 1 illustrates the summary structure of the information presenting device of first embodiment;
Fig. 2 illustrates the Typical Disposition of the information presenting device of first embodiment;
Fig. 3 illustrates another Typical Disposition of the information presenting device of first embodiment;
Fig. 4 is the block diagram of the information presenting device of first embodiment;
Fig. 5 A and 5B illustrate the example of the robot 100 that is installed on the vehicle;
Fig. 6 is the figure of example that is used to illustrate the method for the absolute angle that is used for calculating object;
Fig. 7 is the figure of another example that is used to illustrate the method for the absolute angle that is used for calculating object;
Fig. 8 is used for the figure of description object with respect to the computing method of the object angle of the reference direction of this vehicle;
Fig. 9 is the figure that is used to illustrate the computing method of the object angle after the schedule time;
Figure 10 is the figure that is used to illustrate the computing method of the object angle of object after the schedule time;
Figure 11 is the process flow diagram of processing that is used to illustrate the information presenting device of first embodiment;
Figure 12 is the process flow diagram that is used to illustrate second exemplary process that the search of the object that will present is handled;
Figure 13 is the process flow diagram that is used to illustrate the 3rd exemplary process that the search of the object that will present is handled;
Figure 14 is the process flow diagram that is used to illustrate the 4th exemplary process that the search of the object that will present is handled;
Figure 15 is the process flow diagram of subroutine that is used to illustrate the computing of object angle shown in Figure 11;
Figure 16 is the process flow diagram of another typical subroutine that is used to illustrate the computing of object angle shown in Figure 11;
Figure 17 A~17C is used to illustrate at the figure of the motion of the robot 100 of denoted object direction in the position of tracing object continuously in time;
Figure 18 is the process flow diagram of processing that is used to illustrate the information presenting device of second embodiment;
Figure 19 is the process flow diagram of subroutine that is used to illustrate the computing of object angle shown in Figure 180;
Figure 20 is the figure that is used for illustrating the computing of the object angle after the second embodiment schedule time;
Figure 21 A~21C is the figure of motion that is used for illustrating the robot 100 of second embodiment;
Figure 22 is used to illustrate the figure that recomputates the necessity of object angle when the direct of travel that adopts vehicle during as reference direction;
Figure 23 is used to illustrate the figure that recomputates the necessity of object angle when adopting the orientation as reference direction;
Figure 24 is the figure that is used to illustrate the processing of the necessity that recomputates the object angle when direct of travel that is used for determining when the employing vehicle is as reference direction;
Figure 25 is used to illustrate the figure of processing that is used for determining recomputating during as reference direction when the employing orientation necessity of object angle;
Figure 26 is used to illustrate the figure that recomputates the processing of angular velocity when the direct of travel that adopts vehicle during as reference direction;
Figure 27 is used to illustrate the figure that recomputates the processing of angular velocity when adopting the orientation as reference direction;
Figure 28 A is the synoptic diagram of the structure of three dimensional display 3100;
Figure 28 B is the planimetric map of three dimensional display 3100; And
Figure 29 is the block diagram that is used to illustrate the summary structure of holograph display device.
Embodiment
First embodiment
With the information presenting device 1000 that illustrates with reference to the accompanying drawings according to first embodiment.
As shown in Figure 1, comprise presenting device Q and being used to control of the information that is used to present indicating predetermined direction according to the information presenting device 1000 of present embodiment by presenting the control device R that information that device Q carries out presents.In addition, information presenting device 1000 is connected with the car-mounted device M of this vehicle by the wired or wireless communication unit, with exchange message each other.
The device Q that presents of present embodiment is used for by it towards the indicator 100 that shows " information of indicating predetermined direction ".Indicator 100 is the three-dimensional body at position of human bodies such as three-dimensional body, artificial hand or finger of three-dimensional body, simulating human or anthropomorphic statue of simulated animal or sagittate three-dimensional body etc.Indicator 100 is by the direction indication that rotatablely moves, to present " information of indicating predetermined direction " to the occupant.
For example, indicator 100 can come the denoted object direction by it being obverted object orientation.Alternatively, indicator 100 can be by turning to object orientation to indicate this object orientation on its hand, pin, finger or tail etc.Notice that object orientation is the direction that object exists.
Shown in Fig. 2 and 3, will be installed on the vehicle according to the information presenting device 1000 of present embodiment.To be installed on the end face of instrument panel of vehicle as the indicator that presents device Q (robot) 100, and robot controller 200 will be contained in this instrument panel inside.
Indicator 100 shown in Fig. 2 and 3 is by turning to predetermined party always to show the information of indicating this predetermined direction around turning axle G rotation and with positive f.Here, indicator 100 can be installed in the optional position, if indicator 100 be positioned at the occupant within sweep of the eye.For example, indicator 100 can be installed on the column.
To specify assembly included in the information presenting device 1000 with reference to figure 4.
At first, indicator (robot) 100 will be described.As shown in Figure 4, the indicator 100 of present embodiment comprises motion controller 110, robot rotary drive unit 120 and loudspeaker 130.
Motion controller 110 is according to control command, control robot rotary drive unit 120 and loudspeaker 130 from robot controller 200.Robot rotary drive unit 120 makes robot 100 rotation according to the order that presents that gets access to from robot controller 200 under the control of motion controller 110, thereby makes the front (perhaps the hand of robot 100, pin, finger or tail towards) of robot 100 towards predetermined direction.
Shown in Fig. 5 A and 5B, the indicator of present embodiment (robot) the 100th has facial three-dimensional body as animal or human's class.Hereinafter, also indicator 100 is called robot 100.The face of robot 100 has eyes e1 and e2 at its positive f place.
Robot rotary drive unit 120 comprises: base portion 121 is used for fixing or engages robot 100; The turning axle G of robot 100; And motor mechanism 122, be used to make robot body 101 to rotate around turning axle G along any direction and with any rotational speed.Shown in Fig. 5 B, robot 100 is rotated around turning axle G by motor mechanism 122.By making robot 100 around turning axle G rotation predetermined angular, robot rotary drive unit 120 can turn to any direction with the positive f of robot 100.In addition, robot rotary drive unit 120 can make robot 100 rotate around turning axle G with predetermined angle speed.Be not particularly limited the concrete structure of robot rotary drive unit 120, and can adopt any known parts.
In addition, shown in Fig. 5 A, robot 100 can have the hand member H1 of the hand that is used for simulating human and be used for according to the HK1 of hand-drive mechanism that order makes the vertical and tangential movement of the front end of hand member H1 that presents from robot controller 200.Can be with hand member H1, at the head of robot 100 bight is set, perhaps can replace hand member H1, at the head of robot 100 bight is set.Alternatively, can at the back side of robot 100 afterbody be set, perhaps can replace hand member H1, afterbody is set at the back side of robot 100 with hand member H1.
In addition, under the control of motion controller 110, be included in loudspeaker 130 in the robot 100, (output) and the relevant guidance information of special object (for example, facility) are provided according to the order that presents from robot controller 200.Loudspeaker 130 in the present embodiment has the text-to-speech that is used to read aloud the information relevant with object, and (Text-To-Speech, TTS) function 131.The information relevant with object comprises the guidance information relevant with object (facility, place of interest etc.), warning message relevant with object (left side/right turn point, pedestrian and other vehicle) etc.Loudspeaker 130 can be play the audio-frequency information of the guidance information relevant with object of storage in advance by its playback.
Then, robot controller 200 will be described.The information that robot controller 200 controls are undertaken by robot 100 presents operation.
As shown in Figure 4, robot controller 200 comprises: ROM (ROM (read-only memory)) 201 is used to store the program of the control and treatment that is used to carry out robot 100; CPU (CPU (central processing unit)) 202, it is as the operating circuit that is used as robot controller by institute's program stored among the execution ROM 201; And as the RAM (random access memory) 203 of accessible memory storage.Here, MPU (microprocessing unit), DSP (digital signal processor), ASIC (special IC) and/or FPGA (field programmable gate array) can be used as operating circuit together with CPU, perhaps replaced C PU is used as operating circuit with MPU, DSP, ASIC and/or FPGA.
As shown in Figure 4, robot controller 200 is connected with vehicle-mounted device M such as guider 300, vehicle control device 400, road-vehicle communication device 500 and vehicle mounted imaging apparatus 600.These devices are installed on this vehicle, and are connected with exchange message with robot controller 200 via CAN (ControllerArea Network, controller local area network) or other vehicle-mounted LAN.
Function with explanation robot controller 200.Robot controller 200 comprises that at least information of vehicles obtains function, object appointed function, object information and obtains function, object angle calculation function and present the order systematic function.Obtain function by information of vehicles, carry out the information processing that is used to obtain vehicle.By the object appointed function, carry out the processing that is used to specify the object that to present to the occupant.Obtain function by object information, carry out the information processing that is used to obtain the position that comprises specified object.By object angle calculation function, execution is used to calculate the processing of indication with respect to the object angle of the object orientation of reference direction.By presenting the order systematic function, carry out be used to generate be used for by robot 100 show the denoted object directions information present process of commands.
Each function of robot controller 200 below will be described.
At first, the information of vehicles with explanation robot controller 200 obtains function.Robot controller 200 obtains the current location of vehicle and the reference direction of vehicle from guider 300.By the GPS (GPS) 301 of guider 300 detect the current location of vehicle and the reference direction of vehicle (comprise direct of travel, such as the predetermined orientation of position, the north: this predetermined orientation is not limited to position, the north, also can on the south orientation, position, east, position, west etc. as predetermined orientation).GPS 301 receives the radiowave that sends from HA Global Positioning Satellite by gps antenna, and specifies the current location of this vehicle.GPS 301 is based on the current location according to the positioning result and the further designated vehicle of map matching result of locator data.Here, from the gyro sensor 401 of vehicle control device 400 and geomagnetic sensor 402 and mileometer input locator data.The current location that robot controller 200 obtains vehicle based on the positioning result and the locator data of GPS 301 grades and the reference direction of vehicle.Notice that this reference direction comprises as the direct of travel of the moving direction of the current location of vehicle with based on orientation (for example, position, the north) that this vehicle observed.
Robot controller 200 obtains car speed from the vehicle speed sensor 403 of vehicle control device 400.
Then, will the object appointed function of robot controller 200 be described.The object that robot controller 200 is specified the occupant that will present to vehicle.For this object appointed function, by following four kinds of method appointed objects.The first, based on current location, reference direction and the object's position appointed object of vehicle.The second, the occupant is appointed as object in set destination when using route guidance etc.The 3rd, object is appointed as in the facility or the place of satisfying occupant's set search condition when using information providing system etc.Concrete search condition can be the condition at the classification (attribute) of facilities such as grocery store, Public Parking, refuelling station and restaurant, at the condition of occupants' such as going sightseeing, play or have a dinner purpose etc.The 4th, the facility (place) consistent with occupant's hobby is appointed as object.
Below with each designated treatment of description object appointed function.
In first method, robot controller 200 is with reference to the cartographic information 303 of guider 300, and the current location of object's position and vehicle and the predetermined traveling-position that follows on the route of direction are into compared.Then, robot controller 200 will be present in the current location preset distance of this vehicle or be appointed as " object " along barrier or other vehicle on the place in the zone of the direct of travel of this vehicle, pedestrian, the road.The cartographic information 303 of guider 300 comprises and the position in each place and the relevant information of facility that is associated with this place (position).In addition, will describe in detail, obtain motion objects such as pedestrian or vehicle from the pick-up unit that is installed on the roadside as following.
Robot controller 200 can be based on inferring the place that, the scope of further dwindling " object " by the preassigned place of user or according to user's use history.Therefore, replace presenting near the whole places relevant information, can select and present and the relevant information in the interested place of occupant with only being positioned at vehicle.
In second method, robot controller 200 is imported the occupant by the route search system 302 of guider 300 destination (for example, road sign building) is appointed as object.Particularly, robot controller 200 destination that obtains pass course search system 302 is provided with the destination information that function 3021 is imported.Then, robot controller 200 these destination informations are set to the appointed information of object.Robot controller 200 obtains the position of this destination from the cartographic information 303 of guider 300.In this case, robot controller 200 can also obtain information about the facility that is associated with destination (position) from guider 300.
In third party's method, robot controller 200 will satisfy the facility or the place of the search condition (for example, selected classification) that occupant's pass course search system 302 imported and be appointed as object.Here, search condition is type of facility such as grocery store, Public Parking, refuelling station and restaurant or purpose classification such as have a dinner or play.Particularly, robot controller 200 obtains the search condition by having realized that also information provides the function of search 3022 of the route search system 302 of function to be imported.Then, robot controller 200 will satisfy the facility of set search condition or the information setting in place is the object appointed information.Robot controller 200 obtains the position in the place of satisfying this search condition from the cartographic information 303 of guider 300.In this case, robot controller 200 can also obtain information about the facility that is associated with this place (position) from guider 300.
The scope that robot controller 200 can further dwindle " object " based on the relation between the reference direction of the current location of vehicle and vehicle.Therefore, replace presenting the information relevant, can select and present near the relevant information in place with being positioned at vehicle with whole places of the search condition that satisfies the occupant.
In cubic method, robot controller 200 is appointed as object based on the information request input and the Information Selection result of route search system 302 with the facility (place) consistent with occupant's hobby.Here, when the route guidance of using guider 300 or information providing system, the historical and/or historical hobby of judging the occupant of output based on user's input.Robot controller 200 obtains the position in the place consistent with occupant's hobby from the cartographic information 303 of guider 300.Particularly, robot controller 200 surpasses the place of predetermined value and is set to the place of the number of times of destination above predetermined value in the past from access times or the frequency (frequency=access times/time period) that cartographic information 303 obtains the past.
The scope that robot controller 200 can further dwindle " object " based on the relation between the reference direction of the current location of vehicle and vehicle.Therefore, replace presenting information, can select and present near the relevant information in place with being positioned at vehicle about whole places consistent with occupant's hobby.
Subsequently, the object information of explanation robot controller 200 is obtained function.Robot controller 200 obtains the object's position of the position of denoted object existence.
Here, " object " (its information will be presented to the occupant) comprises road sign, POI (PointOf Interest, focus) or facility etc. " will as the place of introductory object ", and on the road " objects of object by way of caution " such as barrier (electric pole, call box, bus stop), a left side/right turn point (at the point that is used to turn right/turn left), pedestrian or other vehicles from the definition of preset distance place, the center of point of crossing.
Here, following two kinds of methods will be described: a kind of method is used to obtain the object's position that does not move, and another kind of method is used to obtain mobile object's position.
At first, explanation is used to obtain the method for the object's position that does not move.Obtain not the stationary objects position of moving (will provide on place, facility and the road of relevant information barrier etc.) from the cartographic information 303 that has defined object.
On the other hand, obtain mobile object's position (pedestrian and other vehicle etc.) by road-vehicle communication device 500 from the pick-up unit that is installed on the roadside.When objects such as detection pedestrian or other vehicle, the pick-up unit that detects object detects the position of object, detection time and this pick-up unit to road-vehicle communication device 500 notices.Because each pick-up unit detects near the object self, therefore can see the position of installation and measuring device as object's position.Therefore, robot controller 200 can be specified mobile object's position based on detect existence, detection time and the detection position (position of pick-up unit) that get access to from road-vehicle communication device 500.
In addition, robot controller 200 obtains the relevant information of object.The relevant information of this object comprises text message, audio-frequency information or the image information that is used to guide or warn this object.Because the relevant information and the positional information of stationary objects are stored in facilities information 304 or the POI information 305 explicitly, therefore obtain this relevant information from guider 300.Can obtain the positional information of object and the guidance information of object from external server by communication network.Owing to be stored in advance in the robot controller 200 at object (pedestrian, other vehicle etc.) of all categories guidance information (warning message), therefore can obtain guidance information from robot controller 200 with objects such as pedestrian or other vehicles.
Subsequently, will the object angle calculation function of robot controller 200 be described.Robot controller 200 calculates the object angle of indication with respect to the object orientation of the reference direction of vehicle based on the current location of object's position, vehicle and the reference direction of vehicle (direct of travel, orientation).
At first, will be with reference to the computing method of figure 6 explanations with respect to the absolute angle θ (based on the angle of predetermined orientation (for example, position, the north)) of the direct of travel of this vehicle.This absolute angle θ is used for the calculating object angle.The direct of travel that Fig. 6 illustrates vehicle be the north to situation, the current location of this vehicle is position a 1 (longitude A=AAA degree AA.AAAAAA branch, latitude B=BB degree BB.BBBBBB branch), and object's position be position 2 (longitude C=CCC degree CC.CCCCCC branch, latitude D=DD degree DD.DDDDDD branch).Can calculate current location of this vehicle and the distance (vector) between the object (POI) by following equation 3.
(directions X): XX=(A-C) * LO
(Y direction): YY=(B-D) * LA ... 3
" LA " and " LO " in the top equation 3 is the constant that is used for based on the lat/longitude computed range.LA equals each predetermined latitude length of last 1 second, and LO equals each predetermined longitude length of last 1 second, and wherein, 1 second length of latitude (latitude 35 degree) be about 30.8 meters, and 1 second length of longitude (longitude 35 is spent) is about 25 meters.Based on current location of this vehicle that calculates in the equation 3 and the distance between the object's position, the equation 4 below using calculates the absolute angle θ based on the object (POI) of the current location of this vehicle.
θ=arctan(XX/YY) :YY>0
θ=arctan(XX/YY)+π:YY<0
θ=π/2 :YY=0,XX>0
θ=-π/2 :YY=0,XX<0 ...4
In addition, Fig. 7 illustrates the situation of calculating object with respect to the absolute angle θ of predetermined orientation (for example, position, the north).Absolute angle θ is used for the calculating object angle.
The current location of this vehicle is position 11 (longitude A=AAA degree AA.AAAAAA branch, a latitude B=BB degree BB.BBBBBB branch), and object's position is position 12 (longitude C=CCC degree CC.CCCCCC branch, a latitude D=DD degree DD.DDDDDD branch).With the similar ground of situation described above, can calculate current location of this vehicle and the distance (vector) between the object (POI) by following equation 13.
(east to): XX=(A-C) * LO
(north to): YY=(B-D) * LA ... 13
" LA " and " LO " in the top equation 13 is the constant that is used for based on the lat/longitude computed range.LA equals each predetermined latitude length of last 1 second, and LO equals each predetermined longitude length of last 1 second, and wherein, 1 second length of latitude (latitude 35 degree) be about 30.8 meters, and 1 second length of longitude (longitude 35 is spent) is about 25 meters.Based on current location of this vehicle that calculates in the equation 13 and the distance between the object's position, the equation 14 below using calculates the absolute angle θ based on the object (POI) of the current location of this vehicle.
θ=arctan(XX/YY) :YY>0
θ=arctan(XX/YY)+π:YY<0
θ=π/2 :YY=0,XX>0
θ=-π/2 :YY=0,XX<0 ...14
As mentioned above, when providing current location by position 11 and providing object's position by position 12, the distance (vector) that can calculate between current locations and the object's position by equation 13.
With situation shown in Figure 6 similarly, can use equation 14 to come the absolute angle θ of calculating object.Note, as shown in Figure 6, under the consistent situation in direct of travel and predetermined orientation, can be based on any one the calculating absolute angle in direct of travel and the predetermined orientation.
Then, will be with reference to the computing method of figure 8 description objects with respect to the object angle of the direct of travel of this vehicle.The direct of travel θ ' that obtains this vehicle from the gyro sensor 401 and the geomagnetic sensor 402 of vehicle control device 400.According to based on the absolute angle θ of the object (POI) of the current location of this vehicle and the direct of travel θ ' of this vehicle, the equation 5 below using calculates will be as the object angle [alpha] towards angle (position angle of robot) of robot 100.
α=θ-θ ' ... equation 5
Noting, is the orientation be scheduled to promptly under as shown in Figure 6 the situation of benchmark of absolute angle θ at the direct of travel of vehicle, because θ ' is zero, thereby absolute angle θ is consistent with the object angle [alpha].
Robot controller 200 is with preset time interval calculation object angle [alpha].Although this preset time can be set at interval arbitrarily, preferably be set this time interval according to the needed processing time of the rotation of control robot 100.By to be equal to or less than the time interval calculating object angle in processing time, can make the minimum deviation between the direction that exists by indicated direction of robot 100 and object (POI).
In addition, the robot controller 200 of present embodiment is based on the current location and the speed of vehicle included in the information of vehicles, estimates the position of this vehicle after the schedule time and the reference direction (direct of travel, orientation etc.) of this vehicle.Then, robot controller 200 calculates the object angle with respect to the reference direction of vehicle based on included object's position in the reference direction of the position of the vehicle that estimates, the vehicle that estimates and the object information.This object angle denoted object direction.
Particularly, as shown in Figure 9, current location, direct of travel and the speed f1 of robot controller 200 vehicle during based on current time t1 estimate the position and the direct of travel of this vehicle when the time t2 after current time t1 schedule time Δ t.Should schedule time Δ t be the needed time of Presentation Function of control robot 100 preferably, promptly the needed time of rotation of needed time of calculating object angle and control robot 100 and.Therefore,, can prevent during based on current location because the caused In trace delays of time lag by robot 100 denoted object directions.Notice that Δ t is under the situation in the small time interval at the fixed time, the direct of travel of vehicle after the schedule time Δ t can be estimated as identical with initial direct of travel when the time t1.The initial direct of travel in the time of can also be based on time t1 and the information of gyro sensor 401 and/or geomagnetic sensor 402 are estimated the direct of travel of this vehicle after the schedule time Δ t.
Alternatively, the robot controller 200 of present embodiment is estimated the position and the object orientation of this vehicle after the schedule time based on the current location of vehicle included in the information of vehicles and speed.Then, robot controller 200 calculates the object angle of indication with respect to the object orientation of reference direction (orientation) based on the position of the vehicle that estimates and the object orientation that estimates.
Particularly, as shown in figure 10, current location, reference direction (orientation) and the speed f1 of robot controller 200 vehicle during, the position of this vehicle when estimating the time t2 after current time t1 schedule time Δ t based on current time t1.Define this schedule time Δ t by above method.Figure 10 is illustrated in the situation of the object angle of calculating the object orientation after Δ t instruction time under the condition in reference direction=orientation (north).The direct of travel of vehicle is defined as θ ', and car speed is defined as V.Detect θ ' and V by gyro sensor 401 and geomagnetic sensor 402.Provide each speed component by following equation 21.Based on the speed component that calculates in the equation 21, by vehicle location and the relative distance between the object's position (vector) after the following equation Δ t 22 computing time.Below, utilize the Same Way described in Fig. 7 (by equation 14) to calculate absolute angle, and use the absolute angle that is calculated to calculate the object angle of indication with respect to the object orientation of reference direction (orientation).
The speed that the north makes progress: Vn=V * cos θ '
The speed that east makes progress: Ve=V * sin θ ' equation 21
(east to): XX (Δ t)=((A-C) * LO)-(Ve * Δ t)
(north to): YY (Δ t)=((B-D) * LA)-(Vn * Δ t) equation 22
Operating angle based on the object angle calculation robot 100 that calculates in the above described manner.
Subsequently, explanation robot controller 200 presented the order systematic function.Robot controller 200 based on the object angle that calculates, generates the order that presents of the information be used to show the denoted object direction when presenting the information relevant with specified object.This presents order is to be used for making the order of robot 100 rotations with the denoted object direction based on the object angle that is calculated when presenting the information relevant with object.Particularly, the motor mechanism 122 of robot rotary drive unit 120 makes the main body rotation of robot 100 with the denoted object direction according to presenting order.
In response to this order, robot rotary drive unit 120 makes robot 100 around predetermined turning axle rotation.Robot 100 turns to object orientation with its front (face), with to occupant's notify object position.The occupant can accurately specify this object's position based on indicated direction after robot 100 rotations.
In addition, under situation with the object angle of preset time interval calculation object, robot controller 200 is based in each object angle of calculating of place in time interval, each in time interval the place generate be used for by robot 100 show the denoted object directions information present order.Robot 100 shows the information of denoted object direction continuously according to presenting order during each time interval.
As mentioned above, robot 100 carries out continuously based on what the object angle that calculates at each preset time interval was generated and presents order.Here, come successive computations along with vehicle ' continually varying object angle based on the current location of this vehicle.Therefore, although the relative position of object changes, robot 100 can be along with the continuous denoted object direction of vehicle '.
In addition, except that being used for being presented the order by robot 100 display object directions above-mentioned, robot controller 200 generates and is used for presenting order by loudspeaker 130 outputs of robot 100 guidance information relevant with specified object with other of presenting to the occupant.Other presents order is the order that is used to read aloud the order of the text data of guidance information or is used to play the voice data of guidance information.Can with the information stores that is used for reading aloud text data or playing audio-fequency data in the included memory storage of robot controller 200, perhaps they be stored in the memory storage of guider 300 as facilities information 304 or POI information 305.
In addition, robot controller 200 generates and is used to control the initiation command of carrying out the zero hour that presents order.Hereinafter, with the initiation command that illustrates two types.
Comprise at vehicle mounted imaging apparatus 600 captured images under first situation of feature of specified object that robot controller 200 generates and is used to make robot 100 to begin to carry out the initiation command that presents order, and exports this initiation command to robot 100.
Vehicle mounted imaging apparatus 600 at interval towards the direct of travel photographic images of vehicle, and is sent to robot controller 200 with the image of taking with preset time.
For example, above-mentioned feature is the predefined external appearance feature of object.The predefined external appearance feature of object is the feature relevant with the outer shape of objects such as facility or buildings.That is, the external appearance characteristic of object comprises this contours of objects shape (turriform, arch etc.), average height, peak, height/wide ratio or color etc.These external appearance characteristics are stored in the guider 300 as facility shape information 306.In addition, the external appearance characteristic of object can be defined as the facility master image 307 of watching from predetermined location.Owing to external appearance characteristic and each object are stored explicitly, so robot controller 200 can obtain the feature of the object (specified object) that will present to the occupant.
Robot controller 200 judges whether the feature of specified object is included in vehicle mounted imaging apparatus 600 when vehicle ' in the captured image.For example, robot controller 200 cuts the zone of the predetermined altitude that is above the ground level as the buildings respective regions from the image of each shooting.Then, robot controller 200 compares the height/wide ratio of the cropped area height/wide ratio with specified object.If their difference is equal to or less than predetermined value, the image that then is judged as shooting comprises specified object.In addition, robot controller 200 compares the feature of the facility master image 307 of the feature of this cropped area and the specified object (buildings) watched near the observation place the current location.Notice that the feature of facility master image 307 is contour shapes of buildings etc.If their consistent degree is equal to or higher than predetermined value, the image that then is judged as shooting has comprised specified object.
Be included in vehicle mounted imaging apparatus 600 when vehicle ' in the captured image if be judged as the feature of specified object, then robot controller 200 is judged as this vehicle place by can seeing specified object or just passing through specified object.Then, robot controller 200 generates initiation commands, is used for presenting order and being used for the guidance information of object output or other of warning message (relevant information) presents order by robot 100 denoted object directions to begin to carry out.
Comprise at the route guidance image that provides from on-vehicle navigation apparatus 300 under second situation of predefined external appearance feature of specified object, robot controller 200 generates and is used to make robot 100 to begin to carry out the initiation command that presents order, and exports this initiation command to robot 100.
The guider 300 of present embodiment comprises the route search system 302 of seeking suitable route based on current location and destination.The route search system 302 of present embodiment shows and comprises route and along the eye view image of the buildings of this route, to provide route guidance to the occupant.
Robot controller 200 judges whether the external appearance characteristic of specified object is included in the route guidance image that guider 300 provided.Be used to define the feature of object and judge whether image comprises that the method for this feature is equal to the method for aforesaid utilization by the image of vehicle mounted imaging apparatus 600 shootings.
If the route guidance image that guider 300 is provided comprises the feature of specified object, then robot controller 200 is judged as vehicle by seeing the place of specified object.Then, robot controller 200 generates initiation commands, presents order to begin to carry out other of guidance information that presents order and be used for object output that is used for by robot 100 denoted object directions.
As mentioned above, when the feature of specified object is included in the captured image of vehicle mounted imaging apparatus 600 or be included in the route guidance image that guider 300 provided, can be judged as vehicle and just can sees this object near object and occupant.At this constantly, robot controller 200 control robot 100 are with the denoted object direction.Therefore, the occupant can be along by indicated actual this object of seeing of direction of robot 100.If in the moment denoted object direction that the occupant can not actually see object, then what thing the occupant can not recognition machine people 100 indicate.Because according to present embodiment, denoted object direction when robot 100 can actually see object the occupant, so this occupant can specify the indicated object orientation by robot 100 by the actual object of seeing.
Subsequently, the control procedure that will carry out with reference to the flowchart text robot controller shown in Figure 11~16 200.Figure 11 is the process flow diagram that the general control and treatment flow process of being undertaken by the information presenting device 1000 according to present embodiment is shown.
When log-on message display device 1000, robot controller 200 obtains the current location (S100) of this vehicle from guider 300.
In next step S101, the object (barrier, pedestrian and other vehicle on facility, POI, the road) that robot controller 200 search will be presented to the occupant (S101).In this search was handled, the robot controller 200 of present embodiment obtained the position of stationary objects (place, facility etc.) from the cartographic information 303 of guider 300.Alternatively, robot controller 200 obtains position by the detected mobile object of roadside pick-up unit (pedestrian, other vehicle) from road-vehicle communication device 500.
Although be not particularly limited the searching method of object, in the present embodiment, based on the direct of travel of the position of this vehicle, this vehicle, come object search as reference directions such as orientation and object's position.Particularly, robot controller 200 search is positioned at object on the route that is searched based on the current location and the destination of vehicle by the route search system 302 of guider 300 and presumptive area that also arrange along the direct of travel of this vehicle.By using this method appointed object, can provide information near the object the route that travels along it is relevant with being present in this vehicle to the occupant.
The back will illustrate other searching method.
In next step S102, robot controller 200 judges whether to exist the object (POI) that satisfies the condition that defines in each searching method.If have the object that to present to the occupant, then specify this object and treatment scheme to enter step S103.On the other hand, if there is no to present to occupant's object, then be judged as yet not appointed object, and treatment scheme enter step S120.If detect flame-out signal in step S120, then treatment scheme stops (S121).
In step S103, robot controller 200 judge carry out the processing that is used for presentation information (be used to indicate object orientation processing, be used to present the processing of the guidance information relevant with object) the moment.Particularly, if comprise the image of the feature with object specified in step S102 in the captured image of vehicle mounted imaging apparatus 600, then robot controller 200 is judged as the occupant and is in the position that can see object.In this case, robot controller 200 also be judged as robot 100 execution information present processing (be used for by the processing of robot 100 denoted object directions, be used for the processing of output audio guidance information) time [generating the moment of initiation command].Alternatively, if 302 that provide from the route guidance system, be equal to or greater than predetermined threshold in order to the consistent degree between the feature of the feature the image in the driving vehicle the place ahead of carrying out route guidance and the image of the object of appointment in step S102, then robot controller 200 is judged as the occupant and is in the position that can see object.In this case, robot controller 200 also be judged as robot 100 execution information present processing (be used for by the processing of robot 100 denoted object directions, be used for the processing of output audio guidance information) time [generating the moment of initiation command].
Be used to present the moment of object orientation or the information relevant with object if the judgment is Yes, then treatment scheme enters step S104.In step S104, robot controller 200 obtains the information (guidance information) relevant with specified object.Can after step S101 or S102, carry out obtaining to the guidance information that will export.
In next step S105, robot controller 200 begins to read aloud the guidance information relevant with specified object (text of boot statement) by the loudspeaker 130 that is installed in the robot 100.If this guidance information is a voice data, then robot controller 200 can be play this guidance information.
In step S106, robot controller 200 calculates the object angle (having object towards the object angle) with respect to the reference direction of vehicle based on the current location of vehicle, the reference direction and the object's position of vehicle.The back will describe the computing of this object angle in detail.
In next step S107, robot controller 200 is based on the object angle that calculates, and is used to make robot 100 rotations to make the order that presents of its positive f object-oriented direction thereby generate, and this is presented order is sent to robot 100.
In step S108, the movement controller 110 of robot 100 makes robot 100 around turning axle G rotation, so that the positive f object-oriented of robot 100 according to presenting order.
In step S109, judge whether the audio frequency output of the relevant information (guidance information or warning message) of the object that has begun among the S105 is finished.If finished audio frequency output, then treatment scheme enters step S110.Note, can in step S108, make the audio frequency output of the relevant information of robot 100 rotations the beginning afterwards object.
At last, in step S110, thereby robot controller 200 is used to make robot 100 rotations to make the reset command of its positive f towards reference direction to robot 100 transmissions, and then, treatment scheme is returned step S100.
Then, will be with reference to other searching method of Figure 12~object that 14 explanations are carried out in step S101 shown in Figure 11.Because therefore the top current location that is used for based on vehicle, the direct of travel of vehicle and the method that object's position comes appointed object of having illustrated will illustrate the second~the cubic method hereinafter.These methods be used for based on the set destination appointed object of occupant method, be used for the method that will satisfy the facility of the set search condition of occupant or method that object is appointed as in the place and be used for the facility consistent with occupant's hobby or place are appointed as object.
Figure 12 illustrates the part process flow diagram that is used for occupant set destination when using route guidance is appointed as the processing of object.As shown in figure 12, robot controller 200 judges whether that destination by guider 300 is provided with function 3021 and has imported destination (S1011).If imported the destination, then treatment scheme enters step S1012.Robot controller 200 is defined as the destination of being imported will present to occupant's object (S1012).Subsequently, the processing after the execution in step S102.
Note, can from the destination of being imported, select based on included facility or place (POI) in the current location of vehicle and the defined zone of reference direction, to be set to object.
Figure 13 illustrates and is used for and will satisfies the facility of occupant's set search condition when using information providing system etc. or the part process flow diagram of the processing that object is appointed as in the place.As shown in figure 13, robot controller 200 judges whether to have imported search conditions (S1015) such as search category by the function of search 3022 of guider 300.If imported search condition, then treatment scheme enters step S1016.Robot controller 200 will belong to the facility of the search category of being imported or the object (S1016) that place (POI) is defined as presenting to the occupant.Subsequently, the processing after the execution in step S102.Concrete search condition can comprise the condition at the classification (attribute) of facilities such as grocery store, Public Parking, refuelling station and restaurant, at the condition of occupants' such as going sightseeing, play or have a dinner purpose etc.
Note, can be subordinated to and select the facility or place (POI) that comprise based in the current location of vehicle and the defined zone of reference direction among the facility of the search category of being imported and place (POI), to be set to object.
Figure 14 illustrates the part process flow diagram that is used for the facility consistent with occupant's hobby or place are appointed as the processing of object.
Guider 300 is the accumulation information that the occupant imported (information relevant with the set destination of occupant, with the set relevant information of search condition of occupant) (step SQ) continuously.Should import history and be configured to database.Accept robot controller 200 and carry out access, and allow robot controller 200 to obtain the historical result of input according to input history.
At step S1101, robot controller 200 is with reference to the search condition of being imported by the occupant and the input history of destination.
In next step S1102, robot controller 200 judges whether to exist search condition (classification) or the destination that has been set up more than the pre-determined number with reference to input history.Because search condition or the input number of times of destination and occupant's hobby positive correlation, therefore be judged as and be set up the hobby that the above search condition of pre-determined number or destination are tending towards suitable occupant.
If there is no be set up above search condition or the destination of pre-determined number, then being judged as the occupant does not have particular preferences, and treatment scheme enters step S102.On the other hand, be set up above search condition or the destination of pre-determined number, then be judged as the occupant and have particular preferences, and treatment scheme entered step S1103 if exist.
In step S1103, robot controller 200 has been set up the POI more than the pre-determined number (facility etc.) with search condition and has been set to candidate target.Can define the threshold value of the hobby that is used to be judged as the occupant arbitrarily, promptly be set to the number of times (=pre-determined number) of search condition.Can this threshold value be set based on the All Time section in past or based on finite time sections such as nearest one, three or six months.
In next step S1104, robot controller 200 obtains the current location and the reference direction of this vehicle.In step S1105, robot controller 200 narrows down to candidate target (POI) and belongs to the object (POI) that has the zone of preposition relation with current location and reference direction.Be not particularly limited this regional method to set up.Can be the sector region that its center line extends towards reference direction from the current location of vehicle with this zone definitions.Alternatively, this zone definitions can be the zone in the preset range of the road that travelling along vehicle.Alternatively, being provided with the destination and being provided with thus under the situation of route, can be along the zone in the preset range of this route with this zone definitions.
Subsequently, in step S1106, robot controller 200 judges whether be set up the above POI of pre-determined number is included in vehicle and has in the zone of predetermined location relationship.If POI is not included in this zone, then treatment scheme enters step S102.On the other hand, if POI is included in this zone, then treatment scheme enters step S1107.
At last, in step S1107, the POI of robot controller 200 reduced scope in the processing of step S1105 is set to object.Subsequently, the processing after the execution in step S102.
Then, the computing of the object angle among the step S106 shown in Figure 11 will be described with reference to Figure 15 and 16.
Figure 15 illustrates the part process flow diagram of subroutine of the computing of the object angle among the step S106 shown in Figure 11.
As shown in figure 15, in step S1061, robot controller 200 obtains specified object's position from cartographic information 303.Can in S104 shown in Figure 11, obtain obtaining of executing location information in the guidance information of object.
Subsequently, in step S1062, robot controller 200 obtains the current location and the reference direction of this vehicle from vehicle control device 400.Although in this step, can use the current location that in step S100 shown in Figure 11, gets access to,, obtain the current location and the reference direction of vehicle before the preferred adjacent calculating object angle for calculating object angle accurately.Can carry out obtaining of information of vehicles simultaneously with the processing after the step S103.
In next step S1063, robot controller 200 uses with reference to figure 6 or 7 described methods, calculates the object orientation of current location in absolute coordinates with respect to this vehicle.
In step S1064, robot controller 200 uses with reference to figure 8~10 described methods, calculates the object orientation (that is object angle) with respect to the reference direction of this vehicle.
Figure 16 illustrates other example of the computing of object angle.This computing based on be used to calculate after the schedule time, i.e. the method for the residing object angle in field of this vehicle front on the reference direction of vehicle.Handling substantially of this computing is identical with processing shown in Figure 15.
As shown in figure 16, in step S1061, robot controller 200 obtains specified object's position.Then, in step S1062, robot controller 200 obtains the current location and the reference direction of this vehicle.
Subsequently, in step S1065, robot controller 200 obtains the speed of this vehicle from vehicle control device 400.Can obtain car speed with the reference direction among the step S1062.
In step S1066, robot controller 200 is based on current location, reference direction and the speed of this vehicle, and the calculating schedule time is the estimated position of this vehicle afterwards.Based on needed time of rotation of needed time of calculating object angle and/or control robot 100 and pre-defined should the schedule time.
In step S1067, robot controller 200 calculates the object orientation in the absolute coordinates based on the estimated position of this vehicle after the schedule time.
In next step S1068, robot controller 200 is based on the object orientation in the absolute coordinates that calculates in step S1067, and calculating object is with respect to the object angle of the reference direction of vehicle.
According to the computing method of above object angle, can adjacently make robot 100 rotations to calculate the object angle of object before the denoted object direction.Therefore, can eliminate the indicated direction that causes owing to the processing time and the deviation (delay) between the object's position.Note, need under the situation of some times, preferably the needed time period of the rotation of control robot 100 was included in the above-mentioned schedule time in the rotation of control robot 100.
In addition, carry out the calculating (step S106 shown in Figure 11, Figure 15 and Figure 16) of object angle at interval with preset time.Then, generate in succession based on the object angle that goes out with the preset time interval calculation and present order.In other words, during each (at interval) calculating object angle, carry out step S107, S108 and S110 shown in Figure 11 with preset time.By presenting object orientation at interval, can follow the trail of continually varying object's position, to present object orientation continuously along with vehicle ' with preset time.
Figure 17 A~17C is the synoptic diagram that robot 100 denoted object direction when continuous tracing object position is shown.Figure 17 A~17C illustrate respectively the robot 100 that is installed on the driving vehicle T1 constantly, TM constantly and TN (the denoted object direction of T1<TM<TN) constantly.Because vehicle and the robot 100 that is installed on this vehicle are moving, so object (POI) changes along with the time with respect to the object orientation of robot 100.Robot 100 is with the object angle of preset time interval double counting object, with continuous denoted object direction.
As mentioned above, for tracing object, robot 100 is continuous denoted object direction in rotation.Notice that preset time is short more at interval, then the motion of robot 100 becomes level and smooth more.As a result, can inerrably and accurately present the transformation of object's position with respect to driving vehicle to the occupant.
The information presenting device 1000 of the present embodiment that disposes as mentioned above and operate can bring following effect.
Can indicate object orientation with respect to the reference direction of vehicle because robot 100 grades present device Q, even therefore travelling and reference direction when changing therefrom at vehicle, automotive occupant also can the appointed object position.When being the direct of travel of vehicle or orientation, the reference direction of vehicle can bring this effect similarly.
In addition and since make robot 100 around predetermined shaft G rotation so that its front (face) thus the object-oriented direction to occupant's notify object position, so the occupant can based on robot 100 just towards the direction position of appointed object (facility etc.) accurately.
In addition, thereby present the order that presents that order utilizes robot 100 to carry out to be generated owing to generate at interval with preset time based on the object angle that goes out with the preset time interval calculation, therefore can track the object's position that changes along with vehicle ', and display object direction continuously.
In addition, owing to estimate the vehicle location after the schedule time and calculate object angle thus, therefore can eliminate by the deviation (delay) between the object's position of the indicated object orientation of robot 100 and reality in the estimated position that goes out.Because based on the needed time of calculating object angle be used for being set the schedule time to making robot 100 rotations wait presenting of robot 100 to handle the time of controlling, thus can eliminate owing to the processing time cause by the deviation (delay) between the object's position of indicated object orientation of robot 100 and reality.
Owing to be included in the image captured or guider 300 is provided when being used for the eye view image of guiding route by vehicle mounted imaging apparatus 600 in the feature of specified object, therefore execution presents order with the denoted object direction, can see under the situation of this object to this occupant's display object direction the occupant.In other words, the occupant can see with one's own eyes by the indicated object of robot 100, and confirms object's position according to guiding.
In addition, owing to export the information relevant with object (for example, guidance information) by the loudspeaker in the robot 100 130, so robot 100 can read aloud guidance information or play the voice data of this guidance information when actual denoted object.
Robot 100 can be configured to: when providing relevant information in robot 100, by using sensor such as infrared ray sensor to there being the occupant, make the face of robot 100 turn to the occupant at interval with separately working time.Here, below can adopting: calculate vehicle and distance between objects by guider 300, to make the working time interval variation of robot 100 according to this distance.Make the working time become short more at interval if distance becomes approaching more, then the frequency of rotary machine people 100 face can provide vehicle from object information how far intuitively.
Second embodiment
Subsequently, second embodiment will be described.The principal character of second embodiment is the direction that shows the object that moves with a fixed angular speed.Substantially be equal to according to the structure of the information presenting device of second exemplary embodiments and by structure among its processing of carrying out and first embodiment and processing.Therefore, hereinafter, avoided redundant explanation, and the difference between the two will mainly be described.
The structure of the messaging device of the structure of the information presenting device 1000 of second embodiment and first embodiment shown in Figure 4 is equal to.
The object angle calculation function of the robot controller 200 of present embodiment is based on the current location of vehicle, the reference direction of vehicle (comprising direct of travel and orientation) and speed, estimates the position of this vehicle after the schedule time and the reference direction of this vehicle.Then, robot controller 200 calculates the object angle with respect to the reference direction of vehicle based on the reference direction and the object's position of the position of the vehicle that estimates, the vehicle that estimates.
In the present embodiment, " schedule time " is up to the time period that presents till the information relevant with object (for example, guidance information or warning message) is finished.In other words, " schedule time " is to be used to read aloud guidance information relevant with object or needed time period of warning message.Alternatively, " schedule time " is the needed time period of voice data that is used to play the information relevant with object (for example, guidance information or warning message).
If begun to read aloud or broadcast information, then " schedule time " is to read aloud or play as yet the not needed time of information of output.Can read aloud (or broadcast) needed time of information of output not as yet by from time of reading aloud all information all writing times of information (or from), deducting the time of reading aloud the information that (or broadcast) exported, calculating.Can calculate based on the number of texts in the text data that will read aloud and read aloud the needed time.Can calculate the needed time of playing audio-fequency data based on the recording of information time.
The order systematic function that presents of the robot controller 200 of second embodiment further calculates angular velocity based on the object angle that calculates and the schedule time, present order with generation, thereby show the information of indicating the object orientation that moves with angular velocity by robot 100.
Robot controller 200 calculate when the start time of this processing by the indicated direction of robot 100 and and the corresponding direction of object angle that calculated between angle, then by this angle is calculated angular velocity divided by the schedule time.Can define the start time of this processing arbitrarily, as time of appointed object, begin time of the object angle of reading aloud the time of the guidance information relevant or calculating the current position etc. with object.Robot controller 200 calculates when the start time by the indicated direction of robot 100 and poor (angle) between the object orientation after the schedule time of rising from the outset.Subsequently, by should poor (angle) calculating angular velocity divided by the schedule time.
Robot controller 200 moves object orientation with the angular velocity that calculates in a manner described.In other words, robot controller 200 makes the angular velocity rotation of robot 100 to calculate, till bright the running through to information.
Like this, move with constant angular velocity by robot 100 indicated directions (that is object orientation).The robot controller 200 of present embodiment makes the robot 100 angular velocity rotation to calculate in the given time.Subsequently, the bright of guidance information run through, and the object orientation after the 100 indicating predetermined times of robot.
Figure 18 is the process flow diagram that the control procedure that the information presenting device by present embodiment carries out is shown.Processing among the step S100 of the processing among the step S100 of present embodiment~S108 and first embodiment~S108 is equal to substantially.Processing among the step S201~S203 that carries out after the step S108 is different with the processing among first embodiment.
In step S108, movement controller 110 makes robot 100 rotations, so that its front is towards the direction of the object angle that calculates in step S106.
In next step S201, robot controller 200 calculates when presenting the guidance information relevant with specified object and will finish the object angle (object orientation) with respect to the reference direction of this vehicle.
Will be with reference to the control procedure of flowchart text object angle calculation processing shown in Figure 19.As shown in figure 19, in step S2111, robot controller 200 obtains to be read aloud guidance information and finishes the needed time.If during reading aloud, then can calculate bright running through the needed time by from all bright read time that quantity calculated, deducting according to the time that number of texts calculated of having read aloud according to the full text of guidance information.
In next step S2112, robot controller 200 obtains the speed and the reference direction of this vehicle from vehicle control device 400.Here can use employed information of vehicles in step S106.
In step S2113, robot controller 200 is estimated the vehicle location when bright running through based on bright car speed and the reference direction that runs through needed time, vehicle.When this is estimated, can use the route search system 302 and the cartographic information 303 of guider 300.
In step S2114, robot controller 200 utilizations with reference to figure 6 and 7 described methods, are calculated the schedule time object orientation of vehicle location in absolute coordinates afterwards that estimates in first embodiment.
In step S2115, robot controller 200 utilizes with reference to figure 8 described methods, calculates in the position that estimates the object orientation with respect to the reference direction of this vehicle.
In next step S2116, robot controller 200 calculates angular velocity based on the object angle after the schedule time (with respect to the object orientation of the reference direction of this vehicle) and the schedule time.Particularly, robot controller 200 calculate when the start time of this processing by the indicated direction of robot 100 and and the corresponding direction of object angle that calculates between angle, then by this angle is calculated angular velocity divided by the schedule time.
After step S2116, treatment scheme enters the step S202 of Figure 18.Robot controller 200 generates and presents order, to show the information of indicating the direction that moves with the angular velocity that calculates in step S201.Particularly, this to present order be to be used to make the order of robot 100 with constant angular velocity rotation.With the motion controller 110 that order is sent to robot 100 that presents that is generated.
To the rotation of robot 100 with angular velocity be described with reference to Figure 20 and Figure 21 A~21C.Position relation when Figure 20 is illustrated in processing start time T1, follow-up time TM and time T N between vehicle and the object.Time T N is the time (from the time of time T after 1 schedule time) when presenting of guidance information finished.At initial T1, be α T1 with respect to the object orientation (object angle) of the reference direction of this vehicle.Zhi Hou time T N at the fixed time is α TN with respect to the object orientation (object angle) of reference direction.
The robot controller 200 of present embodiment is based on the motion of the object angle control robot 100 that calculates.Particularly, the motion of robot controller 200 control robot 100 shown in Figure 21 A~21C.Particularly, robot controller 200 makes robot 100 rotate to the direction (Figure 21 C) of angle [alpha] TN continuously from the direction (Figure 21 A) of angle [alpha] T1 with angular velocity.Therefore, move with constant angular velocity by robot 100 indicated directions, to follow the variation of object's position.In the present embodiment, through termination after the schedule time with the rotation of constant angular velocity.
Return Figure 18, with the processing of description of step S203.In step S203, recomputate schedule time object orientation afterwards.
If the variation of the reference direction of vehicle and/or speed is equal to or greater than predetermined threshold separately, then robot controller 200 recomputates object angle after the schedule time (with respect to the object orientation of the reference direction of vehicle) based on current location, reference direction and the car speed of object's position, vehicle.
Note, in step S201, estimate the bright current location that runs through vehicle afterwards based on reading aloud needed time and car speed.Yet if shown in Figure 22 or 23, speed of this vehicle and/or reference direction (comprising direct of travel and orientation) greatly change, and the actual vehicle position after the schedule time may be different from the position that estimates.In other words, may dissimilate with respect to the object orientation of actual vehicle position with respect to the object orientation of the vehicle location that estimates after the schedule time.Figure 22 illustrate based on the direct of travel of vehicle through before the schedule time/variation of afterwards object orientation.Figure 23 illustrate based on the orientation that observes on the vehicle through before the schedule time/variation of afterwards object orientation.
Therefore, surpass under the situation of predetermined value or at the changes delta α of the reference direction (comprising direct of travel and orientation) of vehicle shown in Figure 24 or 25 and to surpass under the situation of predetermined value at the velocity variations Δ f of vehicle, the processing that robot controller 200 is incited somebody to action once more among execution in step S201 and the S202 is with the object angle of calculating object.Figure 24 illustrate based on the direct of travel of vehicle through before the schedule time/variation of afterwards object orientation.Figure 25 illustrate based on the orientation that observes on the vehicle through before the schedule time/variation of afterwards object orientation.
In this case, robot controller 200 generates the new order that presents based on the object angle that recalculates.This generation that presents order is handled and is equal to processing shown in Figure 19.
Owing to begun when presenting order therefore the reading aloud of guidance information also needed to recomputate the employed schedule time of angular speed calculation will regenerating.Be not particularly limited and be used to calculate from beginning to regenerate time of presenting order to the method that presents the time period till the time of will finish to guidance information.Can calculate this time period based on the number of texts of the guidance information of being prepared and the number of texts of having read aloud.Alternatively, the current time that can get access to, calculate this time period according to the reproduction time of the guidance information of being prepared and the mistiming that finishes playing between the time based on timer 403 from vehicle control device 400.
Time T M when the variation that Figure 26 and 27 is illustrated in reference direction (comprising direct of travel and orientation) surpasses predetermined value and (from after the schedule time that T1 constantly begins) the time T N place vehicle when presenting of guidance information finished and the position between the object concern.
As shown in figure 26, recomputate the object angle at time T M.Object orientation (object angle) with respect to the direct of travel of this vehicle when time T M is α TM '.In addition, the object orientation (object angle) with respect to reference direction is α TN ' when time T N.
After recomputating the object angle, robot controller 200 makes robot 100 rotate to the direction of angle [alpha] TN ' from the direction of angle [alpha] TM ' with constant angular velocity.Therefore, move with constant angular velocity by robot 100 indicated directions, to follow the object's position of variation.When presenting of guidance information finished, with the robot 100 denoted object directions of the angular velocity that calculates rotation.Rotation that can be by termination machine people 100 after at the fixed time or by making robot 100 towards reference direction (default setting direction), denoted object direction when presenting of guidance information finished.
Figure 27 illustrates the situation of coming the calculating object angle on this vehicle based on the orientation that observes.When time T M, recomputate the object angle.Object orientation (object angle) based on the orientation when time T M is β TM '.Object orientation (object angle) based on the orientation when time T N is β TN '.After recomputating the object angle, robot controller 200 makes robot 100 rotate to the direction of angle beta TN ' from the direction of angle beta TM ' with constant angular velocity.
Because the information presenting device 1000 of second embodiment disposes as mentioned above and works, therefore can bring and the identical effect of being brought by above-mentioned first embodiment of effect.
Particularly, the information presenting device 1000 of present embodiment brings following effect.
Similar with first embodiment, because presenting device Q, robot 100 grades can indicate variation, even therefore travelling and thus during the reference direction variation, automotive occupant also can the appointed object position when vehicle with respect to the object orientation of the reference direction of vehicle.
In addition, since make robot 100 with constant angular velocity around predetermined shaft G rotation with the front (face) of robot 100 thus the object-oriented direction is to occupant's notify object position continuously, so the occupant can based on robot 100 continuously towards the direction position of appointed object (facility etc.) accurately.
According to present embodiment, owing to go out angular velocity based on the object angle calculation after the schedule time, and present the information relevant by robot 100, therefore can under need not, indicate the object's position that relatively moves with the preset time situation that repeated and redundant is handled at interval with the direction of the object that moves with the angular velocity that calculates.Therefore, can be with the object's position that processing cost is followed the trail of and indication relatively moves that has reduced.
In addition since the schedule time be set to finish to the needed time of presenting of the relevant information relevant with object, therefore can be when presenting of information be finished the denoted object direction.
In addition, surpass under the situation of predetermined value separately in the reference direction (comprising direct of travel and orientation) of vehicle and/or the variation of speed, recomputate the object angle, present the information relevant based on the object angle that recalculates by robot 100 then with object orientation.Therefore, even the reference direction of vehicle changes, also can follow the trail of and the denoted object direction.
In addition, surpass under the situation of predetermined value separately in the reference direction (comprising direct of travel and orientation) of vehicle and/or the variation of speed, recomputate the object angle, generate then be used for according to the object angle that recalculates present the object that moves with angular velocity direction present order.Therefore, even vehicle also tracing object direction accurately just in motion.
Subsequently, at the information presenting device 1000 of first and second embodiment, three typical variant examples will be described.In each variation, utilize three dimensional display, holograph display device or two dimensional display to replace as the robot 100 that presents device Q.
The first typical variant example
Comprise as the three dimensional display that presents device Q 3100 that is used to present the information relevant and as the three dimensional display controller 3200 of control device R according to the information presenting device 1300 of the first typical variant example with object orientation.
Three dimensional display 3100 is by being presented at the information that the indicated direction of indicator on the three dimensional display 3100 presents indicating predetermined direction.
Figure 28 A and 28B illustrate the example of three dimensional display 3100.Figure 28 A schematically shows the structure of three dimensional display 3100.Figure 28 B illustrates the planimetric map of three dimensional display 3100.
Shown in Figure 28 A, the rotary drive unit 3105 that three dimensional display 3100 comprises semispherical projection vault 3102, is used for the projector 3103 of the animation of projection indicator, light source 3104 and is used to make projector 3103 to rotate.In addition, the base portion 3106 of support said modules comprises loudspeaker, CPU and storer.CPU is as three dimensional display controller 3200.
Projection vault 3102 is optical transmission types.Can see animation by the indicator of 3103 projections of projector from projection vault 3102 outsides.
Projector 3103 rotates around turning axle G under the situation that is driven in rotation unit 3105 drivings.Turning axle G is along extending with the approximately perpendicular direction of the mounting plane of three dimensional display 3100.When projector 3103 rotations, be projected in the animation rotation of the indicator on the projecting plane 3102.
Because the rotation of projector 3103 can be projected in indicator on the projecting plane 3102 with the indication any direction.In addition, projector 3103 has the animation handoff functionality that is used to switch the animation of wanting projection.Projector 3103 can come the animation of the dissimilar indicator of projection according to the control command of three dimensional display controller 3200.
Be not particularly limited the type of indicator.Figure 28 B shows a type in these types, and the type has eyes e1 and e2.Can define the front of this indicator by the existence of eyes e1 and e2.Can indicate this object orientation by the front object-oriented direction that makes indicator.
When presenting the information relevant with object, three dimensional display controller 3200 is based on the object angle that calculates, generation is used for the animation of indicator (for example, being used to make the animation of the front object-oriented direction of indicator) thereby is projected in the order that presents of denoted object direction on the projection vault 3102.
Information presenting device 1300 can be indicated the object orientation with respect to the reference direction of vehicle by being projected in the indicator (three-dimensional body at the position of human bodies such as the three-dimensional body of the three-dimensional body of simulated animal, simulating human or anthropomorphic statue, artificial hand or finger or the three-dimensional body of flechette-type etc.) on the three dimensional display 3100.Therefore, automotive occupant can the appointed object position when reference direction changes during vehicle '.
Since the three dimensional display controller 3200 of this variation have with the function of the functional equivalent of the robot controller 200 of first and second embodiment carrying out equivalent processes, so this variation can be brought the effect that is equal to the effect of being brought by first and second embodiment.
The second typical variant example
Comprise as the holograph display device 4100 that presents device Q of the information that is used to utilize three-dimensional virtual image to present object orientation and as the holograph display controller 4200 of control device R according to the information presenting device 1400 of the second typical variant example.
Holograph display device 4100 is by being presented the information of indicating predetermined direction by the indicated direction of the indicator that shows as three-dimensional virtual image.
Figure 29 is the block diagram of the information presenting device 1400 of this variation.As shown in figure 29, information presenting device 1400 comprises holograph display device 4100 and holograph display controller 4200.
In addition, holograph display device 4100 comprises playback light irradiation unit 4110, hologram setting device 4120 and main hologram 4130.
Hereinafter these assemblies will be described.Playback light irradiation unit 4110 is based on the main hologram 4130 irradiation reconstruct light of control command to being prepared from holograph display controller 4200, with reconstruct main hologram 4130.Can use lamp, light emitting diode or semiconductor lasers such as Halogen lamp LED or xenon lamp light source as reconstruct light.
In addition, main hologram 4130 is as a series of one or more holograms that are used to the medium of the information of transmitting.These a series of one or more holograms are with the motion process of predetermined interval recording indicator.This indicator can be in the robot 100 described in first and second embodiment.Holograph display device 4100 can show robot 100 as three-dimensional virtual image.Main hologram 4130 comprises with a series of holograms of predetermined record front, interval towards the motion process of the robot 100 of predetermined direction.
Be not particularly limited holographic map generalization method, and can use any known method.For example, coherent laser beam is divided into illuminating bundle and reference beam by beam splitter, and with illumination beam to motion indicator with obtain to be instructed to the device scattering object beam.Reference beam is radiated on the recording medium same as before.To be recorded on the recording medium owing to object beam and the caused interference fringe of reference beam.
Main hologram 4130 comprises the static hologram image that is divided into from the motion process of indicator with predetermined space.With each static hologram image of indicator as main hologram 4130 and record.By the static hologram image that shows main hologram 4130 in succession motion indicator is presented as animation.The quantity of static hologram image many more (that is, the time interval is short more), the motion of the indicator that then presents is level and smooth more.
Note, can use the recording medium as main hologram 4130 such as polyvinylcarbazole, acrylic acid and photosensitive materials such as other photopolymer, dichromated gelatin and photo-induced corrosion resistant material.
Hologram setting device 4120 is arranged on one or more main holograms of preparing 4130 position of irradiation reconstruct light in turn along the time shaft that is equal to the time shaft that writes down.By the main hologram 4130 that is provided with in turn by hologram setting device 4120 is shone reconstruct light in turn, the motion of reconstruct indicator and can show the three-dimensional virtual image of the indicator that moves.This three-dimensional virtual image can be presented on the windshield.In addition, independent holograph display can be set on instrument panel.
Be not particularly limited the display packing of three-dimensional virtual image, and can use any known technology.For example, can use the three-dimensional virtual image that shows the indicator of indicating predetermined direction at the 3 d image display described in the Japanese kokai publication hei 9-113845.
When presenting the information relevant with object, holograph display controller 4200 is based on the main hologram 4130 of the object angle reconstruct indicator that calculates, with display surface on the virtual image display device of holograph display device 4100 to the three-dimensional virtual image of the indicator of object orientation.
Because the information presenting device 1400 of this variation can come the indicator of projection indication with respect to the object orientation of the reference direction of vehicle by holograph display device 4100, therefore when reference direction changed during vehicle ', automotive occupant can the appointed object position.
The 3rd typical variant example
Comprise as the two dimensional display that presents device Q that is used to utilize two dimensional image to present the information relevant and as the image display controller of control device R according to the information presenting device of the 3rd typical variant example with object orientation.This two dimensional display shows the information of indicating predetermined direction by the indicated direction of the indicator that shows as two dimensional image.In this variation, the display 308 that uses guider 300 shown in Figure 29 is as two dimensional display.
The image display controller is stored the animation data of the indicator of indicating predetermined direction, and based on presenting the indicator that order shows the denoted object direction.Be not particularly limited the output control method of two dimensional image, and can use common method.
When presenting the information relevant with object, display 308 is based on the animation of the indicator of the object angle reproduction object-oriented direction that calculates, to show this animation on display 308.
Because display 308 shows the animation of indication with respect to the indicator of the object orientation of the reference direction of vehicle, therefore automotive occupant can the appointed object position when reference direction changes during vehicle '.
Although the two dimensional image with indicator in this variation is presented on the display 308 of guider 300, be not limited thereto.For example, can utilize the head-up display (head-up display) that is installed in the instrument panel that the two dimensional image of this indicator is projected on the windshield.
Presented the foregoing description so that understand the present invention, but not intention restriction the present invention.Therefore, disclosed each element intention comprises whole design variations example and the equivalent that belongs in the technical scope of the present invention in above embodiment.
In other words, although illustration comprise that the equipment that presents device Q and control device R as the aspect according to information presenting device of the present invention, the invention is not restricted to this example.
Indicator is with to present device Q corresponding.Use robot 100 as three-dimensional body (indicator).Use three dimensional display 3100 as three-dimensional display apparatus (indicator).Use holograph display device 4100 as the virtual image display device (indicator) that is used to show three-dimensional virtual image.The display 308 that uses two dimensional display or guider 300 is as the flat display that is used for display indicator.Yet indicator according to the present invention is not limited thereto.
Control module according to the present invention is corresponding with control device R.The motion of robot controller 200 control robot 100.Three dimensional display controller control three dimensional display 3100.Holograph display controller 4200 control holograph display device 4100.Two dimensional display controller control two dimensional display.Use these conducts and the corresponding control device R of control module.Yet, according to the indicator that will control and control and treatment, thereby be not limited thereto according to controller according to the present invention.
Although use control device R as control module, be not limited thereto with ROM 201, CPU 202 and RAM 203.
Although use to have information of vehicles and obtain control device R (robot controller 200 etc.) that function, object information obtain function, object appointed function, object angle calculation function and present the order systematic function, be not limited thereto as according to control module of the present invention.
Although use loudspeaker 130 as the output unit that is included in the indicator, be not limited thereto.
The full content of (on September 29th, 2008 submitted to) Japanese patent application 2008-250220, (on May 29th, 2009 submitted to) Japanese patent application 2009-129958 and (on September 16th, 2009 submitted to) Japanese patent application 2009-214063 is contained in this by reference.
Although above, the invention is not restricted to the foregoing description by the present invention being described with reference to specific embodiment of the present invention.According to above instruction, those skilled in the art will expect variation and the variation of embodiment.Limit scope of the present invention with reference to following claims.

Claims (27)

1. Vehicular information display device comprises:
Indicator, it is installed on the described vehicle, and is used to present the information of direction indication; And
Control module is used to control the information of being undertaken by described indicator and presents, wherein
Described control module comprises:
The information of vehicles acquiring unit is used to obtain the current location of described vehicle and the reference direction of described vehicle;
The object designating unit is used to specify the occupant's that will present to described vehicle object;
The object information acquiring unit is used to obtain object's position, and described object's position is the position that described object exists;
The object angle calculation unit is used for based on described object's position, described current location and described reference direction, calculates the object angle of indication with respect to the object orientation of described reference direction, and described object orientation is the direction that described object exists; And
Present the order generation unit, be used for generating based on the object angle that calculates and present order, wherein, present order according to described, described indicator presents the information of indicating described object orientation.
2. Vehicular information display device according to claim 1 is characterized in that,
Described object angle calculation unit is calculated described object angle with predetermined time interval, and
The described order generation unit that presents presents order based on each the object angle generation that calculates, thereby makes described indicator carry out the order that presents that is generated in succession.
3. Vehicular information display device according to claim 1 is characterized in that,
Described object information acquiring unit obtains the object relevant information specified with described object designating unit, and
The described order generation unit that presents generates and to present order, presents the information relevant with described object with the output unit by described indicator.
4. Vehicular information display device according to claim 3 is characterized in that,
The described order generation unit that presents generates and to present order, with when presenting the information relevant with described object, presents the information of indicating described object orientation by described indicator.
5. Vehicular information display device according to claim 1 is characterized in that,
Described reference direction is the direct of travel of described vehicle.
6. Vehicular information display device according to claim 5 is characterized in that,
Described object designating unit is specified described object based on the direct of travel and the object's position of the current location of described vehicle, described vehicle.
7. Vehicular information display device according to claim 5 is characterized in that,
Except that described current location and described reference direction, described information of vehicles acquiring unit also obtains the speed of described vehicle, and
Described object angle calculation unit is based on described current location, described direct of travel and described speed, the direct of travel of described vehicle after the position of described vehicle and the described schedule time after the estimation schedule time, and, calculate object angle afterwards of the described schedule time based on the position that estimates, the direct of travel that estimates and described object's position.
8. Vehicular information display device according to claim 7 is characterized in that,
The described order generation unit that presents calculates angular velocity based on the object angle that calculates and the described schedule time, and generates and present order, to present the information of the object orientation that indication moves with the angular velocity that calculates by described indicator.
9. Vehicular information display device according to claim 7 is characterized in that,
The described schedule time is that the described indicator of control is with the needed time of presentation information.
10. Vehicular information display device according to claim 7 is characterized in that,
The described schedule time is the needed time that presents that the output unit of described indicator is finished the information relevant with described object.
11. Vehicular information display device according to claim 7 is characterized in that,
When the variation that surpasses predetermined value and/or described speed in the variation of described direct of travel surpassed predetermined value, described object angle calculation unit recomputated the object angle after the described schedule time, and
The described order generation unit that presents presents order based on the object angle generation that recalculates, to present the information of indicating described object orientation by described indicator.
12. Vehicular information display device according to claim 8 is characterized in that,
When the variation that surpasses predetermined value and/or described speed in the variation of described direct of travel surpassed predetermined value, described object angle calculation unit recomputated the object angle after the described schedule time, and
The described order generation unit that presents recomputates angular velocity based on the object angle that recalculates and the described schedule time, and generates and to present order, to present the information of the object orientation that indication moves with the angular velocity that recalculates by described indicator.
13. Vehicular information display device according to claim 1 is characterized in that,
Described reference direction comprises the orientation.
14. Vehicular information display device according to claim 13 is characterized in that,
Except that described current location and described reference direction, described information of vehicles acquiring unit also obtains the speed of described vehicle, and
Described object angle calculation unit is calculated the direct of travel with respect to described reference direction, with based on described current location, the direct of travel that calculates and described speed, the direct of travel of described vehicle after the position of described vehicle and the described schedule time after the estimation schedule time, and described object angle calculation unit is calculated object angle afterwards of the described schedule time based on the position that estimates, the direct of travel that estimates and described object's position.
15. Vehicular information display device according to claim 14 is characterized in that,
The described order generation unit that presents calculates angular velocity based on the object angle that calculates and the described schedule time, and generates and present order, to present the information of the object orientation that indication moves with the angular velocity that calculates by described indicator.
16. Vehicular information display device according to claim 14 is characterized in that,
The described schedule time is that the described indicator of control is with the needed time of presentation information.
17. Vehicular information display device according to claim 14 is characterized in that,
The described schedule time is the needed time that presents that the output unit of described indicator is finished the information relevant with described object.
18. Vehicular information display device according to claim 14 is characterized in that,
When the variation that surpasses predetermined value and/or described speed in the variation of described direct of travel surpassed predetermined value, described object angle calculation unit recomputated the object angle after the described schedule time, and
The described order generation unit that presents presents order based on the object angle generation that recalculates, to present the information of indicating described object orientation by described indicator.
19. Vehicular information display device according to claim 15 is characterized in that,
When the variation that surpasses predetermined value and/or described speed in the variation of described direct of travel surpassed predetermined value, described object angle calculation unit recomputated the object angle after the described schedule time, and
The described order generation unit that presents recomputates angular velocity based on the object angle that recalculates and the described schedule time, and generates and to present order, to present the information of the object orientation that indication moves with the angular velocity that recalculates by described indicator.
20. Vehicular information display device according to claim 1 is characterized in that, also comprises vehicle mounted imaging apparatus, wherein
When the image of being taken by described vehicle mounted imaging apparatus comprised the feature of described object, the described order generation unit that presents generated and is used to make described indicator to begin to carry out the described initiation command that presents order.
21. Vehicular information display device according to claim 1 is characterized in that, also comprises guider, wherein
When the image that is used for route guidance that provides from described guider comprised the feature of described object, the described order generation unit that presents generated and to be used to make described indicator to begin to carry out the described initiation command that presents order.
22. Vehicular information display device according to claim 1 is characterized in that,
Described indicator is by the rotatable three-dimensional body towards the information that presents direction indication,
Described three-dimensional body comprises and is used to driver element that described three-dimensional body is rotated around turning axle, and
The described order generation unit that presents presents order based on the object angle generation that calculates, and rotates to make described three-dimensional body by described driver element, thereby indicates described object orientation.
23. Vehicular information display device according to claim 1 is characterized in that,
Described indicator is the three dimensional display towards the information that presents direction indication by indicator shown on it, and
The described order generation unit that presents presents order based on the object angle generation that calculates, and with the animation of the shown indicator of demonstration on described three dimensional display, thereby indicates described object orientation by the animation of shown indicator.
24. Vehicular information display device according to claim 1 is characterized in that,
Described indicator is the virtual image display device towards the information that presents direction indication by the indicator that shows as three-dimensional virtual image,
Described virtual image display device comprises one or more main holograms of the motion process of the indicator that record is shown, and
The described order generation unit that presents presents order based on the object angle generation that calculates, main hologram with the shown indicator of reconstruct, thereby the three-dimensional virtual image by the shown indicator of described virtual image display device demonstration makes and indicates described object orientation by three-dimensional virtual image.
25. Vehicular information display device according to claim 1 is characterized in that,
Described indicator is the flat display towards the information that presents direction indication by indicator shown on it, and
The described order generation unit that presents presents order based on the object angle generation that calculates, and with the animation of the shown indicator of demonstration on described flat display, thereby indicates described object orientation by the animation of shown indicator.
26. Vehicular information display device according to claim 1 is characterized in that,
The user's that the object of presenting to described occupant is based on the destination that is provided with by described occupant or classification, estimate the hobby and/or the current location and the specified object of reference direction of described vehicle.
27. a Vehicular information rendering method comprises:
Appointment will be presented to the occupant's of vehicle object;
Calculate the object angle of denoted object direction based on the reference direction of the current location of the position of specified object, described vehicle and described vehicle, wherein, described object angle indication is with respect to the object orientation of described reference direction; And
Present the information of indicating described object orientation based on the object angle that calculates.
CN2009101781002A 2008-09-29 2009-09-29 Information presentation apparatus for vehicle and information presentation method for vehicle Expired - Fee Related CN101713664B (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2008-250220 2008-09-29
JP2008250220 2008-09-29
JP2009-129958 2009-05-29
JP2009129958 2009-05-29
JP2009-214063 2009-09-16
JP2009214063A JP5458764B2 (en) 2008-09-29 2009-09-16 Information presentation device

Publications (2)

Publication Number Publication Date
CN101713664A true CN101713664A (en) 2010-05-26
CN101713664B CN101713664B (en) 2012-09-19

Family

ID=41435325

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2009101781002A Expired - Fee Related CN101713664B (en) 2008-09-29 2009-09-29 Information presentation apparatus for vehicle and information presentation method for vehicle

Country Status (4)

Country Link
US (1) US8392061B2 (en)
EP (1) EP2169353B1 (en)
JP (1) JP5458764B2 (en)
CN (1) CN101713664B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103063225A (en) * 2012-12-21 2013-04-24 安科智慧城市技术(中国)有限公司 Vehicle-mounted robot, vehicle-mounted navigation system, vehicle-mounted navigation method and vehicle
CN103210279A (en) * 2011-11-15 2013-07-17 松下电器产业株式会社 Position estimation device, position estimation method, and integrated circuit
CN107731005A (en) * 2017-10-17 2018-02-23 汤庆佳 A kind of vehicle-mounted caution system using phantom imaging based on cloud computing
CN110319847A (en) * 2018-03-30 2019-10-11 比亚迪股份有限公司 Vehicle and its navigation indication system and method based on vehicle-mounted display terminal
CN113534807A (en) * 2021-07-21 2021-10-22 北京优锘科技有限公司 Method, device, equipment and storage medium for realizing robot inspection visualization

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4875127B2 (en) * 2009-09-28 2012-02-15 パナソニック株式会社 3D image processing device
DE102010003850A1 (en) * 2010-04-12 2011-10-13 Robert Bosch Gmbh Method for adjusting position of marked object for marking object on display unit of vehicle, involves determining position of marked object corresponding to surrounding information
JP5418669B2 (en) * 2010-04-27 2014-02-19 トヨタ自動車株式会社 In-vehicle device and in-vehicle information processing device
US9026367B2 (en) 2012-06-27 2015-05-05 Microsoft Technology Licensing, Llc Dynamic destination navigation system
US20140094197A1 (en) * 2012-10-03 2014-04-03 Fisoc, Inc. Speed and topology relevant dynamic geo search
JP6763166B2 (en) * 2016-03-23 2020-09-30 株式会社Jvcケンウッド Navigation device, notification method, program
JP2019164477A (en) * 2018-03-19 2019-09-26 本田技研工業株式会社 Information provision system, information provision method and program
WO2021171350A1 (en) * 2020-02-25 2021-09-02 日本電気株式会社 Control device, control method, and recording medium
JP7515384B2 (en) * 2020-11-30 2024-07-12 本田技研工業株式会社 Display method and system

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0772856B1 (en) * 1994-07-29 1998-04-15 Seiko Communications Holding N.V. Dual channel advertising referencing vehicle location
JP3690393B2 (en) * 1995-03-06 2005-08-31 アイシン・エィ・ダブリュ株式会社 Navigation device
JP3268625B2 (en) 1995-08-11 2002-03-25 シャープ株式会社 3D image display device
JP2000057491A (en) * 1998-08-06 2000-02-25 Nissan Motor Co Ltd On-vehicle information presenting device
JP2001304899A (en) 2000-04-25 2001-10-31 Sony Corp Auxiliary display device for car navigation device
US6859686B2 (en) * 2002-11-26 2005-02-22 General Motors Corporation Gesticulating anthropomorphic interface
JP3949073B2 (en) * 2003-03-27 2007-07-25 トヨタ自動車株式会社 Parking assistance device
JP2006284454A (en) * 2005-04-01 2006-10-19 Fujitsu Ten Ltd In-car agent system
JP4512696B2 (en) * 2006-09-04 2010-07-28 株式会社イクシスリサーチ Robot navigation system
JP2008250220A (en) 2007-03-30 2008-10-16 Hamamatsu Photonics Kk Reflective light modulating device
DE102007037073A1 (en) * 2007-08-06 2009-02-12 Bayerische Motoren Werke Aktiengesellschaft Information i.e. direction information, relaying device for motor vehicle, has drive section of display device controllable by control device for adjusting directional signal element in displayed spatial direction
WO2009044797A1 (en) * 2007-10-04 2009-04-09 Nissan Motor Co., Ltd. Information presentation system
JP2009129958A (en) 2007-11-20 2009-06-11 Oki Semiconductor Co Ltd Semiconductor device and method for fabricating same
JP2009214063A (en) 2008-03-12 2009-09-24 Sintokogio Ltd Filter cartridge for pulse-jet type dust collector

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103210279A (en) * 2011-11-15 2013-07-17 松下电器产业株式会社 Position estimation device, position estimation method, and integrated circuit
CN103210279B (en) * 2011-11-15 2016-05-25 松下电器(美国)知识产权公司 Position estimating device, position estimating method and integrated circuit
CN103063225A (en) * 2012-12-21 2013-04-24 安科智慧城市技术(中国)有限公司 Vehicle-mounted robot, vehicle-mounted navigation system, vehicle-mounted navigation method and vehicle
CN107731005A (en) * 2017-10-17 2018-02-23 汤庆佳 A kind of vehicle-mounted caution system using phantom imaging based on cloud computing
CN110319847A (en) * 2018-03-30 2019-10-11 比亚迪股份有限公司 Vehicle and its navigation indication system and method based on vehicle-mounted display terminal
CN113534807A (en) * 2021-07-21 2021-10-22 北京优锘科技有限公司 Method, device, equipment and storage medium for realizing robot inspection visualization
CN113534807B (en) * 2021-07-21 2022-08-19 北京优锘科技有限公司 Method, device, equipment and storage medium for realizing robot inspection visualization

Also Published As

Publication number Publication date
EP2169353A1 (en) 2010-03-31
US8392061B2 (en) 2013-03-05
EP2169353B1 (en) 2012-12-19
JP2011007768A (en) 2011-01-13
CN101713664B (en) 2012-09-19
US20100082234A1 (en) 2010-04-01
JP5458764B2 (en) 2014-04-02

Similar Documents

Publication Publication Date Title
CN101713664B (en) Information presentation apparatus for vehicle and information presentation method for vehicle
JP7040867B2 (en) System, method and program
US10816984B2 (en) Automatic data labelling for autonomous driving vehicles
US10262234B2 (en) Automatically collecting training data for object recognition with 3D lidar and localization
JP2020514145A (en) Evaluation Method for Sensing Requirement of Autonomous Vehicle Based on Simulation
US20190317515A1 (en) Method for generating trajectories for autonomous driving vehicles (advs)
KR102279078B1 (en) A v2x communication-based vehicle lane system for autonomous vehicles
JP2019182411A (en) Method for transforming 2d bounding boxes of objects into 3d positions for autonomous driving vehicle (adv)
US11230297B2 (en) Pedestrian probability prediction system for autonomous vehicles
CN111061261A (en) Autonomous driving using standard navigation maps and lane configuration determined based on previous trajectories of vehicles
CN110782657A (en) Police cruiser using a subsystem of an autonomous vehicle
US20200269759A1 (en) Superimposed-image display device and computer program
EP4280167A1 (en) Ar service platform for providing augmented reality service
US11685398B2 (en) Lane based routing system for autonomous driving vehicles
CN113753072B (en) Automatic comfort degree scoring system based on human body driving reference data
CN112277951B (en) Vehicle perception model generation method, vehicle automatic driving control method and device
US20210150226A1 (en) Way to generate tight 2d bounding boxes for autonomous driving labeling
KR20190067851A (en) Use of map information for smoothing an object generated from sensor data
CN109927625A (en) A kind of information projecting method and device
JP2022058556A (en) Audio logging for model training and onboard validation utilizing autonomous driving vehicle
JP2020199840A (en) Information presentation control device
JP7111121B2 (en) Display control device and display control program
US10788839B2 (en) Planning-control collaboration design for low cost autonomous driving technology
JP2017016467A (en) Display control method, display control program, and information processing terminal
KR102645700B1 (en) Digital twin-based charging station control system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120919

Termination date: 20170929

CF01 Termination of patent right due to non-payment of annual fee