CN205541484U - Electronic device - Google Patents

Electronic device Download PDF

Info

Publication number
CN205541484U
CN205541484U CN201520983089.8U CN201520983089U CN205541484U CN 205541484 U CN205541484 U CN 205541484U CN 201520983089 U CN201520983089 U CN 201520983089U CN 205541484 U CN205541484 U CN 205541484U
Authority
CN
China
Prior art keywords
mentioned
average speed
speed
event
electronic installation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201520983089.8U
Other languages
Chinese (zh)
Inventor
高锡弼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hyundai Motor Co
Kia Corp
Original Assignee
星克跃尔株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 星克跃尔株式会社 filed Critical 星克跃尔株式会社
Application granted granted Critical
Publication of CN205541484U publication Critical patent/CN205541484U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/365Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3635Guidance using 3D or perspective road maps
    • G01C21/3638Guidance using 3D or perspective road maps including 3D objects and buildings
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • G01C21/3492Special cost functions, i.e. other than distance or default speed limit of road segments employing speed data or traffic data, e.g. real-time or historical
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3647Guidance involving output of stored or live camera images or video streams
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3691Retrieval, searching and output of information related to real-time traffic, weather, or environmental conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Environmental & Geological Engineering (AREA)
  • Computer Hardware Design (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Ecology (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Atmospheric Sciences (AREA)
  • Computer Graphics (AREA)
  • Environmental Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

The utility model relates to an electronic device. The utility model discloses an electronic device includes: the display part for display frame, mean speed calculation portion, if take place the particular event, then above -mentioned mean speed calculation portion calculates the mean speed of current vehicle, object formation portion, generate be used for above -mentioned mean speed that the expression calculated and the corresponding incident speed of above -mentioned particular event between the object of relation, and the control part, control above -mentioned display part and export the above -mentioned object that generates through augmented reality.

Description

Electronic installation
Technical field
This utility model relates to electronic installation, the control method of electronic installation, computer program and computer readable recording medium storing program for performing, in more detail, relate on augmented reality to user of service perform vehicle drive be correlated with guide electronic installation, the control method of electronic installation, computer program and computer readable recording medium storing program for performing.
Background technology
When the vehicle is running, it is most important that drive safely and prevent vehicle accident, to this end, be provided with at vehicle and perform the posture of control vehicle, the multiple auxiliary device controlling the function etc. of vehicle structure device and the safety device such as seat belt, air bag.
Moreover, in the recent period, it is arranged at the traveling image of device storage vehicle of black box etc. of vehicle and the data from various sensor transmissions, thus is in vehicle and arranges when vehicle generation vehicle accident for finding out the trend of the device of the cause of accident of vehicle.Owing to black box or navigator application etc. can also be installed at the portable terminal such as smart mobile phone, panel computer, thus it is used as vehicle device as above.
But it practice, the utilization rate travelling image in this vehicle device is the lowest at present.More specifically, even if obtaining the traveling image of vehicle by being installed on the first-class vision sensor that images of vehicle at present, the electronic installation of vehicle only stays in and merely shows, transmit this image or only stay in whether generation departs from the simple periphery announcement informations such as lane line.
And, as the most concerned new vehicle electronics, head up displays (HUD is also proposed, Head-Up Display) or augmented reality interface, but the utilization rate that in these devices, vehicle travels image the most only rests on the aspect showing merely or generating simple announcement information.
Utility model content
This utility model proposes according to above-mentioned necessity, the purpose of this utility model is, the object generated for representing the relation between the average speed of Current vehicle and the event speed corresponding with particular event is provided, and exports the electronic installation of generated object, the control method of electronic installation, computer program and computer-readable record medium by augmented reality.
Include for realizing the control method of the electronic installation of this utility model one embodiment of above-mentioned purpose: if there is particular event, then calculate the step of the average speed of Current vehicle;Generate the step being used for representing the object of the relation between above-mentioned average speed and the event speed corresponding with above-mentioned particular event calculated;And the step of the above-mentioned object generated is exported by augmented reality.
And, above-mentioned particular event can include that the Route guiding till entering segmentation speed restrictive block event, destination starts event, entrance transport information provides at least one in interval event.
And, in the case of above-mentioned entrance segmentation speed restrictive block event, above-mentioned generation comprises the steps that, for the step representing the object of the relation between above-mentioned average speed and the event speed corresponding with above-mentioned particular event calculated, the step that the restriction average speed of the average speed to the Current vehicle that time point based on above-mentioned entrance segmentation speed restrictive block calculates and above-mentioned segmentation speed restrictive block compares;And the step of the object of the relation representing the average speed of above-mentioned Current vehicle and above-mentioned restriction average speed is generated according to above-mentioned comparative result.
And, in the case of Route guiding till above destination starts event, above-mentioned generation comprises the steps that based on the time reflecting pre-arrival that the Real-time Traffic Information of above-mentioned Route guiding sart point in time calculates for the step representing the object of the relation between the above-mentioned average speed and the event speed corresponding with above-mentioned particular event that are calculated, calculates the step of path average speed;The step that average speed and the above-mentioned path average speed of above-mentioned Current vehicle are compared;And the step of the object of the relation represented between the average speed of above-mentioned Current vehicle and above-mentioned path average speed is generated according to above-mentioned comparative result.
And, there is provided in the case of interval event in above-mentioned entrance transport information, generate the above-mentioned steps of the object for representing the relation between above-mentioned average speed and the event speed corresponding with above-mentioned particular event calculated and comprise the steps that based on entering averagely hastening and step that above-mentioned transport information provides the section mean speed in interval to compare of Current vehicle that above-mentioned transport information provides interval time point to calculate;The step of the object of relation between the average speed and the above-mentioned section mean speed that represent above-mentioned Current vehicle is generated according to above-mentioned comparative result.
And, the above-mentioned steps generating the object for representing the relation between above-mentioned average speed and the event speed corresponding with above-mentioned particular event calculated comprises the steps that in the case of the average speed of above-mentioned Current vehicle is less than above-mentioned event speed, generates the step of the first object;And in the case of the average speed of above-mentioned Current vehicle is more than above-mentioned event speed, generate the step of the second object.
Further, the step exporting the above-mentioned object generated above by augmented reality comprises the steps that photographic head performs calibration (Calibration) calculates the step of photographic head parameter;The step of the virtual three-dimensional space (3D, 3-Dimensional) of filmed image for above-mentioned photographic head is generated based on above-mentioned photographic head parameter;And make the above-mentioned object generated be positioned at the step of above-mentioned virtual three-dimensional space.
And, it is positioned in the above-mentioned steps of above-mentioned virtual three-dimensional space at the above-mentioned object making to be generated, it is possible to be controlled in the way of making the lower end of end point that above-mentioned first object, the second object be positioned at above-mentioned photographic head filmed image manifest.
Further, compared with above-mentioned second object, above-mentioned first object can be located at the position closer to above-mentioned end point.
And, above-mentioned first object and the second object can distinguish with mutually different color.
Further, above-mentioned first object, the manifesting position and can change with the type of flow according to the difference of the average speed of above-mentioned Current vehicle and above-mentioned event speed of the second object.
On the other hand, the electronic installation of this utility model one embodiment for realizing above-mentioned purpose includes: display part, is used for showing picture;Average speed calculating part, if there is particular event, if there is particular event, the most above-mentioned average speed calculating part calculates the average speed of Current vehicle;Object generating unit, generates the object for representing the relation between above-mentioned average speed and the event speed corresponding with above-mentioned particular event calculated;And control portion, control above-mentioned display part and exported the above-mentioned object generated by augmented reality.
And, above-mentioned particular event can include that the Route guiding till entering segmentation speed restrictive block event, destination starts event and enters at least one in transport information offer interval event.
And, in the case of above-mentioned entrance segmentation speed restrictive block event, the average speed of Current vehicle that time point based on above-mentioned entrance segmentation speed restrictive block calculates can be compared by above-mentioned control portion with the restriction average speed of above-mentioned segmentation speed restrictive block, and above-mentioned object generating unit generates the object of the relation representing the average speed of above-mentioned Current vehicle and above-mentioned restriction average speed according to above-mentioned comparative result.
And, in the case of Route guiding till above destination starts event, above-mentioned control portion can be based on the time of pre-arrival reflecting that the Real-time Traffic Information of above-mentioned Route guiding sart point in time calculates, calculate path average speed, and compare with the average speed of above-mentioned Current vehicle and above-mentioned path average speed, above-mentioned object generating unit generates the object of the relation represented between the average speed of above-mentioned Current vehicle and above-mentioned path average speed according to above-mentioned comparative result.
And, in the case of transport information offer interval event is provided, above-mentioned control portion can to based on above-mentioned entrance transport information provide Current vehicle that interval time point calculates averagely hasten and above-mentioned transport information provides interval section mean speed to compare, above-mentioned object generating unit can generate the object of the relation representing the average speed of above-mentioned Current vehicle and above-mentioned section mean speed according to above-mentioned comparative result.
And, in the case of the average speed of above-mentioned Current vehicle is less than above-mentioned event speed, above-mentioned object generating unit can generate the first object, and in the case of the average speed of above-mentioned Current vehicle is more than above-mentioned event speed, above-mentioned object generating unit can generate the second object.
Further, above-mentioned control portion can perform calibration to photographic head and calculate photographic head parameter, generates the virtual three-dimensional space of the filmed image for above-mentioned photographic head based on above-mentioned photographic head parameter, and makes the above-mentioned object generated be positioned at above-mentioned virtual three-dimensional space.
And, above-mentioned control portion can be controlled in the way of manifesting to make the lower end of end point that above-mentioned first object, the second object be positioned at above-mentioned photographic head filmed image.
Further, compared with above-mentioned second object, above-mentioned first object can be located at the position closer to above-mentioned end point.
And, above-mentioned first object and the second object can distinguish with mutually different color.
Further, above-mentioned first object, the manifesting position and can change with the type of flow according to the difference of the average speed of above-mentioned Current vehicle and above-mentioned event speed of the second object.
On the other hand, the computer program being stored in record medium being used for realizing this utility model one embodiment of above-mentioned purpose can perform following steps: if combining with electronic installation particular event occurs, then calculates the step of the average speed of above-mentioned Current vehicle;Generate the step being used for representing the object of the relation between above-mentioned average speed and the event speed corresponding with above-mentioned particular event calculated;And the step of the above-mentioned object generated is exported by augmented reality.
On the other hand, the computer readable recording medium storing program for performing of this utility model one embodiment for realizing above-mentioned purpose can store the computer program performing following steps: if there is particular event, then calculates the step of the average speed of above-mentioned Current vehicle;Generate the step being used for representing the object of the relation between above-mentioned average speed and the event speed corresponding with above-mentioned particular event calculated;And the step of the above-mentioned object generated is exported by augmented reality.
According to above-mentioned various embodiments of the present utility model, there is the interval of segmentation speed limit photographic head, providing the Route guiding till the interval of Real-time Traffic Information and destination interval etc., guidance information is manifested in mobilism mode by augmented reality, such that it is able to effectively make human pilot appeal to, it is possible to seek human pilot to the safe driving of vehicle and convenience.
And, according to above-mentioned various embodiments of the present utility model, on augmented reality (Augmented Reality:AR), the position that manifests of object is made to change with the type of flow according to the speed of vehicle such that it is able to guide human pilot with method more intuitively.
Accompanying drawing explanation
Fig. 1 is the block diagram of the electronic installation representing this utility model one embodiment.
Fig. 2 is the block diagram in the concrete augmented reality offer portion representing this utility model one embodiment.
Fig. 3 is to represent the figure that the end point of this utility model one embodiment determines method.
Fig. 4 is the figure of the network of system being connected with the electronic installation of this utility model one embodiment for explanation.
Fig. 5 is the flow chart of the control method of the electronic installation briefly expressing this utility model one embodiment.
Fig. 6 is the flow chart of the control method of the electronic installation when there is to enter segmentation speed restrictive block event representing this utility model one embodiment.
Fig. 7 is the flow chart of the control method of the electronic installation when occurring the Route guiding till destination to start event representing this utility model one embodiment.
Fig. 8 is for representing the flow chart of the control method of the electronic installation when there is to enter transport information offer interval event of this utility model one embodiment.
Fig. 9 A-9C is the figure of the augmented reality picture representing this utility model one embodiment.
Figure 10 A-10C is the figure of the augmented reality picture representing another embodiment of this utility model.
Figure 11 is the figure of the augmented reality picture representing another embodiment of this utility model.
Figure 12 is the figure of the embodiments in the case of photographic head and electronic installation are divergence type representing this utility model one embodiment.
Figure 13 is the figure of the embodiments in the case of photographic head and electronic installation are integrated type representing this utility model one embodiment.
Figure 14 is the figure of the embodiments of head up displays and the electronic installation representing and utilizing this utility model one embodiment.
Detailed description of the invention
Content below is merely exemplary principle of the present utility model.Therefore, do not clearly state or illustrate in this specification although this utility model person of ordinary skill in the field can invent, but principle of the present utility model can be embodied and be contained in the multiple device of concept of the present utility model and scope.Further, the term of all conditions enumerated in this utility model and embodiment are specifically used for understanding concept of the present utility model in principle, and, it is thus understood that do not limit the embodiment and state enumerated the most especially.
Further, it is used for enumerating principle of the present utility model, viewpoint and embodiment, and all detailed description enumerating specific embodiment should be understood to comprise the structural of this item and functional equivalent technical solutions.Further, this equivalent technical solutions should be read to include presently disclosed equivalent technical solutions, also includes the equivalent technical solutions being developed from now on, i.e. all elements invented in the way of performing identical function with independent of structure.
It is therefoie, for example, the block diagram of this specification should be understood to mean the viewpoint of the concept in exemplary loop principle of the present utility model embodied.Similar, all flow charts, state transition graph, false code etc. should be understood regardless of whether be substantially shown in computer-readable medium, regardless of whether be explicitly illustrated computer or processor, all represent by computer or process the multiple programs performed.
Processor or include that the function of the Various Components shown in the accompanying drawing of the functional device shown with the concept similar to processor not only can be provided as specialized hardware, and can be provided as using the hardware with the function that can run software relatively with suitable software.When providing above-mentioned functions by processor, can provide above-mentioned functions by single application specific processor, single shared processor or multiple Respective processors, the part in these can be shared.
And, digital signal processor (DSP) hardware should should not be included, for storing the read only memory (ROM) of software, random access memory (RAM) and nonvolatile memory with unrestricted being construed as to imply that property of mode to explain in the way of getting rid of the hardware with the ability that can run software by clearly the using of term that process, control or concept similarly propose.May also include other known usual hardware.
Utility model in this specification claims in scope; the structural element showing as the mechanism for performing function described in detailed description includes all methods combining or include performing the function of the software of the form of ownership containing firmware/microcode etc. of the loop element as performed above-mentioned functions; and combine with the suitable loop being used for running above-mentioned software, perform above-mentioned functions.This utility model that scope defines is claimed by this utility model; the function provided by the method enumerated in many ways combines; and the mode required by claiming scope with utility model combines, thus any method of above-mentioned functions can be provided it will be also be appreciated that the method being equal to grasp from this specification is identical.
By the following detailed description of relevant to accompanying drawing, purpose, feature and advantage that can be the most above-mentioned, thus, this utility model person of an ordinary skill in the technical field can be easily implemented with technological thought of the present utility model.Further, during this utility model is illustrated, there is a possibility that purport of the present utility model thickens if judging known technology is illustrated, then by description is omitted.
Hereinafter, referring to the drawings, various embodiments of the present utility model is described in detail.
Fig. 1 is the block diagram of the electronic installation representing this utility model one embodiment.With reference to Fig. 1, electronic installation 100 includes all or part of in storage part 110, input unit 120, output unit 130, average speed calculating part 140, augmented reality offer portion 160, control portion 170, communication unit 180, test section 190, power supply unit 195.
Here, electronic installation 100 can be presented as can provide to the vehicle operator being in driving condition drives the smart mobile phone of relevant guiding, panel computer, notebook computer, personal digital assistant (PDA, personal digital assistant), portable media player (PMP, portable multimedia player), intelligent glasses, expansion Reality glasses, navigator (navigation), the multiple device such as black box (Black-box), above-mentioned electronic installation 100 can be located at vehicle.
Here, the driving condition of vehicle can include the various states of vehicle that the dead ship condition of vehicle, the transport condition of vehicle, the state etc. of parking of vehicle driven by human pilot.
Vehicle drive be correlated with guide can include guiding as navigation, lane line depart from, front vehicles sets out guidings, signal lights change guides, prevent from guiding with front vehicles collision, driveway change guidings, driveway guidings, the oil consumption of vehicle, transient acceleration, transport information, the travel speed of current Current vehicle, distance till locality, front, the multiple guiding for assisting human pilot driving vehicle such as time-consumingly till arrival locality, front.
Here, navigation comprises the steps that augmented reality navigates, and is used in combination the various information such as the position of personnel, direction and performs navigation in the image in the front of shooting vehicle the most under steam;And two dimension (2D, 2-Dimensional) or three-dimensional (3D, 3-Dimensional) navigate, the various information such as the position of personnel, direction that are used in combination in diagram data two-dimensional or three-dimensional perform navigation.Wherein, navigation is interpreted as not being only included in the navigation in the case of user of service's ride-on vehicles is driven, and also includes that user of service is by walking or the concept of the navigation in the case of moving in the way of running.
Further, lane line departs from guiding can be whether the vehicle in travelling to be departed from lane line guide.
Further, whether front vehicles is set out and is guided and can set out carried out guiding for being pointed to the vehicle in front of the vehicle in parking.
Further, signal lights change guiding can be by being pointed to whether the signal lights in the front of the vehicle in stopping changes carried out guiding.As an example, if being the green light representing signalling from representing the Status Change that the red light stopping signal lighting, then this can be guided.
Further, if prevent and the guiding of front vehicles collision can be the distance between vehicle and the located anteriorly vehicle in stopping or travelling within predetermined distance, then in order to prevent from colliding and the guiding that carries out with front vehicles.
Further, driveway change guiding can be for guiding vehicle to be altered to other driveways from the driveway residing for vehicle for the path till guiding destination.
Further, driveway guides the guiding that can be carried out by the driveway being presently in vehicle.
This driving coherent video providing multiple guiding can carry out captured in real-time by the photographic head placed towards the front of vehicle.Here, photographic head can be to form as one with the electronic installation 100 being positioned over vehicle, and shoots the photographic head in the front of vehicle.In this case, photographic head can form as one with smart mobile phone, navigator or black box, and electronic installation 100 can receive the image captured by the photographic head formed as one.
As another example, photographic head can be to be respectively placed in vehicle with electronic installation 100, and shoots the photographic head in the front of vehicle.In this case, photographic head can be the single black box that the front towards vehicle is placed, electronic installation 100 receives the image captured by individually placed black box by wire/wireless communication, or if the storage medium being used for storing the image captured by black box is inserted in electronic installation 100, then electronic installation 100 can receive the image captured by black box.
Hereinafter, based on foregoing, the electronic installation 100 of this utility model one embodiment is carried out the brightest.
Storage part 110 performs the multiple data needed for the work of storage electronics 100 and the function of application.Especially, storage part 110 can data needed for the work of storage electronics 100, such as operating system (OS), track search application, map datum etc..Further, storage part 110 can store the data that the work by electronic installation 100 generates, such as the path data explored, the image etc. received.
nullWherein,Storage part 110 is possible not only to be presented as random access memory (RAM,Random Access Memory)、Flash memory、Read only memory (ROM,Read Only Memory)、Erasable programmable read only memory (EPROM,Erasable Programmable ROM)、Electrically erasable formula read only memory (EEPROM,Electronically Erasable and Programmable ROM)、Depositor、Hard disk、Moveable magnetic disc、RAM (random access memory) card、USIM (USIM,Universal Subscriber Identity Module) etc. internally-arranged type memory element,May be embodied in the memory element of the dismantled and assembled forms such as USB (universal serial bus) memorizer.
Input unit 120 performs to be converted to the function of the specific signal of telecommunication from being physically entered of the outside input of electronic installation 100.Wherein, all or part of during input unit 120 can include user of service's input unit 121 and microphone portion 123.
User of service's input unit 121 can receive the input of the users of service such as touch, pushing action.Here, user of service's input unit 121 may utilize multiple button form, receive touch input touch sensor, receive embody close at least one in the proximity transducer of action.
Microphone portion 123 can receive the sound of user of service and from the sound of the inside and outside of vehicle generation.
Output unit 130 is the device of the data for exporting electronic installation 100.Wherein, all or part of during output unit 130 can include display part 131 and audio output part 133.
Display part 131 be electronic installation 100 output can in the way of sense of vision the device of the data of identification.Display part 131 can be presented as the display part before the shell being located at electronic installation 100.Further, display part 131 can form as one with electronic installation 100, and exports the identification data of sense of vision, also can be separately provided, such as head up displays, the identification data exporting sense of vision with electronic installation 100.
Audio output part 133 be electronic installation 100 output can in the way of audio the device of the data of identification.Audio output part 133 can embody the data that should inform user of service of electronic installation 100 by the speaker of performance sound.
Communication unit 180 can set to make electronic installation 100 communicate with other equipment.Communication unit 180 can include all or part of in position data portion 181, wireless interconnected wet end 183, broadcast receiving and transmitting portion 185, mobile division 186, short-range communication portion 187, wire communication portion 189.
Position data portion 181 is the device being obtained position data by GLONASS (GNSS, Global Navigation Satellite System).GLONASS means that the available electric wave signal received from moonlet calculates the navigation system of the position receiving terminal.nullConcrete example as GLONASS,GPS (GPS can be divided into according to subject of operation,Global Positioning System)、GALILEO positioning system (Galileo)、GLONASS satellite navigation system (GLONASS,Global Orbiting Navigational Satellite System)、Beidou satellite navigation system (COMPASS)、India area navigation satellite system (IRNSS,Indian Regional Navigational Satellite System)、Quasi-zenith satellite system (QZSS,Quasi-Zenith Satellite System) etc..The position data portion 181 of the electronic installation 100 of this utility model one embodiment can obtain positional information by receiving the signal of the GLONASS providing service to the area using electronic installation 100.
Wireless interconnected wet end 183 is to fetch by being connected with wireless Internet to obtain data or the device of the information of transmission.Can be able to be WLAN (WLAN by the wireless Internet that wireless interconnected wet end 183 connects, Wireless LAN), WiMAX (Wibro, Wireless broadband), worldwide interoperability for microwave access (Wimax, World interoperability for microwave access), high-speed slender body theory (HSDPA, High Speed Downlink Packet Access) etc..
Broadcast receiving and transmitting portion 185 is the device being received and dispatched broadcast singal by various broadcast systems.nullCan be able to be T-DMB (DMBT by the broadcast system of broadcast receiving and transmitting portion 185 transmitting-receiving,Digital Multimedia Broadcasting Terrestrial)、Digital multimedia satellite broadcast (DMBS,Digital Multimedia Broadcasting Satellite)、Mobile TV standard (the MediaFLO that high pass proposes,Media Forward Link Only)、Hand-held digital video broadcast (DVBH,Digital Video Broadcast Handheld)、Japanese Digital audio broadcasting scheme (ISDBT,Integrated Services Digital Broadcast Tereestrial) etc..Traffic data, life data etc. can be included by the broadcast singal of broadcast receiving and transmitting portion 185 transmitting-receiving.
Mobile division 186 can be according to G mobile communication (3G, 3rd Generation), third generation partner program (3GPP, 3rd Generation Partnership Project), the various kinds of mobile communication specification such as Long Term Evolution plan (LTE, Long Term Evolution) is connected with mobile radio communication and communicates.
Short-range communication portion 187 is the device for carrying out short-range communication.As mentioned above, short-range communication portion 187 can pass through bluetooth (Bluetooth), RF identification (RFID, Radio Frequency Identification), infrared data tissue (IrDA, Infrared Data Association), ultra broadband (UWB, Ultra WideBand), ZigBee protocol (ZigBee), near-field communication (NFC, Near Field Communication), Wireless Fidelity (Wi-Fi) etc. communicate.
Wire communication portion 189 is the interface arrangement that electronic installation 100 can be made in a wired fashion to be connected with other equipment.Wire communication portion 189 can be to pass through the universal serial bus module that USB port (USB Port) communicates.
This communication unit 180 may utilize at least one in position data portion 181, wireless interconnected wet end 183, broadcast receiving and transmitting portion 185, mobile division 186, short-range communication portion 187, wire communication portion 189 to be come and other equipment communications.
As an example, in the case of electronic installation 100 does not includes camera function, at least one in available short-range communication portion 187, wire communication portion 189 receives the image of the vehicle photographic head shootings such as black box.
As another example, in the case of communicating with multiple equipment, it is possible to communicated by short-range communication portion 187 by an equipment, another equipment is communicated by wire communication portion 189.
Test section 190 is can the device of current state of detection. electronics 100.Test section 190 can include all or part of in motion detection portion 191, optical detection part 193.
Motion detection portion 191 can motion on the three dimensions of detection. electronics 100.Motion detection portion 191 can include three axle geomagnetic sensor and 3-axis acceleration sensors.The exercise data obtained by motion detection portion 191 and the position data obtained by position data portion 191 can be combined, calculate the track more accurately of the vehicle being attached with electronic installation 100.
Optical detection part 193 is the device of the periphery illumination (illuminance) measuring electronic installation 100.Utilize the illumination data obtained by optical detection part 193, the brightness of display part 195 can be made to change accordingly with periphery brightness.
Power supply unit 195 is the device of the power supply needed for the work of other equipment working or being connected with electronic installation 100 for supplying electronic installation 100.Power supply unit 195 can be the device receiving power supply at the external power sources such as the battery being built in electronic installation 100 or vehicle.Further, according to the form of reception power supply, power supply unit 195 can be presented as wire communication module 119 or be presented as the device receiving power supply wirelessly.
Average speed calculating part 140, can calculate the average speed of the vehicle with electronic installation 100.Specifically, average speed calculating part 140 can calculate the average speed of vehicle based on the signal detected by test section 190, the exercise data that obtained by position data portion 181.Or, average speed calculating part 140 may utilize the controller zone network (CAN, Controller Area Network) of vehicle and carrys out the velocity information of real-time reception vehicle, and can calculate the average speed of vehicle based on this.
On the other hand, the electronic installation 100 of this utility model one embodiment can include the augmented reality offer portion 160 for providing augmented reality view mode, to this, will be specifically described with reference to Fig. 2.
Fig. 2 is the block diagram in the concrete augmented reality offer portion 160 representing this utility model one embodiment.With reference to Fig. 2, enhancing display offer portion 160 can include all or part of in calibration portion 161, three dimensions generating unit 162, object generating unit 163, mapping portion 164.
Calibration portion 161 can perform calibration, estimates the photographic head parameter corresponding with photographic head from the filmed image shot by photographic head.Wherein, photographic head parameter can be the parameter constituting photographic head matrix, and above-mentioned photographic head matrix is for representing that real space is mapped in the information of the relation of photo.
Three dimensions generating unit 162 can generate virtual three-dimensional space based on the filmed image shot by photographic head.Specifically, three dimensions generating unit 162 can obtain depth information (Depths information) by the photographic head parameter that calibration portion 161 is estimated from the image taken by photographic head, and can generate virtual three-dimensional space based on the depth information obtained and taken image.
Object generating unit 163 can generate the object for guiding on augmented reality, and such as, the change of navigation object, driveway guides object, lane line to depart from and guides object etc..Especially, object generating unit 163 can generate the object of the relation (such as, speed difference etc.) between average speed and the event speed corresponding with particular event for representing the Current vehicle calculated by average speed calculating part 140.Wherein, object can be presented as three dimensional object, image or line art etc..
Mapping portion 164 can be mapped in, in the virtual three-dimensional space generated by three dimensions generating unit 162, the object that object generating unit 163 is generated.
On the other hand, control portion 170 controls the whole work of electronic installation 100.Specifically, all or part of during control portion 170 can control storage part 110, input unit 120, output unit 130, average speed calculating part 140, augmented reality offer portion 160, communication unit 180, detecting means 190.
Especially, control portion 170 can determine whether particular event.Here, particular event can include that the Route guiding till entering segmentation speed restrictive block event, destination starts event and enters at least one in transport information offer interval event.
Segmentation speed limit is as one of speed limit mode limiting overspeed of vehicle, using elapsed time in the A place of the starting point as segmentation speed limit with through speed and on the basis of the displacement in B place, calculate the average speed of vehicle, thus judge whether to exist in interval the speed limit mode of hypervelocity.Here, the event that segmentation speed restrictive block event can be the starting point entering above-mentioned segmentation speed limit is entered.
Further, the Route guiding event of starting can be to receive vehicle in the case of current location Route guiding to destination is invited, and electronic installation 100 starts the event of Route guiding.
And, transport information provides interval can be to be provided by transportation protocol expert group (TPEG, Transport Protocol Expert Group) comprise Traffic Information (RTM:Road Traffic Message) and block up and the text based such as travel time information (CTT:Congestion and Travel Time Information) adds the road interval of traffic-information service.Here, entering transport information provides interval event can be the event entering the road interval providing above-mentioned transport information in multiple road interval.
On the other hand, if judging above-mentioned event, then control portion 170 controllable objects generating unit 163 generates the object of the relation between the average speed calculated by average speed calculating part 140 and the event speed corresponding with particular event.Here, the relation between average speed and event speed can comprise such as speed difference between average speed and event speed.
As an example, in the case of particular event is for entering segmentation speed restrictive block event, the average speed of Current vehicle and the restriction average speed of segmentation speed restrictive block that calculate based on the time point entering segmentation speed restrictive block are compared by control portion 170 controllable objects generating unit 163, and generate the object of the relation of the average speed for representing Current vehicle and restriction average speed according to above-mentioned comparative result.
As another example, in the case of Route guiding till particular event is destination starts event, control portion 170 controllable objects generating unit 163 guides the Real-time Traffic Information of sart point in time (such as based on response path, the Traffic Informations such as transportation protocol expert group) time of pre-arrival of calculating, calculate path average speed, average speed and path average speed to Current vehicle compare, and generate the object of relation between the average speed and the path average speed that represent Current vehicle according to comparative result.
As another example, in the case of particular event is for entering transport information offer interval event, control portion 170 controllable objects generating unit 163 to based on enter Current vehicle that transport information provides interval time point to calculate averagely hasten and transport information provides interval section mean speed to compare, and generate the object of relation between the average speed and the above-mentioned section mean speed that represent Current vehicle according to comparative result.
On the other hand in the case of the average speed at Current vehicle is less than event speed, control portion 170 controllable objects generating unit 163 generates the first object, in the case of the average speed of Current vehicle is more than event speed, control portion 170 controllable objects generating unit 163 generates the second object.
And, control portion 170 can control mapping portion 164 so that the lower end of the end point that the first object, the second object are positioned at photographic head filmed image manifests.Here, available filmed image detects end point.To this, will be specifically described with reference to Fig. 3.
With reference to Fig. 3, control portion 170 can extract lane line information 310 and lane line information 320 from filmed image, it is possible to the lane line information 310 extracted and lane line information 320 are carried out extending next cross one another point by extraction.In the case, control portion 170 can detect and be extended by lane line information 310 and lane line information 320 and cross one another point is as end point (Vanishing point) 330.
On the other hand, control portion 170 can control the lower end that mapping portion 164 makes the first object, the second object be positioned at end point 330 and manifests.With reference to Fig. 3, control portion 170 may determine that the lower end area 340 of end point 330, and can control mapping portion 164 and make the first object, the second object map in lower end area.Here, lower end area 340 is preferably road area.
According to a this embodiment of the present utility model, for represent length velocity relation object can on the road being positioned at augmented reality picture in the way of manifest, it is possible to provide Route guiding with method more intuitively to human pilot.
On the other hand, compared with above-mentioned second object, above-mentioned first object can be located at the position closer to end point.Such as, in the case of the average speed of Current vehicle is less than the restriction average speed of segmentation speed restrictive block, control portion 170 can control mapping portion 164 makes the first generated object be positioned at the lower end area near end point.Further, in the case of the average speed of Current vehicle is more than the restriction average speed of segmentation speed restrictive block, control portion 170 can control mapping portion 164 makes the second generated object be positioned at the lower end area of the first object.That is, in the picture of augmented reality, the first object can manifest away from Current vehicle, and the second object can manifest close to Current vehicle.As an example, in the picture of augmented reality, the second object can be revealed in the bottom of the augmented reality picture corresponding with the position of Current vehicle.Thus, human pilot can confirm the first object, the second object by augmented reality picture, such that it is able to easily identify the average speed of Current vehicle and the difference limited between average speed of segmentation speed restrictive block.
Further, the first object and the second object can distinguish with mutually different color.Such as, in the case of the average speed of Current vehicle is less than the restriction average speed of segmentation speed restrictive block, control portion 170 controllable objects generating unit 163 generates the first object of green, in the case of the average speed of Current vehicle is more than the restriction average speed of segmentation speed restrictive block, control portion 170 controllable objects generating unit 163 generates the second object of redness.
Further, the first object, the manifesting position and can change with the type of flow according to the difference of the average speed of Current vehicle and above-mentioned event speed of the second object.Such as, when the average speed of Current vehicle is less than the restriction average speed of segmentation speed restrictive block, if both difference changes, then control portion 170 can control mapping portion 164 and makes the position of the first object change with the type of flow according to both difference.I.e., in the case of the average speed of Current vehicle is less than the restriction average speed of segmentation speed restrictive block, if both difference becomes larger, the first object then can be made to be revealed in the position being gradually distance from Current vehicle in the picture of augmented reality, and in the case of the average speed of Current vehicle is less than the restriction average speed of segmentation speed restrictive block, if both difference tapers into, then the first object can be revealed in the position moving closer to Current vehicle in the picture of augmented reality.
Fig. 4 is the figure of the network of system being connected with the electronic installation of this utility model one embodiment for explanation.With reference to Fig. 4, the electronic installation 100 of an embodiment of the present utility model can be presented as that navigator, black box, smart mobile phone or other vehicle augmented reality interfaces provide device etc. to be arranged at the various devices of vehicle, and can be connected with multiple communication network and other electronic equipments 61,62,63,64.
Further, electronic installation 100 can be connected with global positioning system according to the electric wave signal received from moonlet 20, thus calculates current position and current time.
Each moonlet 20 can the different L-band frequency of transmission frequency wave band.Time needed for electronic installation 100 can arrive electronic installation 100 based on the L-band frequency sent from each moonlet 20 calculates current position.
On the other hand, electronic installation 100 can pass through communication unit 180, and wirelessly to be connected with network 30 by control station (ACR) 40, base station (RAS) 50 etc..If electronic installation 100 is connected with network 30, the most also can be connected with other electronic equipments 61,62 being connected with network 30 in an indirect way, and exchange data.
On the other hand, electronic installation 100 can also to be connected with network 30 in an indirect way by having other equipment 63 of communication function.Such as, in the case of electronic installation does not has the module can being connected with network 30, can be communicated with other equipment 63 with communication function by short-range communication etc..
Fig. 5 is the flow chart of the control method of the electronic installation briefly expressing this utility model one embodiment.With reference to Fig. 5, first, it is determined that electronic installation 100 can decide whether particular event (step S101).According to embodiment of the present utility model, in the case of occurring the Route guiding till entering segmentation speed restrictive block event, destination to start event and enter at least one time that transport information provides in interval event, electronic installation 100 may determine that the generation of particular event.
And, if there is particular event, then electronic installation 100 can calculate the average speed (step S102) of Current vehicle.
And, electronic installation 100 can generate the object (step S103) for representing the relation between average speed and the event speed corresponding with particular event calculated.
And, electronic installation 100 can export generated object (step S104) by augmented reality.
Hereinafter, with reference to Fig. 6 to Fig. 8, the control method of the electronic installation of institute's event is carried out more specific description.
Fig. 6 is the flow chart of the control method of the electronic installation when there is to enter segmentation speed restrictive block event representing this utility model one embodiment.With reference to Fig. 6, electronic installation 100 can determine whether to occur to enter segmentation speed restrictive block event (step S201).
If judging generation event, then electronic installation 100 can compare (step S202) to the average speed of Current vehicle and the restriction average speed of segmentation speed restrictive block that calculate based on the time point entering segmentation speed restrictive block.
And, in the case of the average speed of Current vehicle is less than the average average speed of segmentation speed restrictive block, electronic installation 100 can generate the first object (step S203).
Or, in the case of the average speed of Current vehicle is more than the restriction average speed of segmentation speed restrictive block, electronic installation 100 can generate the second object (step S204).
And, electronic installation 100 can export generated object (step S205) by augmented reality.
Here, in order to make human pilot recognize the average speed restriction average speed less than segmentation speed restrictive block of Current vehicle, the first object can be made to manifest away from Current vehicle in augmented reality picture.Further, in order to make human pilot recognize the average speed restriction average speed more than segmentation speed restrictive block of Current vehicle, the second object can be made to be revealed in the lower end of the augmented reality picture corresponding with the position of Current vehicle in augmented reality picture.
Fig. 7 is the flow chart of the control method of the electronic installation when occurring the Route guiding till destination to start event representing this utility model one embodiment.With reference to Fig. 7, electronic installation 100 can determine whether to occur access path to guide beginning event (step S301).
If judging generation event, then electronic installation 100 can guide pre-time of advent of calculating of Real-time Traffic Information of sart point in time based on response path, calculates path average speed (step S302).Here, Real-time Traffic Information can be comprised the Traffic Information received by transportation protocol expert group and block up and travel time information etc..
And, electronic installation 100 can compare (step S303) to the average speed of Current vehicle and path average speed.
And, in the case of the average speed of Current vehicle is less than above-mentioned path average speed, electronic installation 100 can generate the first object (step S304).
Or, in the case of the average speed of Current vehicle is more than above-mentioned path average speed, electronic installation 100 can generate the second object (step S305).
And, electronic installation 100 can export generated object (step S306) by augmented reality.Here, in order to the average speed making human pilot recognize Current vehicle is less than path average speed, the first object can manifest away from Current vehicle in augmented reality picture.Further, in order to make human pilot recognize the average speed of Current vehicle more than path average speed, the second object can be revealed in the lower end of corresponding with current vehicle location augmented reality picture in augmented reality picture.
Fig. 8 is for representing the flow chart of the control method of the electronic installation when there is to enter transport information offer interval event of this utility model one embodiment.With reference to Fig. 8, electronic installation 100 can determine whether that occurring to enter transport information provides interval event (step S401).
If judging generation event, electronic installation 100 can provide interval section mean speed to compare (step S402) to based on the average speed and transport information entering the Current vehicle that transport information provides interval time point to calculate.Here, can calculate, based on the Traffic Information received by delivery protocol expert group, the section mean speed that transport information provides interval.
And, in the case of the section mean speed that the average speed of Current vehicle is interval less than transport information offer, electronic installation 100 can generate the first object (step S403).
And, in the case of the section mean speed that the average speed of Current vehicle is interval more than transport information offer, electronic installation 100 can generate the second object (step S404).
And, electronic installation 100 can export generated object (step S405) by augmented reality.Here, in order to the average speed making human pilot recognize Current vehicle is less than the average speed that path is interval, the first object can be made to manifest away from Current vehicle in augmented reality picture.Further, in order to make human pilot recognize the average speed of Current vehicle more than section mean speed, the second object can be made to be revealed in the bottom of corresponding with current vehicle location augmented reality picture in augmented reality picture.
Hereinafter, with reference to Fig. 8 to Fig. 9 A-9C, the augmented reality picture process for show of the electronic installation in the case of there is entrance segmentation speed restrictive block event is specifically described.
Fig. 9 A-9C is the figure of the augmented reality picture representing this utility model one embodiment.With reference to Fig. 9 A, in the case of 2. the average speed of Current vehicle be more than for 91km as the restriction average speed 90km 1. of segmentation speed restrictive block, 3. the second object can be revealed in the bottom of corresponding with the position of Current vehicle augmented reality picture in augmented reality picture.And, the 3. included arrow of the second object can make direction manifest towards the direction contrary with the travel direction of Current vehicle.And, the second object color 3. can be presented as the color being in state in violation of rules and regulations for making human pilot recognize, such as, red.Thus, human pilot can easily identify and currently be in segmentation speed limit state in violation of rules and regulations, such that it is able to reduce the speed of vehicle.
With reference to Fig. 9 B and 9C, in the case of the 84km 2. of the average speed as Current vehicle is less than as the restriction average speed 90km 1. of segmentation speed restrictive block, 3. the first object can manifest away from Current vehicle in augmented reality picture.And, the first object color 3. can be presented as the color being in state in violation of rules and regulations for making human pilot recognize, such as, green.
On the other hand, the first object 3. manifest position can according to the average speed of Current vehicle 2. with segmentation speed restrictive block limit average speed difference 1. change with the type of flow.Such as, if shown in the B of Fig. 9, when both speed differences are 6km/h, as shown in Figure 9 C, both speed differences increase to 27km/h, then 3. the first object can manifest away from Current vehicle in augmented reality picture.Thus, human pilot can easily identify the current average speed restriction average speed less than segmentation speed restrictive block, such that it is able to increase the speed of vehicle.
Figure 10 A-10C is the figure of the augmented reality picture representing another embodiment of this utility model.With reference to Figure 10 A-10C, the first object and the second object can be presented as the image of police.As an example, as shown in Figure 10 A, in the case of the average speed of Current vehicle is more than the restriction average speed of segmentation speed restrictive block, the second object 901 can manifest close to Current vehicle in augmented reality picture.But in the case of the average speed of Current vehicle is less than the restriction average speed of segmentation speed restrictive block, as shown in Figure 10 B, Figure 10 C, the first object 902,903 manifests away from Current vehicle in augmented reality picture.
Figure 11 is the figure of the picture of the augmented reality representing another embodiment of this utility model.With reference to Figure 11, in the case of the average speed 1001 of Current vehicle is less than the restriction average speed 1003 of segmentation speed restrictive block, first object 1004 manifests away from Current vehicle in the picture of augmented reality, and is currently not at segmentation speed limit is in violation of rules and regulations presented as green to recognize.
Simply, in case of fig. 11, owing to the real-time speed 1002 of Current vehicle is more than the restriction average speed 1003 of segmentation speed restrictive block, thus is exceeding restriction average speed to make human pilot recognize current real-time speed and manifesting Route guiding line 1005 with redness.
Figure 12 is the figure of the embodiments representing that the guider of this utility model one embodiment do not possesses in the case of shoot part.With reference to Figure 12, the vehicle navigation apparatus 100 being respectively provided with and vehicle black box 200 may utilize wire/wireless communication mode to constitute the system of this utility model one embodiment.
Vehicle navigator 100 comprises the steps that display part 131, is located at the forward face of the shell 191 of navigator;Navigator operated key 121;And navigator microphone 123.
Vehicle black box 200 can include black box photographic head 222, black box microphone 224 and facies posterior hepatis 281.
Figure 13 is the figure of the embodiments representing that the navigator device of this utility model one embodiment possesses in the case of shoot part.With reference to Figure 13, in the case of navigator device 100 includes shoot part 150, user of service can arrange navigator device 100 as follows, i.e., the shoot part 150 making navigator device 100 shoots the front of vehicle, and makes the display part of navigator device 100 can recognize that user of service.With this, the system of this utility model one embodiment can be embodied.
Figure 14 is the figure of the embodiments of head up displays and the electronic installation representing and utilizing this utility model one embodiment.With reference to Figure 14, head up displays can show augmented reality guide picture by the wire/wireless communication between other equipment on head up displays.
As an example, augmented reality by utilizing the head up displays of vehicle front windshield or can utilize the image superposition etc. of other image output devices to provide, and like this, augmented reality offer portion 160 can generate reality imagery or be superimposed on the interface image etc. of glass.With this, augmented reality navigator or vehicle infotainment system etc. can be embodied.
On the other hand, the control method of the electronic installation of above-mentioned this utility model various embodiments can be embodied with program, thus provide to server or equipment.With this, each device can be connected with the server having program stored therein or equipment and fetch download said procedure.
And it is possible to embody the control method of the electronic installation of above-mentioned various embodiments of the present utility model with program, thus the computer-readable recording medium (non-transitory computer readable medium) being stored in multiple non-transitory provides.The computer-readable recording medium of non-transitory is not meant to the medium of the short time storage data such as depositor, cache memory, memorizer, but means that semipermanent stores data, and can read the medium of (reading) by equipment.Specifically, above-mentioned multiple application or program can be stored in the computer-readable recording medium of the non-transitory such as CD (CD), digital versatile disc (DVD), hard disk, Blu-ray Disc, USB (universal serial bus), memory card, read only memory and provide.
And; above preferred embodiment of the present utility model is had been shown and described; but this utility model is not limited to above-mentioned specific embodiment; in the case of claim the purport of the present utility model applied in scope without departing from utility model; by this utility model person of an ordinary skill in the technical field, this utility model can be implemented various deformation; further, these variant embodiment should not depart from technological thought of the present utility model or prospect of the present utility model individually understands.

Claims (11)

1. an electronic installation, it is characterised in that including:
Display part, is used for showing picture;
Average speed calculating part, if there is particular event, the most above-mentioned average speed calculating part calculates the average speed of Current vehicle;
Object generating unit, generates the object for representing the relation between above-mentioned average speed and the event speed corresponding with above-mentioned particular event calculated;And
Control portion, is controlled above-mentioned display part and is exported the above-mentioned object generated by augmented reality.
Electronic installation the most according to claim 1, it is characterised in that above-mentioned particular event includes that the Route guiding till entering segmentation speed restrictive block event, destination starts event and enters at least one in transport information offer interval event.
Electronic installation the most according to claim 2, it is characterized in that, in the case of above-mentioned entrance segmentation speed restrictive block event, the average speed of Current vehicle that time point based on above-mentioned entrance segmentation speed restrictive block is calculated by above-mentioned control portion compares with the restriction average speed of above-mentioned segmentation speed restrictive block, and above-mentioned object generating unit generates the object of the relation representing the average speed of above-mentioned Current vehicle and above-mentioned restriction average speed according to above-mentioned comparative result.
Electronic installation the most according to claim 2, it is characterized in that, in the case of Route guiding till above destination starts event, the time of the pre-arrival that above-mentioned control portion calculates based on the Real-time Traffic Information reflecting above-mentioned Route guiding sart point in time, calculate path average speed, and compare with the average speed of above-mentioned Current vehicle and above-mentioned path average speed, above-mentioned object generating unit generates the object of the relation represented between the average speed of above-mentioned Current vehicle and above-mentioned path average speed according to above-mentioned comparative result.
Electronic installation the most according to claim 2, it is characterized in that, in the case of transport information offer interval event is provided, above-mentioned control portion is to providing the average speed of Current vehicle that interval time point calculates to provide the section mean speed in interval to compare with above-mentioned transport information based on above-mentioned entrance transport information, and above-mentioned object generating unit generates the object of the relation representing the average speed of above-mentioned Current vehicle and above-mentioned section mean speed according to above-mentioned comparative result.
Electronic installation the most according to claim 1, it is characterized in that, in the case of the average speed of above-mentioned Current vehicle is less than above-mentioned event speed, above-mentioned object generating unit generates the first object, in the case of the average speed of above-mentioned Current vehicle is more than above-mentioned event speed, above-mentioned object generating unit generates the second object.
Electronic installation the most according to claim 6, it is characterized in that, above-mentioned control portion performs calibration to photographic head and calculates photographic head parameter, generate the virtual three-dimensional space of the filmed image for above-mentioned photographic head based on above-mentioned photographic head parameter, and make the above-mentioned object generated be positioned at above-mentioned virtual three-dimensional space.
Electronic installation the most according to claim 7, it is characterised in that the lower end of the above-mentioned control portion end point to make above-mentioned first object, the second object and be positioned at above-mentioned photographic head filmed image is controlled in the way of manifesting.
Electronic installation the most according to claim 8, it is characterised in that compared with above-mentioned second object, above-mentioned first object is positioned closer to the position of above-mentioned end point.
Electronic installation the most according to claim 8, it is characterised in that above-mentioned first object and the second object distinguish with mutually different color.
11. electronic installations according to claim 8, it is characterised in that above-mentioned first object, the position that manifests of the second object are changed with the type of flow according to the average speed of above-mentioned Current vehicle and the difference of above-mentioned event speed.
CN201520983089.8U 2014-12-01 2015-12-01 Electronic device Active CN205541484U (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20140170048 2014-12-01
KR10-2014-0170048 2014-12-01
KR1020150036147A KR102406490B1 (en) 2014-12-01 2015-03-16 Electronic apparatus, control method of electronic apparatus, computer program and computer readable recording medium
KR10-2015-0036147 2015-03-16

Publications (1)

Publication Number Publication Date
CN205541484U true CN205541484U (en) 2016-08-31

Family

ID=56138966

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201520983089.8U Active CN205541484U (en) 2014-12-01 2015-12-01 Electronic device
CN201510868584.9A Active CN105651299B (en) 2014-12-01 2015-12-01 The control method of electronic device, electronic device

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201510868584.9A Active CN105651299B (en) 2014-12-01 2015-12-01 The control method of electronic device, electronic device

Country Status (2)

Country Link
KR (1) KR102406490B1 (en)
CN (2) CN205541484U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108846920A (en) * 2018-07-06 2018-11-20 丘莲清 A kind of automobile data recorder based on AR technology

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10659279B2 (en) * 2016-10-04 2020-05-19 Htc Corporation Method and device for displaying video corresponding to physical object
KR20180039892A (en) * 2016-10-11 2018-04-19 현대자동차주식회사 Navigation apparatus, vehicle comprising the same and control method of the vehicle
KR20180084556A (en) * 2017-01-17 2018-07-25 팅크웨어(주) Method, apparatus, electronic apparatus, computer program and computer readable recording medium for providing driving guide using a photographed image of a camera

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101464153B (en) * 2007-12-17 2012-06-06 阿尔派电子(中国)有限公司 Navigation apparatus
KR100929331B1 (en) * 2008-04-30 2009-12-03 팅크웨어(주) Method and apparatus for generating 3D direction indication
KR100889470B1 (en) * 2008-05-14 2009-03-19 팅크웨어(주) Method and apparatus for 3d path
KR100888156B1 (en) * 2008-05-15 2009-03-10 팅크웨어(주) System and method for displaying guidance symbol
KR20100011041A (en) * 2008-07-24 2010-02-03 주식회사 현대오토넷 Navigation device with a map matching function using three-dimensional gyro sensor and digital elevation value
KR101188105B1 (en) * 2011-02-11 2012-10-09 팅크웨어(주) Apparatus and method for providing argumented reality using image information
KR101318651B1 (en) * 2011-11-15 2013-10-16 현대자동차주식회사 Navigation System and Displaying Method Thereof
KR101271235B1 (en) * 2011-12-12 2013-06-10 자동차부품연구원 Apparatus and method for providing driving imformation
KR101614338B1 (en) * 2012-03-26 2016-04-21 미쓰비시덴키 가부시키가이샤 Camera calibration method, computer-readable storage medium storing camera calibration program and camera calibration device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108846920A (en) * 2018-07-06 2018-11-20 丘莲清 A kind of automobile data recorder based on AR technology

Also Published As

Publication number Publication date
KR102406490B1 (en) 2022-06-10
CN105651299B (en) 2019-07-12
KR20160065723A (en) 2016-06-09
CN105651299A (en) 2016-06-08

Similar Documents

Publication Publication Date Title
US11814064B2 (en) Method, apparatus, electronic device, computer program and computer readable recording medium for measuring inter-vehicle distance using driving image
CN105300401B (en) Electronic device and its control method
CN105185137B (en) Electronic device, the control method of electronic device and computer readable recording medium storing program for performing
US11881030B2 (en) Method for detecting vehicle and device for executing the same
US11049327B2 (en) Electronic apparatus, control method thereof, computer program, and computer-readable recording medium
CN205541484U (en) Electronic device
CN108680173A (en) Electronic device, the control method of electronic device and computer readable recording medium storing program for performing
CN105654778A (en) Electronic apparatus and control method thereof
KR101711797B1 (en) Automatic parking system for autonomous vehicle and method for controlling thereof
CN110260877A (en) Drive related guidance providing method and device, computer readable recording medium
CN113792589B (en) Overhead identification method and device
CN113516014A (en) Lane line detection method, lane line detection device, electronic apparatus, computer program, and computer-readable recording medium
CN108335507A (en) Utilize the driving guiding providing method and device of the filmed image of camera
CN113804211A (en) Overhead identification method and device
CN115164910B (en) Travel route generation method, travel route generation device, vehicle, storage medium, and chip
KR20200070100A (en) A method for detecting vehicle and device for executing the method
WO2023185622A1 (en) Navigation method and electronic device
CN113820732A (en) Navigation method and device
CN110263688B (en) Driving related guidance providing method and apparatus, and computer readable recording medium
CN115257628A (en) Vehicle control method, device, storage medium, vehicle and chip

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20210824

Address after: Seoul, South Kerean

Patentee after: Hyundai Motor Co.,Ltd.

Patentee after: Kia Co.,Ltd.

Address before: Gyeonggi Do, South Korea

Patentee before: THINKWARE SYSTEMS Corp.