KR101451765B1 - Electronic device and control method of electronic device - Google Patents

Electronic device and control method of electronic device Download PDF

Info

Publication number
KR101451765B1
KR101451765B1 KR1020110018546A KR20110018546A KR101451765B1 KR 101451765 B1 KR101451765 B1 KR 101451765B1 KR 1020110018546 A KR1020110018546 A KR 1020110018546A KR 20110018546 A KR20110018546 A KR 20110018546A KR 101451765 B1 KR101451765 B1 KR 101451765B1
Authority
KR
South Korea
Prior art keywords
vehicle
electronic device
server
traffic
delete delete
Prior art date
Application number
KR1020110018546A
Other languages
Korean (ko)
Other versions
KR20120099981A (en
Inventor
조창빈
김진영
이해일
우승완
Original Assignee
팅크웨어(주)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 팅크웨어(주) filed Critical 팅크웨어(주)
Priority to KR1020110018546A priority Critical patent/KR101451765B1/en
Priority to PCT/KR2012/001491 priority patent/WO2012118320A2/en
Publication of KR20120099981A publication Critical patent/KR20120099981A/en
Application granted granted Critical
Publication of KR101451765B1 publication Critical patent/KR101451765B1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/0969Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096716Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information does not generate an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096733Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
    • G08G1/096741Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where the source of the transmitted information selects which information to transmit to each vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096775Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

A method of providing traffic-related contents of an electronic device and an electronic device is disclosed.
According to the present invention, when a specific event occurs while the vehicle is running, the electronic device automatically acquires user contents related to the road conditions and automatically transmits the acquired user contents to the server.

Description

TECHNICAL FIELD [0001] The present invention relates to a control method for an electronic device and an electronic device,

And a control method of the electronic apparatus and the electronic apparatus.

With the opening of the Internet network and the revision of laws related to location data, a location based service (LBS) related industry is being activated. A typical device using such a location-based service is a navigation device for providing a navigation service for positioning a current position of a vehicle or the like or for guiding a route to a destination.

2. Description of the Related Art [0002] In recent years, with the spread of communication type navigation becoming active, a user is actively participating in generation of traffic information user content, leaving a conventional service method in which a user is unilaterally provided with traffic information, Are being actively researched for sharing with other users. For this purpose, it is considered to improve the structural part and / or the software part of the terminal.

SUMMARY OF THE INVENTION It is an object of the present invention to provide an electronic apparatus and an electronic apparatus control method for effectively providing information on a traffic flow on a road while preventing distraction of a driving user.

According to an aspect of the present invention for realizing the above-described problem, an electronic device according to the present invention includes: a position data module; A communication unit for connecting to a server for collecting and providing contents of at least one electronic device; And a controller for acquiring a location of the vehicle through the location data module, automatically acquiring contents related to the specific event when a specific event occurs, and transmitting the acquired contents to the server.

According to another aspect of the present invention for realizing the above-described problem, an electronic device according to the present invention includes: a position data module; A communication unit for connecting to a server supporting network formation; And acquiring a location of the vehicle through the location data module, forming a network with at least one other electronic device selected based on the location of the vehicle, and when a specific event occurs, To at least one other electronic device.

According to an aspect of the present invention, there is provided a method of controlling an electronic device, the method comprising: connecting to a server collecting and providing contents of at least one electronic device; Obtaining a position of the vehicle; When a specific event occurs, automatically acquiring contents related to the specific event; And transmitting the acquired content to the server.

As one aspect of the present invention for realizing the above object, a computer-readable recording medium according to the present invention records a program that performs any one of the above methods.

According to the present invention, the navigation system automatically determines the occurrence of an event related to a traffic flow or a safe driving and, when an event occurs, transmits automatically acquired user contents to a server providing traffic information, It is possible to grasp the situation. Further, the contents generated by the navigation are transmitted to other electronic devices, so that other electronic devices can recognize the situation on the road in real time based on the contents.

Also, by collecting and transmitting contents related to the road situation without requiring the user to perform any other operation during operation, it is possible to securely transmit the information necessary for grasping the traffic flow to other users or traffic information without dispersing the attention of the driving user It is possible to provide it to a server which is a server.

Figures 1 and 2 are schematic diagrams of a system environment related to embodiments of the present invention.
3 is a structural diagram showing an electronic device 100 related to the embodiments of the present invention.
4 is a structural diagram showing a server 300 related to the embodiments of the present invention.
5 is a flowchart showing a control method of the electronic device 100 according to the first embodiment of the present invention.
FIG. 6 is a flowchart illustrating an example of determining occurrence of an event using map data in the electronic device 100 according to the first embodiment of the present invention.
FIG. 7 and FIG. 8 are flowcharts illustrating examples of determining occurrence of an event using traffic information received from the outside in the electronic device 100 according to the first embodiment of the present invention.
FIG. 9 illustrates an example of determining occurrence of an event using the warehouse information included in the traffic information in the electronic device 100 according to the first embodiment of the present invention.
10 is a flowchart illustrating another example of determining occurrence of an event using traffic information received from the outside in the electronic device 100 according to the first embodiment of the present invention.
FIG. 11 illustrates an example in which the occurrence of an event is determined by comparing the moving speed of the vehicle with the traffic information in the electronic device 100 according to the first embodiment of the present invention.
FIG. 12 is a flowchart illustrating another example of determining occurrence of an event using traffic information received from the outside in the electronic device 100 according to the first embodiment of the present invention.
FIG. 13 shows another example of judging occurrence of an event by comparing the moving speed of the traffic information and the vehicle in the electronic device 100 according to the first embodiment of the present invention.
FIGS. 14 to 16 are flowcharts illustrating examples of determining occurrence of an event using the acceleration of the vehicle in the electronic device 100 according to the first embodiment of the present invention.
17 is a graph for explaining a case where an event occurrence is determined on the basis of the acceleration of the vehicle in the electronic device 100 according to the first embodiment of the present invention.
18 is a flowchart illustrating an example of determining occurrence of an event based on the amount of impact in the electronic device 100 according to the first embodiment of the present invention.
FIG. 19 shows an example in which an occurrence of an event is determined based on the amount of impact in the electronic device 100 according to the first embodiment of the present invention.
20 and 21 are flowcharts illustrating examples of determining occurrence of an event based on a traveling image of a vehicle in the electronic device 100 according to the first embodiment of the present invention.
FIG. 22 shows an example of a traveling image of a vehicle equipped with the electronic device 100 according to the first embodiment of the present invention.
23 is a flowchart showing an example of determining occurrence of an event based on the moving speed of the vehicle in the electronic device 100 according to the first embodiment of the present invention.
FIG. 24 illustrates an example in which an electronic device 100 according to the first embodiment of the present invention determines occurrence of an event using data obtained from an electric vehicle system of a vehicle.
FIG. 25 illustrates an example of determining occurrence of an event using data received from an external electronic device in the electronic device 100 according to the first embodiment of the present invention.
26 shows an example of registering a specific key of an external electronic device in the electronic device 100 according to the first embodiment of the present invention by using a shortcut key related to automatic contents transfer.
FIG. 27 is a flowchart showing a control method of the navigation system 100 according to the second embodiment of the present invention.
28 shows an example of configuring a social network in the electronic device 100 according to the second embodiment of the present invention.

The above objects, features and advantages of the present invention will become more apparent from the following detailed description taken in conjunction with the accompanying drawings. It is to be understood, however, that the invention is not limited to the disclosed embodiments, but, on the contrary, is intended to cover various modifications and similarities. Like reference numerals designate like elements throughout the specification. In the following description, well-known functions or constructions are not described in detail since they would obscure the invention in unnecessary detail. In addition, numerals (e.g., days, days, etc.) used in the description of the present invention are merely an identifier for distinguishing one component from another component

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

In addition, the suffix "module" and " part "for constituent elements used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

Hereinafter, the present invention will be described in detail with reference to the accompanying drawings.

Figures 1 and 2 are schematic diagrams of a system environment related to embodiments of the present invention.

Referring to FIG. 1, a system environment related to embodiments of the present invention may include a plurality of electronic devices 100 and 10, a communication network 200, and a server 300.

The plurality of electronic apparatuses 100 and 10 may be a fixed terminal or a mobile terminal. A plurality of electronic devices 100 and 10 may be used as a portable device such as a navigation device, a smart phone, a mobile phone, a computer, a laptop computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants) Portable Multimedia Player), MID (Mobile Internet Device), and Tablet PC (Tablet PC).

Hereinafter, a plurality of electronic apparatuses 100 and 10 will be referred to as a first electronic apparatus 100, a second electronic apparatus 10a, a third electronic apparatus 10b, and a fourth electronic apparatus 10c. 1, it is assumed that the first electronic device 100 is a navigation device and the second electronic device 10a, the third electronic device 10b, and the fourth electronic device 10c Various embodiments will be described on the assumption that they are smart phones, tablet PCs, and notebook computers, respectively.

Each of the electronic devices 100 and 10 can communicate with other electronic devices by a wireless or wired communication method. In this document, the manner in which a plurality of electronic apparatuses 100 and 10 communicate with each other is not limited. The technical idea of this document can be applied to all wireless communication methods between existing electronic devices and all future communication methods.

For example, the plurality of electronic devices 100 and 10 can communicate with each other by a communication method such as Universal Plug and Play (UPnP), WiFi, or the like. For example, the plurality of electronic devices 100 and 10 can communicate with each other via the communication network 200 or by a local communication method. Examples of the communication network 200 include a mobile communication network, a wired Internet, a wireless Internet, and a broadcasting network.

In addition, each of the electronic devices 100 and 10 can form a network 400 with other electronic devices as shown in FIG. In addition, a plurality of electronic devices 100 and 10 included in the network 400 share contents. 2, the first electronic device 100 constitutes a network 400 with other electronic devices 10 by establishing a relationship between the electronic devices, and connects the other electronic devices 10 included in the network 400, . ≪ / RTI >

In this document, various embodiments will be described assuming that the network 400 is a social network. However, the technical idea of this document is applicable to all networks configured similar to social networks.

A social network is formed by a social relationship structure created by interdependent ties between nodes on the Web. The nodes included in the social network represent the individual entities present in the network. In this document, various embodiments are described on the assumption that each node 100, 10 constituting the network is an electronic device. However, the technical idea of this document is applicable even when the nodes constituting the network are other entities.

For example, the nodes constituting the network may be users, buildings, roads, locations, servers, and the like. That is, the first electronic device 100 may form a social network with at least one user, a building, a road, a location, a server, and the like. When the node included in the social network is a road, a building, a location, etc., the first electronic device 100 may be an account created corresponding to each road, building, location, and the like.

Referring again to FIG. 1, the server 300 communicates with a plurality of electronic devices 100 and 10 through a communication network 200. In addition, various contents can be acquired through communication with the plurality of electronic devices 100 and 10. [ In addition, various contents can be transmitted to the plurality of electronic apparatuses 100 and 10 through communication with the plurality of electronic apparatuses 100 and 10. [

The server 300 can also support the formation of a network 400 composed of a plurality of electronic devices 100,

Hereinafter, the first electronic device 100 will be described in more detail with reference to the drawings.

In this document, various embodiments will be described assuming that the electronic device 100 is a navigation device as shown in FIG. However, the technical idea disclosed in this document can be applied to various kinds of electronic devices such as a smart phone, a tablet PC, and a notebook computer.

FIG. 3 is a structural diagram illustrating navigation 100 in accordance with embodiments of the present invention.

3, the navigation system 100 includes a communication unit 110, an input unit 120, a sensing unit 130, an output unit 140, a storage unit 150, a power unit 160, and a controller 170 can do. The components shown in FIG. 3 are not essential, so the navigation 100 may be implemented with more or fewer components.

Hereinafter, the components will be described in order.

The communication unit 110 may include one or more modules that enable communication between the navigation device 100 and the communication network 200 or between the navigation device 100 and the network where the navigation device 100 is located or between the navigation device 100 and another electronic device 10. [ . ≪ / RTI > For example, the communication unit 100 may include a location data module 111, a wireless Internet module 113, a broadcast transmission / reception module 115, a short-range communication module 117, a wired communication module 119, .

The position data module 111 is a module for obtaining position data of the navigation device 100. [ As a method of acquiring position data by the position data module 111, a method of acquiring position data through a Global Navigation Satellite System (GNSS) can be used.

GNSS means a navigation system capable of calculating the position of a receiving terminal using a radio signal received from a satellite. A specific example of GNSS is a GPS (Global Positioning System), a Galileo, a GLONASS (Global Orbiting Navigational Satellite System), a COMPASS, an Indian Regional Navigational Satellite System (IRNSS), a Quasi-Zenith Satellite System Lt; / RTI >

The location data module 111 of the navigation 100 in connection with embodiments of the present invention may receive the GNSS signal serviced in the area where the navigation 100 is used to obtain location data. Then, the current position of the navigation device 100 is continuously calculated in real time using the acquired position data. The current position of the navigation system 100 may be obtained through map matching of the position data acquired by the position data module 111 to improve accuracy.

In addition, the position data module 111 also calculates velocity information using the current position of the navigation device 100 obtained in real time.

The wireless Internet module 113 is a device that accesses the wireless Internet and acquires or transmits data. The wireless Internet that can be connected through the wireless Internet module 113 may be a wireless LAN (WLAN), a wireless broadband (WIBRO), a world interoperability for microwave access (WIMAX), or a high speed downlink packet access (HSDPA).

The broadcast transmission / reception module 115 is a device for receiving broadcast signals through various broadcasting systems. The broadcast system that can receive the broadcast signal through the broadcast transmission / reception module 115 may be a DMBT (Digital Multimedia Broadcasting Terrestrial), a DMBS (Digital Multimedia Broadcasting Satellite), a MediaFLO (Media Forward Link Only), a DVBH (Digital Video Broadcast Handheld) Integrated Services Digital Broadcast Terrestrial) and the like. The broadcast signal received through the broadcast transmission / reception module 115 may include traffic data, living data, and the like.

The short-range communication module 117 refers to a module for short-range communication. Bluetooth, radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB), ZigBee, and the like can be used as the short distance communication technology.

The wired communication module 119 serves to provide an interface with other electronic devices connected to the navigation device 100. [ For example, the wired communication module 119 may be a USB module capable of communicating via a USB port.

The input unit 120 is a module for generating input data for controlling the operation of the navigation system 100. The input unit 120 can convert the physical input from the outside into a specific electric signal to generate input data. The input unit 120 may include a user input module 121, a microphone 123, a camera 125, and the like.

The user input module 121 receives a control input for controlling the operation of the navigation device 100 from a user. The user input module may include a key pad dome switch, a touch pad (static / static), a jog wheel, a jog switch, and the like. For example, the user input module 121 may be implemented with a navigation operation key provided in the body of the navigation system 100. [

The microphone 123 is a device that receives a user's voice and an audio signal generated from the inside and the outside of the vehicle. The microphone 123 may be implemented as a navigation microphone 195 provided in the body of the navigation system 100.

The camera 125 is an apparatus for acquiring images of the inside and the outside of the vehicle. For example, the camera 125 may acquire a running image of the vehicle.

The sensing unit 130 recognizes the current state of the navigation system 100 and generates a sensing signal for controlling the operation of the navigation system 100. The sensing unit 130 may include a motion sensing module 131, an optical sensing module 133, and the like.

The motion sensing module 131 may sense movement of the navigation device 100 in a three-dimensional space. The motion sensing module 131 may include a geomagnetic sensor, an acceleration sensor, and the like. The motion data obtained through the motion sensing module 131 can be combined with the position data acquired through the position data module 111 to calculate a more accurate trajectory of the vehicle to which the navigation device 100 is attached.

The optical sensing module 133 is a device for measuring the luminance of the surroundings of the navigation device 100. The brightness of the display unit 145 can be changed to correspond to the ambient brightness using the illumination data acquired through the optical sensing module 133. [

The output unit 140 is a device in which the navigation device 100 outputs data. The output unit 140 may include a display module 141, an audio output module 143, and the like.

The display module 141 displays information to be processed in the navigation system 100. For example, the display module 141 displays a UI (User Interface) or a GUI (Graphic User Interface) associated with the navigation service.

The display module 141 may be a liquid crystal display, a thin film transistor liquid crystal display, an organic light emitting diode, a flexible display, a 3D display And may include at least one.

In a case where the display module 141 and a sensor (hereinafter, referred to as a 'touch sensor') for detecting a touch operation form a mutual layer structure (hereinafter, referred to as a 'touch screen' It can also be used as an input device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display module 141 or a capacitance generated in a specific portion of the display module 141 into an electrical input signal. The touch sensor can be configured to detect not only the position and area to be touched but also the pressure at the time of touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 170. Accordingly, the controller 170 can know which area of the display module 141 is touched or the like.

Referring to FIG. 3, a proximity sensor may be disposed in the inner area of the navigation system 100 or in the vicinity of the touch screen, which is enclosed by the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. The proximity sensor has a longer life span than the contact sensor and its utilization is also high.

Examples of the proximity sensor include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.

And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

The audio output module 143 outputs audio data that can be audibly recognized. The audio output module 143 outputs an audio signal related to a function (for example, a route guidance function) performed in the navigation system 100. [ The audio output module 143 may include a receiver, a speaker, a buzzer, and the like.

The storage unit 150 may store a program for the operation of the navigation system 100 and temporarily store input / output data (path information, images) related to the navigation system 100. [

The storage unit 150 may be embedded in the navigation device 100 or may be removable and may be a flash memory type, a hard disk type, a multimedia card micro type, , A card type memory (e.g., SD or XD memory), a random access memory (RAM), a static random access memory (SRAM), a read only memory (ROM), an electrically erasable programmable read only memory (EEPROM) A programmable read-only memory (PROM), a magnetic memory, a magnetic disk, and an optical disk. The navigation system 100 may operate in association with a web storage that performs a storage function of the storage unit 150 on the Internet.

The power supply unit 160 receives power from an external power supply and supplies power to the components of the navigation system 100 or other devices connected to the navigation system 100.

The control unit 170 typically controls the overall operation of the navigation system 100. Further, the control unit 170 may output a control signal for controlling another device connected to the navigation device 100. [

Hereinafter, the server 300 will be described in more detail with reference to the drawings.

4 is a structural diagram showing a server 300 related to the embodiments of the present invention.

Referring to FIG. 4, the server 300 may include a communication unit 310, a storage unit 320, and a control unit 330. 4 are not required, the server 300 may be configured to have more or fewer components.

The communication unit 310 is connected between the server 300 and the communication network 200 or between the server 300 and another server or between the server 300 and other electronic devices 100 and 10 or between the server 300 and the network 400. [ Lt; RTI ID = 0.0 > and / or < / RTI >

The storage unit 320 may store a program for the operation of the server 300 and may temporarily store information input / output related to the server 300. [ In addition, contents obtained through communication with a plurality of electronic apparatuses 100 and 10 are stored.

The control unit 330 typically controls the overall operation of the server 300.

The control unit 330 connects to the plurality of electronic devices 100 and 10 through the communication network 200. When user authentication is required for each of the electronic apparatuses 100 and 10, a user authentication process for the corresponding electronic apparatuses 100 and 10 is performed using the authentication information received from the electronic apparatuses 100 and 10 do.

Also, the control unit 330 acquires user contents, traffic information, and traffic-related contents through communication with a plurality of electronic devices 100 and 10, and stores the contents in the storage unit 320.

The user contents are contents generated by the electronic apparatuses 100 and 10. The control unit 330 can acquire user contents from a plurality of electronic devices 100 and 10 connected through the communication unit 310. [

In addition, the user contents may include contents such as text, image (image, moving picture), audio, etc. obtained by each of the electronic apparatuses 100 and 10.

In addition, the user content may further include a corresponding location. For example, if the user content includes an image, the user content may include the location from which the image was acquired. It may also include, for example, the location of the electronic device 100, 10 that has transmitted the user content to the server 300.

The user content may further include traffic information obtained by the corresponding electronic device 100, 10. The traffic information acquired by the electronic devices 100 and 10 may be acquired based on the position of the corresponding electronic device 100 or 10, and may include a moving direction, a speed, and the like.

The traffic-related contents are contents for transmitting information on a traffic flow to a user. The control unit 330 may generate traffic-related contents using user contents corresponding to the respective locations.

The traffic-related contents may include user contents received from the electronic devices 100 and 10.

In addition, the traffic-related content may further include a specific location obtained in association with the corresponding user content. For example, the traffic-related content may include the location of the electronic device 100, 10 that has transmitted the corresponding user content to the server 300. Here, the position of each electronic device 100, 10 can be directly received from the electronic device 100, 10 or acquired from the base station existing on the communication network 200. [

In addition, the traffic-related content may further include traffic information acquired based on the user content. For example, the control unit 330 analyzes the user contents received from the electronic devices 100 and 10, acquires specific texts and images included in the user contents, and generates traffic-related contents based on the acquired texts and images . For example, when the user content includes text related to a traffic flow, such as "congestion at Kangnam station, " the control unit 320 extracts specific texts such as" Kangnam station & It is possible to generate traffic related contents by using this. In addition, for example, when the user contents include traffic information such as a location, a moving direction, and a speed of the corresponding electronic device 100 or 10, the control unit 330 may generate traffic related contents based on the location information .

In addition, the traffic-related content may further include at least one traffic information generated by the server 300. [ For example, the control unit 330 acquires the position of each of the electronic devices 100 and 10 through communication with the plurality of electronic devices 100 and 10, and generates traffic information indicating the traffic flow based on the acquired position. In addition, traffic related contents can be generated using the generated traffic information. That is, the control unit 330 acquires a position for a predetermined period of time for the electronic devices 100 and 10 that transmit the user contents, and transmits the position information of the electronic devices 100 and 10, Obtain information. Then, the acquired traffic information is included in the traffic-related content. As described above, the traffic information generated by the server 300 can provide objective information related to the traffic flow, so that it is possible to convey objective traffic flow information unlike user contents generated by the user. Accordingly, there is an effect of minimizing a problem that inaccurate information transmission may occur when providing traffic-related contents using only the user content subject to user's involvement.

The traffic-related contents may further include traffic information obtained from an external server that provides traffic information. For example, traffic-related contents can be generated based on traffic information obtained from a traffic information providing server that provides traffic information through a broadcast network. The traffic information provided by the external traffic information providing server may include information about accident and control occurring on the road. Control situations include road control by events, rallies, protests, etc., and road control by construction or disasters.

The control unit 330 may select some of the traffic related contents stored in the storage unit 320 based on at least one criterion when the traffic related contents are requested from one of the electronic devices 100 and 10, And provides the traffic-related contents to the electronic devices 100 and 10.

At least one criterion for selecting traffic-related content may include a specific location. For example, the control unit 330 may transmit the traffic-related contents corresponding to the specific location to the electronic devices 100 and 10 when the electronic-related devices 100 and 10 are requested to provide the traffic-related contents.

In addition, at least one criterion for selecting traffic-related contents may include area information. For example, when the controller 330 requests to provide traffic-related contents from the electronic devices 100 and 10, the controller 330 may transmit the traffic-related contents corresponding to the specific area to the electronic devices 100 and 10.

In addition, at least one criterion for selecting traffic-related contents may include time information. For example, when the controller 330 requests to provide traffic-related contents from the electronic devices 100 and 10, the controller 330 can transmit the traffic-related contents corresponding to the specific time to the corresponding electronic devices 100 and 10. As an example, the traffic-related contents obtained in a specific time or obtained after a specific time can be transmitted to the electronic device 100, 10.

In addition, at least one criterion for selecting traffic-related contents may include direction information. For example, when the controller 330 requests to provide traffic-related contents from the electronic devices 100 and 10, the controller 330 may transmit the traffic-related contents corresponding to the specific directions to the electronic devices 100 and 10. For example, the control unit 330 transmits traffic-related contents acquired through communication with the electronic apparatuses 100 and 10, which are located in a specific direction with reference to any one of the positions, Lt; / RTI >

At least one criterion for selecting the traffic-related content may be directly received from the electronic devices 100 and 10 requesting the traffic-related contents provision, or may be acquired based on the location of the electronic devices 100 and 10.

The control unit 330 also supports the configuration of the social network 400 between the plurality of electronic devices 100 and 10. For example, when at least one criterion for selecting another electronic device 10 constituting the social network 400 is received from the first electronic device 100, 10, at least one electronic device 10 And the first electronic device 100 constitute the social network 400. At least one criterion for selecting an electronic device 10 included in the social network 400 is similar to at least one criterion for selecting traffic-related content and can be obtained in a similar manner.

The control unit 330 also supports reconfiguration of the network 400 based on a change in the location of each of the electronic devices 100 and 10 included in the social network 400. [ For example, the social network configured based on the current location of the first electronic device 100 may be reconfigured based on the changed location as the current location of the first electronic device 100 is changed. In addition, a new electronic device may be added to the network 400, or an electronic device included in the network 400 may be connected to the network 400 based on a change in position of the electronic devices 10 included in the network 400, for example. 400).

The control unit 330 generates traffic information based on the locations of the electronic devices 100 and 10 included in the social network 400 and transmits the generated traffic information to the electronic devices 100 and 10 included in the network 400 Lt; / RTI > For example, the control unit 330 acquires a position for a predetermined period of time for each of the electronic apparatuses 100 and 10 included in the network 400, Speed, and so on. Then, the acquired traffic information is transmitted to each of the electronic devices 100 and 10 included in the network 400.

Hereinafter, the control method of the navigation system 100 according to the first embodiment of the present invention and the operation of the navigation system 100 for implementing the navigation system 100 will be described in more detail with reference to the necessary drawings. Embodiments disclosed in this document can be implemented in the electronic device 100 described with reference to Fig. In this document, various embodiments are described on the assumption that the electronic device 100 is a navigation device. However, the technical idea disclosed in this document can be applied to other kinds of electronic devices such as a smart phone, a tablet PC, and a notebook computer.

Hereinafter, the operation of the navigation system 100 for implementing the first embodiment disclosed in this document will be described in more detail.

The motion sensing module 131 of the sensing unit 130 may include a gyroscope, an accelerometer, a magnetic sensor, a gravity sensor, and the like. The motion sensing module 131 acquires the movement of the vehicle by the navigation device 100.

The control unit 170 acquires the acceleration of the vehicle equipped with the navigation device 100 and the amount of external impact from the navigation device 100 based on the movement of the navigation device 100 acquired through the sensing device 170.

In addition, the control unit 170 acquires the current position of the navigation system 100. [ When the navigation device 100 is mounted on the vehicle, the current position of the navigation device 100 can be used as the current position of the vehicle.

The control unit 170 can acquire the current position of the vehicle using the GPS signal obtained through the position data module 111. On the other hand, since there is some error in the received GPS signal, the controller 170 performs a software map matching process on the received GPS signal to match the current position of the vehicle with the current running road. In addition, when the vehicle enters the shaded area where GPS signals can not be received, the controller 170 acquires the steering direction and the speed information of the vehicle using the gyro sensor included in the sensing unit 130, The position can be judged.

In addition, the controller 170 connects to the server 300 through the communication unit 110.

In addition, the control unit 170 transmits the selected user content to the server 300 based on the user's control input. The control input of the user may be received through the input unit 120 or may be received from an external electronic device connected through the communication unit 110. [ In the latter case, the control unit 170 may connect to at least one external electronic device through the local communication module 117 of the communication unit 110, and receive the control input from the connected external electronic device. The external electronic device connected through the short-range communication module 117 may include a smart phone, a mobile phone, a notebook computer, a PDA, a tablet PC, and the like.

When a specific event occurs, the control unit 170 automatically acquires the user content related to the generated event, and automatically transmits the acquired user content to the server 300.

The control unit 170 may also transmit the current position of the vehicle to the server 300. [

In addition, the control unit 170 may transmit information related to the traffic flow to the server 300, which is obtained based on the position of the vehicle. The controller 170 may include traffic information obtained based on the location of the vehicle, such as a moving speed of the vehicle, a moving direction, and the like.

The control unit 170 acquires the moving speed of the navigation system 100 using the position data obtained through the position data module 111. [ When the navigation device 100 is mounted on a vehicle, the traveling speed of the navigation device 100 can be used as the traveling speed of the vehicle.

The control unit 170 can acquire the moving direction of the navigation system 100 based on the position data acquired through the position data module 111 and the azimuth angle acquired using the geomagnetic sensor included in the sensing unit 130 have. When the navigation device 100 is mounted on the vehicle, the direction of movement of the navigation device 100 can be used in the moving direction of the vehicle.

5 is a flowchart showing a control method of the navigation system 100 according to the first embodiment of the present invention. 6 to 26 are diagrams for explaining a control method of the navigation system 100 according to the first embodiment of the present invention.

Referring to FIG. 5, the controller 170 connects to the server 300 through the communication unit 110 (S101).

When a user authentication process is required to connect to the server 300, the control unit 170 transmits the authentication information to the server 300. If the user authentication is successful by the server 300, the server 300 is connected.

The authentication information 300 may include user identification information such as an ID, device identification information of the navigation device 100, a password, and the like. The control unit 170 may transmit the authentication information input through the input unit 120 to the server 300 in order to request the user authentication. Also, the controller 170 may use the authentication information previously stored in the storage unit 150 to request the user authentication. The authentication information previously stored in the storage unit 150 may be authentication information registered by the user.

The control unit 170 automatically acquires the user contents (S103) when a specific event occurs while driving the vehicle equipped with the navigation system 100 (S102). In addition, the server 300 automatically transmits the acquired user content and the current position of the vehicle (S104).

Then, the display module 141 displays guidance information for notifying that the user content acquired in the navigation system 100 is transmitted to the server 300 (S105).

In this document, an example is described in which an event is detected while the navigation system 100 is connected to the server 300, and content transmission is performed according to the occurrence of the event. However, the technical idea disclosed in this document can also be applied to a case where a new connection is made to the server 300 every time a specific event is generated, and the connection to the server 300 is disconnected when the transmission of user contents is completed.

In step S102 described above, the controller 170 may detect a specific event using the map data stored in the storage unit 150. [

6 is a flowchart showing an example of determining occurrence of an event using map data. Fig. 7 is a diagram showing an example in which event occurrence is judged using map data.

Referring to FIG. 6, the controller 170 obtains the current position of the vehicle using the position data module 111 (S201).

Based on the current position of the vehicle, it is determined whether the vehicle has entered a specific position registered in the map data (S202). If the vehicle enters a specific position, it is determined that an event has occurred (S203).

Here, the specific position at which the event occurs may be an accident bunch area, a construction section, a drunk driving control section, a traffic-related content collection section, a regular stagnation section, and the like. The map data includes such a plurality of specific locations, and specific locations registered in the map data can be updated as the map data is updated.

In addition, the specific position at which the event occurs may be a position registered by the user. The user may register at least one location for automatically collecting and transmitting user content in the map data. For example, the user can register the position perceived as the regular congestion section by experience in the map data so that the user can automatically acquire the user content and transmit it to the server 300 when the vehicle enters the corresponding position.

FIG. 7 shows an example in which the occurrence of an event is judged by entry of a car into a construction section using map data.

7, when the construction section 7a is present on the road on which the navigation device 100 is mounted and the construction section 7a is registered in the map data, the navigation device 100 displays the map data It can be determined that the vehicle 5 enters the construction section 7a.

In addition, the navigation device 100 automatically acquires user contents and transmits them to the server 300 when an event occurs in which the vehicle 5 enters the construction section 7a.

For example, when the vehicle 5 enters the construction section 7a, it can transmit an image of the road being obtained at the time of entering the construction section to the server 300. [ In addition, for example, when the vehicle 5 enters the construction section 7a, it is possible to transmit the running image of the vehicle, which has been acquired for a predetermined period of time after entering the construction section 7a, to the server 300. [ In addition, for example, when the vehicle 5 enters the construction section 7a, it is possible to transmit the running image of the vehicle acquired during the running of the construction section 7a to the server 300. [ The navigation system 100 can determine whether the vehicle 5 has entered the construction section 7a based on the map data similar to the method of determining entry of the construction section 7a of the vehicle 5. [ Referring again to FIG. 5, in step S102, the controller 170 may determine whether a specific event is generated based on traffic information received from the outside through the communication unit 110. FIG. The control unit 170 may connect to the server 300 or the broadcasting network through the communication unit 110 and receive traffic information from the connected server 300 or the broadcasting network.

8, 10 and 12 are flowcharts illustrating examples of determining occurrence of an event using traffic information received from the outside. 9, 11, and 13 illustrate examples in which event occurrence is determined using traffic information received from the outside.

Referring to FIG. 8, the controller 170 receives traffic information from the server 300 or the broadcasting network (S301). The traffic information received in step S301 may include the warehouse information.

In the case of receiving the traffic information from the server 300, the unauthorized information may be information generated by the server 300 using the user content received from the other electronic device 10, It may be the residence information included in the traffic information. Also, when traffic information is received from a broadcasting network, the disguised information may be RTM (Road Traffic Message) information included in Transport Protocol Expert Group (TPEG) information.

Yugo information is information that informs various accident and control situations occurring on the road, and may include locations corresponding to various accident zones and control zones. The control situation included in the evacuation information may include the road control situation due to events, rallies, demonstrations, and the road control situation due to construction or disaster.

Referring again to FIG. 8, the controller 170 obtains the current position of the vehicle using the position data module 111 (S302). Further, based on the current position of the vehicle, it is determined whether the vehicle enters a position included in the warehouse information (S303). Then, when the vehicle enters the position included in the warehouse information, it is determined that an event has occurred (S304).

FIG. 9 shows an example in which event occurrence is determined using the warehouse information included in the traffic information.

Referring to FIG. 9, the controller 170 receives traffic information from outside. In addition, based on the disguised information included in the traffic information, the vehicle 5 equipped with the navigation device 100 acquires information on the accident occurrence point 9a occurring on the road on which the vehicle 5 is traveling. Further, when the vehicle 5 enters the accident occurrence point 9a, the control unit 170 determines that an event has occurred. In addition, since the event has occurred, the content is automatically acquired and transmitted to the server 300.

For example, when the vehicle 5 enters the accident occurrence point 9a, it can transmit the image of the running road obtained at the time of entering the accident occurrence point 9a to the server 300. [ In addition, for example, when the vehicle 5 enters the accident occurrence point 9a, it can transmit the running image of the vehicle that has been acquired for a predetermined time after entering the accident occurrence point 9a to the server 300 .

Referring to FIG. 10, the controller 170 receives traffic information from the server 300 or the broadcasting network (S401). The traffic information received in step S401 includes the average moving speed of corresponding vehicles by each section or position of the road.

In addition, the control unit 170 obtains the moving speed of the vehicle (S402). If the difference between the two is greater than a predetermined value (S403), it is determined that an error occurs in the traffic information or the traffic flow It is determined that an event has occurred and it is determined whether an event has occurred (S404).

FIG. 11 shows an example in which the occurrence of an event is determined by comparing the traffic information and the moving speed of the vehicle.

Referring to FIG. 11, according to the traffic information received from the outside, the average moving speed obtained from the traffic information for the section of the vehicle 5 on which the navigation system 100 is mounted is 80 km / h. On the other hand, as an accident occurs in front of the road on which the vehicle 5 is running, the current moving speed of the vehicle 5 is 15 km / h.

The control unit 170 determines that an event has occurred because the difference between the average moving speed of 80 km / h obtained from the traffic information and the actual moving speed of the vehicle 5 of 15 Km / h is greater than 20 Km / h. In addition, since the event has occurred, the content is automatically acquired and transmitted to the server 300.

For example, the actual traveling speed of the vehicle 5 can be transmitted to the server 300. [ In addition, for example, it is possible to transmit the traveling image of the vehicle, which has been acquired for a predetermined period of time after the occurrence of the event, to the server 300.

Referring to FIG. 12, the controller 170 receives traffic information from the server 300 or the broadcasting network (S501). The traffic information received in step S501 includes the average moving speed of corresponding vehicles by each section or position of the road. In addition, the traffic information may include information indicating a traffic flow state of each section of the road. Here, the state information indicating the traffic flow of each section is information indicating the traffic flow of the corresponding section such as congestion, slowing down, and smoothness.

The control unit 170 receiving the traffic information obtains the average moving speed or the traffic flow state information corresponding to the current position of the vehicle from the received traffic information. That is, the vehicle obtains the average moving speed or the traffic flow state of the vehicles with respect to the section in which the vehicle is currently traveling.

In addition, the controller 170 determines whether the vehicle has entered the congestion section of the traffic information based on the average moving speed or the traffic flow of the vehicles with respect to the section in which the vehicle is currently traveling (S502).

If it is determined that the vehicle has entered the congestion section of the traffic information, it is determined whether the moving speed of the vehicle is less than a predetermined value (S503). If the moving speed of the vehicle is less than the preset value, it is determined that the vehicle has entered the actual congestion period and the occurrence of the event is recognized (S504).

FIG. 13 illustrates an example in which event occurrence is determined using the warehouse information included in the traffic information.

Referring to FIG. 13, the control unit 170 determines the traffic flow such as congestion, slowness, and smoothness for each section of the road on which the vehicle 5 equipped with the navigation system 100 is traveling, based on traffic information received from the outside .

Further, the control unit 170 determines entry of the vehicle 5 into the congestion section based on the acquired traffic flow. When the vehicle 5 enters the congestion section, it is determined that an event has occurred. In addition, since the event has occurred, the content is automatically acquired and transmitted to the server 300.

For example, the actual traveling speed of the vehicle 5 can be transmitted to the server 300. [ In addition, for example, the driving image or the traveling image obtained while the vehicle is traveling in the congestion period can be transmitted to the server 300. [

Referring again to FIG. 5, in step S102 described above, the controller 170 can determine the occurrence of an event using the speed change of the vehicle, that is, the acceleration. The acceleration of the vehicle can be obtained through the acceleration sensor in the sensing unit 130. [ Further, the acceleration of the vehicle can be obtained based on a change in the position data obtained through the position data module 111. [

FIGS. 14 to 16 are flowcharts showing examples in which the occurrence of an event is determined using the acceleration of the vehicle. 17 shows an example in which an occurrence of an event is determined based on the acceleration of the vehicle.

Referring to FIG. 14, the controller 170 acquires the acceleration of the vehicle using the sensing unit 130 or the position data module 111 (S601).

If it is determined that the vehicle is rapidly accelerated / decelerated based on the acceleration of the vehicle (S602), the controller 170 determines that an event has occurred (S603). That is, when it is determined that the vehicle is rapidly decelerated / rapidly accelerated based on the acceleration of the vehicle, it is determined that an event related to traffic flow or safe operation such as entering into a congestion zone, an accident,

On the other hand, as shown in FIG. 14, when contents are transmitted by recognizing the rapid deceleration / rapid acceleration of the vehicle as an event, the controller 170 unnecessarily transmits contents to the server 300). Accordingly, the control unit 170 may further detect the occurrence of the event by using information other than the acceleration of the vehicle.

Referring to FIG. 15, the controller 170 receives traffic information from the server 300 or the broadcasting network connected through the communication unit 110 (S701). The traffic information, as described above, includes the average moving speed of corresponding vehicles by each section or position of the road. In addition, the traffic information may include information indicating a traffic flow state of each section of the road.

The control unit 170 may acquire the average moving speed or the traffic flow state information corresponding to the current position of the vehicle from the received traffic information, and may determine whether the vehicle enters the congestion section on the traffic information based on the average moving speed or the traffic flow state information.

In addition, the controller 170 acquires the acceleration of the vehicle (S702). When the vehicle is rapidly decelerating, that is, when the acceleration of the vehicle is equal to or less than a predetermined value based on the acceleration of the vehicle obtained (S703), the traffic flow state corresponding to the current position of the vehicle is obtained from the traffic information.

That is, the controller 170 obtains the average moving speed or the traffic flow state information corresponding to the current position of the vehicle from the received traffic information, and grasps the traffic flow state of the section in which the vehicle is running based on the obtained average traveling speed or the traffic flow state information.

If the current position of the vehicle corresponds to the congestion period in the traffic information (S704), the controller 170 determines that an event has occurred (S705)

That is, it is determined that the vehicle has entered the congestion section in the actual traffic flow.

Referring to FIG. 16, the controller 170 acquires the acceleration of the vehicle (S801). Then, when the vehicle is rapidly decelerating based on the acceleration of the vehicle, that is, when the acceleration of the vehicle is less than a predetermined value (S802), the moving speed of the vehicle is obtained.

In addition, when the moving speed of the vehicle is equal to or less than a predetermined value for a predetermined time (S803), the control unit 170 determines that an event has occurred (S804)

That is, it is determined that the vehicle has entered the congestion section in the actual traffic flow.

FIG. 17 is a graph for explaining a case where an event occurrence is determined based on the acceleration of the vehicle, and shows the change in the traveling speed of the vehicle.

Referring to FIG. 17, the traveling speed of the vehicle equipped with the navigation system 100 rapidly decreases from t 1 to 20 km / h or less, and then maintains a speed of 20 km / h or less for a certain period of time. Accordingly, the control unit 170 maintains the moving speed of 20 Km / h or less corresponding to the moving speed at the time of stagnation for a preset time t2 after the vehicle is rapidly decelerated, and thus determines that the vehicle has entered the stagnation period.

If it is determined that the vehicle 5 has entered the congestion section, it is determined that an event has occurred. In addition, since the event has occurred, the content is automatically acquired and transmitted to the server 300.

For example, the actual traveling speed of the vehicle 5 can be transmitted to the server 300. [ In addition, for example, the driving image or the traveling image obtained while the vehicle is traveling in the congestion period can be transmitted to the server 300. [

Referring again to FIG. 5, in step S102 described above, the controller 170 can determine occurrence of an event based on the amount of impact from the outside of the vehicle. The amount of impact from the outside can be obtained by using an acceleration sensor, a gravity sensor, or the like included in the sensing unit 130.

18 is a flowchart showing an example of determining occurrence of an event based on the amount of impact. FIG. 19 is a diagram showing an example of determining occurrence of an event based on the amount of impact.

Referring to FIG. 18, the controller 170 continuously obtains the external impulse amount for the vehicle (S901).

If the amount of change in the amount of impact is more than a predetermined value (S902), it is determined that an event has occurred (S903). That is, if a vehicle experiences a lot of shock due to road breakage, obstacles, or an accident, it is judged as an event related to stable driving.

FIG. 19 shows an example of judging occurrence of an event based on the amount of impact.

As shown in Fig. 19 (a), when the vehicle 5 equipped with the navigation system 100 travels on a broken road, the amount of impact from the outside to the vehicle 5 is shown in Fig. 19 (b) As shown in the figure, it can be seen that it changes abruptly for a predetermined time t3. That is, it can be seen that the amplitude of the graph indicating the amount of impact is abruptly changed during the constant period t3.

As shown in FIG. 19 (b), when the amount of impact is abruptly changed, the controller 170 determines that an event has occurred. In addition, since the event has occurred, the content is automatically acquired and transmitted to the server 300.

For example, the server 300 transmits a running image or a running image of the vehicle obtained after an event occurs. In addition, for example, it is also possible to transmit, to the server 300, a running image or a running image of the vehicle obtained before a preset time based on the time when the event occurred. Referring again to FIG. 5, in step S102 described above, the controller 170 may determine occurrence of an event based on the traveling image of the vehicle. The running image of the vehicle can be obtained through at least one camera 125 included in the navigation system 100. [ The traveling image of the vehicle can also be obtained from at least one external camera or a vehicle black box connected through the communication unit 110. [

20 and 21 are flowcharts showing examples of judging occurrence of an event based on the traveling image of the vehicle. FIG. 22 shows an example in which an occurrence of an event is determined based on a traveling image of the vehicle.

Referring to FIG. 20, the controller 170 acquires a running image of the vehicle (S1001). Also, objects included in the running image are extracted based on the image recognition of the running image of the vehicle.

If the control unit 170 determines that a specific object is included in the running image (S1002), it determines that an event has occurred (S1003).

For example, when an object corresponding to an accident vehicle or an obstacle is included in the running image of the vehicle, the controller 170 determines that an event has occurred.

In addition, for example, when the object corresponding to the special vehicle is included in the traveling image of the vehicle, such as an ambulance, the controller 170 determines that an event has occurred.

On the other hand, in step S1002, the control unit 170 may determine whether a specific object is included in the running image based on the outline, color, and the like of each object extracted from the running image.

Referring to FIG. 21, the control unit 170 acquires the traveling image of the vehicle (S1101). Also, objects included in the running image are extracted based on the image recognition of the running image of the vehicle.

The traffic flow state of the road is determined based on at least one object extracted from the traveling image (S1102). Then, when the traffic flow state corresponds to the congestion (S1103), it is determined that an event has occurred (S1104).

In step S1102, the controller 170 may determine a traffic flow using specific objects obtained from the running image.

For example, the control unit 170 may detect other vehicles from the running image, and determine whether the vehicle is stagnant based on the number of vehicles acquired from the running image, or the interval between the vehicles.

FIG. 22 is a view for explaining a case where an event is determined by analyzing an object obtained from a running image of a vehicle, and shows an example of a running image of a vehicle equipped with the navigation 100. FIG.

Referring to FIG. 22, the controller 170 analyzes the running image 7 of the vehicle and extracts objects included in the image. A specific object such as an accident vehicle OB1 and a wrecker vehicle OB1 for tracing an accident vehicle based on the obtained types and shapes of the extracted objects based on the outline, Is included in the running image 7 of the vehicle.

Also, when the specific objects OB1 and OB2 are extracted from the running image 7 of the vehicle, the controller 170 determines that an event has occurred. In addition, since the event has occurred, the content is automatically acquired and transmitted to the server 300.

For example, the driving image or the traveling image of the vehicle can be transmitted to the server 300. [

Referring again to FIG. 5, in step S102 described above, the controller 170 may determine occurrence of an event based on the moving speed of the vehicle.

Fig. 23 is a flowchart showing an example of determining occurrence of an event based on the moving speed of the vehicle.

Referring to FIG. 23, the controller 170 obtains the moving speed of the vehicle (S1201).

In addition, the controller 170 determines that an event has occurred (S1203) when the moving speed of the vehicle is less than a predetermined value for a predetermined time period (S1202). That is, if the moving speed of the vehicle is equal to or lower than a predetermined speed for a predetermined time, it is determined that the vehicle has entered the congestion period.

Referring again to FIG. 5, in step S102 described above, the control unit 170 can determine occurrence of an event based on data received from the vehicle electrical system.

FIG. 24 shows an example in which event occurrence is determined using data obtained from an electric vehicle system of a vehicle.

Referring to FIG. 24, the controller 170 connects to the vehicle's electric system through the communication unit 110 using a short distance communication or a wired communication method (S1301).

If it is determined that the brake malfunction of the vehicle has occurred (S1302), the control unit 170 determines that an event related to safe driving has occurred (S1303).

Referring again to FIG. 5, in step S102 described above, the controller 170 may determine occurrence of an event based on data received from an external electronic device.

FIG. 25 shows an example in which event occurrence is determined using data received from an external electronic device.

Referring to FIG. 25, the controller 170 connects to an external electronic device through the communication unit 110 (S1401).

In addition, when the control unit 170 receives specific data from the connected external electronic device (S1402), it determines that an event has occurred (S1402).

The specific data corresponding to the event can be received as a specific key of the external electronic device is manipulated. The specific key may be set by the user.

FIG. 26 shows an example of registering a specific key of an external electronic device with a shortcut key associated with automatic content transmission.

Referring to FIG. 26, the navigation system 100 connects to the external electronic device 10a (S1501). In addition, the navigation system 100 enters a mode for registering a specific key of the external electronic device as a shortcut key of the automatic content transmission menu based on the user's control input (S1502). In this mode, when the specific key data is received from the external electronic device 10a as the specific key of the external electronic device 10a is operated, the navigation device 100 transmits the specific key as a shortcut key for automatic content transmission Register.

Accordingly, each time the specific key data is input from the external electronic device 100 connected through the communication unit 110, the navigation device 100 automatically acquires the content and transmits the content to the server 300. [

5 to 26, the navigation system 100 according to the first embodiment of the present invention is capable of automatically recognizing an event related to traffic flow or safe driving. Accordingly, the content is automatically collected at a position where information is required to be collected, and transmitted to the server 300 without an operation of a user in operation, thereby minimizing the user's hassle. In addition, there is an effect that the user can safely drive the information by automatically collecting and transmitting the information without disturbing the user's attention.

Referring again to FIG. 5, in step S103, the controller 170 can acquire contents using pre-stored text. When an event occurs in step S102, the controller 170 selects any one of the previously stored plurality of texts based on the attribute of the generated event. Then, the user content generated using the selected text is transmitted to the server 300. The event attribute may include whether the event is an event related to traffic flow and safe driving. In addition, the event attribute includes an attribute of a location where the event occurs, for example, a steady-state congestion section, an accident congestion section, a drinking restriction section, a control section, an accident occurrence point, a construction section, In addition, the event attribute may include a traffic flow state corresponding to the event, for example, congestion, slowdown, and the like.

On the other hand, the text corresponding to the attribute of the generated event can be set by the user. For example, the user can register the 'regular interval congestion section' text in correspondence with the regular congestion section registered in the map data. Thereafter, when the controller 170 enters the position registered in the regular-time congestion section in the map data, the controller 170 transmits the user content generated using the 'regular-congestion interval' text to the server 300.

In addition, in step S103, when an event occurs, the controller 170 may generate user contents using the audio obtained through the microphone 123. [

In addition, in step S103, when an event occurs, the controller 170 may generate the user content using the traveling image (image, moving picture) of the vehicle. The running image of the vehicle can be obtained through at least one camera 125 included in the navigation system 100. [ The traveling image of the vehicle can also be obtained from at least one external camera or a vehicle black box connected through the communication unit 110. [

The control unit 170 may include the traveling image of the vehicle in the user content in the form of an image or a moving image.

For example, when an event occurs, the control unit 170 acquires at least one image from the running image of the vehicle, and transmits the generated user content to the server 300 using the at least one image.

For example, when an event occurs, the control unit 170 acquires the moving image of the vehicle for a predetermined time in the form of a moving image, and transmits the generated user content to the server 300 using the acquired moving image. In the case of generating the user content using the running image of the vehicle, the controller 170 may generate the user content using the running image of the vehicle obtained for a certain period of time after the occurrence of the event. In addition, the controller 170 may acquire the user content using the traveling image of the vehicle photographed for a predetermined period of time before the preset time based on the time when the event occurred.

The time required to acquire the running image of the vehicle for user content creation may vary depending on the event attribute. According to the event attribute, when a traveling image of the vehicle in a certain area is required, the controller 170 can acquire the traveling image until the vehicle enters the corresponding area and proceeds to the at least one image or moving image form.

For example, when the vehicle enters a section registered with the map data as the regular congestion section, the control section 170 acquires the traveling image from the vehicle entering the regular speed congestion section to the advancement thereof in at least one image or moving image form can do.

When a traveling image of the vehicle is required for a certain period of time based on a moment when an event occurs, such as when the vehicle suddenly declines, the control unit 170 determines whether at least one image A traveling image in the form of a moving picture can be obtained.

In addition, the time for acquiring the running image of the vehicle for generating the user contents may vary depending on the subject who acquires the running image. When the main body for acquiring the running image of the vehicle is a black box for a vehicle, the controller 170 can obtain not only the running image currently obtained from the vehicle black box but also the previously obtained running image. Accordingly, when a traveling image before the occurrence of an event is required according to the generated event, the controller 170 can generate user contents using the traveling image of the vehicle obtained from a time before the occurrence of the event .

5, in step S104, the controller 170 may transmit the current position, the moving speed, and the moving direction of the vehicle together with the user content to the server 300. [

In step S105, the control unit 170 displays a text, an icon, and the like on the screen through the display module 151 to guide the user when the transfer of the user content is completed. However, this document is not limited thereto. The technical idea disclosed in this document can be applied to a case where the transmission of user contents is completed and text, icons, and the like, which guide the user, are not displayed on the screen.

Hereinafter, with reference to necessary drawings, the control method of the navigation system 100 according to the second embodiment of the present invention and the operation of the navigation system 100 for implementing the method will be described in more detail. Embodiments disclosed in this document can be implemented in the electronic device 100 described with reference to Fig.

Hereinafter, the operation of the navigation system 100 for implementing the second embodiment disclosed in this document will be described in more detail.

The motion sensing module 131 of the sensing unit 130 may include a gyro sensor, an acceleration sensor, a geomagnetic sensor, a gravity sensor, and the like. The motion sensing module 131 acquires the movement of the vehicle by the navigation device 100.

The control unit 170 acquires the acceleration of the vehicle, the amount of impact from the outside, and the like based on the movement of the navigation system 100 acquired through the sensing unit 170.

In addition, the control unit 170 acquires the current position of the vehicle. The method of acquiring the current position of the vehicle is the same as the position acquiring method disclosed in the first embodiment of the present invention, and therefore, detailed description thereof will be omitted.

In addition, the controller 170 connects to the server 300 through the communication unit 110. Further, it is connected to another electronic device 10 through the communication unit 110.

The control unit 170 constitutes the social network 400 with another electronic device 10 through communication with the server 300. [

The control unit 170 also transmits the selected user content to other electronic devices 10 included in the social network 400 based on the user's control input. The control input of the user may be received through the input unit 120 or may be received from an external electronic device connected through the communication unit 110. [ In the latter case, the control unit 170 may connect to at least one external electronic device through the local communication module 117 of the communication unit 110, and receive the control input from the connected external electronic device. The external electronic device connected through the short-range communication module 117 may include a smart phone, a mobile phone, a notebook computer, a PDA, a tablet PC, and the like.

When a specific event occurs, the control unit 170 automatically acquires user content related to the generated event, and automatically transmits the acquired user content to another electronic device 10 included in the social network 400.

The control unit 170 may also transmit the current location of the vehicle to another electronic device 10 included in the social network 400. [

In addition, the control unit 170 may transmit information related to the traffic flow, which is obtained based on the position of the vehicle, to another electronic device 10 included in the social network 400. [ The traffic information obtained based on the position of the vehicle may include the moving speed of the vehicle, the moving direction, and the like.

FIG. 27 is a flowchart showing a control method of the navigation system 100 according to the second embodiment of the present invention. 28 is a diagram for explaining a control method of the navigation system 100 according to the second embodiment of the present invention.

Referring to FIG. 27, the controller 170 connects to the server 300 through the communication network 200 (S1701). In the second embodiment of the present invention, similar to the first embodiment of the present invention, when the user authentication process is required, the user authentication process is performed using the authentication information. If the user authentication is successful, the server 300 is connected.

In addition, the control unit 170 requests the server 300 to configure the social network to share the user contents obtained on the road based on the user's control input (S1702). In addition, the controller 180 configures a social network with at least one electronic device 10 selected by the server 300 (S1703). In this document, a case where a navigation system 100 is connected to a server 300 and a social network configuration is requested will be described as an example. However, the technical idea disclosed in this document is also applicable to the case of connecting to the server 300 when a request for a social network configuration is received from a user.

In step S1702, the control unit 170 transmits the current position of the navigation device 100, that is, the current position of the vehicle equipped with the navigation device 100, to the server 300, And requests the server 300 to configure the network with the electronic devices 10 located in the Internet.

The control unit 170 may also transmit the moving direction, the destination, the search radius, etc. of the vehicle together with the current position of the vehicle.

When the moving direction of the vehicle is transmitted to the server 300, the server 300 may support the network configuration by selecting the electronic devices 10 whose moving direction and moving direction match the moving direction of the vehicle.

In addition, when the destination is transmitted to the server 300, the server 300 may support the network configuration by selecting the electronic devices 10 located on the route on which the vehicle is moving.

In addition, when transmitting the search radius to the server 300, the server 300 may support the network configuration by selecting the electronic devices 10 located within the radius based on the current position of the vehicle.

Fig. 28 shows an example of constituting a social network.

Fig. 28 shows an example of configuring a network based on the current position SP of the vehicle. 28, when a network configuration for sharing user contents is requested from the navigation system 100, the server 300 determines whether the navigation system 100 has received the current position, that is, the current position SP of the vehicle equipped with the navigation system 100 ). Here, the current position SP may be received from the navigation device 100 or may be obtained from the base station included in the communication network 200. [

In addition, the server 300 acquires a change in position of the vehicle for a predetermined period of time, and acquires a moving direction of the vehicle based on the acquired position change. Further, a certain area A is set based on the current position and the moving direction of the vehicle. The other electronic devices T1 to T5 located in the set area A are selected and the social networks of the selected electronic devices T1 to T5 and the navigation device 100 are supported.

On the other hand, the social network 400 configured by the request of the navigation 100 includes the electronic devices 10 selected based on the current position of the navigation device 100 as described above. Therefore, it can be updated according to the current position of the navigation device 100. [ The server 300 continuously acquires the position of the navigation device 100 and continuously updates the electronic devices 10 included in the network 400 based on the position of the navigation device 100 while the network 400 is maintained.

When the position of the other electronic devices 10 included in the network 400 is changed, even if the position of the navigation device 100 does not change, the server 300 continuously displays the electronic devices 10 included in the network . The electronic device 10 that does not satisfy the condition of the network configuration can be excluded from the network and the electronic device 10 satisfying the new criterion can be included in the network 400. [

Referring again to FIG. 27, when a specific event occurs during driving of the vehicle equipped with the navigation system 100 (S1704), the controller 170 automatically acquires user contents (S1705). The acquired user content is automatically transmitted to at least one electronic device included in the network 400 formed in step S1703 (S1706). When the transmission of the user content is completed, guidance information for guiding the user content is displayed on the screen through the display module 141 (S1707).

In the second embodiment of the present invention, steps S1704 to S1707 are performed in a manner similar to steps S102 to S105 described in the first embodiment of the present invention, and therefore, detailed description thereof will be omitted in this document.

According to embodiments of the present invention, the navigation system automatically determines an event related to traffic flow or safe driving and, when an event occurs, transmits automatically acquired user contents to a server providing traffic information, It is possible to grasp the situation on the road in real time. Further, the contents generated by the navigation are transmitted to other electronic devices, so that other electronic devices can recognize the situation on the road in real time based on the contents.

In addition, by collecting and transmitting contents related to road conditions without requiring the user to perform any operation during operation, it is possible to safely and securely transmit information necessary for traffic flow to other electronic devices or traffic information It is possible to provide it to a server that provides it.

Embodiments of the present invention include computer readable media including program instructions for performing various computer-implemented operations. The medium records a program for executing the navigation service method described so far. The medium may include program instructions, data files, data structures, etc., alone or in combination. Examples of such media include magnetic media such as a hard disk, a floppy disk and a magnetic tape, optical recording media such as CD and DVD, ROM, RAM, flash memory, and the like, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that may be executed by a computer using an interpreter, etc.

It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit and scope of the invention. Accordingly, such modifications or variations are intended to fall within the scope of the appended claims.

Claims (43)

A method for providing traffic-related contents in an electronic device,
Connecting to an external electronic device through a short-range wireless communication module;
Receiving a control input from the external electronic device;
Acquiring a traveling image of a vehicle for a predetermined period of time based on a time point at which the event occurs when a specific event occurs according to an amount of impact detected by the sensing unit; And
And automatically transmitting traffic-related contents including the obtained running image to a server using a wireless communication network,
Wherein the server stores the traveling image, supports a network configuration for content sharing between a plurality of electronic devices, and transmits the stored traveling image to an electronic device connected to the network. Way.
The method according to claim 1,
Further comprising performing authentication with the server to transmit the traffic-related content, wherein the authentication information used for the authentication includes device identification information of the electronic device.
The method according to claim 1,
Wherein the automatically transmitting comprises:
And automatically transmitting the traffic-related content including the obtained running image based on the received control input.
The method according to claim 1,
Wherein the server is a server for transmitting the stored traffic related content to another electronic device included in a social network connected to the server based on an input received from the external electronic device.
The method according to claim 1,
And when the event occurs, newly connecting to the server for transmission of the traffic-related contents.
A method for providing traffic-related contents in an electronic device,
Connecting to at least one vehicle black box through a short range wireless communication module;
Receiving from a black box a traveling image of a vehicle for a predetermined period of time based on a time point at which the event occurs when a specific event occurs according to an amount of an external impact on the vehicle;
And automatically transmitting traffic-related contents including the received traveling image to a server,
Wherein the server stores the traveling image, supports a network configuration for content sharing between a plurality of electronic devices, and transmits the stored driving image to an electronic device connected to the network.
The method according to claim 6,
Further comprising performing authentication with the server to transmit the traffic-related content, wherein the authentication information used for the authentication includes device identification information of the black box for a vehicle.
The method according to claim 6,
Wherein the receiving comprises:
Receiving a driving image obtained in the vehicle black box based on a control input; And
And automatically transmitting traffic-related contents including the received traveling image.
The method according to claim 6,
Wherein the server is a server for transmitting the stored traffic related content to another electronic device included in a social network connected to the server based on an input received from the electronic device.
The method according to claim 6,
And when the event occurs, newly connecting to the server for transmission of the traffic-related contents.
A traffic-related content providing method of a server,
A method of driving a vehicle, comprising the steps of: when a specific event occurs according to an amount of an external impact obtained through a sensing part of a black box of a vehicle, a traveling image of the vehicle for a predetermined period of time, Receiving traffic-related contents including the traveling image automatically transmitted from the electronic device;
Storing traffic related contents including the traveling image;
Supporting a network configuration for content sharing between a plurality of electronic devices; And
And transmitting the stored driving image to a second electronic device connected to the network.
12. The method of claim 11,
Further comprising performing authentication with the first electronic device to receive the traffic related content, wherein the authentication information used for the authentication includes the device identification information of the black box.
12. The method of claim 11,
Wherein the receiving comprises:
And receiving traffic-related contents automatically received by the first electronic device from the black box based on the control input of the first electronic device.
12. The method of claim 11,
And transmitting the stored traffic related content to another electronic device included in a social network connected to the server based on an input received from the first electronic device.
12. The method of claim 11,
And when the event occurs, newly connecting to the first electronic device for receiving the traffic related content.
16. A recording medium on which a program for executing the method disclosed in any one of claims 1 to 15 is recorded in a computer.
delete delete delete delete delete delete delete delete delete delete delete delete delete delete delete delete delete delete delete delete delete delete delete delete delete delete delete
KR1020110018546A 2011-03-02 2011-03-02 Electronic device and control method of electronic device KR101451765B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020110018546A KR101451765B1 (en) 2011-03-02 2011-03-02 Electronic device and control method of electronic device
PCT/KR2012/001491 WO2012118320A2 (en) 2011-03-02 2012-02-28 Electronic device and method for controlling electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020110018546A KR101451765B1 (en) 2011-03-02 2011-03-02 Electronic device and control method of electronic device

Publications (2)

Publication Number Publication Date
KR20120099981A KR20120099981A (en) 2012-09-12
KR101451765B1 true KR101451765B1 (en) 2014-10-20

Family

ID=46758371

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020110018546A KR101451765B1 (en) 2011-03-02 2011-03-02 Electronic device and control method of electronic device

Country Status (2)

Country Link
KR (1) KR101451765B1 (en)
WO (1) WO2012118320A2 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102219268B1 (en) * 2014-11-26 2021-02-24 한국전자통신연구원 Navigation System Cooperating Routes of Explorer and Controlling Method Thereof
EP3845427A1 (en) * 2015-02-10 2021-07-07 Mobileye Vision Technologies Ltd. Sparse map for autonomous vehicle navigation
KR102464898B1 (en) * 2016-01-05 2022-11-09 삼성전자주식회사 Method and apparatus for sharing video information associated with a vihicle
CN105679064A (en) * 2016-03-25 2016-06-15 北京新能源汽车股份有限公司 Vehicle terminal and system used for road condition monitoring, and automobile
KR102297801B1 (en) * 2019-12-11 2021-09-03 (주)케이웍스 System and Method for Detecting Dangerous Road Condition

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050108393A (en) * 2003-03-14 2005-11-16 가부시키가이샤 나비타이무쟈판 Navigation device, navigation system, navigation method, and program
KR20090104299A (en) * 2008-03-31 2009-10-06 유종태 Vehicle Black Box
JP2009276081A (en) 2008-05-12 2009-11-26 Sony Corp Navigation device and information providing method
KR101019915B1 (en) 2010-03-05 2011-03-08 팅크웨어(주) Server, navigation system, navigation for vehicle and video providing method of navigation for vehicel

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100926681B1 (en) * 2007-08-29 2009-11-17 주식회사 아이리버 Navigation information provision system and method
KR100914868B1 (en) * 2007-11-14 2009-09-02 (주)컨버전스스퀘어 Terminal and Method of Displaying Information of Contents

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050108393A (en) * 2003-03-14 2005-11-16 가부시키가이샤 나비타이무쟈판 Navigation device, navigation system, navigation method, and program
KR20090104299A (en) * 2008-03-31 2009-10-06 유종태 Vehicle Black Box
JP2009276081A (en) 2008-05-12 2009-11-26 Sony Corp Navigation device and information providing method
KR101019915B1 (en) 2010-03-05 2011-03-08 팅크웨어(주) Server, navigation system, navigation for vehicle and video providing method of navigation for vehicel

Also Published As

Publication number Publication date
KR20120099981A (en) 2012-09-12
WO2012118320A2 (en) 2012-09-07
WO2012118320A3 (en) 2012-12-20

Similar Documents

Publication Publication Date Title
US11271890B2 (en) Electronic device, server, and control method and location information providing method for the electronic device
KR101677638B1 (en) Mobile terminal system and control method thereof
KR20190109625A (en) Method of providing detailed map data and system therefor
CN104380047B (en) Navigation system
KR101451765B1 (en) Electronic device and control method of electronic device
JP2014236493A (en) Message notification system, message transmitter/receiver device, program and recording medium
KR101440334B1 (en) Server for providing traffic information, electonic device and method for providing traffic information
KR20120109899A (en) Electronic device and navigation service method of electronic device
JP6210629B2 (en) Navigation device, map display control method, and map display control program
KR101421613B1 (en) Electronic device, server, mehotd for controlling of the electronic device and method for providing of traffic information
KR101854663B1 (en) Server and method for providing information and electrinic device and method for receiving and using information from server
KR101952341B1 (en) Electronic device, server and method for providing traffic information
KR101861355B1 (en) Traffic information providing method, electronic device, server and system for implementing the method
KR101854665B1 (en) Electronic device, server, method and system for providing user contents
KR101105145B1 (en) Server, electronic device and traffic information providing method of electronic device
KR101296294B1 (en) Electronic device and schedule managing method of electronic device
KR102328015B1 (en) Electronic device, server and method for providing traffic information
KR102219901B1 (en) Electronic device, server and method for providing traffic information
KR102305136B1 (en) Electronic device, server and method for providing traffic information
KR102362471B1 (en) Electronic device, server and method for providing traffic information
KR102335466B1 (en) Electronic device, server and method for providing traffic information
KR102057933B1 (en) Electronic device, server and method for providing traffic information
KR101448264B1 (en) Apparatus for presenting traffic information based on location information and Method thereof
KR20140019124A (en) Path searching method by drawing, apparatus and system therefor
KR20180029000A (en) Electronic device, method and system for providing contents relate to traffic

Legal Events

Date Code Title Description
N231 Notification of change of applicant
A201 Request for examination
A302 Request for accelerated examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20170927

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20181010

Year of fee payment: 5

FPAY Annual fee payment

Payment date: 20191007

Year of fee payment: 6