CN108470162B - Electronic device and control method thereof - Google Patents

Electronic device and control method thereof Download PDF

Info

Publication number
CN108470162B
CN108470162B CN201810214077.7A CN201810214077A CN108470162B CN 108470162 B CN108470162 B CN 108470162B CN 201810214077 A CN201810214077 A CN 201810214077A CN 108470162 B CN108470162 B CN 108470162B
Authority
CN
China
Prior art keywords
signal
vehicle
region
image data
driving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810214077.7A
Other languages
Chinese (zh)
Other versions
CN108470162A (en
Inventor
李韩雨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hyundai Motor Co
Kia Corp
Original Assignee
Hyundai Motor Co
Kia Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hyundai Motor Co, Kia Corp filed Critical Hyundai Motor Co
Publication of CN108470162A publication Critical patent/CN108470162A/en
Application granted granted Critical
Publication of CN108470162B publication Critical patent/CN108470162B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/582Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of traffic signs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)

Abstract

An electronic apparatus and a control method thereof. The invention discloses an electronic device, a control method of the electronic device, and a computer-readable recording medium. The control method of the electronic device comprises the following steps: generating signal type information using image data of a signal area portion of a signal lamp in driving-related image data of a vehicle; and a step of performing driving-related guidance of the vehicle using the generated signal kind information.

Description

Electronic device and control method thereof
This application is a divisional application of an invention patent application having a parent application number of 201510333358.0 (application date is 2015, 6/16, entitled "electronic device, control method for electronic device, and computer-readable recording medium").
Technical Field
The present invention relates to an electronic device, a method for controlling the electronic device, and a computer-readable recording medium, and more particularly, to an electronic device, a method for controlling the electronic device, and a computer-readable recording medium, which are capable of recognizing the signal type of a traffic light and executing guidance related to driving of a vehicle based on the signal type.
Background
When a vehicle is traveling, it is most important to safely travel and prevent a traffic accident, and for this reason, the vehicle is equipped with various auxiliary devices, safety devices such as a seat belt and an airbag, and the like, which perform functions of controlling the posture of the vehicle, controlling vehicle structural devices, and the like.
In addition, recently, devices such as a black box installed in a vehicle store driving images of the vehicle and data transmitted from various sensors, and thus there is a tendency that a device for finding the cause of an accident of the vehicle when the vehicle has a traffic accident is installed in the vehicle. A black box, a navigator application, and the like can be mounted on a portable terminal such as a smartphone or a tablet computer, and thus the present invention is used as the above-described vehicle device.
However, in the vehicle device, the utilization rate of the traveling image is very low. More specifically, even if a driving image of the vehicle is obtained by a visual sensor mounted on a camera or the like of the vehicle at present, the electronic device of the vehicle is merely left to simply display and transmit such an image or is left to generate simple peripheral notification information such as whether the vehicle is off the lane line or not.
Further, as a new vehicle electronic device which has been attracting attention at present, a Head-Up Display (HUD) and an augmented reality interface have been proposed, but in these devices, the utilization rate of a traveling image with respect to a vehicle is only limited to a level of simply displaying or generating simple notification information.
Disclosure of Invention
The present invention has been made to solve the above-mentioned problems, and an object of the present invention is to provide an electronic device, a method of controlling the electronic device, and a computer-readable recording medium, in which signal type information of a traffic light is generated using driving-related image data of a vehicle, and guidance related to driving of the vehicle is executed based on the signal type information.
A control method of an electronic apparatus according to an embodiment of the present invention for achieving the above object includes: generating signal type information using image data of a signal area portion of a signal lamp in driving-related image data of a vehicle; and executing the driving-related guidance of the vehicle using the generated signal type information.
The present invention may further include a step of determining whether or not the vehicle is in a stopped state using the driving-related image data, and the step of generating the signal type information may be executed if it is determined that the vehicle is in a stopped state.
The step of determining whether the vehicle is in a stopped state may include: generating gray-scale image data of the driving-related image data; and comparing a plurality of frames included in the generated grayscale image data, respectively, to determine whether or not the vehicle is in a stopped state.
Further, the step of generating the signal type information may include: a step of determining a region of interest including the traffic light in the driving-related image data; and converting the determined image data of the region of interest based on a preset pixel value to generate image data of the region of interest.
The step of generating the signal type information may include: applying a first region having a first area and a second region including the first region and having a second area to the target region image data to detect image data of a signal region portion of the traffic light; comparing the difference between the area pixel value of the second region and the area pixel value of the first region with a preset area pixel value; and generating signal type information of the signal lamp based on the comparison result.
The signal type information may be information for identifying a plurality of types of signals that can be displayed on the signal lamp.
And, the step of performing the driving-related guidance of the vehicle may include a step of outputting a signal guidance using the signal kind information.
The step of executing the driving-related guidance of the vehicle may include a step of outputting a signal change guidance using route information used for navigation of the vehicle and the signal type information.
The step of outputting the signal change guidance may be performed when a time period during which the vehicle maintains the stopped state elapses within a predetermined time period from a signal change time point of the traffic light.
The step of outputting the signal change guide may include: generating an indicator for executing the driving-related guidance; and outputting the generated indicator through augmented reality.
On the other hand, an electronic apparatus according to an embodiment of the present invention for achieving the above object includes: a signal type information generation unit that generates signal type information using image data of a signal area portion of a traffic light in driving-related image data of a vehicle; and a control unit configured to execute a driving-related guidance of the vehicle using the generated signal type information.
The present invention may further include a driving state determination unit that determines whether or not the vehicle is in a stopped state using the driving-related image data, and the control unit may control the signal type information generation unit to generate signal type information of a traffic light if it is determined that the vehicle is in the stopped state.
The driving state determination unit may generate grayscale image data of the driving-related image data, and compare a plurality of frames included in the generated grayscale image data, respectively, to determine whether or not the vehicle is in a stopped state.
The signal type information generating unit may determine a region of interest including the traffic signal in the driving-related image data, and may convert the image data of the determined region of interest with reference to a predetermined pixel value to generate the region of interest image data.
The signal type information generating unit may apply, to the target region image data, image data of a signal region portion of the traffic signal to detect a first region having a first area and a second region including the first region and having a second area, compare a difference between an area pixel value of the second region and an area pixel value of the first region with a predetermined area pixel value, and generate the signal type information of the traffic signal based on a result of the comparison.
The signal type information may be information for identifying a plurality of types of signals that can be displayed on the signal lamp.
The control unit may control the output unit to output a signal guide using the signal type information.
The control unit may control the output unit to output a signal change guide using the route information and the signal type information for the navigation of the vehicle.
The control unit may control the output unit to output the signal change guidance when the vehicle maintains the stopped state for a predetermined time period from the signal change time point of the traffic light.
The control unit may control an output unit to generate an indicator for executing the driving-related guidance, and output the generated indicator by augmented reality.
On the other hand, a recording medium of an embodiment of the present invention for achieving the above object may have program codes recorded therein for executing the control method of the above electronic apparatus in a computer.
According to various embodiments of the present invention described above, it is possible to generate signal kind information of signal lights located on a road on which a vehicle is stopped or traveling, and to perform driving-related guidance of the vehicle based on this.
Also, according to various embodiments of the present invention, a signal is presented to a driver of a vehicle using signal type information, so that an effect of assisting the driver can be performed.
Also, according to various embodiments of the present invention, accurate signal change prompts are performed using navigator path and signal kind information, so that convenience can be provided to a user.
Also, according to various embodiments of the present invention, vehicle driving-related guidance such as navigation, signal guidance, signal change guidance, and the like can be performed in augmented reality, thereby providing guidance to a driver in a more intuitive manner.
Drawings
Fig. 1 is a block diagram of an electronic device according to an embodiment of the invention.
Fig. 2 is a diagram illustrating a system network connected to an electronic device according to an embodiment of the present invention.
Fig. 3 is a flowchart illustrating a control method of an electronic device according to an embodiment of the invention.
Fig. 4 is a flowchart specifically illustrating a method for determining a parking state of a vehicle according to an embodiment of the present invention.
Fig. 5 is a diagram illustrating a process of generating grayscale image data from driving-related image data according to an embodiment of the present invention.
Fig. 6 is a flowchart specifically illustrating a signal type information generating method of an electronic device according to an embodiment of the invention.
Fig. 7 is a diagram illustrating a process of generating image data of a region of interest from driving-related image data according to an embodiment of the present invention.
Fig. 8 is a diagram illustrating a method for determining target area image data and a stop signal according to an embodiment of the present invention.
Fig. 9 is a diagram illustrating a method for determining image data of a region of interest corresponding to a forward signal and a forward signal according to an embodiment of the present invention.
Fig. 10 is a flowchart illustrating a signal change guidance method of an electronic device according to an embodiment of the invention.
Fig. 11 is a diagram showing a signal change guidance screen according to an embodiment of the present invention.
Fig. 12 is a diagram showing an embodiment of the present invention in a case where the camera and the electronic apparatus are separated.
Fig. 13 is a diagram showing an embodiment of the present invention in a case where a camera and an electronic apparatus are integrated.
Fig. 14 is a diagram showing an embodiment of a head-up display and an electronic apparatus according to an embodiment of the present invention.
Detailed Description
The following merely illustrates the principles of the invention. Thus, those skilled in the art to which the invention pertains may devise various arrangements that, although not explicitly described or shown herein, embody the principles of the invention and are included within its concept and scope. In addition, all the conditional terms and embodiments recited in the present invention are used for understanding the concept of the present invention in principle and are not to be construed as limiting the embodiments and states specifically recited as described above.
Moreover, all detailed descriptions that exemplify the principles, aspects, and embodiments of the present invention and set forth particular embodiments are to be understood as including structural and functional equivalents of such items. Also, such equivalents should be understood to include both currently disclosed equivalents as well as equivalents developed in the future, i.e., all elements invented to perform the same function, regardless of structure.
Thus, for example, the block diagrams of the present specification should be understood to represent conceptual views of illustrative circuitry embodying the principles of the invention. Similarly, all flowcharts, state transition diagrams, pseudo codes and the like should be understood as representing various programs executed by a computer or a process, whether or not substantially shown in a computer readable medium, whether or not the computer or the process is explicitly shown.
The functions of a processor or various elements shown in the drawings including functional blocks shown in a concept similar to the processor may be provided not only as dedicated hardware but also as hardware having functions capable of executing software in association with appropriate software. When provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared.
Also, explicit use of terms such as processing, control, or similar concepts herein should not be construed to exclude the ability to run software, and should be construed in a non-limiting sense to implicitly include Digital Signal Processor (DSP) hardware, Read Only Memory (ROM) for storing software, Random Access Memory (RAM), and non-volatile storage. Other conventional hardware may also be included as is well known.
In the scope of the claims of the present specification, elements expressed as means for performing functions described in the detailed description include, for example, combinations of circuit elements that perform the functions described above or all methods including functions that perform all types of software including firmware, microcode, etc., and are combined with appropriate circuits for executing the software to perform the functions described above. The invention defined by such claims may be combined with the functionality provided by the methods enumerated above, and in any combination with the claims, and thus any method capable of providing the functionality described above should also be understood to be equivalent to the method capable of being grasped from the description.
The above objects, features and advantages will become more apparent from the following detailed description with reference to the accompanying drawings, and thus, it is possible for those skilled in the art to easily implement the technical idea of the present invention. In describing the present invention, if it is determined that detailed description of known techniques may obscure the gist of the present invention, detailed description thereof will be omitted.
Hereinafter, various embodiments of the present invention will be described in detail with reference to the accompanying drawings.
Fig. 1 is a block diagram of an electronic device according to an embodiment of the invention. Referring to fig. 1, electronic device 100 includes all or a part of storage unit 110, input unit 120, output unit 130, lane line information generation unit 140, lane position information generation unit 150, augmented reality supply unit 160, control unit 170, communication unit 180, and detection unit 190.
Here, the electronic device 100 may be embodied as a smart phone, a tablet computer, a notebook computer, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), smart glasses, augmented reality glasses, a navigator (navigation), a Black box (Black-box), and the like, which can provide driving-related guidance to a vehicle driver.
Here, the driving state of the vehicle may include various states of the vehicle driven by the driver, such as a parking state of the vehicle, a running state of the vehicle, a parking state of the vehicle, and the like.
The driving-related guidance may include navigation, lane line departure guidance, preceding vehicle departure guidance, signal change guidance, collision prevention guidance with a preceding vehicle, lane change guidance, lane guidance, and the like, which are various kinds of guidance for assisting the driver in driving the vehicle.
Here, the navigating may include: augmented reality navigation that performs navigation by capturing an image ahead of a vehicle while the vehicle is traveling, in combination with various information such as the position and direction of a user; and two-Dimensional (2D, 2-Dimensional) or three-Dimensional (3D, 3-Dimensional) navigation, which is performed in conjunction with various information such as the position, direction, and the like of a person using in two-Dimensional or three-Dimensional map data. Here, the navigation may be interpreted to include not only the navigation in the case where the user drives in the vehicle but also the concept of navigation in the case where the user moves in a walking or running manner.
The lane line departure guide may be a guide for guiding whether or not the traveling vehicle departs from the lane line.
The front vehicle departure guide may guide whether or not a vehicle located in front of the vehicle in a parking state departs.
The signal guidance may be guidance of a signal state of a traffic light located in front of the driving vehicle, such as a stop signal, a straight signal, a left turn signal, and a right turn signal. Here, the color and form of the signal corresponding to each signal may be different depending on the country. Taking korea as an example, the stop signal may be a red circle, the straight signal may be a blue circle, the left turn signal may be a blue left arrow, and the right turn signal may be a blue right arrow.
The signal change guidance may be guidance for changing the signal state of a traffic light located in front of the vehicle being driven. For example, when the stop signal is changed to the straight signal, the signal is guided.
The guidance for preventing a collision with the preceding vehicle may be guidance for preventing a collision with the preceding vehicle if the distance between the vehicle in a stopped state or a traveling state and the vehicle located in front is within a predetermined distance.
The lane change guidance may be a guidance for guiding the vehicle to change from the lane where the vehicle is located to another lane in order to guide the route to the destination.
The lane guidance may be guidance for a lane on which the vehicle is currently located.
Such driving-related images capable of providing various guides can be photographed in real time by a camera placed toward the front of the vehicle. Here, the camera may be a camera that is integrally formed with the electronic device 100 placed in the vehicle and photographs the front of the vehicle. In this case, the camera may be integrated with the smartphone, the navigator or the black box, and the electronic device 100 may receive an image photographed by the integrated camera.
As another example, the camera may be a camera that is placed in a vehicle separately from the electronic device 100 and that captures an image of the front of the vehicle. In this case, the camera may be a separate black box placed toward the front of the vehicle, and the electronic device 100 may receive the image photographed by the separately placed black box through wired/wireless communication, or if a storage medium for storing the image photographed by the black box is inserted into the electronic device 100, the electronic device 100 may receive the image photographed by the black box.
Hereinafter, the electronic device 100 according to an embodiment of the invention will be described in more detail based on the above description.
The storage unit 110 performs a function of storing various data and applications necessary for the operation of the electronic apparatus 100. In particular, the storage unit 110 may store data required for the operation of the electronic device 100, such as an Operating System (OS), a route search application, map data, and the like. The storage unit 110 may store data generated by the operation of the electronic device 100, such as the searched route data and the received video. The storage unit 110 may store positional relationship information of a plurality of types of signals included in the signal lamp.
Here, the storage unit 110 may be embodied as a removable storage element such as a general purpose serial bus Memory, as well as a built-in storage element such as a Random Access Memory (RAM), a flash Memory, a Read Only Memory (ROM), an Erasable Programmable Read Only Memory (EPROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a register, a hard disk, a removable disk, a Memory card, and a Universal Subscriber Identity Module (USIM).
The input unit 120 performs a function of converting a physical input from the outside of the electronic device 100 into a specific electric signal. Here, the input part 120 may include all or a part of the user input part 121 and the microphone part 123.
The user input unit 121 can receive user input such as touch and push operations. Here, the user input unit 121 may be embodied by at least one of various button configurations, a touch sensor that receives a touch input, and a proximity sensor that receives a proximity operation.
The microphone unit 123 can receive the voice of the user and the sound generated from the inside and outside of the vehicle.
The output unit 130 is a device for outputting data of the electronic device 100. Here, the output part 130 may include all or a part of the display part 131 and the audio output part 133.
The display unit 131 is a device that outputs visually recognizable data to the electronic device 100. The display portion 131 may be embodied as a display portion provided on the front of the housing of the electronic device 100. Here, the display portion 131 may be integrated with the electronic device 100 and output visual identification data, or may be provided separately from the electronic device 100 as a head-up display to output visual identification data.
The audio output unit 133 is a device that outputs data that can be identified by auditory means to the electronic device 100. The audio output unit 133 can embody data of the electronic device 100 to be notified to the user through a speaker that represents sound.
The communication unit 180 may be provided to communicate the electronic apparatus 100 with another device. The communication section 180 may include all or a part of the location data section 181, the wireless internet section 183, the broadcast transmitting/receiving section 185, the mobile communication section 186, the short-range communication section 187, and the wired communication section 189.
The position data unit 181 is a device that obtains position data by a Global Navigation Satellite System (GNSS). The global navigation satellite system means a navigation system that can calculate the position of a receiving terminal using radio signals received from artificial satellites. Specific examples of the Global navigation Satellite System include a Global Positioning System (GPS), a Galileo Positioning System (Galileo), a GLONASS navigation System (GLONASS), a COMPASS navigation Satellite System (COMPASS), an Indian Regional Navigation Satellite System (IRNSS), and a Quasi-Zenith Satellite System (QZSS, quadrature-Zenith Satellite System) depending on the subject of operation. The location data unit 181 of the electronic device 100 according to an embodiment of the present invention may obtain the location information by receiving a signal of a global navigation satellite system providing a service to a region where the electronic device 100 is used.
The wireless internet section 183 is a device that obtains data or transmits information by connecting to a wireless internet. The Wireless internet connectable via the Wireless internet unit 183 may be a Wireless Local Area Network (WLAN), a Wireless broadband (Wireless broadband), a worldwide interoperability for microwave Access (Wimax), a High Speed Downlink Packet Access (HSDPA), or the like.
The broadcast transmitting/receiving unit 185 is a device that transmits and receives broadcast signals through various broadcast systems. The Broadcast system that can be transmitted and received by the Broadcast transmitting/receiving section 185 may be Digital Multimedia Broadcasting Terrestrial (DMBT), Digital Multimedia Satellite Broadcasting (DMBS), mobile television standard proposed by highpass (Media FLO, Media aware Link Only), Digital Video Broadcasting Handheld (DVBH), Digital audio Broadcasting in japan (ISDBT), Integrated Services Digital Broadcast television (r). The broadcast signal transceived by the broadcast transceiving part 185 may include traffic data, life data, and the like.
The mobile communication unit 186 is capable of connecting to and communicating with a mobile communication network according to various mobile communication specifications such as the third generation mobile communication technology (3G), the third generation partnership project (3 GPP), and the Long Term Evolution (LTE).
The short-range communication unit 187 is a device for performing short-range communication. As described above, the short-range Communication unit 187 can perform Communication by Bluetooth (Bluetooth), Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee (ZigBee), Near Field Communication (NFC), wireless fidelity (Wi-Fi), and the like.
The wired communication unit 189 is an interface device that can connect the electronic apparatus 100 to another device by wire. The wired communication unit 189 may be a universal serial bus module capable of communicating through a universal serial bus Port (USB Port).
The communication unit 180 can communicate with other devices using at least one of the location data unit 181, the wireless internet unit 183, the broadcast transmitting/receiving unit 185, the mobile communication unit 186, the short-range communication unit 187, and the wired communication unit 189.
For example, when the electronic device 100 does not include an imaging function, an image captured by a vehicle camera such as a black box can be received by at least one of the near field communication unit 187 and the wired communication unit 189.
As another example, when communicating with a plurality of apparatuses, one apparatus may communicate via the short-range communication unit 187, and the other apparatus may communicate via the wired communication unit 189.
The detection unit 190 is a device that can detect the current state of the electronic device 100. The detection section 190 may include all or a part of the motion detection section 191 and the light detection section 193.
The motion detector 191 may detect a motion in the three-dimensional space of the electronic device 100. The motion detecting unit 191 may include a three-axis geomagnetic sensor and a three-axis acceleration sensor. The motion data obtained by the motion detection unit 191 and the position data obtained by the position data unit 191 are combined to calculate a more accurate trajectory of the vehicle to which the electronic device 100 is attached.
The light detector 193 measures the ambient illuminance (illuminance) of the electronic device 100. The luminance of the display unit 195 can be changed in accordance with the peripheral luminance by using the illuminance data obtained by the light detection unit 193.
The power supply unit 195 supplies power necessary for the operation of the electronic apparatus 100 or the operation of another device connected to the electronic apparatus 100. The power supply unit 195 may receive power from an external power supply such as a battery or a vehicle built in the electronic device 100. Also, the power supply part 195 may be embodied as the wired communication module 119 or as a device that receives power wirelessly, depending on the form of receiving power.
On the other hand, the control unit 170 controls the overall operation of the electronic device 100. Specifically, the control unit 170 may control all or a part of the storage unit 110, the input unit 120, the output unit 130, the signal type information generation unit 140, the driving state determination unit 150, the augmented reality supply unit 160, the communication unit 180, and the detection unit 190.
The signal type information generating unit 140 may generate the signal type information using image data of a signal area portion of a traffic light in the driving-related image data of the vehicle.
Specifically, the signal type information generating unit 140 may determine a region of interest including a traffic light in the driving-related image data, and convert the image data of the determined region of interest with reference to a preset pixel value to generate the region of interest image data. The signal type information generating unit 140 may apply, to the target region image data, image data of a signal region portion in which the signal of the traffic light is detected, the first region having the first area and the second region including the first region and having the second area. The signal type information generating unit 140 may compare a difference between the area pixel value of the second region and the area pixel value of the first region with a predetermined area pixel value, and generate signal type information of the traffic signal according to the comparison result.
Here, the signal type information may be information for identifying a plurality of types of signals that can be displayed on the traffic light, and may include stop signal information, straight signal information, left turn signal, and right turn signal information.
On the other hand, when the driving state determination unit 150 determines that the vehicle is in the stopped state, the signal type information generation operation of the signal type information generation unit 140 described above may be executed.
Specifically, the driving state determination unit 150 may determine whether the vehicle is in a stopped state using the driving-related image data. More specifically, the driving state determination unit 150 may generate the grayscale image data of the driving-related image data, and sequentially compare a plurality of frames included in the generated grayscale image data in chronological order, to determine whether or not the vehicle is in a stopped state.
If the vehicle is stopped as a result of the determination by the driving state determining unit 150, the control unit 170 controls the signal type information generating unit 140 to operate to generate the signal type information.
On the other hand, the control unit 170 may perform the driving-related guidance of the vehicle using the signal type information generated by the signal type generation unit 140.
For example, the control unit 170 may control the output unit 130 to output the signal guide using the signal type information generated by the signal type information generating unit 140. Specifically, the control part 170 may control how the signal state of the signal light is outputted in an image or sound manner by the output part 130, for example, whether it is a stop signal, whether it is a straight signal, whether it is a right turn signal, and whether it is a left turn signal.
As another example, the control unit 170 may control the output unit 130 to output the signal change guide using the signal type information generated by the signal type information generating unit 140 and the route information for navigation. Specifically, in the case where the electronic device 100 performs navigation to the destination of the vehicle, the control part 170 may control the output part 130 to output the signal change guide using the path information and the signal kind information for navigation. That is, in a state where the route information of the vehicle stopped on the intersection is in the straight traveling direction, if the signal type information generated by the signal type information generating unit 140 is changed from the stop signal to the left turn signal, the control unit 170 may control the output unit 130 not to output the signal change guidance. That is, when the signal type information generated by the signal type information generating unit 140 is changed from the stop signal to the straight traveling signal in a state where the route information of the vehicle stopped on the intersection is in the straight traveling direction, the control unit 170 may control the output unit 130 to output the signal change guidance when a predetermined time elapses from the signal change time point of the traffic light and the time when the vehicle is kept stopped.
On the other hand, the control section 170 may cause the electronic device 100 to execute the driving-related guidance based on the augmented reality. Here, augmented reality may be a method Of supplying additional information (for example, a graphic element indicating a Point Of Interest (POI), a graphic element indicating a route to a destination, or the like) so as to be visually superimposed on a screen on which a real world actually seen by a user is presented. In this case, the control unit 170 may generate an indicator for executing the driving-related guidance in conjunction with the augmented reality supply unit 160, and may output the generated indicator through the output unit 130. For example, augmented reality can be provided by superimposing images on a head-up display using a front windshield of a vehicle or other image output devices, and thus the augmented reality provider 160 can generate a real image, an interface image superimposed on a glass, or the like. Thus, an augmented reality navigator, a vehicle information system, or the like can be embodied.
Fig. 2 is a diagram illustrating a system network connected to an electronic device according to an embodiment of the present invention. Referring to fig. 2, the electronic device 100 according to an embodiment of the present invention may be embodied as various devices disposed in a vehicle, such as a navigator, a black box, a smart phone, or other augmented reality interface providing device for a vehicle, and may be connected to various communication networks and other electronic devices 61, 62, 63, and 64.
The electronic device 100 can also calculate the current position and the current time by linking with the global positioning system based on the radio wave signal received from the artificial satellite 20.
Each satellite 20 may transmit L-band frequencies in a different frequency band. The electronic device 100 may calculate the current position based on the time required for the L-band frequency transmitted from each artificial satellite 20 to reach the electronic device 100.
On the other hand, the electronic device 100 can be wirelessly connected to the network 30 through the communication unit 180 by means of the control station (ACR)40, the base station (RAS)50, and the like. If the electronic apparatus 100 is connected to the network 30, it can also be connected to other electronic devices 61 and 62 connected to the network 30 in an indirect manner, and exchange data.
On the other hand, the electronic apparatus 100 may also be connected to the network 30 in an indirect manner through the other device 63 having a communication function. For example, in the case where the electronic apparatus does not have a module connectable to the network 30, the electronic apparatus can perform incoming communication with the other device 63 having a communication function by near field communication or the like.
Fig. 3 is a flowchart illustrating a signal type information generating method of an electronic device according to an embodiment of the invention. Referring to fig. 3, first, the electronic device 100 may determine whether the vehicle is in a parking state by using the driving-related image data (step S101).
If it is determined that the vehicle is in the stopped state, the electronic device 100 may generate the signal type information using the image data of the signal area portion of the traffic light in the driving-related image data of the vehicle (step S102).
Also, the driving-related guidance of the vehicle may be performed using the generated signal kind information (step S103).
Fig. 4 is a flowchart specifically illustrating a method for determining a parking state of a vehicle according to an embodiment of the present invention. Referring to fig. 4, first, the electronic device 100 may generate grayscale image data of driving-related image data (step S201). Here, the driving-related image of the vehicle may include an image of parking, traveling, and the like of the vehicle. Also, the driving-related image of the vehicle may be an image captured in a camera module included in the electronic device 100 or an image captured in another device received by the electronic device 100. Also, the driving-related image of the vehicle may be a color (RGB, Red Green Blue) image. In this regard, a detailed description will be made with reference to fig. 5.
Fig. 5 is a diagram illustrating a process of generating grayscale image data from driving-related image data according to an embodiment of the present invention. Referring to fig. 5, the driving state determination part 150 may receive driving-related image data as a color image as shown in part (a) of fig. 5 and may perform a gray scale conversion to generate gray scale image data as shown in part (b) of fig. 5.
Then, the electronic apparatus 100 can compare a plurality of frames included in the generated grayscale video data (step S202). Specifically, the driving state determination unit 150 may compare a plurality of frames included in the grayscale image data in chronological order.
Also, the electronic device 100 may determine whether the vehicle is in a parking state according to the comparison result (step S203). For example, the driving state determination part 150 may calculate a difference between a current frame and a previous frame, and compare the calculated value with a preset value, thereby determining whether the vehicle is in a stopped state.
According to the method for determining a parking state of a vehicle as shown in fig. 4, the image processing speed can be increased by determining whether the vehicle is in a parking state using the grayscale image data, and thus, whether the vehicle is in a parking state can be determined quickly in a driving state in which the driving and parking states are changed at any time.
On the other hand, the method for determining the parking state of the vehicle shown in fig. 4 is merely an example of the present invention, and the present invention is not limited thereto. Therefore, a plurality of vehicle parking state determination methods for determining whether or not the vehicle is in a parking state based on the detection information of the operation detecting unit 191 can be used.
Fig. 6 is a flowchart specifically illustrating a signal type information generating method of an electronic device according to an embodiment of the invention. Referring to fig. 6, first, the electronic device 100 may determine a region of interest including a traffic light in the driving-related image data (step S301).
In this case, the signal type information generation unit 140 may determine the attention area including the traffic light in the driving-related image data by using a Vanishing point (vanising point), for example. That is, the signal type information generating unit 140 may extract a lane line from a captured image captured by a camera while the vehicle is driving, and determine a point where the extracted lane lines intersect with each other as a vanishing point. The traffic light is located in an upper area of the vanishing point, and the signal type information generating unit 140 may determine the determined upper area of the vanishing point as the attention area.
On the other hand, as another example, the signal type information generating unit 140 may determine a predetermined area set in advance as the attention area in the driving-related image data.
The electronic device 100 can generate the region-of-interest image data by converting the determined image data of the region of interest with reference to a preset pixel value (step S302). In this case, for example, the signal type information generating unit 140 may apply the determined image data of the region of interest to the following equation 1 to generate the region of interest image data.
Mathematical formula 1:
Figure BDA0001598071030000141
here, R, G, B may be the R value, G value, and B value of the determined video data of the region of interest, K may be a predetermined pixel value serving as a conversion reference, and x, y, and z may be specific coefficients. As an example, K may be an average pixel value of 128.
Thus, the signal type information generating unit 140 can generate the region of interest image data clearly dividing the signal region portion of the traffic light.
On the other hand, the color and/or the form of the signal respectively corresponding to the plurality of kinds of signals included in the signal lamp may differ by country. Taking korea as an example, the stop signal may be a red circle, the straight signal may be a blue circle, the left turn signal may be a blue left arrow, and the right turn signal may be a blue right arrow.
Therefore, the predetermined pixel values and x, y, and z may have different values according to an algorithm.
Such steps S301 and S302 will be specifically described with reference to fig. 7.
Fig. 7 is a diagram illustrating a process of generating image data of a region of interest from driving-related image data according to an embodiment of the present invention. Referring to fig. 7, the signal type information generating unit 140 may determine a region of interest including a traffic light in the driving-related image data as shown in fig. 7 (a).
The signal type information generation unit 140 may convert the determined video data of the region of interest to generate region of interest video data 702 as shown in part (b) of fig. 7. In this case, the signal type information generation unit 140 may convert the video data of the determined region of interest with reference to a predetermined pixel value to generate the region of interest video data 702.
Thus, the signal type information generating unit 140 can generate the attention area video data in which the signal area portion 703 of the traffic light is clearly divided.
On the other hand, the electronic device 100 may apply the first region and the second region to the target region image data to detect the image data of the signal region portion of the traffic light (step S303). Specifically, the signal type information generating unit 140 may apply, to the target region image data, the image data of the signal region portion of the traffic light to the first region having the first area and the second region including the first region and having the second area. Here, the first and second regions may have a shape corresponding to the shape of the signal lamp. For example, when the signal of the traffic light is circular, elliptical, or square, the first region and the second region may be circular, elliptical, or square.
On the other hand, the electronic device 100 may compare the difference between the area pixel value of the second region and the area pixel value of the first region with a preset area pixel value (step S304). Here, the preset area pixel value may be changed by reflecting colors and/or shapes and the like corresponding to a plurality of types of signals included in the signal lamp, respectively.
The electronic device 100 may generate signal type information of the traffic light according to the comparison result (step S305).
Step S303, step 304, and step 305 will be specifically described with reference to fig. 8 to 9.
Fig. 8 is a diagram illustrating a method for determining target area image data and a stop signal corresponding to the stop signal according to an embodiment of the present invention. When the traffic signal displays a red stop signal, the signal type information generation unit 140 converts the video data of the region of interest including the traffic signal based on a predetermined pixel value, thereby generating the region of interest video data 801 as shown in fig. 8 (a). Here, as shown in 802, a red stop signal of a traffic light can be generated on the region of interest image data 801.
In this case, the signal type information generating unit 140 may apply the first region 804 and the second region 803 to the target region video data 801 to detect the video data of the signal region portion 802 of the traffic light.
The signal type information generating unit 140 may compare the difference between the area pixel value of the second region 803 and the area pixel value of the first region 804 with a predetermined area pixel value to generate signal type information of the traffic light. For example, if the difference between the area pixel value of the second region 803 and the area pixel value of the first region 804 is smaller than a predetermined area pixel value (that is, in the case shown in fig. 8 (b), the area pixel value of the first region 804 is large), the signal type information generating unit 140 may determine the signal type information as the stop signal.
Fig. 9 is a diagram illustrating a method for determining image data of a region of interest corresponding to a forward signal and a forward signal according to an embodiment of the present invention. When the traffic signal displays a blue straight signal, the signal type information generation unit 140 converts the video data of the region of interest including the traffic signal based on a predetermined pixel value, thereby generating region of interest video data 901 as shown in part (a) of fig. 9. Here, as shown by 902, a blue straight signal of a traffic light can be generated on the region of interest image data 901.
In this case, the signal type information generating unit 140 may apply the first area 904 and the second area 903 to the target area video data 901 to detect the video data of the signal area portion 902 of the traffic light.
The signal type information generating unit 140 may compare the difference between the area pixel value of the second region 903 and the area pixel value of the first region 904 with a predetermined area pixel value to generate signal type information of the traffic signal. For example, when the difference between the area pixel value of the second region 903 and the area pixel value of the first region 904 is larger than a predetermined area pixel value (that is, when the area pixel value of the first region 904 is small as shown in fig. 9 (b)), the signal type information generating unit 140 may determine the signal type information as a straight signal.
On the other hand, although omitted in fig. 8 to 9, the determination of the turn signal such as the left turn signal or the right turn signal can be performed in a similar manner to the above-described method. That is, the signal type information generating unit 140 may determine the turn signal by using the above-described method and/or the shape (for example, an arrow shape) of the image data of the signal area portion of the traffic light.
According to the signal type information generating method shown in fig. 6, since the signal type information is determined in a state where the image data of the region of interest is averaged based on the predetermined pixel value, the image processing speed can be increased, and thus the signal type information can be determined quickly in a driving state where the driving and parking states change from moment to moment.
On the other hand, the signal type information generating method shown in fig. 6 is merely an example of the present invention, and the present invention is not limited thereto. Therefore, it is possible to determine a plurality of signal type information generation methods such as signal type information by detecting a red signal corresponding to a stop signal or a blue signal corresponding to a straight signal from driving-related image data which is a color image.
On the other hand, according to an embodiment of the present invention, if the signal type information is determined to be the stop signal, the signal type information generating unit 140 may position the first region and the second region in the region of interest video data in advance in a region for displaying a turn signal such as a signal, a left turn, a right turn, or the like. In this case, the signal type information generating unit 140 may position the first area and the second area at the above-described positions in advance using the positional relationship information of the plurality of types of signals stored in the storage unit 110. In this case, by increasing the video processing speed, it is possible to more quickly determine whether the stop signal is changed to a straight signal or a turn signal such as a left turn/right turn.
Fig. 10 is a flowchart illustrating a signal change guidance method of an electronic device according to an embodiment of the invention. Referring to fig. 10, the electronic device 100 may obtain path information for navigation (step S401). Specifically, in the case where the electronic device 100 performs navigation of the vehicle to a destination, the control part 170 may obtain path information for navigation at the current position of the vehicle.
The electronic device 100 may determine whether the signal is changed (step S402). Specifically, the control unit 170 may determine whether or not the signal is changed based on the signal type information generated by the signal type information generating unit 140.
When the signal is changed, the electronic device 100 may determine whether the vehicle should travel based on the changed signal (step S403). For example, in a state where the route information of the vehicle stopped on the intersection is in the straight-ahead direction, the control unit 170 may determine that the vehicle should not travel if the signal type information generated by the signal type information generating unit 140 is changed from the stop signal to the left turn signal. As another example, in a state where the route information of the vehicle stopped on the intersection is in the straight traveling direction, the control unit 170 may determine that the vehicle should travel when the signal type information generated by the signal type information generating unit 140 is changed from the stop signal to the straight traveling signal.
If the vehicle is to travel, the electronic device 100 may determine whether the vehicle has maintained the stopped state for a predetermined time period from the time point when the signal of the traffic light is changed (step S404).
If the vehicle is kept stopped for a predetermined time, the electronic device 100 may output a signal change guide (step S405).
Fig. 11 is a diagram showing a signal change guidance screen according to an embodiment of the present invention. Referring to fig. 11, the electronic device 100 according to an embodiment of the invention can display a signal change guidance screen in augmented reality.
For example, when the signal state of the traffic light is changed from the stop signal shown in fig. 11 (a) to the straight signal shown in fig. 11 (b), the augmented reality supply unit 160 generates an indicator for overlaying the augmented reality for guidance in the augmented reality, and can output the generated signal change guidance indicator 1101 in the augmented reality as shown in fig. 11 (b).
Fig. 12 is a diagram showing an embodiment of the present invention in a case where the camera and the electronic apparatus are separated. Referring to fig. 12, the navigator 100 for a vehicle and the black box 200 for a vehicle, which are separately provided, may constitute a system of an embodiment of the present invention using a wired/wireless communication method.
The navigator 100 for a vehicle may include: a display unit 145 provided on the front surface of the housing 191 of the navigator; navigator operation keys 193; and a navigator microphone 195.
The black box 200 for a vehicle can obtain data of the vehicle during a driving process and a parking process. That is, not only the image of the vehicle during traveling but also the image may be taken when the vehicle is stopped. The definition of the image obtained by the black box 200 for a vehicle may be constant or vary. For example, before and after an accident, the definition of the image may be made high, and in a general case, the definition may be reduced, so that a required storage space may be minimized and key images may be stored.
The black box 200 for a vehicle may include a black box camera 222, a black box microphone 224, and an attachment portion 281.
On the other hand, although fig. 12 shows that the navigator for a vehicle 100 and the black box for a vehicle 200 provided separately are connected to each other by wire/wireless communication, the navigator for a vehicle 100 and the black box for a vehicle 200 may not be connected by wire/wireless communication. In this case, if a storage medium for storing the photographed image of the black box 200 is inserted into the electronic device 100, the electronic device 100 can receive the photographed image. On the other hand, the black box 200 for a vehicle may be provided with the function of the navigator 100 for a vehicle, or the navigator 100 for a vehicle may be provided with a camera and integrated. In this regard, a detailed description will be made with reference to fig. 13.
Fig. 13 is a diagram showing an embodiment of the present invention in a case where a camera and an electronic apparatus are integrated. Referring to fig. 13, in the case where the electronic device includes a camera function, a user can place the electronic device in such a manner that the user can recognize a display portion of the electronic device by photographing the front of the vehicle with a camera portion of the electronic device. Thus, the system of an embodiment of the invention may be embodied.
Fig. 14 is a diagram showing an embodiment of a head-up display and an electronic apparatus according to an embodiment of the present invention. Referring to fig. 14, the electronic device may be connected with the head-up display by wired/wireless communication and display an augmented reality guidance picture on the head-up display.
On the other hand, the control method of the electronic apparatus of the various embodiments of the present invention described above can be embodied in program code to be provided to each server or device in a state stored in various non-transitory computer readable media.
The non-transitory readable medium does not mean a medium that stores data for a short time, such as a register, a cache, a memory, and the like, but means a medium that stores data semi-permanently and can be read (reading) by a device. Specifically, the above-mentioned various applications or programs can be provided by a non-transitory readable medium such as a Compact Disc (CD), a Digital Versatile Disc (DVD), a hard disk, a blu-ray disc, a universal serial bus, a memory card, a read only memory, and the like.
While the preferred embodiments of the present invention have been illustrated and described, the present invention is not limited to the specific embodiments described above, and various modifications can be made by those skilled in the art without departing from the spirit of the present invention as claimed, and these modifications should not be construed as departing from the technical spirit or the scope of the present invention.

Claims (16)

1. A control method of an electronic device, the control method comprising:
generating signal type information using image data of a signal area portion of a signal lamp in driving-related image data of a vehicle; and
performing driving-related guidance of the vehicle using the generated signal kind information,
wherein generating the signal type information comprises:
determining a region of interest including a signal light in driving-related image data;
converting the determined image data of the attention area by taking a preset pixel value as a reference;
generating image data of a region of interest, wherein a signal region portion and a non-signal region portion of the signal lamp are distinguished according to the conversion;
calculating an area pixel value of the second region and an area pixel value of the first region according to the image data of the signal region part;
comparing the difference between the area pixel value of the second region and the area pixel value of the first region with a preset area pixel value;
signal kind information of the signal lamp is generated according to the comparison result,
wherein the first region has a first area, and the second region includes the first region and has a second area.
2. The control method according to claim 1, further comprising:
judging whether the vehicle is in a parking state by using the driving related image data,
if it is determined that the vehicle is in the stopped state, a step of generating signal type information is executed.
3. The control method according to claim 2, wherein the step of determining whether the vehicle is in a stopped state includes:
generating gray-scale image data of driving-related image data; and
and a step of comparing a plurality of frames included in the generated grayscale image data to determine whether or not the vehicle is in a stopped state.
4. The control method according to claim 2, wherein the signal kind information is information for identifying a plurality of kinds of signals that can be displayed at the signal lamp, respectively.
5. The control method according to claim 4, wherein the step of performing driving-related guidance of the vehicle includes the step of outputting signal guidance using signal kind information.
6. The control method according to claim 4, wherein the step of performing driving-related guidance of the vehicle includes the step of outputting signal change guidance using path information and signal kind information for navigation of the vehicle.
7. The control method according to claim 6, wherein the step of outputting the signal change guidance is performed when a predetermined time has elapsed since the signal change time point of the traffic light when the vehicle is maintained in the stopped state.
8. The control method of claim 7, wherein the step of outputting the signal change guide comprises:
a step of generating an indicator for executing driving-related guidance; and
a step of outputting the generated indicator through augmented reality.
9. An electronic device, comprising:
a signal type information generation unit that generates signal type information using image data of a signal area portion of a traffic light in driving-related image data of a vehicle; and
a control unit for executing a driving-related guidance of the vehicle by using the generated signal type information,
wherein the signal type information generating unit is configured to:
a region of interest including a signal lamp is determined in the driving-related image data,
converting the image data of the determined attention area based on the preset pixel value,
generating image data of a region of interest, wherein a signal region portion and a non-signal region portion of the signal lamp are distinguished according to the conversion;
calculating an area pixel value of the second region and an area pixel value of the first region according to the image data of the signal region part;
comparing the difference between the area pixel value of the second region and the area pixel value of the first region with a preset area pixel value;
signal kind information of the signal lamp is generated according to the comparison result,
wherein the first region has a first area, and the second region includes the first region and has a second area.
10. The electronic device of claim 9, further comprising:
a driving state determination unit for determining whether the vehicle is in a stopped state by using the driving-related image data,
if it is determined that the vehicle is in the stopped state, the control unit controls the signal type information generation unit to generate the signal type information.
11. The electronic device according to claim 10, wherein the driving state determination unit generates grayscale image data of the driving-related image data, and determines whether or not the vehicle is in a stopped state by comparing a plurality of frames included in the generated grayscale image data, respectively.
12. The electronic device according to claim 10, wherein the signal kind information is information for identifying a plurality of kinds of signals that can be displayed at the signal lamp, respectively.
13. The electronic device according to claim 12, wherein the control section controls the output section to output the signal guidance using the signal kind information.
14. The electronic device according to claim 12, wherein the control portion controls the output portion to output the signal change guidance using the route information and the signal type information for navigation of the vehicle.
15. The electronic device according to claim 14, wherein the control unit controls the output unit to output the signal change guidance when a predetermined time elapses from a signal change time point of the traffic light when the vehicle maintains the stopped state.
16. The electronic apparatus according to any one of claims 13 to 15, wherein the control portion generates an indicator for executing driving-related guidance, and controls the output portion in such a manner that the generated indicator is output by augmented reality.
CN201810214077.7A 2014-06-16 2015-06-16 Electronic device and control method thereof Active CN108470162B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
KR10-2014-0073041 2014-06-16
KR1020140073041A KR102233391B1 (en) 2014-06-16 2014-06-16 Electronic apparatus, control method of electronic apparatus and computer readable recording medium
CN201510333358.0A CN105185137B (en) 2014-06-16 2015-06-16 Electronic device, the control method of electronic device and computer readable recording medium storing program for performing

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201510333358.0A Division CN105185137B (en) 2014-06-16 2015-06-16 Electronic device, the control method of electronic device and computer readable recording medium storing program for performing

Publications (2)

Publication Number Publication Date
CN108470162A CN108470162A (en) 2018-08-31
CN108470162B true CN108470162B (en) 2022-04-08

Family

ID=54836420

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201810214077.7A Active CN108470162B (en) 2014-06-16 2015-06-16 Electronic device and control method thereof
CN201510333358.0A Active CN105185137B (en) 2014-06-16 2015-06-16 Electronic device, the control method of electronic device and computer readable recording medium storing program for performing

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201510333358.0A Active CN105185137B (en) 2014-06-16 2015-06-16 Electronic device, the control method of electronic device and computer readable recording medium storing program for performing

Country Status (3)

Country Link
US (2) US10269124B2 (en)
KR (1) KR102233391B1 (en)
CN (2) CN108470162B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102233391B1 (en) * 2014-06-16 2021-03-29 팅크웨어(주) Electronic apparatus, control method of electronic apparatus and computer readable recording medium
US10652466B2 (en) * 2015-02-16 2020-05-12 Applications Solutions (Electronic and Vision) Ltd Method and device for stabilization of a surround view image
JP6171046B1 (en) * 2016-04-26 2017-07-26 京セラ株式会社 Electronic device, control method, and control program
US10410074B2 (en) * 2016-10-25 2019-09-10 Ford Global Technologies, Llc Systems and methods for locating target vehicles
CN106600569B (en) * 2016-11-28 2020-04-10 浙江宇视科技有限公司 Signal lamp color effect enhancement processing method and device
CN109472989A (en) * 2018-12-05 2019-03-15 斑马网络技术有限公司 Reminding method, device, equipment and the readable storage medium storing program for executing of traffic lights
CN110349415B (en) * 2019-06-26 2021-08-20 江西理工大学 Driving speed measuring method based on multi-scale transformation
CN110309033B (en) * 2019-07-15 2022-12-09 中国工商银行股份有限公司 Fault monitoring method, device and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1164172A (en) * 1997-08-22 1999-03-05 Nippon Denshi Kagaku Kk Drivers aid
CN101084137A (en) * 2004-12-20 2007-12-05 德·T·多恩 Apparatus for monitoring traffic signals and alerting drivers
CN102556043A (en) * 2011-12-12 2012-07-11 浙江吉利汽车研究院有限公司 Automobile control system and automobile control method based on traffic light recognition
CN103312973A (en) * 2012-03-09 2013-09-18 宏达国际电子股份有限公司 Electronic apparatus and adjustment method thereof

Family Cites Families (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5931888A (en) * 1994-09-22 1999-08-03 Aisin Aw Co., Ltd. Navigation system for vehicles with alternative route searching capabilities
JPH1186182A (en) * 1997-09-01 1999-03-30 Honda Motor Co Ltd Automatic driving control system
JPH11144185A (en) * 1997-09-03 1999-05-28 Honda Motor Co Ltd Automatic drive control guidance system
EP1220182A3 (en) * 2000-12-25 2005-08-17 Matsushita Electric Industrial Co., Ltd. Image detection apparatus, program, and recording medium
US7489802B2 (en) * 2002-09-10 2009-02-10 Zeev Smilansky Miniature autonomous agents for scene interpretation
JP4253271B2 (en) * 2003-08-11 2009-04-08 株式会社日立製作所 Image processing system and vehicle control system
TWI236901B (en) * 2004-06-11 2005-08-01 Oriental Inst Technology An apparatus and method for identifying surrounding environment by means of image processing and for outputting the resutls
US7466841B2 (en) * 2004-08-16 2008-12-16 Siemens Corporate Research, Inc. Method for traffic sign detection
JP4677226B2 (en) * 2004-12-17 2011-04-27 キヤノン株式会社 Image processing apparatus and method
JP4557288B2 (en) * 2005-01-28 2010-10-06 アイシン・エィ・ダブリュ株式会社 Image recognition device, image recognition method, position specifying device using the same, vehicle control device, and navigation device
US7804980B2 (en) * 2005-08-24 2010-09-28 Denso Corporation Environment recognition device
JP4631750B2 (en) * 2006-03-06 2011-02-16 トヨタ自動車株式会社 Image processing system
JP4825722B2 (en) * 2007-04-27 2011-11-30 アイシン・エィ・ダブリュ株式会社 Route guidance system, navigation device, and route guidance method
EP1988488A1 (en) * 2007-05-03 2008-11-05 Sony Deutschland Gmbh Method for detecting moving objects in a blind spot region of a vehicle and blind spot detection device
US8031062B2 (en) * 2008-01-04 2011-10-04 Smith Alexander E Method and apparatus to improve vehicle situational awareness at intersections
US8233662B2 (en) * 2008-07-31 2012-07-31 General Electric Company Method and system for detecting signal color from a moving video platform
JP5057166B2 (en) * 2008-10-30 2012-10-24 アイシン・エィ・ダブリュ株式会社 Safe driving evaluation system and safe driving evaluation program
KR20100055254A (en) * 2008-11-17 2010-05-26 엘지전자 주식회사 Method for providing poi information for mobile terminal and apparatus thereof
US9522817B2 (en) * 2008-12-04 2016-12-20 Crown Equipment Corporation Sensor configuration for a materials handling vehicle
KR101768101B1 (en) * 2009-10-30 2017-08-30 엘지전자 주식회사 Information displaying apparatus and method thereof
JP5462609B2 (en) * 2009-12-09 2014-04-02 富士重工業株式会社 Stop line recognition device
JP2011135253A (en) * 2009-12-24 2011-07-07 Fujitsu Ten Ltd Image processor, image processing system and image processing method
US8559673B2 (en) * 2010-01-22 2013-10-15 Google Inc. Traffic signal mapping and detection
CN101783964A (en) * 2010-03-18 2010-07-21 上海乐毅信息科技有限公司 Auxiliary driving system for achromate or tritanope based on image identification technology
WO2012114382A1 (en) * 2011-02-24 2012-08-30 三菱電機株式会社 Navigation device, advisory speed arithmetic device and advisory speed presentation device
CN102176287B (en) * 2011-02-28 2013-11-20 无锡中星微电子有限公司 Traffic signal lamp identifying system and method
CN102117546B (en) * 2011-03-10 2013-05-01 上海交通大学 On-vehicle traffic light assisting device
CN102679993A (en) * 2011-03-18 2012-09-19 阿尔派株式会社 Navigation device and driving guide method thereof
US9207091B2 (en) * 2011-04-21 2015-12-08 Mitsubishi Electric Corporation Drive assistance device
KR20120007781U (en) * 2011-05-04 2012-11-14 권정기 - Route guidance method using Augmented Reality and Head-up display
JP2013029451A (en) * 2011-07-29 2013-02-07 Ricoh Co Ltd Deposit detection device and deposit detection method
JP5803402B2 (en) * 2011-08-08 2015-11-04 株式会社ソシオネクスト Image processing apparatus, imaging apparatus, imaging system, and data processing method
JP5573803B2 (en) * 2011-09-21 2014-08-20 株式会社デンソー LIGHT DETECTING DEVICE, LIGHT DETECTING PROGRAM, AND LIGHT CONTROL DEVICE
CN103020623B (en) * 2011-09-23 2016-04-06 株式会社理光 Method for traffic sign detection and road traffic sign detection equipment
KR101331096B1 (en) * 2012-03-21 2013-11-19 주식회사 코아로직 Image recording apparatus and method for black box system for vehicle
JP5849006B2 (en) * 2012-03-29 2016-01-27 富士重工業株式会社 Vehicle driving support device
JP6062041B2 (en) * 2012-05-07 2017-01-18 本田技研工業株式会社 A method for generating a virtual display surface from a video image of a landscape based on a road
CN103345766B (en) * 2013-06-21 2016-03-30 东软集团股份有限公司 A kind of signal lamp recognition methods and device
CN103395391B (en) * 2013-07-03 2015-08-05 北京航空航天大学 A kind of vehicle lane-changing alarming device and change state identification method
JP5852637B2 (en) * 2013-12-27 2016-02-03 富士重工業株式会社 Arrow signal recognition device
US9767371B2 (en) * 2014-03-27 2017-09-19 Georgia Tech Research Corporation Systems and methods for identifying traffic control devices and testing the retroreflectivity of the same
US20150316387A1 (en) * 2014-04-30 2015-11-05 Toyota Motor Engineering & Manufacturing North America, Inc. Detailed map format for autonomous driving
KR102233391B1 (en) * 2014-06-16 2021-03-29 팅크웨어(주) Electronic apparatus, control method of electronic apparatus and computer readable recording medium
US9586585B2 (en) * 2014-11-20 2017-03-07 Toyota Motor Engineering & Manufacturing North America, Inc. Autonomous vehicle detection of and response to traffic officer presence
KR101920186B1 (en) * 2015-07-08 2018-11-19 닛산 지도우샤 가부시키가이샤 Equalizer detection device and equalizer detection method
US20170068863A1 (en) * 2015-09-04 2017-03-09 Qualcomm Incorporated Occupancy detection using computer vision

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1164172A (en) * 1997-08-22 1999-03-05 Nippon Denshi Kagaku Kk Drivers aid
CN101084137A (en) * 2004-12-20 2007-12-05 德·T·多恩 Apparatus for monitoring traffic signals and alerting drivers
CN102556043A (en) * 2011-12-12 2012-07-11 浙江吉利汽车研究院有限公司 Automobile control system and automobile control method based on traffic light recognition
CN103312973A (en) * 2012-03-09 2013-09-18 宏达国际电子股份有限公司 Electronic apparatus and adjustment method thereof

Also Published As

Publication number Publication date
CN105185137B (en) 2018-04-13
US20150363652A1 (en) 2015-12-17
CN105185137A (en) 2015-12-23
US20170256064A1 (en) 2017-09-07
KR102233391B1 (en) 2021-03-29
US10269124B2 (en) 2019-04-23
CN108470162A (en) 2018-08-31
KR20150144201A (en) 2015-12-24
US10282848B2 (en) 2019-05-07

Similar Documents

Publication Publication Date Title
CN108680173B (en) Electronic device, control method of electronic device, and computer-readable recording medium
US20200333154A1 (en) Electronic apparatus and control method thereof
CN108470162B (en) Electronic device and control method thereof
CN110091798B (en) Electronic device, control method of electronic device, and computer-readable storage medium
CN109323708B (en) Electronic device and control method thereof
US11030816B2 (en) Electronic apparatus, control method thereof, computer program, and computer-readable recording medium
US10147001B2 (en) Electronic apparatus and control method thereof
US9644987B2 (en) Electronic apparatus, control method of electronic apparatus and computer readable recording medium
KR102406490B1 (en) Electronic apparatus, control method of electronic apparatus, computer program and computer readable recording medium
KR102276082B1 (en) Navigation device, black-box and control method thereof
KR102158167B1 (en) Electronic apparatus, control method of electronic apparatus and computer readable recording medium
KR102371620B1 (en) Electronic apparatus, control method of electronic apparatus and computer readable recording medium
KR102299501B1 (en) Electronic apparatus, control method of electronic apparatus and computer readable recording medium
KR102299499B1 (en) Electronic apparatus and control method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210819

Address after: Seoul, South Kerean

Applicant after: Hyundai Motor Co.,Ltd.

Applicant after: Kia Co.,Ltd.

Address before: Gyeonggi Do, South Korea

Applicant before: THINKWARE SYSTEMS Corp.

GR01 Patent grant
GR01 Patent grant