CN105654778B - Electronic device and control method thereof - Google Patents

Electronic device and control method thereof Download PDF

Info

Publication number
CN105654778B
CN105654778B CN201510867498.6A CN201510867498A CN105654778B CN 105654778 B CN105654778 B CN 105654778B CN 201510867498 A CN201510867498 A CN 201510867498A CN 105654778 B CN105654778 B CN 105654778B
Authority
CN
China
Prior art keywords
vehicle
crosswalk
pedestrian
electronic device
present
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510867498.6A
Other languages
Chinese (zh)
Other versions
CN105654778A (en
Inventor
趙浩亨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hyundai Motor Co
Kia Corp
Original Assignee
星克跃尔株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 星克跃尔株式会社 filed Critical 星克跃尔株式会社
Priority to CN202010541932.2A priority Critical patent/CN111710189B/en
Priority to CN201811590078.8A priority patent/CN110091798B/en
Priority to CN202010541922.9A priority patent/CN111681455B/en
Publication of CN105654778A publication Critical patent/CN105654778A/en
Application granted granted Critical
Publication of CN105654778B publication Critical patent/CN105654778B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3602Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3632Guidance using simplified or iconic instructions, e.g. using arrows
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3635Guidance using 3D or perspective road maps
    • G01C21/3638Guidance using 3D or perspective road maps including 3D objects and buildings
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3644Landmark guidance, e.g. using POIs or conspicuous other objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3647Guidance involving output of stored or live camera images or video streams
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0816Indicating performance data, e.g. occurrence of a malfunction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Abstract

The invention discloses an electronic device, a control method of the electronic device, a computer program, and a computer-readable recording medium. The control method of the electronic device comprises the following steps: detecting a crosswalk from image data captured by a camera during operation of a vehicle; generating an object indicating the detected crosswalk; and outputting the generated object through augmented reality.

Description

Electronic device and control method thereof
Technical Field
The present invention relates to an electronic device, a method of controlling an electronic device, a computer program, and a computer-readable recording medium, and more particularly, to an electronic device, a method of controlling an electronic device, a computer program, and a computer-readable recording medium for providing guidance related to driving of a vehicle to a user in augmented reality.
Background
When a vehicle is traveling, it is most important to safely travel and prevent a traffic accident, and for this reason, the vehicle is equipped with various auxiliary devices, safety devices such as a seat belt and an airbag, and the like, which perform functions of controlling the posture of the vehicle, controlling vehicle structural devices, and the like.
In addition, recently, devices such as a black box installed in a vehicle store driving images of the vehicle and data transmitted from various sensors, and thus there is a tendency that a device for finding the cause of an accident of the vehicle when the vehicle has a traffic accident is installed in the vehicle. A black box, a navigator application, and the like can be mounted on a portable terminal such as a smartphone or a tablet computer, and thus the present invention is used as the above-described vehicle device.
However, in the vehicle device, the utilization rate of the traveling image is very low. More specifically, even if a driving image of the vehicle is obtained by a visual sensor mounted on a camera or the like of the vehicle at present, the electronic device of the vehicle is merely left to simply display and transmit such an image or is left to generate simple peripheral notification information such as whether the vehicle is off the lane line or not.
Further, as a new vehicle electronic device which has been attracting attention at present, a Head-up display (HUD) or an augmented reality interface has been proposed, but in these devices, the utilization rate of a traveling image with respect to a vehicle is only limited to a level of simply displaying or generating simple notification information.
Disclosure of Invention
The present invention has been made in view of the above-described need, and an object of the present invention is to provide an electronic device, a control method of the electronic device, a computer program, and a computer-readable recording medium that generate an object indicating a crosswalk and output the object by augmented reality when a vehicle traveling on a road stops at an intersection or the like.
A control method of an electronic apparatus according to an embodiment of the present invention for achieving the above object includes: detecting a crosswalk from image data captured by a camera during operation of a vehicle; generating an object indicating the detected crosswalk; and outputting the generated object through augmented reality.
The method of controlling an electronic device according to the present invention may further include a step of determining whether or not the vehicle is in a stopped state, and in the step of generating an object indicating the detected crosswalk, if it is determined that the vehicle is in a stopped state, a first object for recognizing that the crosswalk is located in front of the vehicle may be generated.
The method for controlling an electronic device according to the present invention may further include a step of determining signal type information from the image data using image data of a signal area portion of a traffic light, wherein in the step of generating an object indicating the detected crosswalk, a first object for recognizing that the crosswalk is located ahead of the vehicle is generated when the vehicle maintains a stopped state in a state where the signal type information is a stop signal, and a second object for warning that the crosswalk is located ahead of the vehicle is generated when the vehicle departs in the state where the signal type information is the stop signal.
The first object and the second object may be distinguished by different colors.
The first and second objects may be embodied in a form including an alpha channel related to color transparency, and the first and second objects may include a transparent region according to the alpha channel.
The control method of the electronic device of the present invention may further include the step of determining whether a pedestrian is present on the pedestrian crossing using the image data obtained by photographing; and a step of generating an object indicating whether the pedestrian is present.
Further, the control method of the electronic apparatus of the present invention may further include the steps of: when a vehicle ahead of the vehicle starts in a state where a pedestrian is present on the crosswalk, the control unit does not execute the preceding vehicle start guidance.
Furthermore, the control method of the electronic apparatus of the present invention may further include the steps of: when the vehicle starts in a state where a pedestrian is present on the crosswalk, the vehicle is controlled to execute guidance for warning of the presence of the pedestrian on the crosswalk.
The appearance positions of the first object and the second object may be regions where crosswalks are located in the augmented reality.
Further, the outputting step may include: a step of calculating camera parameters by performing Calibration (Calibration) on the camera; generating a virtual three-Dimensional (3-Dimensional) space of the image captured by the camera based on the camera parameters; and a step of positioning the generated object in the virtual three-dimensional space.
On the other hand, an electronic apparatus according to an embodiment of the present invention for achieving the above object includes: a display unit for displaying a screen; an object generating unit that generates an object indicating a pedestrian crossing detected from image data captured by a camera during vehicle operation; and a control unit configured to control the display unit to output the generated object by augmented reality.
The control unit may determine whether or not the vehicle is in a stopped state, and if it is determined that the vehicle is in a stopped state, the control unit may control the object generation unit to generate the first object for recognizing that the crosswalk is located in front of the vehicle.
The control unit may determine the signal type information using image data of a signal area portion of a traffic signal in the image data, and the control unit may control the object generating unit to: the control unit generates a first object for recognizing that the crosswalk is positioned in front of the vehicle when the vehicle maintains a stopped state in a state where the signal type information is a stop signal, and generates a second object for warning that the crosswalk is positioned in front of the vehicle when the vehicle departs in the state where the signal type information is the stop signal.
The first object and the second object may be distinguished by different colors.
The first and second objects may be embodied in a form including an alpha channel related to color transparency, and the first and second objects may include a transparent region according to the alpha channel.
The control unit may determine whether or not a pedestrian is present on the pedestrian crossing by using the image data obtained by the image pickup, and control the object generation unit to generate an object indicating whether or not the pedestrian is present.
The control unit may perform the following control: if the vehicle ahead of the vehicle starts in a state where a pedestrian is present on the crosswalk, the vehicle start guidance ahead is not executed.
The control unit may perform the following control: when the vehicle starts in a state where a pedestrian is present on the crosswalk, guidance for warning that a pedestrian is present on the crosswalk is executed.
The appearance positions of the first object and the second object may be regions where crosswalks are located in the augmented reality.
The control unit may perform the following control: the camera parameters are calculated by calibrating the camera, and a virtual three-dimensional space of the image taken by the camera is generated based on the camera parameters, so that the generated object is located in the virtual three-dimensional space.
On the other hand, a computer program stored in a recording medium of an embodiment of the present invention for achieving the above object may perform the following steps in combination with an electronic device: detecting a crosswalk from image data captured by a camera during operation of a vehicle; generating an object indicating the detected crosswalk; and outputting the generated object through augmented reality.
On the other hand, in a computer-readable recording medium storing a computer program for executing a control method of an electronic apparatus according to an embodiment of the present invention for achieving the above object, the control method includes: detecting a crosswalk from image data captured by a camera during operation of a vehicle; generating an object indicating the detected crosswalk; and outputting the generated object through augmented reality.
According to the various embodiments of the present invention, it is possible to dynamically display guidance information by an augmented reality method in a section where a crosswalk exists, thereby effectively providing guidance to a driver, drawing the driver's interest, and enabling safe driving and convenience of the driver with respect to a vehicle.
Further, according to the various embodiments of the present invention, the driver can perform the guidance according to whether or not there is a pedestrian on the pedestrian crossing, thereby achieving safe driving and convenience.
Drawings
Fig. 1 is a block diagram of an electronic device according to an embodiment of the invention.
Fig. 2 is a block diagram specifically showing an augmented reality providing unit according to an embodiment of the present invention.
Fig. 3 is a diagram for explaining a network of a system connected to an electronic apparatus according to an embodiment of the present invention.
Fig. 4 is a flowchart schematically illustrating a control method of an electronic device according to an embodiment of the invention.
Fig. 5 is a flowchart specifically illustrating a control method of an electronic device according to an embodiment of the invention.
Fig. 6 is a flowchart specifically illustrating a control method of an electronic device according to still another embodiment of the invention.
Fig. 7 is a flowchart specifically illustrating a control method of an electronic device according to another embodiment of the invention.
Fig. 8 is a diagram showing an augmented reality screen on which a crosswalk object appears according to an embodiment of the present invention.
Fig. 9 is a diagram showing a texture image of a crosswalk object according to an embodiment of the present invention.
Fig. 10 is a diagram showing an augmented reality screen on which a pedestrian notification object appears according to an embodiment of the present invention.
Fig. 11 is a diagram showing an embodiment of the present invention in a case where a camera and an electronic device are separated.
Fig. 12 is a diagram showing an embodiment of the present invention in a case where a camera and an electronic apparatus are integrated.
Fig. 13 is a diagram showing an embodiment of a head-up display and an electronic apparatus according to an embodiment of the present invention.
Detailed Description
The following merely illustrates the principles of the invention. Thus, those skilled in the art to which the invention pertains may devise various arrangements that, although not explicitly described or shown herein, embody the principles of the invention and are included within its concept and scope. In addition, terms and embodiments of all conditions recited in the present invention are clearly used in principle to understand the concept of the present invention, and it should be understood that the embodiments and states specifically recited as described above are not limited thereto.
Moreover, all detailed descriptions that exemplify the principles, aspects, and embodiments of the present invention and set forth particular embodiments are to be understood as including structural and functional equivalents of such items. Also, such equivalents should be understood to include both currently disclosed equivalents as well as equivalents developed in the future, i.e., all elements invented to perform the same function, regardless of structure.
Thus, for example, the block diagrams of the present specification should be understood to represent conceptual views of illustrative circuitry embodying the principles of the invention. Similarly, all flowcharts, state transition diagrams, pseudo codes and the like should be understood as representing various programs executed by a computer or a process, whether or not substantially shown in a computer readable medium, whether or not the computer or the process is explicitly shown.
The functions of a processor or various elements shown in the drawings including functional blocks shown in a concept similar to the processor may be provided not only as dedicated hardware but also as hardware having functions of executable software in association with appropriate software. When provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared.
Also, explicit use of terms such as processing, control, or similar language should not be construed to exclude hardware having the ability to execute software, and should be construed in a non-limiting sense to implicitly include Digital Signal Processor (DSP) hardware, Read Only Memory (ROM) for storing software, Random Access Memory (RAM), and non-volatile storage. Other conventional hardware may also be included as is well known.
In the scope of the claims of the present specification, elements expressed as means for performing functions described in the detailed description include, for example, combinations of circuit elements that perform the functions described above or all methods including functions that perform all types of software including firmware, microcode, etc., and are combined with appropriate circuits for executing the software to perform the functions described above. The invention defined by such claims may be combined with the functionality provided by the means exemplified in the plurality of ways and with the means claimed in the claims and any means providing the functionality as described herein shall be understood to be equivalent to the means comprehended from the description.
The above objects, features and advantages will become more apparent from the following detailed description with reference to the accompanying drawings, and thus, it is possible for those skilled in the art to easily implement the technical idea of the present invention. In describing the present invention, if it is determined that detailed description of known techniques may obscure the gist of the present invention, detailed description thereof will be omitted.
Hereinafter, various embodiments of the present invention will be described in detail with reference to the accompanying drawings.
Fig. 1 is a block diagram of an electronic device according to an embodiment of the invention. Referring to fig. 1, electronic device 100 includes all or a part of storage unit 110, input unit 120, output unit 130, crosswalk detection unit 140, signal type information determination unit 150, operation state determination unit 155, augmented reality provision unit 160, control unit 170, communication unit 180, detection unit 190, and power supply unit 195.
Here, the electronic device 100 may be embodied as a smart phone, a tablet computer, a notebook computer, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), smart glasses, augmented reality glasses, a navigator (navigation), a Black box (Black-box), and the like, which can provide driving-related guidance to a driver of a vehicle in a driving state, and the electronic device 100 may be provided in the vehicle.
Here, the operation state of the vehicle may include various states of the vehicle driven by the driver, such as a parking state of the vehicle, a running state of the vehicle, a parking state of the vehicle, and the like.
The vehicle driving-related guidance may include various guidance for assisting the driver in driving the vehicle, such as navigation, lane line departure guidance, preceding vehicle departure guidance, signal light change guidance, collision prevention guidance with a preceding vehicle, lane change guidance, lane guidance, and the like.
Here, the navigating may include: augmented reality navigation that performs navigation by capturing an image ahead of a vehicle while the vehicle is traveling, in combination with various information such as the position and direction of a user; and two-Dimensional (2D, 2-Dimensional) or three-Dimensional (3D, 3-Dimensional) navigation, which is performed in conjunction with various information such as the position, direction, and the like of a person using in two-Dimensional or three-Dimensional map data. Here, the navigation may be interpreted to include not only the navigation in the case where the user drives in the vehicle but also the concept of navigation in the case where the user moves in a walking or running manner.
The lane line departure guide may be a guide for guiding whether or not the traveling vehicle departs from the lane line.
The preceding vehicle departure guidance may be guidance for whether or not a vehicle located in front of the vehicle in a parking state departs.
The traffic light change guidance may be guidance on whether or not the traffic light located in front of the vehicle that is parked is changed. For example, if the state of the red light indicating the stop signal is changed to the green light indicating the signal, the signal can be guided.
The guidance for preventing a collision with the preceding vehicle may be guidance for preventing a collision with the preceding vehicle if the distance between the vehicle in a stopped state or a traveling state and the vehicle located in front is within a predetermined distance.
The lane change guidance may be a guidance for guiding the vehicle to change from the lane where the vehicle is located to another lane in order to guide the route to the destination.
The lane guidance may be guidance for a lane on which the vehicle is currently located.
Such driving-related images that can provide various guidance can be photographed in real time by a camera placed toward the front of the vehicle. Here, the camera may be a camera that is integrally formed with the electronic device 100 placed in the vehicle and photographs the front of the vehicle. In this case, the camera may be integrated with the smartphone, the navigator or the black box, and the electronic device 100 may receive an image photographed by the integrated camera.
As another example, the camera may be a camera that is placed in a vehicle separately from the electronic device 100 and that captures an image of the front of the vehicle. In this case, the camera may be a separate black box placed toward the front of the vehicle, and the electronic device 100 may receive the image photographed by the separately placed black box through wired/wireless communication, or if a storage medium for storing the image photographed by the black box is inserted into the electronic device 100, the electronic device 100 may receive the image photographed by the black box.
Hereinafter, the electronic device 100 according to an embodiment of the invention will be described in more detail based on the above description.
The storage unit 110 performs a function of storing various data and applications necessary for the operation of the electronic apparatus 100. In particular, the storage unit 110 may store data required for the operation of the electronic device 100, such as an Operating System (OS), a route search application, map data, and the like. The storage unit 110 may store data generated by the operation of the electronic device 100, such as the searched route data and the received video.
The storage unit 110 may be embodied as a removable storage element such as a Universal serial bus (usb) Memory, as well as a built-in storage element such as a Random Access Memory (RAM), a flash Memory, a Read Only Memory (ROM), an Erasable Programmable Read Only Memory (EPROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a register, a hard disk, a removable disk, a Memory card, and a Universal Subscriber Identity Module (USIM).
The input unit 120 performs a function of converting a physical input from the outside of the electronic device 100 into a specific electric signal. Wherein the input portion 120 may include all or a portion of the user input portion 121 and the microphone portion 123.
The user input unit 121 can receive user input such as touch and push operations. Here, the user input unit 121 may be embodied by at least one of various button configurations, a touch sensor that receives a touch input, and a proximity sensor that receives a proximity operation.
The microphone unit 123 can receive the voice of the user and the voice generated from the inside and outside of the vehicle.
The output unit 130 is a device for outputting data of the electronic device 100. Among them, the output part 130 may include all or a part of the display part 131 and the audio output part 133.
The display unit 131 is a device that outputs visually recognizable data to the electronic device 100. The display portion 131 may be embodied as a display portion provided on the front of the housing of the electronic device 100. The display unit 131 may be integrated with the electronic device 100 to output visual identification data, or may be provided separately from the electronic device 100 to output visual identification data, such as a head-up display.
The audio output unit 133 is a device that outputs data that can be identified by auditory means to the electronic device 100. The audio output unit 133 can embody data of the electronic device 100 to be notified to the user through a speaker that represents sound.
The communication unit 180 may be provided to communicate the electronic apparatus 100 with another device. The communication section 180 may include all or a part of the location data section 181, the wireless internet section 183, the broadcast transmitting/receiving section 185, the mobile communication section 186, the short-range communication section 187, and the wired communication section 189.
The position data unit 181 is a device that obtains position data by a Global Navigation Satellite System (GNSS), which means a Navigation System that can calculate the position of a receiving terminal using radio signals received from satellites, and specific examples of the Global Navigation Satellite System include a Global Positioning System (GPS), a Galileo, a glonass Satellite Navigation System (G L ONASS), a Global Positioning Satellite System (Global Positioning Satellite System), a Satellite Navigation System (COMPASS), an Indian Regional Navigation Satellite System (IRNSS), a Quasi-Zenith Satellite System (qss, COMPASS-zenitith Satellite System), and the like, according to the operator.
The Wireless internet connectable to the Wireless internet unit 183 may be a Wireless local area network (W L AN, Wireless L AN), a Wireless broadband (Wireless broadband), a worldwide interoperability for microwave access (Wimax), a High Speed Downlink Packet Access (HSDPA), or the like.
The Broadcast system that can be transmitted and received by the Broadcast transmitting and receiving section 185 may be Digital Multimedia Broadcasting Terrestrial (DMBT), Digital Multimedia Broadcasting Satellite (DMBS), high-pass mobile television standard (Media L O, Media Forward L inkOnly), Digital Video Broadcasting Handheld (DVBH), japanese Digital audio broadcasting scheme (ISDBT), etc. the Broadcast signal transmitted and received by the Broadcast transmitting and receiving section 185 may include traffic data, life data, and the like.
The mobile communication section 186 can be connected to and communicate with a mobile communication network according to various mobile communication specifications such as a third Generation mobile communication technology (3G, 3rd Generation), a third Generation Partnership Project (3GPP, 3rd Generation Partnership Project), a long term evolution plan (L TE, &ttttranslation = L "&tttl &/t &gtgong) and the like.
The short-range communication unit 187 is a device for performing short-range communication. As described above, the short-range Communication unit 187 can perform Communication by Bluetooth (Bluetooth), Radio Frequency Identification (RFID), infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee (ZigBee), Near Field Communication (NFC), wireless fidelity (Wi-Fi), and the like.
The wired communication unit 189 is an interface device that can connect the electronic apparatus 100 to other devices in a wired manner. The wired communicator 189 may be a universal serial bus module that can communicate through a universal serial bus Port (USB Port).
The communication unit 180 can communicate with other devices using at least one of the location data unit 181, the wireless internet unit 183, the broadcast transmitting/receiving unit 185, the mobile communication unit 186, the short-range communication unit 187, and the wired communication unit 189.
For example, when the electronic device 100 does not include an imaging function, an image captured by a vehicle camera such as a black box can be received by at least one of the near field communication unit 187 and the wired communication unit 189.
As another example, when communicating with a plurality of apparatuses, one apparatus may communicate via the short-range communication unit 187, and the other apparatus may communicate via the wired communication unit 189.
The detection unit 190 is a device that can detect the current state of the electronic device 100. The detection section 190 may include all or a part of the motion detection section 191 and the light detection section 193.
The motion detector 191 may detect a motion in the three-dimensional space of the electronic device 100. The motion detecting unit 191 may include a three-axis geomagnetic sensor and a three-axis acceleration sensor. The motion data obtained by the motion detection unit 191 and the position data obtained by the position data unit 191 are combined to calculate a more accurate trajectory of the vehicle to which the electronic device 100 is attached.
The light detector 193 measures the ambient illuminance (illuminance) of the electronic device 100. The luminance of the display unit 195 can be changed in accordance with the peripheral luminance by using the illuminance data obtained by the light detection unit 193.
The power supply unit 195 supplies power necessary for the operation of the electronic apparatus 100 or the operation of another device connected to the electronic apparatus 100. The power supply unit 195 may receive power from an external power supply such as a battery or a vehicle built in the electronic device 100. Also, the power supply part 195 may be embodied as the wired communication module 119 or as a device that receives power wirelessly, depending on the form of receiving power.
The crosswalk detection unit 140 can detect crosswalks from image data captured by the camera. Specifically, the crosswalk detection unit 140 may determine a region of interest including a crosswalk in the video data by using a Vanishing point (vanising point) of the captured video. The vanishing point may be determined by extracting a lane line from image data photographed by a camera during the operation of the vehicle, and calculating a point where the extracted lane line is extended to intersect. Also, the crosswalk is formed on the road in the lower end area of the vanishing point, and the crosswalk detecting part 140 may determine the lower end area of the vanishing point as the attention area.
On the other hand, the crosswalk detection unit 140 may determine whether or not the crosswalk is located in the region of interest by performing image processing on the video data of the identified region of interest.
However, unlike the above-described embodiment, according to another embodiment of the present invention, the crosswalk detection section 140 can detect the crosswalk without taking image data with a camera. For example, the crosswalk detection unit 140 may determine whether the vehicle is currently located on the crosswalk using the current position information of the vehicle specified by the position data unit 181 and the crosswalk position of the map data stored in the storage unit 110. Alternatively, in order to make a more accurate determination, the crosswalk detection unit 140 may determine whether the vehicle is currently located on the crosswalk by taking all of the camera-captured image data, the map data, and the position data into consideration.
The signal type information determination unit 150 may determine the signal type information using image data of a signal area portion of the traffic light in the image data captured by the camera. Specifically, the signal type information determination unit 150 may specify a region of interest including a traffic light in the image data by using a vanishing point of the captured image, and generate the region of interest image data by converting the image data of the specified region of interest with reference to a preset pixel value. However, the traffic light is located in the upper region of the vanishing point, and the signal type information judging unit 150 may determine the upper region of the determined vanishing point as the attention region.
On the other hand, the signal type information determination unit 150 may detect the image data of the signal area portion of the traffic signal from the specified attention area image data. The signal type information determining unit 150 may determine the signal type information based on the video data of the signal region portion.
The signal type information is information for identifying a plurality of signals that can be displayed on the signal lamp, and the signal type information may include stop signal information, straight signal information, left turn signal, right turn signal, and the like.
The operation state determination part 155 may determine an operation state of the vehicle, such as whether the vehicle is in a parking state, whether the vehicle is in a driving state, or a parking state. Specifically, the driving state determination unit 155 may determine whether the vehicle is in a parking state using image data captured by the camera. More specifically, the driving state determination unit 155 may generate grayscale video data of the video data, and may sequentially compare a plurality of frames included in the generated grayscale video data in time order to determine whether the vehicle is in a stopped state. However, the present invention is not limited thereto, and the driving state determination unit 155 may determine whether the vehicle is in a stopped state based on the signal detected by the detection unit 190 and the motion data obtained by the position data unit 181, or may determine the driving state of the vehicle based on real-time speed information of the vehicle obtained by using Controller Area Network (Controller Area Network) communication of the vehicle.
On the other hand, the electronic device 100 according to an embodiment of the present invention may include an augmented reality provider 160 for providing an augmented reality view mode. In this regard, a detailed description will be made with reference to fig. 2.
Fig. 2 is a block diagram specifically showing the augmented reality provider 160 according to the embodiment of the present invention. Referring to fig. 2, the enhanced display providing part 160 may include all or a part of the calibration part 161, the three-dimensional space generating part 162, the object generating part 163, and the mapping part 164.
The calibration unit 161 can perform calibration to estimate camera parameters corresponding to the camera from a captured image captured by the camera. The camera parameters are parameters forming a camera matrix, and the camera matrix is information used for representing the relationship of real space mapping on a photo.
The three-dimensional space generating unit 162 may generate a virtual three-dimensional space based on a captured image captured by a camera. Specifically, the three-dimensional space generating unit 162 may obtain depth information (depth information) from a video captured by the camera based on the camera parameters estimated by the calibrating unit 161, and generate a virtual three-dimensional space based on the obtained depth information and the captured video.
The object generating unit 163 may generate an object for guiding in augmented reality, for example, a navigation object, a lane change guide object, a lane line deviation guide object, a pedestrian crossing object, a pedestrian guide object, or the like. Wherein the object can be embodied as a three-dimensional object, a texture image or an art line, etc.
The mapping unit 164 may map the object generated by the object generating unit 163 on the virtual three-dimensional space generated by the three-dimensional space generating unit 162.
On the other hand, the control unit 170 controls the overall operation of the electronic device 100. Specifically, the control unit 170 may control all or a part of the storage unit 110, the input unit 120, the output unit 130, the crosswalk detection unit 140, the signal type information determination unit 150, the operation state determination unit 155, the augmented reality provision unit 160, the communication unit 180, the detection unit 190, and the power supply unit 195.
In particular, the control unit 170 controls the object generating unit 163 as follows: when a crosswalk is detected from image data captured by a camera while the vehicle is running, an object representing the detected crosswalk is generated, and the control unit 170 may control to display the generated object in augmented reality.
For example, when the vehicle is in the stopped state as a result of the determination by the driving state determination unit 155 and a crosswalk exists in front of the vehicle as a result of the determination by the crosswalk detection unit 140, the control unit 170 may control the object generation unit 163 to generate the first object indicating the crosswalk. Further, the control unit 170 may control to output the generated object by augmented reality. The first object may be an object for allowing the driver to recognize that a crosswalk exists in front of the vehicle.
As another example, when the determination result of the driving state determination unit 155 is that the vehicle is in a stopped state, the determination result of the signal type information determination unit 150 is that the signal is a stop signal, and the determination result of the crosswalk detection unit 140 is that a crosswalk is present in front of the vehicle, the control unit 170 may control the object generation unit 163 to generate the first object indicating the crosswalk while the vehicle maintains the stopped state. Further, the control unit 170 may be controlled to output the generated first object by augmented reality. The first object may be an object for allowing the driver to recognize that a crosswalk exists in front of the vehicle.
However, when the vehicle is driven from the stopped state during the stop signal, the control unit 170 may control the object generation unit 163 to generate the second object. Further, the control unit 170 may be controlled to output the generated second object by augmented reality. Wherein the second object may be an object for warning a driver of the presence of a crosswalk in front of the vehicle.
Wherein a first object for making the driver recognize that a crosswalk is present in front of the vehicle and a second object for warning the driver that a crosswalk is present in front of the vehicle are distinguishable from each other.
Specifically, the first object and the second object may be distinguished by colors different from each other. For example, the first object may appear on the augmented reality screen in white similar to the color of a crosswalk of the real world, and the second object may appear on the augmented reality screen in a color that causes the driver to recognize a dangerous state, for example, red.
Also, the first and second objects may be embodied in a form including an Alpha channel related to color transparency, for example, in RGBA (Red, Green, Blue, Alpha), in which case the first and second objects may include a transparent region according to the Alpha channel. That is, the first object and the second object may be embodied to include a region where colors appear and a transparent region to correspond to a pedestrian crossing of the real world.
The appearance positions of the first object and the second object may be areas where crosswalks are located in augmented reality. For example, when the crosswalk detection unit 140 detects a crosswalk from the image data captured by the camera, the control unit 170 may control the mapping unit 164 so that the first object and the second object appear at positions within the augmented reality screen corresponding to the positions where the crosswalk is detected. As such, according to an embodiment of the present invention, an object that can be used to represent a crosswalk appears to be located on the crosswalk of the augmented reality screen, whereby guidance can be provided to the driver in a more intuitive manner.
On the other hand, according to an embodiment of the present invention, the notification corresponding to the presence of the pedestrian on the crosswalk may be performed by determining whether or not the pedestrian is present.
Specifically, the control unit 170 may control the object generation unit 163 as follows: the presence or absence of a pedestrian on the crosswalk is determined by using the captured image data, and a third object indicating the presence or absence of the formation is generated. Further, the control unit 170 may be controlled to output the generated third object by augmented reality. Wherein the third object may be an object for making the driver recognize that a pedestrian is present on the crosswalk.
Further, if the vehicle ahead of the current vehicle starts in a state where there is a pedestrian on the crosswalk, the control unit 170 may control not to execute the preceding vehicle start guidance. That is, if the preceding vehicle departure guidance is executed in a state where a pedestrian is present on the crosswalk, there is a possibility that the current vehicle collides with the pedestrian, and in this case, the control unit 170 may control not to execute the preceding vehicle departure guidance.
Further, if the vehicle is currently departing from the parking state in a state where a pedestrian is present on the crosswalk, the control portion 170 may control to execute guidance for warning that a pedestrian is present on the crosswalk. That is, if the current vehicle departs from the stopped state in a state where a pedestrian is present on the crosswalk, there is a possibility that the current vehicle collides with the pedestrian, and in this case, the control unit 170 may control to execute guidance that the current vehicle cannot depart.
On the other hand, according to the above examples, the description has been given taking as an example the case where the guidance of the pedestrian crossing, the pedestrian, or the like is shown in the form of an image on the augmented reality screen, but the present invention is not limited to this. Therefore, according to another embodiment of the present invention, the control part 170 may control the audio output part 133 to output guidance in sound.
Fig. 3 is a diagram for explaining a network of a system connected to an electronic apparatus according to an embodiment of the present invention. Referring to fig. 3, the electronic device 100 according to an embodiment of the present invention may be embodied as a navigator, a black box, a smart phone, or other augmented reality interface providing device for a vehicle, which is disposed in the vehicle, and may be connected to various communication networks and other electronic devices 61, 62, 63, and 64.
The electronic device 100 can also calculate the current position and the current time by linking with the global positioning system based on the radio wave signal received from the artificial satellite 20.
Each of the satellites 20 may transmit L band frequencies having different frequency bands the electronic device 100 may calculate the current position based on the time required for the L band frequencies transmitted from each of the satellites 20 to reach the electronic device 100.
On the other hand, the electronic device 100 can be wirelessly connected to the network 30 through the communication unit 180 by means of the control station (ACR)40, the base station (RAS)50, and the like. If the electronic apparatus 100 is connected to the network 30, it may also be connected to other electronic devices 61, 62 connected to the network 30 in an indirect manner, and exchange data.
On the other hand, the electronic apparatus 100 may also be connected to the network 30 in an indirect manner through the other device 63 having a communication function. For example, in the case where the electronic apparatus 100 does not have a module connectable to the network 30, communication with the other device 63 having a communication function can be performed by near field communication or the like.
Fig. 4 is a flowchart schematically illustrating a control method of an electronic device according to an embodiment of the invention. Referring to fig. 4, the electronic device 100 may detect a crosswalk from image data captured by a camera during operation of a vehicle (step S101). Here, the camera may be a camera integrally formed with the electronic device 100 mounted to the vehicle to photograph the front of the vehicle, or the camera may be a separate black box mounted toward the front of the vehicle.
Further, the electronic apparatus 100 may generate an object indicating the detected crosswalk (step S102). The object can be embodied in the form of a three-dimensional object, a texture image, an artistic line, or the like.
Also, the electronic apparatus 100 may output the generated object through augmented reality (step S103). The outputting step (step S103) may include the step of calibrating the camera to calculate camera parameters; generating a virtual three-dimensional space of a shot image of the camera on the basis of the camera parameters; and a step of locating the generated object in a virtual three-dimensional space.
Hereinafter, the control method of the electronic apparatus 100 will be described in more detail with reference to fig. 5 to 7.
Fig. 5 is a flowchart specifically illustrating a control method of an electronic device according to an embodiment of the invention. Referring to fig. 5, the electronic device 100 may determine whether the vehicle is in a parking state (step S201). The determination of whether the vehicle is in a parked state may be performed by the operating state determining unit 155.
When the vehicle is parked, the electronic device 100 may detect a crosswalk from image data captured by the camera while the vehicle is running (step S202). The detection of the crosswalk can be performed by the crosswalk detection unit 140.
If the crosswalk is detected, the electronic apparatus 100 may generate a first object for recognizing that the crosswalk is located in front of the vehicle (step S203).
Also, the electronic apparatus 100 may output the generated first object through augmented reality (step S204). In this case, the control unit 170 may control the mapping unit 164 so that the generated first object appears in the augmented reality screen so as to approach the current vehicle.
With this, the first object can appear at a position near the front of the current vehicle in the road area of the augmented reality screen, and the driver can easily recognize that there is a crosswalk near the front of the current vehicle.
Fig. 6 is a flowchart specifically illustrating a control method of an electronic device according to still another embodiment of the invention. Referring to fig. 6, the electronic device 100 may determine whether the vehicle is in a parking state (step S301). The determination of whether the vehicle is in a stopped state may be performed by the operating state determining unit 155.
If it is determined that the vehicle is in the parking state, the electronic device 100 may determine the signal type information using the image data of the signal area portion of the signal lamp in the image data (step S302). The signal type information determining unit 150 may be used to determine the signal type.
If the signal type information is determined to be the stop signal, the electronic device 100 may detect the crosswalk from the image data captured by the camera during the vehicle operation (step S303). The detection of the crosswalk can be performed by the crosswalk detection unit 140.
If the vehicle maintains the parking state in the state where the signal type information is the stop signal, the electronic device 100 may generate a first object for recognizing that the adult crosswalk is located in front of the vehicle (step S304).
When the vehicle starts in a state where the signal type information is the stop signal, the electronic device 100 may generate a second object for warning that the crosswalk is located in front of the vehicle (step S305).
Also, the electronic apparatus 100 may output the generated object through augmented reality (step S306). The first object and the second object generated may be displayed in a form that is distinguished from each other in order to provide different guidance to the driver.
Therefore, the driver can easily recognize not only the presence of a crosswalk in the vicinity of the front of the current vehicle but also whether the vehicle can start.
Fig. 7 is a flowchart specifically illustrating a control method of an electronic device according to another embodiment of the invention. Referring to fig. 7, the electronic device 100 may determine whether the vehicle is in a parking state (step S401). The determination of whether the vehicle is in a parked state may be performed by the operating state determining unit 155.
If it is determined that the vehicle is in the stopped state, the electronic apparatus 100 may detect a crosswalk from image data captured by the camera while the vehicle is running (step S402). The detection of the crosswalk can be performed by the crosswalk detection unit 140.
When the crosswalk is detected, the electronic apparatus 100 may determine whether or not a pedestrian is present on the crosswalk using the captured image data (step S403).
Also, the electronic device 100 may generate an object indicating whether or not a pedestrian is present (step S404).
Also, the electronic apparatus 100 may output the generated object through augmented reality (step S405). Therefore, the driver can easily recognize that the pedestrian is walking near the front of the current vehicle.
On the other hand, according to the present invention, if the preceding vehicle of the current vehicle starts in a state where there is a pedestrian on the crosswalk, the electronic device 100 may control not to execute the preceding vehicle start guidance.
Also, according to the present invention, if the current vehicle departs from the parking state in a state where a pedestrian is present on the crosswalk, the electronic device 100 may be controlled to execute guidance for warning of the presence of a pedestrian on the crosswalk.
Fig. 8 is a diagram showing an augmented reality screen on which a crosswalk object appears according to an embodiment of the present invention. Part (a) of fig. 8 is a diagram showing an augmented reality screen in a case where the vehicle stops behind the crosswalk during the stop signal 810. Referring to part (a) of fig. 8, the electronic apparatus 100 generates a first object 801, the first object 801 representing a crosswalk located in front of a vehicle, and the electronic apparatus 100 may output the generated first object 801 through augmented reality. Here, the first object 801 may appear at a position near the front of the current vehicle in the road area of the augmented reality screen, whereby the driver can easily recognize that there is a crosswalk near the front of the current vehicle.
On the other hand, part (b) of fig. 8 is a diagram showing an augmented reality screen in a case where the vehicle moves from a parked state during the stop signal 810. Referring to part (b) of fig. 8, the electronic device 100 generates a second object 802, the second object 802 is used to warn a driver of the presence of a crosswalk in front of the vehicle, and the electronic device 100 may output the second object 802 through augmented reality. Here, the second object 802 may appear in a position near the front of the current vehicle in the road area of the augmented reality screen, and may be embodied in different colors from each other so as to be distinguished from the first object 801. Thus, the driver can easily recognize that the vehicle is not currently going to be started.
On the other hand, the first object and the second object can be embodied as texture images and are displayed through augmented reality. In this regard, it will be specifically explained with reference to fig. 9.
Referring to fig. 9, a first object 801 for allowing a driver to recognize the presence of a crosswalk in front of a vehicle may appear on an augmented reality screen in white similar to the color of a real-world crosswalk, and a second object 802 for warning the driver of the presence of a crosswalk in front of the vehicle may appear on the augmented reality screen in a color for allowing the driver to recognize a dangerous state, for example, in red.
Here, the first and second objects 801 and 802 may be embodied to include regions 801-1 and 802-1 and transparent regions 801-2 and 802-2 for displaying colors, so as to correspond to a pedestrian crossing of the real world, in this case, the transparency of the transparent regions 801-2 and 802-2 may be adjusted by changing an a value, which is an Alpha channel value of RGBA (Red, Green, Blue, Alpha), as an example, in an embodiment of the present invention, the Alpha channel value is a value between 0.0 (fully transparent) and 1.0 (fully opaque), in an embodiment of the present invention, the RGBA value is used as a value of a display color, but HS L a (Hue, Saturation, L illumination, Alpha) or the like, which is other color unit representing the Alpha channel value for displaying transparency, may be used.
Fig. 10 is a diagram showing an augmented reality screen on which a pedestrian notification object appears according to an embodiment of the present invention. Referring to fig. 10, the electronic device 100 may perform the following control: when a pedestrian 1002 exists on the crosswalk, a third object 1001 for guiding the pedestrian is generated, and the generated third object 1001 is output by augmented reality. With this, the driver can recognize that a pedestrian is currently walking in front of the vehicle, and can easily recognize that the vehicle is currently out of departure.
Fig. 11 is a diagram showing an embodiment of a navigator device according to an embodiment of the present invention in a case where an imaging unit is not provided. Referring to fig. 11, the navigator device 100 for a vehicle and the black box 200 for a vehicle, which are separately provided, may constitute a system according to an embodiment of the present invention using a wired/wireless communication method.
The navigator 100 for a vehicle may include: a display unit 131 provided on the front surface of the housing 191 of the navigator; navigator operation keys 121; and a navigator microphone 123.
The black box 200 for a vehicle may include a black box camera 222, a black box microphone 224, and an attachment portion 281.
Fig. 12 is a diagram showing an embodiment of a navigator device according to an embodiment of the present invention in a case where an imaging unit is provided. Referring to fig. 12, in the case where the navigator device 100 includes the image pickup section 125, the user can set the navigator device 100 such that the image pickup section 125 of the navigator device 100 picks up an image of the front of the vehicle and the display section of the navigator device 100 can recognize the user. Thus, the system of an embodiment of the invention can be embodied.
Fig. 13 is a diagram showing an embodiment of a head-up display and an electronic apparatus according to an embodiment of the present invention. Referring to fig. 13, the head-up display may display an augmented reality guidance picture on the head-up display through wired/wireless communication with other devices.
For example, augmented reality can be provided by a head-up display using a front windshield of a vehicle, image superimposition using another image output device, or the like, and thus the augmented reality provider 160 can generate a real image, an interface image superimposed on the glass, or the like. Therefore, the augmented reality navigator or the vehicle infotainment system and the like can be embodied.
On the other hand, the control method of the electronic apparatus of the various embodiments of the present invention described above may be embodied in a program to be provided to a server or a device. In this way, each device can be connected to a server or an apparatus storing the program to download the program.
The control methods of the electronic apparatus according to the various embodiments of the present invention described above may be implemented as programs, and may be provided by being stored in various non-transitory readable media. The non-transitory readable medium does not mean a medium that stores data for a short time, such as a register, a cache, a memory, and the like, but means a medium that stores data semi-permanently and can be read (reading) by a device. Specifically, the above-mentioned various applications or programs can be provided by a non-transitory readable medium such as a Compact Disc (CD), a Digital Versatile Disc (DVD), a hard disk, a blu-ray disc, a universal serial bus, a memory card, a read only memory, and the like.
While the preferred embodiments of the present invention have been illustrated and described, the present invention is not limited to the specific embodiments described above, and various modifications can be made by those skilled in the art without departing from the spirit of the present invention as claimed, and these modifications should not be construed as departing from the technical spirit or the scope of the present invention.

Claims (15)

1. A method of controlling an electronic device, comprising:
detecting a crosswalk from image data captured by a camera during operation of a vehicle;
determining whether a pedestrian is present on the pedestrian crossing;
generating an object indicating the detected pedestrian crossing and indicating whether or not the pedestrian is present; and
a step of outputting the generated object through augmented reality,
wherein, the method further comprises:
a step of judging signal type information by using the image data of the signal area part of the signal lamp in the image data,
in the step of generating an object indicating the detected pedestrian crossing and indicating whether or not the pedestrian is present,
generating a first object for recognizing that the crosswalk is located in front of the vehicle when the vehicle maintains a stopped state in a state where the signal type information is a stop signal,
generating a second object for warning a driver that the crosswalk is located in front of the vehicle when the vehicle is started in a state where the signal type information is a stop signal,
generating a third object for enabling the driver to recognize that the pedestrian is present on the crosswalk,
wherein the first object, the second object, and the third object are displayed in a form distinguished from each other,
wherein the outputting the generated object through augmented reality includes:
a step of determining a mapping position of the generated object on a virtual three-dimensional space of a captured image for the camera; and
a step of displaying the object by mapping the object to the augmented reality of the virtual three-dimensional space based on the determined mapping position,
wherein the mapped positions of the first object and the second object are in the vicinity of the front of the vehicle so that the user recognizes that the crosswalk is in the front of the vehicle,
wherein, the method further comprises:
when a vehicle ahead of the vehicle starts in a state where the pedestrian is present on the crosswalk, the control unit does not execute the preceding vehicle start guidance.
2. The method of controlling an electronic apparatus according to claim 1,
also includes the step of judging whether the vehicle is in a parking state,
in the step of generating an object indicating the detected crosswalk, if it is determined that the vehicle is in a stopped state, a first object for recognizing that the crosswalk is located in front of the vehicle is generated.
3. The method of claim 1, wherein the first object and the second object are distinguished by different colors.
4. The method of controlling an electronic apparatus according to claim 1,
the first object and the second object are embodied in a form including an alpha channel related to color transparency,
the first object and the second object include a transparent region according to the alpha channel.
5. The method of controlling an electronic device according to claim 1, further comprising a step of controlling to execute guidance for warning of the presence of a pedestrian on the crosswalk if the vehicle starts in a state where the pedestrian is present on the crosswalk.
6. The method of controlling an electronic apparatus according to claim 1, wherein the appearance positions of the first object and the second object are areas where crosswalks are located in the augmented reality.
7. The method of controlling an electronic apparatus according to claim 1, wherein the step of outputting the generated object by augmented reality further comprises:
a step of calculating camera parameters by calibrating the camera;
generating the virtual three-dimensional space of the captured image for the camera based on the camera parameters; and
and a step of positioning the generated object in the virtual three-dimensional space.
8. An electronic device, comprising:
a display unit for displaying a screen;
an object generating unit configured to generate an object indicating the detected pedestrian crossing and indicating whether or not the pedestrian is present on the pedestrian crossing, when the pedestrian crossing is detected from image data captured by a camera during vehicle operation and whether or not the pedestrian is present on the pedestrian crossing is determined; and
a control unit configured to control the display unit to output the generated object by augmented reality,
wherein the control unit determines signal type information using image data of a signal area portion of a traffic signal in the image data, and controls the object generation unit to: causing the object generating unit to generate a first object for recognizing that the crosswalk is positioned in front of the vehicle if the vehicle maintains a stopped state in a state in which the signal type information is a stop signal, and causing the object generating unit to generate a second object for warning a driver that the crosswalk is positioned in front of the vehicle and a third object for allowing the driver to recognize that the pedestrian is present on the crosswalk if the vehicle departs from the state in which the signal type information is a stop signal,
wherein the first object, the second object, and the third object are displayed in a form distinguished from each other,
wherein the control unit determines a mapping position of the generated object on a virtual three-dimensional space of a captured image for the camera, and controls the display unit to display the object by an augmented reality that maps the object to the virtual three-dimensional space based on the determined mapping position,
wherein the mapped positions of the first object and the second object are in the vicinity of the front of the vehicle so that the user recognizes that the crosswalk is in the front of the vehicle,
wherein the control unit controls not to execute the forward vehicle departure guidance when the forward vehicle of the vehicle departs in a state where the pedestrian is present on the crosswalk.
9. The electronic device of claim 8,
the control part judges whether the vehicle is in a parking state,
if it is determined that the vehicle is in a stopped state, the control unit controls the object generation unit to generate a first object for recognizing that the crosswalk is located in front of the vehicle.
10. The electronic device of claim 8, wherein the first object and the second object are distinguished by different colors.
11. The electronic device of claim 8,
the first object and the second object are embodied in a form including an alpha channel related to color transparency,
the first object and the second object include a transparent region according to the alpha channel.
12. The electronic device according to claim 8, wherein the control unit determines whether or not a pedestrian is present on the crosswalk by using the image data obtained by the image pickup, and controls the object generation unit to generate an object indicating whether or not the pedestrian is present.
13. The electronic device according to claim 8, wherein the control unit performs control as follows: when the vehicle starts in a state where a pedestrian is present on the crosswalk, guidance for warning that a pedestrian is present on the crosswalk is executed.
14. The electronic device according to claim 8, wherein the appearance positions of the first and second objects are areas where crosswalks are located in the augmented reality.
15. The electronic device of claim 8,
the control unit performs control as follows: the camera parameters are calculated by calibrating the camera, and the virtual three-dimensional space of the image captured by the camera is generated based on the camera parameters, so that the generated object is located in the virtual three-dimensional space.
CN201510867498.6A 2014-12-01 2015-12-01 Electronic device and control method thereof Active CN105654778B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202010541932.2A CN111710189B (en) 2014-12-01 2015-12-01 Control method for electronic device, and recording medium
CN201811590078.8A CN110091798B (en) 2014-12-01 2015-12-01 Electronic device, control method of electronic device, and computer-readable storage medium
CN202010541922.9A CN111681455B (en) 2014-12-01 2015-12-01 Control method of electronic device, and recording medium

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20140170051 2014-12-01
KR10-2014-0170051 2014-12-01
KR1020150035744A KR102383425B1 (en) 2014-12-01 2015-03-16 Electronic apparatus, control method of electronic apparatus, computer program and computer readable recording medium
KR10-2015-0035744 2015-03-16

Related Child Applications (3)

Application Number Title Priority Date Filing Date
CN201811590078.8A Division CN110091798B (en) 2014-12-01 2015-12-01 Electronic device, control method of electronic device, and computer-readable storage medium
CN202010541932.2A Division CN111710189B (en) 2014-12-01 2015-12-01 Control method for electronic device, and recording medium
CN202010541922.9A Division CN111681455B (en) 2014-12-01 2015-12-01 Control method of electronic device, and recording medium

Publications (2)

Publication Number Publication Date
CN105654778A CN105654778A (en) 2016-06-08
CN105654778B true CN105654778B (en) 2020-07-10

Family

ID=56138965

Family Applications (4)

Application Number Title Priority Date Filing Date
CN202010541922.9A Active CN111681455B (en) 2014-12-01 2015-12-01 Control method of electronic device, and recording medium
CN201811590078.8A Active CN110091798B (en) 2014-12-01 2015-12-01 Electronic device, control method of electronic device, and computer-readable storage medium
CN201510867498.6A Active CN105654778B (en) 2014-12-01 2015-12-01 Electronic device and control method thereof
CN202010541932.2A Active CN111710189B (en) 2014-12-01 2015-12-01 Control method for electronic device, and recording medium

Family Applications Before (2)

Application Number Title Priority Date Filing Date
CN202010541922.9A Active CN111681455B (en) 2014-12-01 2015-12-01 Control method of electronic device, and recording medium
CN201811590078.8A Active CN110091798B (en) 2014-12-01 2015-12-01 Electronic device, control method of electronic device, and computer-readable storage medium

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202010541932.2A Active CN111710189B (en) 2014-12-01 2015-12-01 Control method for electronic device, and recording medium

Country Status (2)

Country Link
KR (1) KR102383425B1 (en)
CN (4) CN111681455B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10150414B2 (en) * 2016-07-08 2018-12-11 Ford Global Technologies, Llc Pedestrian detection when a vehicle is reversing
CN107784864A (en) * 2016-08-26 2018-03-09 奥迪股份公司 Vehicle assistant drive method and system
KR20180084556A (en) * 2017-01-17 2018-07-25 팅크웨어(주) Method, apparatus, electronic apparatus, computer program and computer readable recording medium for providing driving guide using a photographed image of a camera
CN106971626A (en) * 2017-05-12 2017-07-21 南通海鑫信息科技有限公司 A kind of alarming method for power of pedestrian running red light
KR101966384B1 (en) * 2017-06-29 2019-08-13 라인 가부시키가이샤 Method and system for image processing
DE102017216100A1 (en) * 2017-09-12 2019-03-14 Volkswagen Aktiengesellschaft A method, apparatus and computer readable storage medium having instructions for controlling a display of an augmented reality display device for a motor vehicle
CA3068659A1 (en) * 2018-01-02 2019-07-11 Lumus Ltd. Augmented reality displays with active alignment and corresponding methods
CN110634324A (en) * 2018-06-22 2019-12-31 上海擎感智能科技有限公司 Vehicle-mounted terminal based reminding method and system for courtesy pedestrians and vehicle-mounted terminal
JP7345128B2 (en) * 2019-05-20 2023-09-15 パナソニックIpマネジメント株式会社 Pedestrian devices and traffic safety support methods
CN113978468A (en) * 2021-12-16 2022-01-28 诺博汽车系统有限公司 Vehicle speed control method, device, equipment and medium based on water accumulation environment

Family Cites Families (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09178505A (en) * 1995-12-27 1997-07-11 Pioneer Electron Corp Drive assist system
JP2004257979A (en) * 2003-02-27 2004-09-16 Sanyo Electric Co Ltd Navigation apparatus
WO2005088970A1 (en) * 2004-03-11 2005-09-22 Olympus Corporation Image generation device, image generation method, and image generation program
JP2006127055A (en) * 2004-10-27 2006-05-18 Denso Corp Information presentation device for vehicle
JP4321821B2 (en) * 2005-01-28 2009-08-26 アイシン・エィ・ダブリュ株式会社 Image recognition apparatus and image recognition method
JP2007193652A (en) * 2006-01-20 2007-08-02 Hitachi Ltd Navigation apparatus
JP4267657B2 (en) * 2006-10-31 2009-05-27 本田技研工業株式会社 Vehicle periphery monitoring device
JP2008143387A (en) * 2006-12-11 2008-06-26 Fujitsu Ten Ltd Surrounding area monitoring device and surrounding area monitoring method
US8384532B2 (en) * 2009-04-02 2013-02-26 GM Global Technology Operations LLC Lane of travel on windshield head-up display
WO2010125634A1 (en) * 2009-04-27 2010-11-04 トヨタ自動車株式会社 Drive assisting device
JP2011086097A (en) * 2009-10-15 2011-04-28 Daihatsu Motor Co Ltd Obstacle detection device
JP5462609B2 (en) * 2009-12-09 2014-04-02 富士重工業株式会社 Stop line recognition device
JP5577398B2 (en) * 2010-03-01 2014-08-20 本田技研工業株式会社 Vehicle periphery monitoring device
JP5035371B2 (en) * 2010-03-15 2012-09-26 アイシン精機株式会社 Crosswalk detection device, crosswalk detection system, crosswalk detection method and program
KR101759975B1 (en) * 2011-01-03 2017-07-24 팅크웨어(주) Navigation for vehicel and land departure warning method of navigation for vehicel
JP2012155655A (en) * 2011-01-28 2012-08-16 Sony Corp Information processing device, notification method, and program
CN102519475A (en) * 2011-12-12 2012-06-27 杨志远 Intelligent navigation method and equipment based on augmented reality technology
JP5893054B2 (en) * 2012-01-17 2016-03-23 パイオニア株式会社 Image processing apparatus, image processing server, image processing method, image processing program, and recording medium
JP5872923B2 (en) * 2012-02-22 2016-03-01 株式会社マイクロネット AR image processing apparatus and method
US9135754B2 (en) * 2012-05-07 2015-09-15 Honda Motor Co., Ltd. Method to generate virtual display surfaces from video imagery of road based scenery
CN202815590U (en) * 2012-07-30 2013-03-20 中国航天科工集团第三研究院第八三五七研究所 Control system for mini self-driving unmanned vehicle
CN102951089B (en) * 2012-08-20 2015-04-01 上海工程技术大学 Vehicle-mounted navigation and active safety system based on mobile equipment camera
CN103065470B (en) * 2012-12-18 2014-12-17 浙江工业大学 Detection device for behaviors of running red light of vehicle based on machine vision with single eye and multiple detection faces
CN103105174B (en) * 2013-01-29 2016-06-15 四川长虹佳华信息产品有限责任公司 A kind of vehicle-mounted outdoor scene safety navigation method based on AR augmented reality
US9047703B2 (en) * 2013-03-13 2015-06-02 Honda Motor Co., Ltd. Augmented reality heads up display (HUD) for left turn safety cues
CN104102678B (en) * 2013-04-15 2018-06-05 腾讯科技(深圳)有限公司 The implementation method and realization device of augmented reality
CN203651606U (en) * 2013-11-19 2014-06-18 浙江吉利汽车研究院有限公司 Vehicle display device preventing blind zones
KR101388872B1 (en) * 2014-03-17 2014-04-23 안병준 Traffic safety system for pedestrian crossing

Also Published As

Publication number Publication date
CN111710189A (en) 2020-09-25
CN111681455B (en) 2023-02-03
KR20160065722A (en) 2016-06-09
CN110091798A (en) 2019-08-06
CN110091798B (en) 2022-12-16
CN111681455A (en) 2020-09-18
KR102383425B1 (en) 2022-04-07
CN105654778A (en) 2016-06-08
CN111710189B (en) 2023-09-08

Similar Documents

Publication Publication Date Title
CN105654778B (en) Electronic device and control method thereof
KR102348127B1 (en) Electronic apparatus and control method thereof
CN108680173B (en) Electronic device, control method of electronic device, and computer-readable recording medium
US11543256B2 (en) Electronic apparatus and control method thereof
US11030816B2 (en) Electronic apparatus, control method thereof, computer program, and computer-readable recording medium
CN108470162B (en) Electronic device and control method thereof
CN110260877B (en) Driving related guidance providing method and apparatus, and computer readable recording medium
KR20160065723A (en) Electronic apparatus, control method of electronic apparatus, computer program and computer readable recording medium
KR102158167B1 (en) Electronic apparatus, control method of electronic apparatus and computer readable recording medium
KR102276082B1 (en) Navigation device, black-box and control method thereof
KR102299501B1 (en) Electronic apparatus, control method of electronic apparatus and computer readable recording medium
KR102371620B1 (en) Electronic apparatus, control method of electronic apparatus and computer readable recording medium
KR102299499B1 (en) Electronic apparatus and control method thereof
KR20200092197A (en) Image processing method, image processing apparatus, electronic device, computer program and computer readable recording medium for processing augmented reality image
KR102299500B1 (en) Electronic apparatus and control method thereof

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20210818

Address after: Seoul, South Kerean

Patentee after: Hyundai Motor Co.,Ltd.

Patentee after: Kia Co.,Ltd.

Address before: Gyeonggi Do, South Korea

Patentee before: THINKWARE SYSTEMS Corp.

TR01 Transfer of patent right