KR20120091492A - Electronic device and course guide method of electronic device - Google Patents

Electronic device and course guide method of electronic device Download PDF

Info

Publication number
KR20120091492A
KR20120091492A KR1020100133411A KR20100133411A KR20120091492A KR 20120091492 A KR20120091492 A KR 20120091492A KR 1020100133411 A KR1020100133411 A KR 1020100133411A KR 20100133411 A KR20100133411 A KR 20100133411A KR 20120091492 A KR20120091492 A KR 20120091492A
Authority
KR
South Korea
Prior art keywords
stereoscopic
image
navigation
objects
stereoscopic image
Prior art date
Application number
KR1020100133411A
Other languages
Korean (ko)
Inventor
고석필
Original Assignee
팅크웨어(주)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 팅크웨어(주) filed Critical 팅크웨어(주)
Priority to KR1020100133411A priority Critical patent/KR20120091492A/en
Publication of KR20120091492A publication Critical patent/KR20120091492A/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/0969Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram

Abstract

PURPOSE: An electronic device and a navigation service method of the same are provided to provide navigation service by using 3D navigation images, thereby providing navigation images similar to an actual driving condition. CONSTITUTION: An electronic device comprises a display module(141) and a controlling unit(170). The controlling unit provides image for a stereoscopic vision being displayed through the display module. One or more objects related of the navigation service being included in the images of the stereoscopic vision are displayed by non-stereoscopic vision.

Description

Electronic device and navigation service method of electronic device {ELECTRONIC DEVICE AND COURSE GUIDE METHOD OF ELECTRONIC DEVICE}

The present invention relates to an electronic device and a navigation service method of the electronic device.

With the opening of the Internet network and the revision of laws relating to location data, the industry based on location based services (LBS) is being activated. Representative devices using such a location-based service may be a vehicle navigation device that provides a navigation service for locating a current location of a vehicle or guiding a moving route to a destination.

Recently, the movement of providing services using 3D stereoscopic images in various electronic devices has become more active. When a navigation service is provided using a 3D stereoscopic image, the navigation information may be more realistically provided to the user. However, in order to provide a navigation service using a 3D stereoscopic image, it is necessary to improve the structural part and / or software part of the terminal.

The present invention provides an electronic device that provides a navigation service using a three-dimensional navigation image and a method of providing a navigation service of the electronic device.

As an aspect of the present invention for realizing the above object, an electronic device according to the present invention, a display module; And a control unit that provides a navigation service using an image for stereoscopic vision displayed through the display module, and displays at least one object included in the stereoscopic image in a non-stereoscopic manner.

As another aspect of the present invention for realizing the above object, an electronic device comprising: a display module; And a controller configured to display an image for stereoscopic vision through the display module and to control a stereoscopic depth of the stereoscopic image based on at least one criterion.

As an aspect of the present invention for realizing the above object, a navigation service method of an electronic device according to the present invention comprises the steps of: obtaining location data; Obtaining a plurality of objects for displaying navigation information based on the position data; Setting a sense of distance according to stereoscopic vision of each of the plurality of objects based on at least one criterion; Generating a stereoscopic image based on a sense of distance according to stereoscopic vision of each of the plurality of objects; And displaying the stereoscopic image.

As one aspect of the present invention for realizing the above object, a computer-readable recording medium according to the present invention records a program that performs any one of the above methods.

According to the present invention, by providing a navigation service using a three-dimensional navigation image, it is possible to provide a navigation image similar to the actual driving situation. Accordingly, since the user may use navigation information that is more realistic than when using a 2D navigation image, the user can effectively receive the navigation information.

In addition, navigation information may be provided to the user more effectively by adjusting the three-dimensional effect of each object included in the navigation image according to the importance of the corresponding information and the taste of the user.

In addition, by displaying some of the objects included in the navigation image as a two-dimensional object, it is possible to solve the problem that may occur when displaying the three-dimensional object.

1 is a structural diagram illustrating an electronic device according to an embodiment of the present disclosure.
2 illustrates an example of an electronic device 100 according to an embodiment of the present disclosure.
3 is a block diagram of a communication network including a vehicle navigation system according to an embodiment of the present invention.
4 and 5 are views for explaining a stereoscopic image display method using a binocular parallax associated with an embodiment of the present invention.
6 is a flowchart illustrating a navigation service method of a vehicle navigation apparatus 100 according to an embodiment of the present invention.
7 illustrates an example of a stereoscopic navigation image generated by the vehicle navigation apparatus 100 according to an embodiment of the present invention.
8 illustrates an example of generating a navigation image including both a 2D object and a 3D object in the electronic device 100 according to an embodiment of the present disclosure.
9 illustrates another example of generating a navigation image including both a 2D object and a 3D object in the electronic device 100 according to an embodiment of the present disclosure.

The above objects, features and advantages of the present invention will become more apparent from the following detailed description taken in conjunction with the accompanying drawings. It is to be understood, however, that the invention is not limited to the disclosed embodiments, but, on the contrary, is intended to cover various modifications and similarities. Like reference numerals designate like elements throughout the specification. In the following description, well-known functions or constructions are not described in detail since they would obscure the invention in unnecessary detail. In addition, numerals (e.g., days, days, etc.) used in the description of the present invention are merely an identifier for distinguishing one component from another component

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

In addition, the suffix "module" and " part "for constituent elements used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

Hereinafter, the present invention will be described in detail with reference to the accompanying drawings.

1 is a structural diagram illustrating an electronic device according to an embodiment of the present disclosure.

Hereinafter, a case in which the electronic device 100 is 'vehicle navigation' will be described as an example. However, the present invention is not limited thereto, and in the present invention, other types of electronic devices may be used to implement the navigation service method. For example, the electronic device 100 may be implemented as a mobile communication terminal, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), or the like.

Referring to FIG. 1, the vehicle navigation apparatus 100 may include a communication unit 110, an input unit 120, a sensing unit 130, an output unit 140, a storage unit 150, a power supply unit 160, and a control unit 170. It may include. Since the components shown in FIG. 1 are not essential, an electronic device having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The communication unit 110 may include one or more modules that enable communication between the vehicle navigation apparatus 100 and the communication system or between the vehicle navigation apparatus 100 and the network in which the vehicle navigation apparatus 100 is located or the vehicle navigation apparatus 100 and other electronic devices. It may include. For example, the communication unit 100 may include a location data module 111, a wireless internet module 113, a broadcast transmission / reception module 115, a short range communication module 117, a wired communication module 119, and the like. .

The position data module 111 is a module for checking or obtaining position data of the vehicle navigation apparatus 100. As the method of obtaining location data by the location data module 111, a method of obtaining location data through a global navigation satellite system (GNSS) may be used. GNSS refers to a navigation system that can calculate the position of a receiver terminal using radio signals received from satellites. Specific examples of GNSS include Global Positioning System (GPS), Galileo, Global Orbiting Navigational Satellite System (GLONASS), COMPASS, Indian Regional Navigational Satellite System (IRNSSS), and Quasi ?? Zenith Satellite System (QZSS), depending on its operation. Can be. The location data module 111 of the vehicle navigation apparatus 100 according to an embodiment of the present disclosure may receive location data by receiving a GNSS signal serving in an area in which the vehicle navigation apparatus 100 is used. The position data module 111 continuously calculates a current position of the vehicle navigation apparatus 100 in real time, and calculates speed information using the same.

The wireless internet module 113 is a device that accesses the wireless Internet and acquires or transmits data. The wireless Internet accessible through the wireless internet module 113 may be a wireless LAN (WLAN), a wireless broadband (WBRO), a world interoperability for microwave access (Wimax), a high speed downlink packet access (HSDPA), or the like.

The broadcast transmission / reception module 115 is an apparatus for receiving broadcast signals through various broadcast systems. The broadcast system that can be received through the broadcast transmission / reception module 115 includes: Digital Multimedia Broadcasting Terrestrial (DMBT), Digital Multimedia Broadcasting Satellite (DMBS), Media Forward Link Only (MediaFLO), Digital Video Broadcast Handheld (DVBH), and ISDBT ( Integrated Services Digital Broadcast Terrestrial). The broadcast signal received through the broadcast transmission / reception module 115 may include traffic data, living data, and the like.

The short-range communication module 117 refers to a module for short-range communication. Bluetooth, radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB), ZigBee, and the like can be used as the short distance communication technology.

The wired communication module 119 serves to provide an interface with other electronic devices connected to the vehicle navigation apparatus 100. For example, the wired communication module 119 may be a USB module capable of communicating through a USB port.

The input unit 120 is a module that generates input data for controlling the operation of the vehicle navigation apparatus 100. The input unit 120 may convert the physical input from the outside into a specific electrical signal to generate the input data. The input unit 120 may include a user input module 121, a microphone 123, and the like.

The user input module 121 receives a control input for controlling the operation of the vehicle navigation apparatus 100 from the user. The user input module may include a key pad dome switch, a touch pad (constant voltage / capacitance), a jog wheel, a jog switch, and the like. For example, the user input module 121 may be implemented as a navigation operation key (193 of FIG. 2) provided outside the housing (191 of FIG. 2) of the vehicle navigation apparatus 100.

The microphone 123 is a device that receives a user's voice and an audio signal generated inside and outside the vehicle. The microphone 123 may be implemented as a navigation microphone 195 provided outside the housing (191 of FIG. 2) of the vehicle navigation apparatus 100.

The sensing unit 130 detects the current state of the vehicle navigation apparatus 100 and generates a sensing signal for controlling the operation of the vehicle navigation apparatus 100. The sensing unit 130 may include a motion sensing module 131, an optical sensing module 133, and the like.

The motion sensing module 131 may detect a movement in a three-dimensional space of the vehicle navigation apparatus 100. The motion sensing module 131 may include a three-axis geomagnetic sensor and a three-axis acceleration sensor. The motion data obtained through the motion sensing module 131 may be combined with the position data obtained through the position data module 111 to calculate a more accurate trajectory of the vehicle to which the vehicle navigation apparatus 100 is attached.

The light sensing module 133 is a device for measuring the ambient illuminance of the vehicle navigation apparatus 100. The brightness of the display unit 145 may be changed to correspond to the ambient brightness by using the illumination data acquired through the light sensing module 133.

The output unit 140 is a device in which the vehicle navigation apparatus 100 outputs data. The output unit 140 may include a display module 141, an audio output module 143, and the like.

The display module 141 is a device that outputs data that can be visually recognized by the vehicle navigation apparatus 100. The display module 141 may be implemented as a display unit 145 of FIG. 2 provided on the front surface of the housing (191 of FIG. 2) of the vehicle navigation apparatus 100. Meanwhile, as described above, when the display module 141 is a touch screen, the display module 141 may simultaneously serve as the output unit 140 and the input unit 120.

The audio output module 143 outputs audio data that can be recognized acoustically. The audio output module 143 outputs an audio signal related to a function (eg, a route guidance function) performed in the vehicle navigation apparatus 100. The audio output module 143 may include a receiver, a speaker, a buzzer, and the like.

The storage unit 150 may store a program for operating the vehicle navigation apparatus 100, and may temporarily store data (path data and image data) input / output in relation to the vehicle navigation apparatus 100. The storage unit 150 may be embedded in or detachable from the vehicle navigation apparatus 100, and may be a flash memory type, a hard disk type, or a multimedia card micro type. ), Card-type memory (e.g. SD or XD memory), RAM (Random Access Memory, RAM), SRAM (Static Random Access Memory), ROM (ReadOnly Memory, ROM), EEPROM (Electrically Erasable Programmable ReadOnly Memory) It may include a storage medium of at least one type of PROM (Programmable ReadOnly Memory), magnetic memory, magnetic disk, optical disk. The vehicle navigation apparatus 100 may operate in association with a web storage that performs a storage function of the storage unit 150 on the Internet.

The power supply unit 160 receives external power and internal power to supply power required for operation of each component of the vehicle navigation apparatus 100 or another device connected to the vehicle navigation apparatus 100.

The controller 170 typically controls the overall operation of the vehicle navigation apparatus 100. In addition, the controller 170 may output a control signal for controlling another device connected to the vehicle navigation apparatus 100. 2 illustrates an example of an electronic device 100 according to an embodiment of the present disclosure.

Referring to FIG. 2, the vehicle navigation apparatus 100 may include a display unit 145 provided on the front surface of the navigation housing 191, a navigation operation key 193, and a navigation microphone 195.

The navigation housing 191 forms the exterior of the vehicle navigation apparatus 100. The vehicle navigation apparatus 100 may be exposed to an environment of various conditions such as being exposed to high or low temperatures due to seasonal factors or being subjected to direct / indirect external shock. The navigation housing 191 may have a purpose to protect various electronic components inside the vehicle navigation apparatus 100 from changes in the external environment and to enhance the appearance of the vehicle navigation apparatus 100. In order to achieve the above object, the navigation housing 191 may be processed by injection molding a material such as ABS, PC or strength-enhanced engineering plastic.

The display unit 145 is a part that visually displays various data. The various data displayed on the display unit 145 may be map data in which route data and the like are combined, various broadcast screens including DMB broadcasting, or images stored in a memory. The display unit 145 may be divided into several physically or logically. The physically divided display unit 145 refers to a case in which two or more display units 145 are adjacent to each other. The logically divided display unit 145 refers to a case in which a plurality of independent screens are physically displayed on one display unit 145. For example, when the DMB broadcast is received and displayed, the route data is displayed on a part of the display unit 145 or the DMB broadcast and the map screen are displayed on a part of the display unit 145 and another area, respectively. Can be. In accordance with the tendency for various functions to converge on the vehicle navigation apparatus 100, the display 145 is logically divided to display various data. Furthermore, in order to display various data, the display unit 145 is gradually becoming larger.

The entire surface or part of the display unit 145 may be a touch screen that may receive a user's touch input. For example, the function may be activated by touching the function selection button displayed on the display unit 145. That is, it means that the display unit 145 may be an output unit (140 of FIG. 3) and an input unit (120 of FIG. 3) of an image.

The navigation operation key 193 may be provided for executing various functions of the vehicle navigation apparatus 100 or for allowing a user to directly input necessary data. The convenience of use can be improved by mapping a frequently used specific function to the navigation operation key 193.

The navigation microphone 195 may be provided to receive a sound including voice and sound. For example, the specific function of the vehicle navigation apparatus 100 may be executed based on the voice signal received by the navigation microphone 195. In addition, the current state of the vehicle, such as the occurrence of an accident, may be detected based on the acoustic signal received by the navigation microphone 195.

3 is a block diagram of a communication network including a vehicle navigation system according to an embodiment of the present invention.

Referring to FIG. 3, the vehicle navigation apparatus 100 may be connected to various communication networks and other electronic devices 61 to 64.

The vehicle navigation apparatus 100 may calculate a current position using a radio wave signal received from the satellite 20. Each satellite 20 may transmit L band frequencies having different frequency bands. The vehicle navigation apparatus 100 may calculate a current position based on the time taken for the L-band frequency transmitted from each satellite 20 to reach the vehicle navigation apparatus 100.

The vehicle navigation apparatus 100 may wirelessly connect to the network 30 through the control station 40 (ACR), the base station 50, RAS, or the like. When the vehicle navigation apparatus 100 is connected to the network 30, data can be exchanged by indirectly connecting to the electronic devices 61 and 62 connected to the network 30.

The vehicle navigation apparatus 100 may be indirectly connected to the network 30 through another device 63 having a communication function. For example, when a device capable of connecting to the network 30 is not provided in the vehicle navigation apparatus 100, it means that the local area communication can be used to communicate with another device 63 having a communication function.

4 and 5 are views for explaining a stereoscopic image display method using binocular parallax according to an embodiment of the present invention, Figure 4 is a method using a lenticular lens array (lenticular lens array) 5 shows a method of using a parallax barrier.

The binocular parallax refers to the difference in the way the left and right eyes of a person see things. When a human's brain synthesizes the image seen through the left eye and the image seen through the right eye, the synthesized image makes a person feel three-dimensional. In the following description, a phenomenon in which a person feels a stereoscopic sense according to binocular parallax is referred to as 'stereoscopic vision', and a three-dimensional image causing stereoscopic vision is referred to as a 'stereoscopic image'. Also, when a specific object included in the image causes stereoscopic vision, the object is referred to as a 'stereoscopic object'.

The stereoscopic image display method according to binocular disparity is classified into a glasses type that requires special glasses and a non-glass type that does not require glasses. The spectacle type is an anaglyph using a sunglasses with wavelength selectivity, a polarization filter using a shading effect according to a polarization difference, and a shutter glasses method to alternately present images of left and right within an afterimage of an eye. Glasses). In addition, there is a method in which a filter having different transmittances is mounted in the left and right sides to obtain a stereoscopic effect on the movement in the left and right directions according to the time difference of the visual system coming from the difference in the transmittances.

The autostereoscopic method, in which a stereoscopic effect is generated on the image display surface rather than the observer, includes a parallax barrier method, a lenticular lens method, a microlens array method, and the like.

Referring to FIG. 4, the display module 141 includes a lenticular lens array 11a to display a stereoscopic image. The lenticular lens array 11a includes a display surface 13 in which the pixels L to be input to the left eye 12a and the pixels R to be input to the right eye 12b are alternately arranged along the horizontal direction, and the left and right eyes 12a, 12b), and provides optical discrimination directivity for the pixel L to be input to the left eye 12a and the pixel R to be input to the right eye 12b. Accordingly, the image passing through the lenticular lens array 11a is observed separately from the left eye 12a and the right eye 12a, and the human brain is viewed through the left eye 12a and through the right eye 12b. By synthesizing the three-dimensional image.

Referring to FIG. 5, the display module 141 includes a parallax barrier 11b having a vertical grid shape to display a stereoscopic image. The parallax barrier 11b includes a display surface 13 and left and right eyes 12a in which pixels L to be input to the left eye 12a and pixels R to be input to the right eye 12b are alternately arranged along the horizontal direction. 12b), the image is separated from the left eye 12a and the right eye 12b through a vertical grid-shaped aperture. Therefore, the human brain synthesizes an image viewed through the left eye 12a and an image viewed through the right eye 12b to observe a stereoscopic image. The parallax barrier 11b is turned on only when a three-dimensional image is to be displayed and is separated from the incident time, and is turned off when a planar image is to be displayed. have. On the other hand, the above-described three-dimensional image display method for explaining the embodiment of the present invention, the present invention is not limited thereto. The present invention can display a stereoscopic image using binocular disparity using various methods in addition to the above-described method.

An embodiment disclosed in this document may be implemented in the electronic device 100 described with reference to FIGS. 1 to 3. Hereinafter, an operation of the electronic device 100 for implementing an embodiment disclosed in this document will be described in more detail.

The storage unit 150 may store a map model used for the navigation service and objects related to the navigation function.

In addition, the controller 170 may generate a navigation image for stereoscopic vision, and provide a navigation service to a user by using the navigation image. To this end, the controller 17 may obtain a map model corresponding to the location data of the vehicle and an object related to a navigation function from the storage 150, and generate a stereoscopic navigation image using the same. When rendering to generate a stereoscopic navigation image, the controller 170 may control a stereoscopic sense of distance of the stereoscopic navigation image based on at least one criterion. In addition, the controller 170 may control to display at least one object included in the stereoscopic navigation image in a non-stereoscopic manner. A rendering method for generating a stereoscopic navigation image will be described in detail with reference to FIGS. 6 to 9 to be described later.

Hereinafter, the navigation service method of the vehicle navigation apparatus 100 according to an embodiment of the present invention and the operation of the vehicle navigation apparatus 100 for implementing the same will be described in detail with reference to FIGS. 6 to 9.

6 is a flowchart illustrating a navigation service method of a vehicle navigation apparatus 100 according to an embodiment of the present invention.

Referring to FIG. 6, the controller 170 acquires location data through the location data module 111 as a navigation service provision request is received. Then, a stereoscopic navigation image is generated based on the acquired position data (S101). In addition, the navigation service is provided by displaying the generated stereoscopic navigation image through the display module 141 (S102). Here, the navigation image may include a map model. In addition, the map model may include objects modeled on actual roads, facilities, traffic lights, buildings, etc. to display road information. In addition, the navigation image may include a plurality of objects related to the navigation function. Here, the objects related to the navigation function may include an object for displaying navigation information and an object for controlling the vehicle navigation apparatus 100. Navigation information displayed using the object may include a moving path to a destination of the vehicle, a moving speed of the vehicle, and a point of interest (POI). Each object included in the navigation image is stored in the storage unit 150, and the controller 170 reads necessary objects such as a map model from the storage unit 150 based on the position data obtained through the position data module 111. And create navigation images.

7 illustrates an example of a stereoscopic navigation image generated by the vehicle navigation apparatus 100 according to an embodiment of the present invention. Referring to FIG. 7, the stereoscopic navigation image 7 includes a left image 7a and a right image 7b. The controller 170 synthesizes or displays the left image 7a and the right image 7b alternately or sequentially to provide a 3D navigation image according to binocular disparity to the user.

The controller 170 may separately control a sense of distance according to stereoscopic vision of each object included in the 3D navigation image when the 3D navigation image is generated. The controller 170 controls a sense of distance according to stereoscopic vision of each stereoscopic object by using the parallax of the left image and the right image included in the 3D navigation image. That is, it is possible to control the sense of distance according to stereoscopic vision of each stereoscopic object by controlling the position where the corresponding object is displayed in the left image and the right image, or by controlling the combined position of the left image and the right image. In addition, according to an embodiment of the present disclosure, the controller 170 may control a depth according to stereoscopic vision of at least one object included in the navigation image, based on at least one criterion. Accordingly, even in the objects included in the same navigation image, a sense of distance according to stereoscopic vision may be displayed differently.

First, the controller 170 may control the distance feeling according to stereoscopic view of each object based on a preset priority. For example, the controller 170 sets an object having a priority such as a traffic light, POI, etc. included in the map model to have a higher priority than other objects, and accordingly stereoscopic view of the corresponding objects so that they appear more protruding than other objects. You can also control the sense of distance. The priority of each object can be set by the user.

In addition, the controller 170 may set a sense of distance according to stereoscopic view of the navigation image according to a view mode. In this case, as the sense of distance of the navigation image changes according to the view mode, the sense of distance of each object included in the navigation image also changes. The view mode is classified according to the viewpoint position, the line of sight direction, and the like of the corresponding driver, and the position of the driver's viewpoint for determining the sense of distance of the map model or the navigation image varies according to the view mode. Therefore, the controller 170 may control the sense of distance according to the stereoscopic view of the navigation image to express the corresponding sense of distance according to the view mode of the navigation image.

In addition, the controller 170 may display some objects among the objects included in the navigation image as two-dimensional objects. For example, the controller 170 may display at least one object for receiving a control input as a 2D object. At least one object for receiving the control input may include a button, an icon, a selection indicator, and the like. When all the objects included in the navigation image are displayed as 3D stereoscopic objects, the actual display position where the object is actually displayed on the display area and the virtual display position where the user feels that the object is displayed due to stereoscopic vision may be different. Accordingly, when the user needs to input a control input by touching the display area, it may be difficult to accurately touch the position where the corresponding object is actually displayed. On the other hand, when the object is displayed as a two-dimensional object, the actual display position where the object is displayed in the display area and the virtual display position where the user feels that the object is displayed due to stereoscopic vision are the same. Therefore, when displaying the object for receiving the control input as a two-dimensional object, it is possible to solve the above-described difficulty of accurate operation.

In addition, the controller 170 may display only some of the objects included in the navigation image as a 3D stereoscopic object. For example, the controller 170 may display only an object corresponding to important information in the navigation image as a 3D stereoscopic object. In this case, the remaining objects such as the map model may be displayed in two dimensions. Accordingly, the vehicle navigation apparatus 100 can effectively convey important information to the user by highlighting and displaying important information as a three-dimensional solid object.

As described above, when only some of the objects included in the navigation image are displayed as 2D objects or 3D stereoscopic objects, the objects displayed in 2D or the objects displayed in 3D may be selected by the user. . In this case, the user may be set to display a three-dimensional solid object for an object determined to be important or to display a two-dimensional object for an object receiving a control input. Therefore, a customized stereoscopic image may be provided to the user.

Meanwhile, as described above, when the navigation image includes both the 2D object and the 3D object, the controller 170 may generate the navigation image in various ways.

For example, the controller 170 controls the distance of each object included in the navigation image when converting the two-dimensional navigation image into a three-dimensional navigation image, thereby navigating the navigation image including both the two-dimensional object and the three-dimensional object. Can be generated.

8 illustrates an example of generating a navigation image including both a 2D object and a 3D object. Referring to FIG. 8, the controller 170 generates a 3D navigation image by rendering a 2D navigation image. That is, a left image and a right image to be used for displaying a 3D navigation image are generated from the 2D navigation image. At this time, the controller 170 controls the position of each object in the left image and the right image so that a sense of distance according to stereoscopic vision corresponding to each object is displayed.

Also, for example, the control unit 170 converts an object to be displayed in 3D into a 3D object separately, and then synthesizes the object to be displayed in 2D, thereby navigating the image including both the 2D object and the 3D object. You can also create

9 illustrates another example of generating a navigation image including both a 2D object and a 3D object. Referring to FIG. 9, the controller 170 divides the objects included in the navigation into at least one first object 9a to be displayed in three dimensions and at least one second object 9b to be displayed in two dimensions. The at least one first object 9a to be displayed in three dimensions is separately rendered to generate the three-dimensional solid object 9c. That is, a left image and a right image on which the three-dimensional solid object 9c is displayed are generated. Subsequently, the controller 170 combines the at least one second object 9b displayed in two dimensions to the left image and the right image in which the three-dimensional stereoscopic object 9c is displayed, and finally, the stereoscopic navigation image 9d. Create

As described above, in one embodiment of the present invention, a navigation service may be provided to a driver by providing a navigation service using a 3D navigation image for stereoscopic vision, thereby providing a navigation image similar to an actual driving image. Accordingly, navigation information can be provided to the user more effectively.

In addition, according to an embodiment of the present invention, it is possible to control a sense of distance according to stereoscopic vision of each object included in the navigation image. Accordingly, it is possible to control stereoscopic vision so that each object is protruded or depressed compared to other objects according to importance, thereby providing information to the user more effectively.

In addition, according to an embodiment of the present invention, if some objects are displayed as two-dimensional objects as necessary, various problems that may occur when all objects are displayed only as three-dimensional solid objects, such as reduced accuracy of manipulation, may be solved. .

Embodiments of the present invention include a computer readable medium containing program instructions for performing various computer-implemented operations. This medium records a program for executing the navigation service method described so far. The medium may include program instructions, data files, data structures, etc., alone or in combination. Examples of such media include magnetic media such as hard disks, floppy disks and magnetic tapes, optical recording media such as CDs and DVDs, and ROMs. Hardware devices configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine code, such as produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like.

It is apparent to those skilled in the art that the present invention is not limited to the described embodiments, and that various modifications and changes can be made without departing from the spirit and scope of the present invention. Therefore, such modifications or variations will have to be belong to the claims of the present invention.

Claims (21)

Display module; And
The controller provides an image for stereoscopic vision displayed through the display module and displays at least one object related to navigation included in the stereoscopic image in a non-stereoscopic manner.
.
The method of claim 1,
The at least one object is an electronic device, characterized in that at least one object for receiving a control input, at least one object for displaying a map model or navigation information.
The method of claim 2,
The at least one object for receiving the control input comprises a button, icon or selection indicator.
The method of claim 2,
The at least one object for displaying the navigation information includes at least one of destination information, a moving path to the destination, a moving speed of the vehicle, a moving direction, and a point of interest.
The method of claim 1,
The controller generates a stereoscopic image based on at least one stereoscopic object included in the stereoscopic image, and generates the stereoscopic image by combining the generated stereoscopic image with the at least one non-stereoscopic object. Electronic devices.
The method of claim 1,
And the at least one object is selected by a user.
The method of claim 1,
The controller may control a distance of each of at least one stereoscopic object included in the stereoscopic image based on a preset priority.
The method of claim 7, wherein
The priority is set by a user.
The method of claim 1,
And the at least one criterion comprises a view mode.
10. The method of claim 9,
The controller is configured to set a sense of distance of the stereoscopic image based on a driver's viewpoint corresponding to a current view mode.
The method of claim 1,
And the controller is configured to control a stereoscopic distance by varying a parallax in a left image and a right image included in the stereoscopic image with respect to the stereoscopic object included in the stereoscopic image.
Display module; And
Control unit for displaying a stereoscopic vision image through the display module, and controls the stereoscopic depth of the stereoscopic image based on at least one criterion
.
The method of claim 12,
And the at least one criterion comprises a view mode.
The method of claim 13,
The controller is configured to set a sense of distance of the stereoscopic image based on a driver's viewpoint corresponding to a current view mode.
The method of claim 12,
The controller may be configured to set a stereoscopic distance to a specific value so that at least one object for receiving a control input among a plurality of objects included in the stereoscopic image is displayed in a non-stereoscopic manner.
The method of claim 12,
And the controller controls a stereoscopic distance by varying a parallax in a left image and a right image included in the stereoscopic image for each object included in the stereoscopic image.
Obtaining position data;
Obtaining a plurality of objects for displaying navigation information based on the position data;
Setting a sense of distance according to stereoscopic vision of each of the plurality of objects based on at least one criterion;
Generating a stereoscopic image based on a sense of distance according to stereoscopic vision of each of the plurality of objects; And
Displaying the stereoscopic image
Navigation service method of an electronic device comprising a.
18. The method of claim 17,
Wherein the generating comprises:
Generating a stereoscopic image including at least one object displayed in stereoscopic form among the plurality of objects; And
Generating at least one stereoscopic image by combining at least one object displayed in non-stereoscopic objects among the plurality of objects with the stereoscopic image;
Navigation service method of an electronic device comprising a.
18. The method of claim 17,
The stereoscopic image includes a left image and a right image,
The distance feeling of each of the plurality of objects is controlled by varying the display position in the left image and the right image with respect to the object.
18. The method of claim 17,
The at least one criterion includes a function corresponding to each of the preset priority, view mode, or the plurality of objects.
21. A computer readable recording medium having recorded thereon a program for performing the method of any one of claims 17-20.
KR1020100133411A 2010-12-23 2010-12-23 Electronic device and course guide method of electronic device KR20120091492A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020100133411A KR20120091492A (en) 2010-12-23 2010-12-23 Electronic device and course guide method of electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020100133411A KR20120091492A (en) 2010-12-23 2010-12-23 Electronic device and course guide method of electronic device

Publications (1)

Publication Number Publication Date
KR20120091492A true KR20120091492A (en) 2012-08-20

Family

ID=46883853

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020100133411A KR20120091492A (en) 2010-12-23 2010-12-23 Electronic device and course guide method of electronic device

Country Status (1)

Country Link
KR (1) KR20120091492A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140080789A (en) * 2012-12-18 2014-07-01 두산인프라코어 주식회사 Method for display in vehicle using head up display and apparatus thereof
KR101439410B1 (en) * 2013-02-14 2014-10-30 나비스오토모티브시스템즈 주식회사 navigation system and method for displaying three-dimensional crossroad thereof
WO2015032766A1 (en) * 2013-09-03 2015-03-12 Jaguar Land Rover Limited Instruments 3d display system
KR20160124479A (en) * 2015-04-20 2016-10-28 삼성전자주식회사 Master device, slave device and control method thereof
KR20200003291A (en) * 2020-01-02 2020-01-08 삼성전자주식회사 Master device, slave device and control method thereof

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140080789A (en) * 2012-12-18 2014-07-01 두산인프라코어 주식회사 Method for display in vehicle using head up display and apparatus thereof
KR101439410B1 (en) * 2013-02-14 2014-10-30 나비스오토모티브시스템즈 주식회사 navigation system and method for displaying three-dimensional crossroad thereof
WO2015032766A1 (en) * 2013-09-03 2015-03-12 Jaguar Land Rover Limited Instruments 3d display system
CN105517834A (en) * 2013-09-03 2016-04-20 捷豹路虎有限公司 Instruments 3D display system
GB2517793B (en) * 2013-09-03 2017-06-14 Jaguar Land Rover Ltd Instrument display system
CN105517834B (en) * 2013-09-03 2018-08-28 捷豹路虎有限公司 Instrument 3d display system
KR20160124479A (en) * 2015-04-20 2016-10-28 삼성전자주식회사 Master device, slave device and control method thereof
US11024083B2 (en) 2015-04-20 2021-06-01 Samsung Electronics Co., Ltd. Server, user terminal device, and control method therefor
KR20200003291A (en) * 2020-01-02 2020-01-08 삼성전자주식회사 Master device, slave device and control method thereof

Similar Documents

Publication Publication Date Title
CN106993181B (en) More VR/AR equipment collaboration systems and Synergistic method
US10145703B2 (en) Electronic apparatus, control method thereof, computer program, and computer-readable recording medium
EP3338136B1 (en) Augmented reality in vehicle platforms
US8498816B2 (en) Systems including mobile devices and head-mountable displays that selectively display content, such mobile devices, and computer-readable storage media for controlling such mobile devices
EP2672459B1 (en) Apparatus and method for providing augmented reality information using three dimension map
EP2972095B1 (en) System and method for context dependent level of detail adjustment for navigation maps and systems
KR102123844B1 (en) Electronic apparatus, control method of electronic apparatus and computer readable recording medium
KR20120017228A (en) Mobile terminal and image display method thereof
KR102406491B1 (en) Electronic apparatus, control method of electronic apparatus, computer program and computer readable recording medium
KR20120057696A (en) Electronic device and control method for electronic device
KR102480000B1 (en) Electronic apparatus, route guidance method of electronic apparatus, computer program and computer readable recording medium
KR20120091492A (en) Electronic device and course guide method of electronic device
KR20170017430A (en) Method, electronic apparatus and computer readable recording medium for displaying information regarding user's poi
JP6665402B2 (en) Content display terminal, content providing system, content providing method, and content display program
KR102406489B1 (en) Electronic apparatus, control method of electronic apparatus, computer program and computer readable recording medium
KR102299081B1 (en) Electronic apparatus and control method thereof
KR20120071744A (en) Electronic device and course guide method of electronic device
KR102158167B1 (en) Electronic apparatus, control method of electronic apparatus and computer readable recording medium
KR101678447B1 (en) Mobile Terminal And Method Of Displaying Image
JP2013205072A (en) Map display system, map display method and program
KR20120078876A (en) Electronic device and lane guide method of electronic device
KR101878922B1 (en) Mobile terminal
KR102091017B1 (en) Electronic device and mehtod of shooting pictures thereof
JP2020098635A (en) Content display terminal, method for providing content, and content display program
US20070222856A1 (en) Portable Device for Viewing an Image and Associated Production Method

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination