KR20110056572A - Audio video navigation system comprising a proximity sensor for activating a menu image of a touch screen device - Google Patents

Audio video navigation system comprising a proximity sensor for activating a menu image of a touch screen device Download PDF

Info

Publication number
KR20110056572A
KR20110056572A KR1020090112956A KR20090112956A KR20110056572A KR 20110056572 A KR20110056572 A KR 20110056572A KR 1020090112956 A KR1020090112956 A KR 1020090112956A KR 20090112956 A KR20090112956 A KR 20090112956A KR 20110056572 A KR20110056572 A KR 20110056572A
Authority
KR
South Korea
Prior art keywords
menu
audio
screen
video
touch screen
Prior art date
Application number
KR1020090112956A
Other languages
Korean (ko)
Inventor
황창업
Original Assignee
현대모비스 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 현대모비스 주식회사 filed Critical 현대모비스 주식회사
Priority to KR1020090112956A priority Critical patent/KR20110056572A/en
Publication of KR20110056572A publication Critical patent/KR20110056572A/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Navigation (AREA)

Abstract

PURPOSE: An AVN(Audio Video Navigation) system comprising a proximity sensor for activating a menu screen of a touch screen device is provided to enable the menu selection screen of the touch screen device to be automatically activated by the proximity sensor when an object approaches. CONSTITUTION: An AVN(Audio Video Navigation) system(101) comprising a proximity sensor(110) for activating a menu screen of a touch screen device comprises a proximity sensor, a navigation device(120), an AV device(130) and a touch screen device(140). The proximity sensor senses that an object has approached a given area, and outputs a proximity sensing signal. The navigation device outputs a menu display control signal in response to the proximity sensing signal. Then, the navigation device outputs first video data corresponding to a first sub menu screen or performs an operation corresponding to the selected menu. The AV device outputs second video data corresponding to a second sub menu screen or performs an operation corresponding to the selected menu.

Description

Audio video navigation system comprising a proximity sensor for activating a menu image of a touch screen device}

The present invention relates to an audio video navigation (AVN) system, and more particularly, to an AVN system including a proximity sensor.

In general, an input unit of an AVN system installed in a vehicle is manufactured in a form that a user can easily operate without being disturbed by driving. 1 shows the appearance of a conventional AVN system. The touch screen screen 11 includes a menu activation button image 12 for activating a menu selection screen of the touch screen device. The user may touch the menu activation button image 12 to cause the touch screen device to display a menu selection screen for controlling the operation of the AVN system.

Meanwhile, the conventional AVN system 10 includes physical button inputs 13 and 14 for quick menu movement, which are installed at both sides of the touch screen screen 11. However, when the physical button input units 13 and 14 are installed in the AVN system, the size of the touch screen screen 11 is reduced, so that it is difficult for a user to check the information displayed on the touch screen screen 11 while driving. In addition, when the physical button input units 13 and 14 are installed in the AVN system, a printed circuit board (PCB) corresponding to the physical button input units 13 and 14 and a button light LED for identifying physical buttons at night Since there is a need for an emitting diode and a button mechanism, the manufacturing cost of the AVN system 10 increases.

Therefore, the technical problem to be achieved by the present invention is to automatically activate the menu selection screen of the touch screen device when an external object approaches by using a proximity sensor that detects when an external object approaches a predetermined area. It is to provide an AVN system that can simplify the menu operation of the, and to increase the size of the touch screen screen and reduce the manufacturing cost by removing the physical button input.

According to one aspect of the present invention, an AVN system includes a proximity sensor, a navigation device, an AV device, and a touch screen device. The proximity sensor detects when an external object approaches the set area and outputs a proximity detection signal. The navigation device outputs a menu display control signal to the touch screen device in response to the proximity sensing signal. The navigation apparatus outputs first video data corresponding to the first submenu screen in response to the first menu selection information, or executes an operation corresponding to the selected menu.

In response to the second menu selection information, the AV device outputs second video data corresponding to the second detailed menu screen or executes an operation corresponding to the selected menu.

In the touch screen device, third video data corresponding to a menu selection screen associated with operations of the navigation device and the AV device are stored in advance. The touch screen device displays a menu selection screen in response to a menu display control signal, recognizes a menu selected according to a user's touch input, and outputs first menu selection information to the navigation device according to the recognition result, or 2 Outputs menu selection information to the AV device. The touch screen device displays a first detail menu screen based on the first video data, or displays a second detail menu screen based on the second video data.

According to another aspect of the present invention, an AVN system includes a proximity sensor, a navigation device, an AV device, and a touch screen device. The proximity sensor detects when an external object approaches the set area and outputs a proximity detection signal. The navigation apparatus outputs first video data corresponding to the first submenu screen in response to the first menu selection information, or executes an operation corresponding to the selected menu.

The AV device outputs a menu display control signal to the touch screen device in response to the proximity sensing signal. The AV device outputs second video data corresponding to the second detailed menu screen in response to the second menu selection information, or is selected. Execute the operation corresponding to the menu.

In the touch screen device, third video data corresponding to a menu selection screen associated with operations of the navigation device and the AV device are stored in advance. The touch screen device displays a menu selection screen in response to a menu display control signal, recognizes a menu selected according to a user's touch input, outputs first menu selection information to the navigation device according to the recognition result, or a second Outputs menu selection information to the AV device. The touch screen device displays a first detail menu screen based on the first video data, or displays a second detail menu screen based on the second video data.

As described above, the AVN system according to the present invention automatically activates the menu selection screen of the touch screen device when an external object approaches by using the proximity sensor, so that the user can easily operate the AVN system. In addition, since the AVN system according to the present invention does not include a physical button input unit, the size of the touch screen screen may be increased, and the PCB, lighting LED, and button fixtures for the physical button input unit may also be removed, thereby reducing the manufacturing cost of the AVN system. And the AVN system can be lightened.

Hereinafter, with reference to the accompanying drawings will be described a preferred embodiment of the present invention. However, the present invention is not limited to the embodiments disclosed below, but can be implemented in various forms, and only the present embodiments are intended to complete the disclosure of the present invention and to those skilled in the art. It is provided for complete information.

2 is a schematic block diagram of an AVN system according to an embodiment of the present invention. In FIG. 2, only parts related to the present invention are schematically illustrated for simplicity of the drawings, and illustrations of transmission and reception signals between respective components are omitted.

The AVN system 101 includes a proximity sensor 110, a navigation device 120, an audio and video (AV) device 130, a touch screen device 140, an audio processor 150, and a speaker 160. It includes. The proximity sensor 110 detects when an external object approaches a set area (for example, within a set distance in front of the proximity sensor 110), and outputs a proximity detection signal SEN to the navigation device 120. do. As shown in FIG. 4, the proximity sensor 110 may be installed adjacent to the display screen 140a of the touch screen device 140.

The navigation device 120 includes a GPS (Global Positioning System) receiver 121, a map data DB 122, and a navigation controller 123. The GPS receiver 121 receives the position data LDAT1 to LDATK (K is an integer) from GPS satellites (not shown) and outputs it to the navigation controller 123. The map data DB 122 stores map data MDAT including road information.

The navigation controller 123 outputs a menu display control signal DCTL to the touch screen device 140 in response to the proximity detection signal SEN received from the proximity sensor 110. Thereafter, the navigation controller 123 corresponds to the first detailed menu screen including detailed menu images related to the operation of the navigation device 120 in response to the menu selection information MSEL1 received from the touch screen device 140. The video data VDAT1 is output to the touch screen device 140 or an operation corresponding to the selected menu is executed.

For example, when the selected menu is a map display menu, the navigation controller 123 calculates the current position based on the position data LDAT1 to LDATK received from the GPS receiver 121. Thereafter, the navigation control unit 123 outputs the guide video data GVDAT including the map screen to the touch screen device 140 indicating the current location or the route to the destination selected by the user, and guide audio. The data GADAT is output to the audio processor 150.

The AV device 130 includes a tuner unit 131, an audio reproducing unit 132, a video reproducing unit 133, and an AV control unit 134. The tuner 131 receives any one of a plurality of radio frequency signals in response to the broadcast selection signal BSEL received from the AV control unit 134, and based on the received radio frequency signals, the tuner 131 BADAT) is output to the AV control unit 134.

The audio reproducing unit 132 operates in response to the mode control signal MCTL1 received from the AV control unit 134 or operates by detecting that an audio storage medium is inserted, and reproduces audio data RPDAT1 from the audio storage medium. Is read and output to the AV control unit 134.

The video reproducing unit 133 operates in response to the mode control signal MCTL2 received from the AV control unit 134 or operates by detecting that a video storage medium is inserted, and reproduces audio data RPDAT2 from the video storage medium. And the playback video data RPVDAT are read out and output to the AV control unit 134.

In response to the menu selection information MSEL2 received from the touch screen device 140, the AV control unit 134 includes video data corresponding to a second detailed menu screen including detailed menu images related to the operation of the AV device 130. Outputs the VDAT2 to the touch screen device 140 or executes an operation corresponding to the selected menu.

The AV controller 134 may operate in one of a radio mode, an audio mode, and a video mode according to the selected menu. The AV controller 134 outputs a broadcast selection signal BSEL to the tuner 131 according to a menu selected through the touch screen device 140 in the radio mode, and receives the broadcast audio received from the tuner 131. The data BADAT is output to the audio processor 150.

The AV control unit 134 outputs the mode control signal MCTL1 to the audio reproduction unit 132 in the audio mode, and outputs the reproduced audio data RPDAT1 received from the audio reproduction unit 132 to the audio processing unit 150. . The AV control unit 134 outputs the mode control signal MCTL2 to the video reproducing unit 133 in the video mode, and receives the reproducing audio data RPDAT2 and the reproducing video data RPVDAT from the video reproducing unit 133. The AV controller 134 outputs the reproduced audio data RPDAT2 to the audio processor 150, and outputs the reproduced video data RPVDAT to the touch screen device 140. The AV controller 134 may output video data VDAT indicating operation information of the corresponding mode to the touch screen device 140 in the radio mode or the audio mode.

The AV controller 134 and the navigation controller 123 share the touch screen device 140 and the audio processor 150. The AV control unit 134 and the navigation control unit 123 communicate with each other to inform each other of their own operation state.

In the touch screen device 140, video data MVDAT corresponding to a menu selection screen related to operations of the navigation device 120 and the AV device 130 is stored in advance.

The touch screen device 140 displays a menu selection screen in response to a menu display control signal DCTL received from the navigation controller 123. Thereafter, the selected menu is recognized according to the user's touch input, and the menu selection information MSEL1 is output to the navigation control unit 123 or the menu selection information MSEL2 is output to the AV control unit 134 according to the recognition result. Output

The touch screen device 140 displays a map screen based on the guide video data GVDAT received from the navigation controller 123. The touch screen device 140 displays the first detailed menu screen based on the video data VDAT1 received from the navigation controller 123. The touch screen device 140 displays a second detailed menu screen based on the video data VDAT2 received from the AV controller 134.

In addition, the touch screen device 140 displays a playback video screen based on the playback video data RPVDAT received from the AV control unit 134. The touch screen device 140 displays a screen indicating operation information of a radio mode or an audio mode based on the video data VDAT received from the AV controller 134.

The audio processor 150 receives guide audio data GADAT corresponding to a voice guiding a route to the selected destination from the navigation control unit 123, or broadcast audio data BADAT and playback from the AV control unit 134. One of the audio data RPDAT1 and the reproduced audio data RPDAT2 is received.

When receiving the guided audio data GADAT, the audio processor 150 converts the guided audio data GADAT into a guided audio signal GAUD and outputs it to the speaker 160. When the audio processor 150 receives the broadcast audio data BADAT, the audio processor 150 converts the broadcast audio data BADAT into a broadcast audio signal BAAUD and outputs the broadcast audio signal BAAUD to the speaker 160. When receiving the reproduced audio data RPDAT1, the audio processor 150 converts the reproduced audio data RPDAT1 into a reproduced audio signal RPAUD1 and outputs the reproduced audio signal RPAUD1 to the speaker 160. When receiving the reproduced audio data RPDAT2, the audio processor 150 converts the reproduced audio data RPDAT2 into a reproduced audio signal RPAUD2 and outputs the reproduced audio signal RPAUD2 to the speaker 160.

3 is a flowchart illustrating a menu selection process of the AVN system illustrated in FIG. 2.

First, the navigation controller 123 determines whether a proximity detection signal SEN is received from the proximity sensor 110 (step 1001). When the proximity sensing signal SEN is received, the navigation controller 123 outputs a menu display control signal DCTL to the touch screen device 140. As a result, the touch screen device 140 displays a menu selection screen in response to the menu display control signal DCTL (step 1002). Thereafter, the touch screen device 140 determines whether one of the menu images displayed on the menu selection screen is selected by the user (step 1003). When the menu image (eg, 140b (see FIG. 4)) is not selected, the touch screen device 140 maintains the display operation of the menu selection screen.

When the menu image is selected, the touch screen device 140 outputs the menu selection information MSEL1 to the navigation controller 123 or the menu selection information MSEL2 to the AV controller 134 according to the selected menu. . For example, when the touch screen device 140 outputs the menu selection information MSEL1 to the navigation control unit 123, the navigation control unit 123 checks information of the selected menu (step 1004). The navigation controller 123 determines whether detailed menu selection is necessary for the selected menu (step 1005).

On the other hand, when the touch screen device 140 outputs the menu selection information MSEL2 to the AV controller 134, the AV controller 134 checks the information of the selected menu and determines whether detailed menu selection is necessary for the selected menu. Determine whether or not.

If it is determined in step 1005 that no detailed menu selection is necessary, the navigation control unit 123 executes an operation corresponding to the selected menu (step 1006).

In operation 1005, when it is determined that the detailed menu selection is necessary, the navigation controller 123 outputs the video data VDAT1 corresponding to the first detailed menu screen to the touch screen device 140. As a result, the touch screen device 140 displays the first detailed menu screen based on the video data VDAT1 (step 1007). Thereafter, the AVN system 101 repeats the operations of steps 1003 to 1007.

As described above, the AVN system 101 uses the proximity sensor 110 to activate the touch screen device 140 to automatically display a menu selection screen when the user's hand or the like is close, so that the user can more easily. The AVN system 101 can be operated. In addition, since the AVN system 101 does not include a physical button input unit, the size of the display screen 140a of the touch screen device 140 may be increased, and the PCB, illumination LED, and button mechanism for the physical button input unit may also be removed. Therefore, the manufacturing cost of the AVN system is reduced, and the AVN system can be reduced in weight.

5 is a schematic block diagram of an AVN system according to another embodiment of the present invention. The configuration and detailed operation of the AVN system 102 are almost similar to those of the AVN system 101 described above. In order to avoid duplication of description, this embodiment will be described based on the differences between the AVN systems 102 and 101.

The difference between the AVN systems 102 and 101 is that the AV control unit 134 of the AVN system 102 touches the menu display control signal DCTL in response to the proximity detection signal SEN received from the proximity sensor 110. Output to the screen device 140. The touch screen device 140 displays a menu selection screen in response to a menu display control signal DCTL received from the AV controller 134.

The above embodiments are for explaining the present invention, and the present invention is not limited to these embodiments, and various embodiments are possible within the scope of the present invention. In addition, although not described, equivalent means will also be referred to as being incorporated in the present invention. Therefore, the true scope of the present invention will be defined by the claims below.

1 is a front view showing the appearance of a conventional AVN system.

2 is a schematic block diagram of an AVN system according to an embodiment of the present invention.

3 is a flowchart illustrating a menu selection process of the AVN system illustrated in FIG. 2.

4 is a perspective view illustrating an appearance of an AVN system illustrated in FIG. 2.

5 is a schematic block diagram of an AVN system according to another embodiment of the present invention.

Description of the Related Art

101, 102: AVN system 110: proximity sensor

120: navigation device 130: AV device

140: touch screen device 150: audio processing unit

160: speaker

Claims (6)

A proximity sensor that detects when an external object approaches a predetermined area and outputs a proximity detection signal; Outputting a menu display control signal in response to the proximity sensing signal, outputting first video data corresponding to a first detailed menu screen in response to first menu selection information, or executing an operation corresponding to the selected menu Navigation devices; An audio and video (AV) device outputting second video data corresponding to the second detailed menu screen or executing an operation corresponding to the selected menu in response to the second menu selection information; And Third video data corresponding to a menu selection screen related to operations of the navigation device and the AV device are stored in advance, and the menu selection screen is displayed in response to the menu display control signal, and a menu selected according to a user's touch input And output the first menu selection information to the navigation device according to the recognition result, or output the second menu selection information to the AV device and based on the first video data. And a touch screen device for displaying a menu screen or displaying the second detailed menu screen based on the second video data. The method of claim 1, Receive guidance audio data corresponding to a voice guiding a route to a selected destination from the navigation device, or receive one of broadcast audio data, first playback audio data, and second playback audio data from the AV device; Converting the guide audio data into a guide audio signal when receiving the guide audio data and outputting the guide audio data to a speaker, and converting the broadcast audio data into a broadcast audio signal and outputting the guide audio data to the speaker when receiving the guide audio data, Converting the first reproduction audio data into a first reproduction audio signal when the first reproduction audio data is received and outputting the first reproduction audio data to the speaker; and receiving the second reproduction audio data when the second reproduction audio data is received. Audio that is converted into a playback audio signal and output to the speaker AVN device further comprising a processing unit. The method of claim 2, When the selected menu is a map display menu, the navigation device calculates a current position based on position data received from GPS (Global Positioning System) satellites, indicates the current position, or reaches a destination selected by a user. Outputting guide video data including a map screen indicating a route to the touch screen device, outputting the guide audio data to the audio processing unit, And the touch screen device to display the map screen based on the guide video data. The method of claim 2, The AV device operates in one of a radio mode, an audio mode, and a video mode according to a selected menu, detects that an audio storage medium is inserted, operates in the audio mode, and detects that a video storage medium is inserted. Operates in a video mode, receives one of a plurality of radio frequency signals in the radio mode, outputs the broadcast audio data to the audio processor based on the received radio frequency signal, and outputs the audio in the audio mode Reading the first reproduction audio data from a storage medium and outputting the first reproduction audio data to the audio processing unit; reading the second reproduction audio data and reproduction video data from the video storage medium in the video mode; Output to an audio processing unit, and the reproduction ratio O and output data to the touch screen device, And the touch screen device displays a playback video screen based on the playback video data. The method of claim 1, The proximity sensor is installed adjacent to the display screen of the touch screen device. A proximity sensor that detects when an external object approaches a predetermined area and outputs a proximity detection signal; A navigation device configured to output first video data corresponding to the first detailed menu screen or to execute an operation corresponding to the selected menu in response to the first menu selection information; Outputting a menu display control signal in response to the proximity sensing signal, outputting second video data corresponding to a second detailed menu screen in response to second menu selection information, or executing an operation corresponding to the selected menu Audio and video (AV) devices; And Third video data corresponding to a menu selection screen related to operations of the navigation device and the AV device are stored in advance, and the menu selection screen is displayed in response to the menu display control signal, and a menu selected according to a user's touch input And output the first menu selection information to the navigation device according to the recognition result, or output the second menu selection information to the AV device and based on the first video data. And a touch screen device for displaying a menu screen or displaying the second detailed menu screen based on the second video data.
KR1020090112956A 2009-11-23 2009-11-23 Audio video navigation system comprising a proximity sensor for activating a menu image of a touch screen device KR20110056572A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020090112956A KR20110056572A (en) 2009-11-23 2009-11-23 Audio video navigation system comprising a proximity sensor for activating a menu image of a touch screen device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020090112956A KR20110056572A (en) 2009-11-23 2009-11-23 Audio video navigation system comprising a proximity sensor for activating a menu image of a touch screen device

Publications (1)

Publication Number Publication Date
KR20110056572A true KR20110056572A (en) 2011-05-31

Family

ID=44364907

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020090112956A KR20110056572A (en) 2009-11-23 2009-11-23 Audio video navigation system comprising a proximity sensor for activating a menu image of a touch screen device

Country Status (1)

Country Link
KR (1) KR20110056572A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013063372A1 (en) * 2011-10-26 2013-05-02 Google Inc. Detecting object moving toward or away from a computing device
KR101675830B1 (en) * 2015-09-21 2016-11-15 주식회사 아이콘트롤스 Wallpad for visually impaired person

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013063372A1 (en) * 2011-10-26 2013-05-02 Google Inc. Detecting object moving toward or away from a computing device
KR101675830B1 (en) * 2015-09-21 2016-11-15 주식회사 아이콘트롤스 Wallpad for visually impaired person

Similar Documents

Publication Publication Date Title
JP6457492B2 (en) Vehicle equipment
CN101674957B (en) Navigation device
US7825991B2 (en) Multi-video display system
EP2952376A2 (en) Input system disposed in steering wheel and vehicle including the same
US20080215240A1 (en) Integrating User Interfaces
US20070067088A1 (en) In-vehicle multifunctional information device
CN102449435A (en) Navigation device
US20110164053A1 (en) Information processing device and information processing method
EP1542189A2 (en) Remote control of an information processing system
JP2009216888A (en) Screen display device
KR102124197B1 (en) System for controlling in-vehicle-infortainment apparatus using mobile terminal and method for the same
JP2006001498A (en) On-vehicle unit device and operation method by touch panel
KR20110056572A (en) Audio video navigation system comprising a proximity sensor for activating a menu image of a touch screen device
JP2004246455A (en) Operation screen display device
KR20160089752A (en) A portable terminal for controlling a car interface and a method for operating it
US20230117342A1 (en) Movable electronic apparatus and method of controlling the same
KR20130106929A (en) Multimedia control apparatus for car
KR20100126244A (en) Integrated system of vehicle black-bok and navigation system
JP2015162019A (en) Vehicle display controller
JP2008287810A (en) Information reproducing unit and electronic equipment
US20070063826A1 (en) In-vehicle multifunctional information device
KR101494810B1 (en) System, method and computer readable recording medium for controlling a navigation by the recognition of a gesture according to the variation of near and far
JP5371121B2 (en) Electronic device and program
US9573471B2 (en) Terminal, vehicle having the same, and control method thereof
KR102428271B1 (en) Electronic apparatus and method for controlling sound of electronic apparatus

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application