WO2018230526A1 - Système d'entrée et procédé d'entrée - Google Patents

Système d'entrée et procédé d'entrée Download PDF

Info

Publication number
WO2018230526A1
WO2018230526A1 PCT/JP2018/022305 JP2018022305W WO2018230526A1 WO 2018230526 A1 WO2018230526 A1 WO 2018230526A1 JP 2018022305 W JP2018022305 W JP 2018022305W WO 2018230526 A1 WO2018230526 A1 WO 2018230526A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
gesture
gesture recognition
recognition space
image
Prior art date
Application number
PCT/JP2018/022305
Other languages
English (en)
Japanese (ja)
Inventor
中井 潤
隆 大河平
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2018230526A1 publication Critical patent/WO2018230526A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Definitions

  • This disclosure relates to an input system and an input method using gesture input in a vehicle.
  • This disclosure provides a technique that allows a gesture operation to be comfortably performed in a vehicle.
  • the input system includes a display, a sensor, a display, and an optical plate.
  • the display presents information to passengers in the vehicle.
  • the sensor recognizes a passenger's gesture operation performed in a gesture recognition space set in the vicinity of the display.
  • the indicator is installed outside the gesture recognition space.
  • the optical plate is installed between the gesture recognition space and the display, and forms a gesture guide image displayed on the display in the gesture recognition space.
  • the display displays information reflecting the gesture operation recognized by the sensor.
  • FIG. 1A is a diagram illustrating a configuration example of gesture input.
  • FIG. 1B is a diagram illustrating a configuration example of gesture input.
  • FIG. 2A is a diagram illustrating a configuration example in which gesture input and an aerial display are combined.
  • FIG. 2B is a diagram illustrating a configuration example in which gesture input and an aerial display are combined.
  • FIG. 3A is a diagram illustrating an installation example in the vehicle of the input system according to the embodiment of the present disclosure.
  • FIG. 3B is a diagram illustrating an installation example in the vehicle of the input system according to the embodiment of the present disclosure.
  • FIG. 3C is a diagram illustrating an installation example in the vehicle of the input system according to the embodiment of the present disclosure.
  • FIG. 3A is a diagram illustrating an installation example in the vehicle of the input system according to the embodiment of the present disclosure.
  • FIG. 3B is a diagram illustrating an installation example in the vehicle of the input system according to the embodiment of the present disclosure.
  • FIG. 3C is
  • FIG. 4 is a diagram illustrating an example of an installation method of the optical plate and the display.
  • FIG. 5 is a block diagram illustrating a configuration of the input system according to the embodiment of the present disclosure.
  • FIG. 6A is a diagram illustrating a specific example of the gesture operation using the input system according to the embodiment of the present disclosure.
  • FIG. 6B is a diagram illustrating a specific example of the gesture operation using the input system according to the embodiment of the present disclosure.
  • FIG. 6C is a diagram illustrating a specific example of the gesture operation using the input system according to the embodiment of the present disclosure.
  • FIG. 7 is a flowchart showing an operation of the input system according to the embodiment of the present disclosure.
  • FIG. 8A is a diagram illustrating an installation example of the input system according to the modification in the vehicle.
  • FIG. 8B is a diagram illustrating an installation example of the input system according to the modification in the vehicle.
  • FIG. 1A and FIG. 1B are diagrams showing a configuration example of gesture input.
  • FIG. 1A is a configuration example in which a camera is used as the gesture detection sensor 50.
  • a gesture detection camera is installed below the information display 30. The direction of the camera is set so that a space that is a predetermined distance away from the screen of the display 30 is within the angle of view. A space that falls within the angle of view of the camera in front of the screen is a gesture recognition space S1.
  • FIG. 1B is a configuration example in which a highly sensitive capacitive touch panel sensor is used as the gesture detection sensor 50.
  • a capacitive touch panel sensor is installed on the surface of the display 30.
  • a non-contact space close to the surface of the display 30 is a gesture recognition space S1.
  • the user can perform a hover operation in the gesture recognition space S1.
  • the gesture recognition space S1 becomes wider as the sensitivity of the touch panel sensor is higher.
  • an infrared sensor, an ultrasonic sensor, or the like can be used instead of the camera or the capacitive touch panel sensor.
  • the gesture input particularly the gesture input of the configuration example shown in FIG. 1A has the following problems.
  • (1) It is difficult for the operator to perform a gesture operation in the range of the gesture recognition space S1.
  • the gesture recognition space S1 is an empty space as seen from the operator, and it is difficult for the operator to grasp the space range.
  • (2) Displaying a gesture operation guide on the display 30 is troublesome for non-operators.
  • the gesture input and the aerial display are used in combination. Specifically, using the aerial display, the gesture operation guide video is displayed in the air in the gesture recognition space S1.
  • FIG. 2A and FIG. 2B are diagrams showing a configuration example in which gesture input and an aerial display are combined.
  • FIG. 2A is a configuration example in which a camera is used as the gesture detection sensor 50
  • FIG. 2B is a configuration example in which a capacitive touch panel sensor is used as the gesture detection sensor 50.
  • the aerial display is realized by the display 41 and the optical plate 42.
  • the optical plate 42 subjected to the special processing described above forms an image displayed on the display 41 installed on one side of the optical plate 42 in a line-symmetric space position on the opposite side of the optical plate 42. Can be made.
  • the installation positions of the optical plate 42 and the display device 41 are determined based on the relationship between the position of the guide image I1 to be displayed in the gesture recognition space S1 set in front of the display 30 and the reflection.
  • the combination of gesture input and aerial display has the following advantages. (1) By displaying an aerial gesture operation guide that is visible only to the operator, it is easy to perform a gesture operation within the range of the gesture recognition space S1. (2) By displaying an aerial gesture operation guide that is visible only to the operator, there is no trouble for anyone other than the operator. (3) Since only the gesture operation guide is displayed, there is little visual trouble for the operator. (4) It is possible to intuitively operate devices other than the display displayed in front of the eyes and the device in front of the eyes. (5) The amount of information to be presented can be ensured by installing an aerial display for gesture operation guide separately from the display 30 for displaying normal information.
  • the aerial display technology as of 2017 has lower luminance and lower resolution than general liquid crystal displays and organic EL (organic electro-luminescence: OEL) displays, and is suitable for displaying small characters and pictures. Not.
  • OEL organic electro-luminescence
  • the aerial display is preferably used in combination with the normal display 30. Since the aerial display technology as of 2017 has a narrower viewing angle than a general display, there is an advantage in terms of presenting information only to a specific operator.
  • the above-described aerial display requires the display 41 and the optical plate 42, and a space for storing the display 41 and the optical plate 42 is required below the display 30.
  • FIG. 3A to 3C are diagrams illustrating an installation example of the input system according to the embodiment of the present disclosure in a vehicle.
  • FIG. 3A is a schematic view of the vicinity of the driver's seat in the vehicle.
  • a display 30 for presenting information is installed on the dashboard 4.
  • a center display of a car navigation apparatus can be used as the information presentation display 30.
  • the display of a smart phone or a tablet fixed to the holder on the dashboard 4 may be used.
  • the gesture recognition space S ⁇ b> 1 is set to a space below the display surface of the display 30. Specifically, it is installed on the inclined surface of the center console 5 that extends while tilting forward and downward from the installation position of the display 30 of the dashboard 4.
  • the gesture recognition space S1 is set on the upper side of the inclined surface, but may be set on the center or lower side of the inclined surface.
  • the steering wheel 3a is installed on the right side of the inclined surface of the center console 5, and the driver can easily reach the gesture recognition space S1 with the left hand.
  • FIG. 3B is a schematic diagram showing the positional relationship between the display 30 and the optical plate 42.
  • An optical plate 42 is installed in parallel with the inclined surface of the center console 5.
  • a guide image Ia with a left arrow is displayed in the air above the optical plate 42.
  • FIG. 3C is a schematic view of the positional relationship among the display 30, the optical plate 42, and the display device 41 viewed from the side surface direction. From the viewpoint E1 of the driver, the guide video Ia appears to appear on the inclined surface of the center console 5.
  • FIG. 4 is a diagram illustrating an example of an installation method of the optical plate 42 and the display device 41.
  • the display device 41 is housed and installed in a storage box 45.
  • the inside of the storage box 45 is subjected to low reflection processing.
  • a normal liquid crystal display module (LCM) can be used for the display device 41.
  • a light control film (LCF) 43 is attached to the surface of the display 41 on the display surface side.
  • the light control film 43 is a film that suppresses diffused light and improves the parallelism of light, and can improve luminance and visibility when the display device 41 is viewed from the front.
  • the optical plate 42 is installed at the position of the upper lid of the storage box 45.
  • the periphery of the display 41 can be darkened and an image is formed in the air on the opposite side of the optical plate 42.
  • the visibility of the guide video can be improved.
  • the storage box 45 shown in FIG. 4 is installed inside the inclined surface of the center console 5 shown in FIG. 3A, for example.
  • FIG. 5 is a block diagram illustrating a configuration of the input system 2 according to the embodiment of the present disclosure.
  • the input system 2 includes a control device 10, a display 30, an aerial image display device 40, and a gesture detection sensor 50.
  • the aerial image display device 40 includes a display 41 and an optical plate 42 as main members.
  • the light control film 43 shown in FIG. 4 is not essential and can be omitted.
  • the control device 10 includes a processing unit 11, an input / output unit (I / O unit) 12, and a recording unit 13.
  • the processing unit 11 includes a screen control unit 111, a guide control unit 112, a detection information acquisition unit 113, an operation content determination unit 114, and a device control unit 115.
  • the function of the processing unit 11 can be realized by cooperation of hardware resources and software resources.
  • Hardware resources include CPU (central processing unit), GPU (graphics processing unit), DSP (digital signal processor), FPGA (field-programmable gate array), ROM (read-only memory), RAM (random-access memory), Other LSIs (large-scale integration) can be used.
  • Programs such as operating system, application, firmware, etc. can be used as software resources.
  • the recording unit 13 is a nonvolatile memory, and includes a recording medium such as a NAND flash memory chip, an SSD (solid-state drive), and an HDD (hard disk drive).
  • the control device 10 may be mounted in a dedicated housing, or may be mounted in a head unit such as a car navigation device or display audio.
  • a head unit such as a car navigation device or display audio.
  • casings may be sufficient, and the form which utilizes those existing hardware resources by a time division may be sufficient.
  • the form which utilizes the hardware resource of the information equipment brought in from the outside, such as a smart phone and a tablet, may be sufficient.
  • the display 30 is a display installed in the vehicle interior as described above, and a liquid crystal display or an organic EL display can be used.
  • the gesture detection sensor 50 is a sensor for recognizing a passenger's gesture operation performed in the gesture recognition space S1 set in the vicinity of the display 30 in the vehicle interior. As described above, a camera, a non-contact type touch panel, or the like can be used.
  • the I / O unit 12 outputs an image signal supplied from the processing unit 11 to the display 30, outputs an image signal supplied from the processing unit 11 to the display 41, and a detection signal supplied from the gesture detection sensor 50. Is output to the processing unit 11.
  • the screen control unit 111 generates all image data to be displayed on the display 30, and outputs and displays the image data on the display 30.
  • the guide control unit 112 generates image data to be displayed in the air as a gesture guide video in the gesture recognition space S1, and outputs and displays the image data on the display device 41.
  • the guide control unit 112 displays a symbol image indicating the operation content in the air as a gesture guide image.
  • graphic symbol marks such as circles, triangles, squares, crosses, arrows, and crosses may be displayed in the air, or icons representing the operation contents may be displayed in the air.
  • the guide control unit 112 may display an image defining the range of the gesture recognition space S1 in the air as a gesture guide image.
  • the image of the frame of the gesture recognition space S1 may be an aerial image.
  • a point image may be displayed in the air at the position of each vertex in the gesture recognition space S1.
  • the detection information acquisition unit 113 acquires detection information detected by the gesture detection sensor 50. For example, the image data of the gesture recognition space S1 photographed by the camera is acquired.
  • the operation content determination unit 114 determines the operation content based on the detection information acquired by the detection information acquisition unit 113. For example, a hand is detected as an object from the acquired image, and the detected movement of the hand is followed.
  • the operation content determination unit 114 specifies the gesture operation content from the detected hand movement. Note that the hand search range in the image may be narrowed down to a nearby region where the guide video is displayed.
  • the screen control unit 111 causes the display 30 to display an image reflecting the gesture operation determined by the operation content determination unit 114.
  • the screen control unit 111 displays an image (a mark, an icon, a pictogram, a symbol, or the like) indicating that the gesture operation has been accepted.
  • the screen control unit 111 displays an image (for example, an icon during processing and an icon indicating completion of processing) indicating the state of device operation corresponding to the accepted gesture operation.
  • the device control unit 115 executes the operation content corresponding to the gesture operation determined by the operation content determination unit 114 for the device in the vehicle. For example, an operation of a car navigation device, an operation of display audio, an operation of an air conditioner, an operation of a power window, an operation of turning on / off a room lamp, and the like are executed. In addition, you may perform driving operation of vehicles, such as turning on / off of a blinker, a gear shift, a sound of a horn, passing, and start / end of a wiper.
  • FIG. 6A to 6C are diagrams illustrating specific examples of the gesture operation using the input system 2 according to the embodiment of the present disclosure.
  • FIG. 6A is an example in the case of executing a function that is not displayed on a device in front of the operator's eyes (in the vicinity of performing the gesture operation).
  • a display 30 is a display of the car navigation device, and the volume of the voice guidance of the car navigation device is changed by a gesture operation.
  • a left guide image Ia and a right guide image Ib are imaged in the air.
  • the volume is decreased by the gesture of the operator receiving the guide image Ia indicated by the left arrow in the left direction, and the volume is increased by the gesture of receiving the guide image Ib indicated by the right arrow in the right direction.
  • the volume of the volume bar 30a displayed on the display 30 is reduced by lowering the volume by moving the guide image Ia indicated by the left arrow in the left direction.
  • FIG. 6B shows an example of operating a device that is not in front of the operator.
  • a guide image Ic with an upward arrow is imaged in the air.
  • the right blinker blinks with a gesture in which the operator receives the upward arrow guide image Ic upward.
  • an icon 30b indicating that the right turn signal blinks is displayed on the screen of the display 30.
  • a guide image indicated by a down arrow is formed in the air, and the right blinker is turned off by a gesture in which the operator receives the guide image indicated by the down arrow downward.
  • FIG. 6C shows an example of operating a device that is in front of the operator but is difficult to reach.
  • the display 30 is a display of the car navigation apparatus, and flicks or swipes the map displayed on the display 30.
  • a map 30c is displayed on the screen of the display 30, and a left guide image Ia and a right guide image Ib are formed in the air in the gesture recognition space S1 above the optical plate 42.
  • the operator flicks or swipes the map 30c in the left direction with a gesture that causes the left arrow guide image Ia to move left, and flicks or swipes the map 30c in the right direction with a gesture that causes the right arrow guide image Ib to move in the right direction. Is done.
  • a flick operation is performed when the hand movement is less than a predetermined speed
  • a swipe operation is performed when the hand movement is greater than the predetermined speed.
  • the map 30c is flicked to the left by receiving the guide image Ia of the left arrow in the left direction.
  • FIG. 7 is a flowchart showing the operation of the input system 2 according to the embodiment of the present disclosure.
  • the guide control unit 112 displays a predetermined guide image in the air in the gesture recognition space S1 (S10).
  • the detection information acquisition unit 113 acquires detection information based on an operator's gesture operation detected by the gesture detection sensor 50 (S11).
  • the operation content determination unit 114 identifies the operation content based on the acquired detection information (S12).
  • the screen control unit 111 causes the display 30 to display an image indicating completion of reception of the specified operation content (S13).
  • the device control unit 115 controls the device according to the specified operation content (S14).
  • the gesture operation within the gesture recognition space S1 is facilitated, and the probability that the gesture operation will be missed is increased. It can be greatly reduced. Therefore, a passenger in the vehicle can perform a gesture operation comfortably.
  • the guide video is displayed in the air toward the driver, the guide video cannot be seen by the passenger sitting in the passenger seat due to the narrow viewing angle characteristics. Therefore, the visual annoyance of passengers other than the subject does not occur. Further, if the guide video is displayed in the air only during the gesture operation, the visual inconvenience of the subject does not occur. Moreover, the amount of information presentation can be ensured by using together with the existing display for information display. Only the aerial display limits the amount of information presented.
  • FIGS. 8A and 8B are diagrams showing an installation example of the input system 2 according to the modification in the vehicle.
  • the gesture recognition space S ⁇ b> 1 is installed on the center portion of the dashboard 4.
  • the aerial image display device 40 is installed inside the center of the dashboard 4.
  • the information presentation display 30 is installed at a position close to the gesture recognition space S ⁇ b> 1 on the inclined surface of the center console 5.
  • a display of a head unit such as display audio can be used as the information presentation display 30.
  • an aerial video display device 40 having a display 41 and an optical plate 42 is embedded in the upper part of the joint portion of the steering column 3b with the steering wheel 3a.
  • the guide image I1 is imaged in the back of the steering wheel 3a (above the steering column 3b) as viewed from the driver.
  • the display in the instrument panel 6 can be used as a display for presenting information.
  • the example which uses a liquid crystal display module for the display 41 was demonstrated in FIG. 4, in the use with which a display image is limited, you may create the display 41 by installing several light emitting diodes on a board
  • the display device 41 can be created simply by installing eight light emitting diodes at predetermined positions on the substrate.
  • the guide control unit 112 may adjust the luminance of the light emitting diode according to the brightness in the vehicle.
  • the brightness in the vehicle is determined based on illuminance information detected by an illuminance sensor (not shown) installed in the vehicle.
  • the illuminance at the current position may be acquired from a server of the Japan Meteorological Agency or a private weather company via a wireless communication network.
  • the guide control unit 112 decreases the luminance of the light emitting diode as the illuminance in the vehicle is lower.
  • the luminance of the light emitting diode can be controlled by adjusting the drive current or the PWM (pulse width modulation) ratio.
  • the original display luminance is low, so that it is not necessary to reduce the luminance at night.
  • luminance control when using the liquid crystal display module is not excluded.
  • the input system (2) includes a display (30), a sensor (50), a display (41), and an optical plate (42).
  • the display (30) presents information to passengers in the vehicle.
  • the sensor (50) recognizes the occupant's gesture operation performed in the gesture recognition space (S1) set in the vicinity of the display (30).
  • the display (41) is installed outside the gesture recognition space (S1).
  • the optical plate (42) is installed between the gesture recognition space (S1) and the display (41), and images the gesture guide image displayed on the display (41) in the gesture recognition space (S1).
  • the display (30) displays information reflecting the gesture operation recognized by the sensor (50).
  • the display device (41) displays a symbol image indicating the operation content as a gesture guide image
  • the optical plate (42) displays the symbol image as a gesture recognition space (S1). To form an image.
  • the display (41) displays an image defining the range of the gesture recognition space (S1) as a gesture guide image, and the optical plate (42) is a gesture recognition space.
  • An image defining the range of (S1) is imaged in the gesture recognition space (S1).
  • the display (30) is a center display (30) installed on the dashboard (4), and the gesture recognition space (S1) is a center display. It is set in the space below the front with respect to the display surface of (30).
  • the gesture recognition space (S1) is set on the center of the dashboard (4), and the display (30) is the center console (5). It is installed at a position close to the gesture recognition space (S1).
  • the gesture guide image is displayed at the same height as the windshield, the visibility of the gesture guide image during driving can be improved.
  • the input method includes a step of recognizing a passenger's gesture operation performed in a gesture recognition space (S1) set in the vicinity of a display (30) for presenting information to a passenger in the vehicle (1). . Moreover, the input method uses an optical plate (42) installed between the gesture recognition space (S1) and the display (41) installed outside the gesture recognition space (S1). 41) imaging the gesture guide image displayed in 41) in the gesture recognition space (S1). Further, the input method includes a step of displaying information reflecting the recognized gesture operation on the display (30).
  • the present disclosure relates to a technique capable of performing a gesture operation comfortably in a vehicle, and is particularly useful as an input system and an input method.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

La présente invention concerne un système d'entrée comprenant : une unité d'affichage ; un capteur ; un indicateur ; et une plaque optique. L'unité d'affichage présente des informations à un passager dans un véhicule. Le capteur reconnaît le geste d'un passager effectué dans un espace de reconnaissance des gestes placé à proximité de l'unité d'affichage. L'indicateur est installé en dehors de l'espace de reconnaissance des gestes. La plaque optique est installée entre l'espace de reconnaissance des gestes et l'indicateur, et forme, dans l'espace de reconnaissance des gestes, une image guide des gestes qui a été affichée sur l'indicateur. L'unité d'affichage affiche des informations reflétant le geste reconnu par le capteur.
PCT/JP2018/022305 2017-06-13 2018-06-12 Système d'entrée et procédé d'entrée WO2018230526A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017116226A JP2020126282A (ja) 2017-06-13 2017-06-13 入力システム、入力方法
JP2017-116226 2017-06-13

Publications (1)

Publication Number Publication Date
WO2018230526A1 true WO2018230526A1 (fr) 2018-12-20

Family

ID=64658632

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/022305 WO2018230526A1 (fr) 2017-06-13 2018-06-12 Système d'entrée et procédé d'entrée

Country Status (2)

Country Link
JP (1) JP2020126282A (fr)
WO (1) WO2018230526A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111984117A (zh) * 2020-08-12 2020-11-24 深圳创维-Rgb电子有限公司 全景地图控制方法、装置、设备及存储介质

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7216925B2 (ja) * 2020-08-28 2023-02-02 大日本印刷株式会社 空中結像装置、空中入力装置、空中結像装置付き表示装置、移動体及びホログラム結像レンズ

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005138755A (ja) * 2003-11-07 2005-06-02 Denso Corp 虚像表示装置およびプログラム
JP2007326409A (ja) * 2006-06-06 2007-12-20 Toyota Motor Corp 車両用表示装置
JP5509391B1 (ja) * 2013-06-07 2014-06-04 株式会社アスカネット 再生画像の指示位置を非接触で検知する方法及び装置
JP2016021082A (ja) * 2014-07-11 2016-02-04 船井電機株式会社 画像表示装置
JP2017084136A (ja) * 2015-10-28 2017-05-18 アルパイン株式会社 ジェスチャ入力装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005138755A (ja) * 2003-11-07 2005-06-02 Denso Corp 虚像表示装置およびプログラム
JP2007326409A (ja) * 2006-06-06 2007-12-20 Toyota Motor Corp 車両用表示装置
JP5509391B1 (ja) * 2013-06-07 2014-06-04 株式会社アスカネット 再生画像の指示位置を非接触で検知する方法及び装置
JP2016021082A (ja) * 2014-07-11 2016-02-04 船井電機株式会社 画像表示装置
JP2017084136A (ja) * 2015-10-28 2017-05-18 アルパイン株式会社 ジェスチャ入力装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111984117A (zh) * 2020-08-12 2020-11-24 深圳创维-Rgb电子有限公司 全景地图控制方法、装置、设备及存储介质

Also Published As

Publication number Publication date
JP2020126282A (ja) 2020-08-20

Similar Documents

Publication Publication Date Title
TWI578021B (zh) 擴增實境互動系統及其動態資訊互動顯示方法
CN107351763B (zh) 用于车辆的控制装置
JP3979002B2 (ja) コンピュータのユーザ・インターフェース用システム及びユーザ・インターフェース提供方法
JP6413207B2 (ja) 車両用表示装置
US10591723B2 (en) In-vehicle projection display system with dynamic display area
US9008904B2 (en) Graphical vehicle command system for autonomous vehicles on full windshield head-up display
US9057874B2 (en) Virtual cursor for road scene object selection on full windshield head-up display
KR20170141484A (ko) 차량용 제어장치 및 그것의 제어방법
US9256325B2 (en) Curved display apparatus for vehicle
KR101610098B1 (ko) 차량용 곡면 디스플레이 장치
JP6331567B2 (ja) 車両用表示入力装置
KR102051606B1 (ko) 차량용 전자장치
KR20180053290A (ko) 차량용 제어장치 및 그것의 제어방법
US20160124224A1 (en) Dashboard system for vehicle
WO2018230526A1 (fr) Système d'entrée et procédé d'entrée
US11068054B2 (en) Vehicle and control method thereof
US11828947B2 (en) Vehicle and control method thereof
TWM564749U (zh) 車輛多螢幕控制系統
JP2005313722A (ja) 車載機器の操作表示装置及びその操作表示方法
JP2015019279A (ja) 電子機器
JP6236211B2 (ja) 輸送機器用表示装置
JP2018162023A (ja) 操作装置
Adachi Chances and Challenges for Automotive Displays
WO2023248687A1 (fr) Dispositif d'affichage d'image virtuelle
JP6146261B2 (ja) 車載ナビゲーションシステム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18817280

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18817280

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP