WO2017188098A1 - Système de traitement d'informations embarqué - Google Patents

Système de traitement d'informations embarqué Download PDF

Info

Publication number
WO2017188098A1
WO2017188098A1 PCT/JP2017/015774 JP2017015774W WO2017188098A1 WO 2017188098 A1 WO2017188098 A1 WO 2017188098A1 JP 2017015774 W JP2017015774 W JP 2017015774W WO 2017188098 A1 WO2017188098 A1 WO 2017188098A1
Authority
WO
WIPO (PCT)
Prior art keywords
operator
display
screen
unit
control unit
Prior art date
Application number
PCT/JP2017/015774
Other languages
English (en)
Japanese (ja)
Inventor
佐藤 晴彦
吉富 輝雄
Original Assignee
カルソニックカンセイ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by カルソニックカンセイ株式会社 filed Critical カルソニックカンセイ株式会社
Publication of WO2017188098A1 publication Critical patent/WO2017188098A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/04Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to an in-vehicle information processing system mounted on a vehicle in order to control information related to the vehicle.
  • an operator performs a pointer operation or a cursor operation displayed on one screen, a screen scroll operation, a selection operation, an input operation, etc. by performing a touch operation on a touch pad installed on a center console. Is possible.
  • the operation input device disclosed in Patent Literature 1 reduces the influence of disturbance factors including vibration of a running vehicle, and smoothes the cursor movement on the display unit based on the touch operation on the touch pad. It is possible.
  • the operation device disclosed in Patent Literature 2 can operate a pointer or cursor displayed on the screen and scroll the screen by a touch operation by an operator on a touch pad installed in the center console. To do.
  • Patent Documents 1 and 2 respond to an operation of the operator by drawing a pointer, a cursor, a movement vector, or the like on one screen.
  • a pointer a pointer
  • a cursor a movement vector, or the like
  • a movement vector a movement vector
  • An object of the present invention made in view of such a viewpoint is to provide an in-vehicle information processing system capable of easily selecting each screen.
  • an in-vehicle information processing system includes: A display unit having a plurality of screens; An operation unit for detecting at least a part of the operation of the operator's operator; And a control unit that selects the screen for operating the display content based on the detected operation.
  • each screen can be easily selected.
  • FIG. 1 is a schematic diagram showing an entire in-vehicle information processing system 10 according to the present embodiment.
  • FIG. 2 is a functional block diagram showing a schematic configuration of the in-vehicle information processing system 10 of FIG.
  • the in-vehicle information processing system 10 includes a display unit 11, a touch operation unit 12, an imaging unit 13, an operation unit 14, a control unit 15, and a storage unit 16.
  • FIG. 3 is a schematic diagram illustrating an example of an image displayed on the display unit 11.
  • FIG. 3A shows an example of a menu screen
  • FIG. 3B shows an example of a map screen.
  • FIG. 4 is a diagram schematically illustrating a cross section when the touch operation unit 12 is viewed from the side surface direction.
  • FIG. 5 is a schematic diagram when the operation unit 14 is viewed from above.
  • the in-vehicle information processing system 10 associates the position coordinates in the operation area on the screen configuring the display unit 11 with the position coordinates in the predetermined area of the touch operation unit 12 and images captured by the imaging unit 13. Based on the above, the operator's operating hand is superimposed on the screen. That is, based on the touch operation by the operator on the touch operation unit 12, the operator superimposed on the screen operates the screen virtually at the corresponding position.
  • the in-vehicle information processing system 10 causes the movement of the operator superimposed on the screen to correspond to the movement of the actual operator's operator captured by the imaging unit 13.
  • the operator is, for example, a driver who drives the vehicle or a passenger sitting in the passenger seat, and the operator is, for example, the driver on the center console side or the passenger's own hand.
  • the display unit 11 has at least one screen.
  • the display unit 11 may be configured by an arbitrary display device such as a liquid crystal display.
  • the display unit 11 is disposed, for example, on an instrument panel.
  • the display device constituting the display unit 11 may be a touch panel display or a display incapable of touch operation. In the following description, it is assumed that the display unit 11 is a display that cannot be touched.
  • the in-vehicle information processing system 10 may include a so-called head-up display type device in addition to or instead of the display unit 11.
  • the head-up display type device has a light emitting unit that generates display information as display light, reflects the generated display light toward an observer such as a driver, and the virtual image through the front windshield. Is displayed.
  • the observer is not limited to the driver but may be a passenger sitting in the passenger seat.
  • the display unit 11 displays information on the vehicle, function items for controlling the information, or a combination thereof.
  • the information about the vehicle includes, for example, information such as air conditioning, car navigation, audio, an image around the vehicle by an electronic mirror, a vehicle speed, a traveling position of the host vehicle in a plurality of lanes, or an inter-vehicle distance.
  • the function items for controlling the information include, for example, “return”, “forward”, “home”, “decision”, “various menus”, “temperature high / low”, “current location”, “volume high / low” ”,“ Enlargement / reduction ”,“ speed fast / slow ”,“ lane change ”, or“ distance long / short ”.
  • the display unit 11 may display each item as a character or an icon.
  • the display unit 11 displays various menus on one screen as function items for controlling information related to the vehicle. Specifically, the display unit 11 displays “APPS” as an item for displaying various applications. The display unit 11 displays “TEL” as an item for using the telephone. The display unit 11 displays “A / C” as an item for controlling the air conditioner. The display unit 11 displays “NAVI” as a menu for using the car navigation. The display unit 11 displays “AUDIO” as a menu for using audio. The display unit 11 displays “HOME” as an item for returning to the home screen. The display unit 11 displays “RETURN” as an item for returning to the previous screen.
  • “APPS” as an item for displaying various applications.
  • the display unit 11 displays “TEL” as an item for using the telephone.
  • the display unit 11 displays “A / C” as an item for controlling the air conditioner.
  • the display unit 11 displays “NAVI” as a menu for using the car navigation.
  • the display unit 11 displays “AUDIO” as a menu for using audio.
  • the display unit 11 displays map information that is a part of the car navigation system on one screen as information about the vehicle.
  • the display unit 11 displays function items such as “Destination setting”, “HOME”, and “RETURN” so as to be superimposed on the map information as a combination of information on the vehicle and function items for controlling the information. indicate.
  • the display unit 11 displays the operator's operator so as to overlap the display content. As shown in FIG. 3, the display unit 11 makes the operator's operating hand translucent and displays the above display content behind it.
  • the display unit 11 is not limited to this, and may display an opaque operation hand so that the display content behind the operator is temporarily hidden when the operator's operation hand is superimposed.
  • the degree of semi-transmission that is, the transmittance is described as being constant without depending on the position to be superimposed, but is not limited thereto, and may be changed for each position to be superimposed.
  • the display unit 11 raises the transmittance above a predetermined value so that the operator can sufficiently recognize which item the operator should select. It may be displayed. Conversely, at a position where only the background is displayed, the display unit 11 may display the transmittance lower than a predetermined value.
  • the display unit 11 may display an operator's operator with gradation.
  • the gradations described herein may include any step change with respect to light and darkness, color, or transmittance, or a combination thereof.
  • the display unit 11 is preferably displayed in gradation by an arbitrary method in which the operator can easily view the display content behind. For example, the display unit 11 displays the operator by gradually increasing the brightness of the operator, gradually changing to a lighter color, or gradually increasing the transmittance as the fingertip of the operator's operator superimposed on the display approaches. May be.
  • the display unit 11 displays each display content by an arbitrary display method that allows the operator to easily view the display contents while ensuring the reality of the operator's operator to be superimposed.
  • the display unit 11 has been described as superimposing an operator's hand in the real world on a virtual space where the above-described display contents are displayed, the present invention is not limited to this.
  • the display unit 11 may superimpose display contents and the like from the front surface of the operator's operator displayed on the screen as in a so-called mixed reality.
  • the touch operation part 12 is arrange
  • the touch operation unit 12 includes a touch pad 121 and a tact switch 122 as shown in FIG. As shown in FIG. 4, the operator places his / her arm and wrist on the armrest and palmrest, respectively, and makes a part of the operation hand, for example, a finger contact the touchpad 121.
  • the touch pad 121 detects contact by a contact object such as an operator's operator or a stylus pen at a corresponding contact position.
  • the touch pad 121 detects contact with a part of the operator's operating hand, for example, a finger at a corresponding contact position.
  • the operator operates information displayed on each screen constituting the display unit 11 by performing a touch operation on the touch operation unit 12, particularly the touch pad 121.
  • the touch pad 121 is formed of, for example, transparent glass, and a touch sensor configured by an arbitrary method such as a resistive film method, a capacitance method, a surface acoustic wave method, an infrared method, or an electromagnetic induction method can be used. . In the following description, it is assumed that the touch pad 121 is a capacitive touch pad.
  • the tact switch 122 is disposed immediately below the touch pad 121 and supported by the substrate.
  • the tact switch 122 is turned on by the pressing.
  • the tact switch 122 is turned off when the push of the operator is released and the touch pad 121 returns to the original position.
  • the tact switch 122 is turned on to obtain a click feeling.
  • one tact switch 122 is arranged at the central portion immediately below the touch pad 121.
  • the tact switch 122 is not limited to this, and any number can be used as long as the pressure from the touch pad 121 can be detected. It may be arranged at any place.
  • one tact switch 122 may be disposed on the outer periphery of the tact switch 122 immediately below the touch pad 121, or a plurality of tact switches 122 may be disposed at dispersed positions.
  • the touch operation unit 12 is configured to be able to detect a pressure from the touch pad 121 for each predetermined area of the touch pad 121. Also good. That is, the touch operation unit 12 is configured to be able to detect at which position on the touch pad 121 the operator has pressed when a plurality of fingers of the operator are in contact with the touch pad 121 at the same time. Also good.
  • the configuration unit arranged immediately below the touch pad 121 is not limited to the tact switch 122, and may be any configuration as long as the pressing from the touch pad 121 can be detected.
  • a pressure sensitive sensor such as a piezoelectric element may be disposed immediately below the touch pad 121.
  • the touch operation unit 12 may include a filter in order to remove an unnecessary detection signal of the touch pad 121 due to vibration during traveling of the vehicle.
  • the imaging unit 13 has at least one camera and is disposed on, for example, a roof panel.
  • the camera which comprises the imaging part 13 is arrange
  • the imaging unit 13 may image at least a part of the operator's operating hand, for example, only five fingers, but it is preferable to image the entire operator's operating hand including the back of the hand.
  • the entire operating hand is the entire part including from the vicinity of the operator's wrist to the fingertip.
  • the imaging unit 13 is preferably arranged above the operator, such as a roof panel, in order to easily image the entire operator's operator.
  • the imaging unit 13 is not limited to such an arrangement, and may be arranged at any location as long as at least a part of the operator's operator's hand can be imaged.
  • the imaging unit 13 may be arranged directly below the transparent touch pad 121 and may capture a part of the operator's operator performing a touch operation with the touch pad 121 from below. In this case, for example, by changing the palm rest portion in FIG. 4 to an arbitrary transparent support member, the imaging unit 13 can also image the entire operator's operating hand including the back of the hand.
  • the imaging unit 13 is preferably composed of a camera with a wide dynamic range so that the operator's hand can be clearly imaged both in the daytime bright state and in the nighttime dark state.
  • the image captured by the camera may be a black and white image or a color image.
  • the imaging unit 13 is not limited to a configuration with a camera having a wide dynamic range, and may be configured with a camera capable of imaging in a bright state in the daytime. In this case, the imaging unit 13 may irradiate the operator's operating hand on the touch pad 121 with a spotlight from above in order to clearly capture the operator's operating hand even at night.
  • the vehicle When the vehicle performs automatic driving, it is assumed that the operator leans on the reclining seat and leans on the seat in a relaxed state. At this time, if the position of the touch operation unit 12 is fixed, the operator needs to perform the touch operation by extending his arm while leaning on the seat, and feels inconvenience. Therefore, for example, by configuring the center console on which the touch operation unit 12 is disposed so as to be lowered rearward in conjunction with the movement of the reclining seat, the operator can easily perform the touch operation without extending the arm. Can do. In the case of such a configuration, the imaging unit 13 needs to image the operator on the touch pad 121 in accordance with each position of the touch operation unit 12 that is linked to the movement of the reclining seat.
  • the imaging unit 13 is preferably configured with a camera with a wide angle of view.
  • the imaging unit 13 is not limited to this, and may be configured such that the angle of the camera itself changes in conjunction with a change in the position of the touch operation unit 12 even if the camera has a narrow angle of view.
  • the imaging unit 13 may be configured such that the camera itself translates in conjunction with a change in the position of the touch operation unit 12.
  • the position of the touch operation unit 12 that changes in conjunction with the movement of the reclining seat is limited to two positions, for example, a position during manual operation and a position during automatic operation.
  • a camera may be arranged.
  • the operation part 14 is arrange
  • the part of the operating hand described here is, for example, the palm side of the wrist, that is, the wrist part.
  • the operation unit 14 has, for example, a four-way switch such as a cross key on the upper surface thereof.
  • the operator places a part of the operator on the upper surface of the operation unit 14, particularly the surface of the four-way switch.
  • the operation unit 14 detects the operation of at least a part of the operator's operator touching the four-way switch. For example, when the operator pushes the four-way switch to the back side with the wrist, the operation unit 14 detects the pushing operation to the back side.
  • the operation unit 14 detects the push-in operation toward the near side, the left side, or the right side. That is, the operation unit 14 can detect the movements in the four directions of the operator's wrist.
  • the operation unit 14 transmits a signal based on the detected operation to the control unit 15.
  • the operation unit 14 will be described below as having a four-way switch, but is not limited thereto.
  • the operation unit 14 may have an 8-way switch.
  • the operation unit 14 can detect at least a part of eight-direction motions of the operator's operating hand.
  • the operation unit 14 is not limited to the direction switch, and may have any configuration as long as it can detect at least a part of the operation of the operator's operator.
  • the control unit 15 is a processor that controls and manages the entire in-vehicle information processing system 10 including each functional block of the in-vehicle information processing system 10.
  • the control unit 15 includes a processor such as a CPU (Central Processing Unit) that executes a program that defines a control procedure. Such a program is stored in the storage unit 16, for example.
  • a processor such as a CPU (Central Processing Unit) that executes a program that defines a control procedure.
  • a program is stored in the storage unit 16, for example.
  • the control unit 15 acquires the contact information detected on the touch pad 121 from the touch operation unit 12 as an input signal. Specifically, the control unit 15 acquires detection information regarding a contact object, for example, a contact by an operator's finger and a corresponding contact position. The control unit 15 identifies accurate position coordinates on the touch pad 121 where the touch operation is performed based on the detection information regarding the corresponding contact position.
  • the control unit 15 acquires a signal related to the on or off state of the tact switch 122 from the touch operation unit 12. Specifically, when the operator depresses the tact switch 122 via the touch pad 121, the control unit 15 acquires an on-state signal. When the operator stops pressing the touch pad 121 and releases the pressing of the tact switch 122, the control unit 15 acquires an off-state signal. The control unit 15 identifies the on state or the off state of the tact switch 122 based on the acquired signal.
  • the control unit 15 selects a corresponding item on the screen constituting the display unit 11 when the touch pad 121 detects contact by a part of the operator's operator's hand. At this time, the control unit 15 highlights the item. Highlighting is to display a predetermined item with emphasis.
  • the control unit 15 provides feedback to the operator that the above item is in a selected state by highlighting. For example, as illustrated in FIG. 3A, when contact with the operator's finger is detected at a corresponding position on the touch pad 121, the control unit 15 highlights the function item “NAVI” on the screen. At this time, the control unit 15 superimposes and displays the operator's operating hand at a corresponding position based on the image captured by the imaging unit 13.
  • the control unit 15 determines the selection of a predetermined item on the screen when the tact switch 122 is turned on by pressing the touch pad 121 by a part of the operator's operator.
  • the operation for confirming selection of a predetermined item on the screen is not limited to this, and may be an arbitrary operation such as a double tap on the touch pad 121, for example. In this case, the touch operation unit 12 may not have the tact switch 122.
  • the control unit 15 when the tact switch 122 is turned on by pressing by a part of the operator's operator's hand, the control unit 15 confirms selection of the item “NAVI” displayed on the screen. Let At this time, the control unit 15 causes the same movement to be displayed on the screen in accordance with a part of the operator's hand touching the touch pad 121, for example, pressing of the index finger or double tap.
  • the control unit 15 causes the display unit 11 to display information on the vehicle, function items for controlling the information, or a combination thereof.
  • the control unit 15 causes at least a part of the operator's operator's hand to be superimposed and displayed on the screen at a display magnification based on the size of the operation area on the screen constituting the display unit 11 by image processing to be described later.
  • the control unit 15 acquires a signal based on at least a part of the operator's operating hand, for example, the operation of the wrist detected by the operating unit 14. For example, the control unit 15 identifies the direction of the operation based on the acquired signal. The control unit 15 selects a screen for the operator to operate display contents based on the detected operation.
  • the control unit 15 refers to various information stored in the storage unit 16. Specifically, the control unit 15 refers to information related to the vehicle or information related to a function item for controlling the information. The control unit 15 refers to information regarding the on state or the off state of the tact switch 122. The control unit 15 refers to the image information captured by the imaging unit 13. The control unit 15 refers to information regarding the operator's operator who has been subjected to image processing and is finally superimposed on the display unit 11.
  • the storage unit 16 can be composed of a semiconductor memory, a magnetic memory, or the like, and stores various information described above, a program for operating the vehicle information processing system 10, and the like.
  • the storage unit 16 also functions as a work memory.
  • the storage unit 16 stores information on the operator of the operator who has been subjected to image processing, which is finally superimposed on the display unit 11.
  • FIG. 6 is a schematic diagram illustrating an example of a correspondence relationship between a predetermined area of the touch operation unit 12 and an operation area on the screen constituting the display unit 11.
  • FIG. 6A shows a predetermined region R1 of the touch operation unit 12.
  • FIG. 6B shows an operation area R ⁇ b> 2 on the screen constituting the display unit 11.
  • the control unit 15 sets a predetermined region R1 of the touch operation unit 12 and an operation region R2 on the screen constituting the display unit 11.
  • the predetermined area R1 of the touch operation unit 12 is an area for the operator to perform a touch operation with a part of the operator.
  • the predetermined region R1 of the touch operation unit 12 is a part or the entire region of the touch pad 121.
  • the operation area R2 on the screen constituting the display unit 11 is an area on the screen that can be virtually operated by the operator's operator hand superimposed on the screen.
  • the operation area R2 on the screen constituting the display unit 11 is a part or the entire area of the screen.
  • the predetermined region R1 of the touch operation unit 12 is set on the back side of the touch operation unit 12 so that the operator's operator's hand is superimposed on the entire touch operation unit 12.
  • the back side of the touch operation unit 12 is, for example, the back side of the touch pad 121 constituting the touch operation unit 12. That is, as shown in FIGS. 4 and 6, the back side of the touch operation unit 12 is an area on the touch pad 121 that is farthest from the wrist when the operator's arm and wrist are arranged on the armrest and palmrest, respectively. .
  • the predetermined area R1 of the touch operation unit 12 is not limited to this, and may be an arbitrary partial area on the touch pad 121 or an entire area as described above.
  • the predetermined area R1 of the touch operation unit 12 is an arbitrary partial area on the touch pad 121, the areas on the touch pad 121 other than the predetermined area R1 are configured not to react to the touch operation. May be.
  • the operation area R2 on the screen constituting the display unit 11 is set at the upper part of the screen so as to correspond when the predetermined area R1 is set on the back side of the touch pad 121. That is, the back side and the near side of the touch pad 121 correspond to the upper part and the lower part of the screen, respectively.
  • the correspondence relationship between the touch pad 121 and the screen constituting the display unit 11 is not limited to this. For example, the above correspondence relationship may be reversed, and the near side and the far side of the touch pad 121 may correspond to the upper part and the lower part of the screen, respectively.
  • the operation region R2 on the screen constituting the display unit 11 is associated with the predetermined region R1 on the back side of the touch operation unit 12. , May be set at the lower part thereof.
  • the control unit 15 associates the set position coordinates in the predetermined region R1 of the touch operation unit 12 with the position coordinates in the operation region R2 on the screen constituting the display unit 11. For example, when the predetermined area R1 of the touch operation unit 12 is a square area of a part or the whole of the touch pad 121, and the operation area R2 on the screen is a part of or a whole square area of the screen. think of. In this case, the control unit 15 associates the four vertices of the predetermined area R1 with the four vertices of the operation area R2. The control unit 15 can determine the correspondence between the position coordinates of each point located in the square area connecting the vertices by identifying the correspondence between the position coordinates of the four vertices.
  • Such processing may be executed as calibration at an initial stage when the in-vehicle information processing system 10 is mounted on the vehicle, or may be executed as needed.
  • FIG. 7 is a schematic diagram showing a state of image processing performed by the in-vehicle information processing system 10.
  • FIG. 7A shows the state of the operator's operator performing a touch operation on the touch operation unit 12.
  • FIG. 7B shows the state of the operator's operator hand superimposed on the screen constituting the display unit 11.
  • the control unit 15 acquires image information captured by the camera from the imaging unit 13. As shown by the region R ⁇ b> 3 in FIG. 7, the captured image includes at least a part of an operator's hand performing a touch operation with the touch operation unit 12, and the touch operation unit 12, particularly the touch pad 121. That is, the imaging unit 13 images the positional relationship between the touch operation unit 12 and the operator's operating hand. In addition, as described above, the control unit 15 associates the position coordinates in the predetermined region R1 with the position coordinates in the operation region R2, so that it corresponds to the position of the operator's operator's hand in the touch operation unit 12. In addition, at least a part of the operator can be displayed in a superimposed manner on the screen.
  • the control unit 15 performs image processing for extracting a part or the whole of the operator's operator hand based on the above image when the operator's operator's hand is superimposed on the display unit 11. That is, the control unit 15 removes image information such as an external background from the contour of the operator's operator's hand. By surrounding the periphery of the touch pad 121 with a black edge, the control unit 15 can easily extract the operator's operator based on the captured image.
  • control unit 15 may or may not color the operating part by image processing. In order to further improve the reality of the operator's operator displayed on the display unit 11, the control unit 15 is preferably colored by image processing.
  • control unit 15 When the captured image is a color image, it is preferable that the control unit 15 directly superimposes the image on the screen in accordance with the actual color and brightness of the operator's operator.
  • the control unit 15 is not limited to this, and in order to make it easier to visually recognize the display content behind, the control unit 15 eliminates the color and brightness of the operating part, and instead performs image processing that adds a predetermined color, for example. You may go.
  • the control unit 15 may perform image processing that eliminates the color and brightness of the operating part and makes the operating part completely colorless and transparent. In this case, the control unit 15 displays only the part near the contour of the operator on the screen.
  • the image captured by the imaging unit 13 is a color image
  • the control unit 15 is described as being superimposed and displayed on the screen as it is in accordance with the actual color and brightness of the operator's operator. That is, the description will be made assuming that the control unit 15 does not need to perform image processing relating to color, brightness, and the like.
  • the control unit 15 operates the operator based on the ratio of the size between the predetermined region R1 of the touch operation unit 12 in the captured image and the operation region R2 on the screen constituting the display unit 11. Determine the hand display magnification. For example, when the predetermined area R1 of the touch operation unit 12 is a square area of a part or the whole of the touch pad 121, and the operation area R2 on the screen is a part of or a whole square area of the screen. think of. In this case, the control unit 15 calculates the ratio between the length of each side of the predetermined region R1 of the touch operation unit 12 and the length of each corresponding side of the operation region R2 on the screen. Based on the ratio, the control unit 15 determines the display magnification of the operator's operator's hand to be displayed on the display unit 11 in a superimposed manner.
  • the control unit 15 superimposes the imaged operator's operator's hand on the display unit 11 in an enlarged or reduced state or as it is based on the determined display magnification.
  • the control unit 15 may set the display magnification of the operator's operator to be the same as the above ratio, or may be a different value based on the above ratio.
  • the display magnification determination process may be executed simultaneously with the above-described calibration at the initial stage when the in-vehicle information processing system 10 is mounted on the vehicle, or may be executed as needed.
  • the control unit 15 may fix the magnification or change the display magnification according to the situation. For example, the control unit 15 may change the display magnification of the operator's operator between daytime and nighttime, or may appropriately change the display magnification according to the operator's setting.
  • the control unit 15 may change the display magnification of the operator's operator to be superimposed and displayed based on the size of the operation area R2 of each screen. Good. Without being limited thereto, the control unit 15 derives, for example, an average value based on the size of the operation region R2 of each screen, and the operator's operator who displays the average value based on the average value.
  • the display magnification may be constant.
  • the control unit 15 may change the display magnification of the operator's operator according to not only the size of the operation area R2 on the screen but also the content displayed on the screen. For example, when the display unit 11 displays a map or a function item and the operator operates each, the control unit 15 reduces the display magnification of the operator to be lower than usual so that the operator can easily operate. Then, the display unit 11 may be displayed in a superimposed manner.
  • the control unit 15 may change the display magnification for each operator, for example, to match the size of the operator displayed superimposed on the display unit 11 between operators having different sizes of the operator.
  • the control unit 15 makes the display magnification of the operator superimposed on the display unit 11 constant among the operators having different sizes of the operator, and adjusts it to the size of the actual operator's operator. May be displayed in a superimposed manner.
  • control unit 15 performs image processing within a predetermined time based on the image captured by the imaging unit 13.
  • the predetermined time is a time delay between the timing of the operation by the operator's actual operator and the timing of the movement of the operator's operator superimposed on the screen, and the operator is not aware of it. Means a time delay. That is, it is preferable that the control unit 15 completes the image processing within a time sufficiently shorter than the time delay in which the operator feels uncomfortable with the operation due to the reaction speed and the recognition ability of the operator. For example, it is preferable that the control unit 15 limit the image processing only to the above-described extraction of the operator's operator's hand that has been imaged and adjustment of the display magnification.
  • the position coordinates in the predetermined region R1 of the touch operation unit 12 where the touch operation by the operator's operator is detected is not identified by image processing of the captured image by the imaging unit 13, but as described above. It is preferable that identification is performed based on detection information from the touch operation unit 12, particularly the touch pad 121.
  • the control unit 15 is described as performing two image processes as described above, but is not limited thereto, and may perform three or more image processes within a predetermined time.
  • the position coordinates in the predetermined region R1 of the touch operation unit 12 where the touch operation by the operator's operator is detected may be identified by image processing of the captured image by the imaging unit 13.
  • the control unit 15 refers to information about the predetermined region R1 of the touch operation unit 12 and the operation region R2 on the screen constituting the display unit 11 from the storage unit 16 when performing the above image processing. That is, the control unit 15 refers to information regarding the position coordinates in the predetermined region R1 of the touch pad 121 corresponding to the detection information. The control unit 15 refers to information regarding the position coordinates in the operation region R2 of each screen constituting the display unit 11. The control unit 15 refers to information on the display magnification of the operator's operator's hand to be displayed in a superimposed manner determined by calibration or the like.
  • FIG. 8 is an enlarged view of the display unit 11 shown in FIG.
  • the display unit 11 has four screens 111, 112, 113, and 114.
  • the screen 111 has two display layers 1111 and 1112 that are different in the depth direction.
  • the display layer 1111 of the screen 111 displays an operation screen.
  • the display layer 1112 of the screen 111 displays information about the vehicle such as the vehicle speed, the traveling position of the host vehicle in a plurality of lanes, or the inter-vehicle distance.
  • the screens 112 and 113 display an image around the vehicle by an electronic mirror.
  • the screen 114 displays function information for controlling map information related to car navigation and information related to vehicles.
  • the number of screens constituting the display unit 11 is four and the number of display layers of the screen 111 is two.
  • the display unit 11 may be configured by an arbitrary number of screens.
  • the screen 111 may be configured by an arbitrary number of display layers. Although only the screen 111 is described as having a display layer, the present invention is not limited to this, and other screens constituting the display unit 11 may have a plurality of display layers.
  • the content displayed on each screen is not limited to the above, and any screen may display any content.
  • the control unit 15 determines the movement direction from the currently selected screen to the next selected screen, at least a part of the detected operator's operating hand (carpal portion). ) Corresponding to the direction of movement. For example, in a state where the screen 111 is selected, when the operator pushes the four-way switch to the left side with the wrist, the control unit 15 selects the screen 112 installed on the left side of the screen 111. Similarly, when the operator depresses the four-way switch to the right side with the wrist while the screen 111 is selected, the control unit 15 selects the screen 113 installed on the right side of the screen 111. In this state, when the operator pushes the four-way switch to the right again at the wrist, the control unit 15 selects the screen 114 installed on the right side of the screen 113.
  • the control unit 15 may return to the right end and select the screen 114.
  • the selection may not be changed, and the control may be performed so that the selection still remains on the screen 112.
  • the control unit 15 may perform the same control.
  • the control unit 15 determines the movement direction from the currently selected display layer to the next selected display layer at least a part of the detected operator's operator's hand. Correspond to the direction of movement. For example, when the display layer 1111 is selected and the operator pushes the four-way switch to the back side with the wrist, the control unit 15 selects the display layer 1112 arranged on the back side of the display layer 1111. To do. Conversely, when the display layer 1112 is selected and the operator pushes the four-way switch to the near side with the wrist, the control unit 15 moves the display layer 1111 disposed on the near side of the display layer 1112. select.
  • the target selected by the above operation is a front that displays a virtual image in addition to each screen and each display layer. It may be a windshield.
  • the control unit 15 causes the front window disposed on the back side of the display layer 1112 to be Select a shield. That is, the control unit 15 identifies the display layers 1111 and 1112 and the front windshield as a single hierarchical structure.
  • the control unit 15 In a state where the display layer 1112 or the front windshield arranged at the back end of the display unit 11 is selected, when the operator further pushes the four-way switch to the back side, the control unit 15 returns to the front end and returns to the display layer. 1111 may be selected.
  • the control unit 15 may perform control so that the selection is not changed and the display layer 1112 or the front windshield still remains.
  • the control unit 15 may perform the same control even when the operator pushes the four-way switch further forward while the screen 111 arranged at the front end of the display unit 11 is selected.
  • control unit 15 selects the screen 112 installed on the left side.
  • control unit 15 selects the screen 113 installed on the right side.
  • the control unit 15 When the operator presses the four-way switch in the direction of selecting the screen 111 while the screen 112 or 113 is selected, the control unit 15 causes the specific display layer or the front windshield (for example, the front end of the front end) to be selected. The selection may be returned to the display layer 1111). The control unit 15 may return the selection to the display layer or the front windshield that was selected immediately before.
  • the method of selecting the screen, the display layer, or the front windshield is not limited to the above, and any method may be used as long as it corresponds to the direction of movement of at least a part of the detected operator's operator's hand. .
  • the control unit 15 may or may not suppress the display level of the display layer that is not selected on the screen 111. In view of the visibility of the operator, it is preferable that the control unit 15 suppresses the display degree of the display layer that is not selected.
  • control unit 15 may reduce the luminance of a display layer that is not selected on the screen 111.
  • the control unit 15 may suppress the display degree by graying out.
  • the control unit 15 may exclude all RGB of the display layer that has not been selected, or may exclude only one or two elements of RGB.
  • the control unit 15 may blur the display layer that has not been selected.
  • the control part 15 does not need to display the display layer which has not been selected in the first place. Without being limited to these methods, the control unit 15 performs control by any method that can relatively improve the visibility of the selected display layer by reducing the visibility of the display layer that is not selected. May be.
  • FIG. 9 is a flowchart showing an example of the operation of the in-vehicle information processing system 10.
  • the control unit 15 performs calibration. That is, the control unit 15 associates the set position coordinates in the predetermined region R1 of the touch operation unit 12 with the position coordinates in the operation region R2 on the screen constituting the display unit 11 (step S10).
  • the control unit 15 determines the display magnification of the operator's operator to be displayed in a superimposed manner on the display unit 11 by calibration or the like (step S11).
  • the control unit 15 determines whether the operator's operating hand is superimposed on the touch operation unit 12 based on the image captured by the imaging unit 13 (step S12).
  • control unit 15 determines that the operator's operating hand is superimposed on the touch operation unit 12
  • the control unit 15 proceeds to step S13.
  • the control unit 15 returns to step S12 again and waits until the operator's operating hand is superimposed.
  • control unit 15 When it is determined that the operator's operator's hand is superimposed on the touch operation unit 12, the control unit 15 performs image processing for extracting a part or the whole of the operator's operator's hand (Step S13).
  • the control unit 15 superimposes and displays the captured operator's hand based on the display magnification determined in step S11 (step S14).
  • the control unit 15 determines whether or not the detection information regarding at least a part of the operation of the operator's operator has been acquired from the operation unit 14 (step S15).
  • control part 15 progresses to step S16, when detection information is acquired.
  • the control unit 15 proceeds to step S18.
  • control unit 15 When acquiring the detection information, the control unit 15 newly selects a screen corresponding to the identified operation direction (step S16).
  • the control unit 15 superimposes the captured operator's hand based on the display magnification adapted to the newly selected screen (step S17).
  • the control unit 15 determines whether detection information related to the touch operation has been acquired from the touch operation unit 12 (step S18).
  • control unit 15 When the control unit 15 acquires the detection information, the control unit 15 proceeds to step S19. When the detection information is not acquired, the control unit 15 returns to step S18 again and waits until the detection information is acquired.
  • control unit 15 When acquiring the detection information, the control unit 15 performs an operation based on the touch operation on the touch operation unit 12 on the currently selected screen (step S19).
  • the control unit 15 ends the flow.
  • the operator's operator displayed on the display unit 11 virtually operates the information on the screen. Is possible. That is, the operator can access the screen in a state closer to the actual feeling. The operator can intuitively recognize the actual position of the operator and the positional relationship on the screen. Therefore, the in-vehicle information processing system 10 can reduce the time for the operator to watch the screen as compared with a conventional device that displays a pointer or the like.
  • the in-vehicle information processing system 10 limits the time for performing image processing, and therefore, it is possible to superimpose and display the operator's hand with a minimum delay. That is, the in-vehicle information processing system 10 can reduce a temporal movement shift between the actual operator and the operator superimposed and displayed on the screen. Thereby, the operator can operate the information displayed on the screen without a more uncomfortable feeling.
  • the in-vehicle information processing system 10 performs image processing for extracting the operator's operator hand imaged by the imaging unit 13, the operator's operator hand can be faithfully superimposed on the screen. As a result, the operator can intuitively recognize the operating hand superimposed on the screen as his / her own hand.
  • the in-vehicle information processing system 10 performs image processing for changing the display magnification of the operator's operator imaged by the imaging unit 13, so that the operator's operator's hand is superimposed and displayed in an optimum size for each screen. It is possible to make it. As a result, the operator can easily see the display content displayed behind the operator while feeling the reality of the operator superimposed on the screen.
  • the in-vehicle information processing system 10 can display an operator's operator in a superimposed manner with a minimum delay compared to the case where the position coordinates in the predetermined region R1 where the touch operation is detected are identified by image processing. Is possible.
  • the in-vehicle information processing system 10 does not indirectly identify the position coordinates based on the image picked up by the image pickup unit 13 but directly identifies them by the touch operation unit 12, so that the position coordinates can be accurately identified. it can. That is, since the in-vehicle information processing system 10 detects the position where the operator is actually touching by the touch operation unit 12, when the operator selects a function item displayed on the screen, a malfunction occurs. Less likely to cause.
  • the imaging unit 13 captures an image of the operator's entire operating hand, so that the operator can easily recognize that the operating hand superimposed on the screen is his / her hand.
  • the operator can easily recognize which part of the operator is moving based on the display on the screen. Since the operator accurately recognizes the relationship between the actual position of the operator and the position on the screen, the operator can easily grasp the movement amount of the operator and the movable area on the screen.
  • the in-vehicle information processing system 10 can accurately identify the operator's operator on the touch pad 121 by imaging the entire operator's operator with the imaging unit 13. That is, the in-vehicle information processing system 10 can accurately recognize the operator's operating hand as a human hand.
  • the in-vehicle information processing system 10 can identify each part of the operator's operator with higher accuracy. That is, the in-vehicle information processing system 10 can accurately identify which finger corresponds to the finger in the captured image.
  • the vehicle-mounted information processing system 10 can accurately identify the size of the entire operator and the proportion of the size of each part in the entire operator by capturing the entire operator.
  • the in-vehicle information processing system 10 accurately matches the movement amount of the operator's operator's hand on the screen and the movable area on the screen with the movement of the actual operator's hand on the touch pad 121. It is possible to identify.
  • the in-vehicle information processing system 10 sets the predetermined area R1 on the back side of the touch pad 121 and sets the operation area R2 on the upper part of the screen, so that the operator inevitably places himself in the touch operation unit 12 as a whole. Will be superimposed. At this time, the in-vehicle information processing system 10 can inevitably image the entire operator's operation hand by the imaging unit 13. By making the region on the touch pad 121 other than the predetermined region R1 non-responsive to the touch operation, the operator's consciousness is concentrated on the predetermined region R1. At this time, the vehicle-mounted information processing system 10 can capture the entire operator's operating hand more reliably.
  • the in-vehicle information processing system 10 can clearly recognize which finger is in contact with the touch pad 121 as visual information by highlighting an item selected by contact by the operator.
  • the operator can easily visually recognize at which position on the screen the operator displayed in superposition is touching or which item is selected.
  • the in-vehicle information processing system 10 gives a click feeling to the operator when the tact switch 122 is turned on. Therefore, the operator can obtain tactile feedback by his / her own operation, and more intuitive operation is possible.
  • the tact switch 122 is used to confirm the selection, so that the operator can confirm the item selected by a natural action.
  • the operator feels more reality with respect to the operator superimposed and displayed on the screen due to the reaction force acting on the finger. That is, the operator can easily feel the illusion that his / her actual operator is touching the screen directly.
  • the in-vehicle information processing system 10 can ensure the reality because the operator's operator's hand to be superimposed is semi-transparent. Thereby, the operator can easily visually recognize the information displayed on the screen. That is, the operator can operate information displayed on the screen more intuitively.
  • the in-vehicle information processing system 10 can image the entire operator's operating hand more easily by installing the imaging unit 13 above the touch operation unit 12.
  • each screen can be easily selected. That is, the operator can select each screen based on visual information obtained while viewing each screen of the display unit 11 without changing his / her line of sight to his / her own operator. Therefore, the in-vehicle information processing system 10 makes the operator's screen selection operation easier and improves the convenience for the operator.
  • the in-vehicle information processing system 10 makes the movement direction from the currently selected screen to the next selected screen correspond to the detected direction of operation, and thus enables an intuitive screen selection operation. That is, in order to select the next screen, the operator can operate at least a part of his / her operator with a normal feeling based on the position information of each screen identified visually.
  • the in-vehicle information processing system 10 includes a screen having a plurality of display layers that differ in the depth direction, more information can be displayed on one screen. That is, the in-vehicle information processing system 10 can simultaneously display different types of information on one screen by dividing the display layer.
  • the in-vehicle information processing system 10 makes the movement direction from the currently selected display layer to the next selected display layer correspond to the direction of the detected operation, so that an intuitive display layer selection operation can be performed. To do. That is, in order to select the next display layer, the operator can operate at least a part of his / her operator with a normal feeling based on the position information of each display layer visually identified.
  • the in-vehicle information processing system 10 suppresses the degree of display of a display layer that is not selected, so that the visibility of the selected display layer can be improved.
  • the in-vehicle information processing system 10 suppresses the display level by reducing the luminance, so that the selected display layer can be clearly displayed. That is, the in-vehicle information processing system 10 can relatively increase the luminance of the selected display layer by reducing the luminance of the display layer that is not selected, thereby enabling clear display.
  • the in-vehicle information processing system 10 suppresses the degree of display of the display layer that is not selected by graying out, so that the visibility of the selected display layer can be relatively improved.
  • the in-vehicle information processing system 10 can improve the visibility of the operator with respect to the display layer structure when displaying a non-selected display layer by the above-described arbitrary method as compared with not displaying at all. That is, the operator can easily visually recognize which number of display layers are provided at which position on the screen.
  • the in-vehicle information processing system 10 includes an operation unit 14 that is arranged on a palm rest and includes a four-way switch. Therefore, the operator can select each screen or each display layer with a simple operation. That is, the operator can select each screen simply by pressing the four-way switch in the corresponding direction based on the back, near, left, and right directions regarding each screen or each display layer identified visually.
  • the in-vehicle information processing system 10 associates the back, near, left, and right directions for each screen or each display layer with the back, near, left, and right switches of the four-way switch, respectively. Therefore, the operator can perform the selection operation intuitively.
  • the in-vehicle information processing system 10 may respond to the operation of the operator with a pointer, cursor, or the like instead of displaying the operator on the screen.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

L'invention concerne un système de traitement d'informations embarqué permettant de sélectionner facilement des écrans. Le système de traitement d'informations embarqué (10) comprend : une unité d'affichage (11) qui possède une pluralité d'écrans ; une unité d'actionnement (14) qui détecte une action réalisée par au moins une partie de la main d'actionnement d'un opérateur ; et une unité de commande (15) qui sélectionne un écran pour que l'opérateur puisse actionner un contenu affiché, sur la base de l'action détectée.
PCT/JP2017/015774 2016-04-27 2017-04-19 Système de traitement d'informations embarqué WO2017188098A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-089489 2016-04-27
JP2016089489A JP2017197016A (ja) 2016-04-27 2016-04-27 車載用情報処理システム

Publications (1)

Publication Number Publication Date
WO2017188098A1 true WO2017188098A1 (fr) 2017-11-02

Family

ID=60161585

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/015774 WO2017188098A1 (fr) 2016-04-27 2017-04-19 Système de traitement d'informations embarqué

Country Status (2)

Country Link
JP (1) JP2017197016A (fr)
WO (1) WO2017188098A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7338624B2 (ja) * 2018-06-18 2023-09-05 日本精機株式会社 車両用表示装置、車両用表示装置の制御方法、車両用表示装置の制御プログラム
JP2020071641A (ja) * 2018-10-31 2020-05-07 株式会社デンソー 入力操作装置及びユーザインタフェースシステム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006293601A (ja) * 2005-04-08 2006-10-26 Nissan Motor Co Ltd 情報操作装置
WO2007088942A1 (fr) * 2006-02-03 2007-08-09 Matsushita Electric Industrial Co., Ltd. Dispositif d'entree et son procede
JP2014056462A (ja) * 2012-09-13 2014-03-27 Toshiba Alpine Automotive Technology Corp 操作装置
JP2015149022A (ja) * 2014-02-07 2015-08-20 東日本電信電話株式会社 情報提供装置及び情報提供方法
JP2015152964A (ja) * 2014-02-10 2015-08-24 トヨタ自動車株式会社 車両用情報表示装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006293601A (ja) * 2005-04-08 2006-10-26 Nissan Motor Co Ltd 情報操作装置
WO2007088942A1 (fr) * 2006-02-03 2007-08-09 Matsushita Electric Industrial Co., Ltd. Dispositif d'entree et son procede
JP2014056462A (ja) * 2012-09-13 2014-03-27 Toshiba Alpine Automotive Technology Corp 操作装置
JP2015149022A (ja) * 2014-02-07 2015-08-20 東日本電信電話株式会社 情報提供装置及び情報提供方法
JP2015152964A (ja) * 2014-02-10 2015-08-24 トヨタ自動車株式会社 車両用情報表示装置

Also Published As

Publication number Publication date
JP2017197016A (ja) 2017-11-02

Similar Documents

Publication Publication Date Title
CN110045825B (zh) 用于车辆交互控制的手势识别系统
JP4351599B2 (ja) 入力装置
KR101367593B1 (ko) 쌍방향 조작 장치 및 쌍방향 조작 장치의 작동 방법
US8085243B2 (en) Input device and its method
US9604542B2 (en) I/O device for a vehicle and method for interacting with an I/O device
US9956878B2 (en) User interface and method for signaling a 3D-position of an input means in the detection of gestures
CN108108042B (zh) 车辆用显示装置及其控制方法
US20140210795A1 (en) Control Assembly for a Motor Vehicle and Method for Operating the Control Assembly for a Motor Vehicle
US11112873B2 (en) Method for operating a display device for a motor vehicle and motor vehicle
CN109643219A (zh) 用于与在车辆中的显示设备上呈现的图像内容进行交互的方法
US20180157324A1 (en) Method and Device for Interacting with a Graphical User Interface
WO2014162697A1 (fr) Dispositif d'entrée
JPWO2012026402A1 (ja) 車両用操作装置
KR101806172B1 (ko) 차량 단말기 조작 시스템 및 그 방법
CN111032414A (zh) 自动驾驶车辆的主显示器的控制系统
WO2017138545A1 (fr) Système de traitement d'informations, dispositif de traitement d'informations, procédé de commande et programme
JP2017197015A (ja) 車載用情報処理システム
JP2018136616A (ja) 表示操作システム
WO2017188098A1 (fr) Système de traitement d'informations embarqué
US20190291578A1 (en) Vehicular display device
JP2018195134A (ja) 車載用情報処理システム
JP2018010472A (ja) 車内電子機器操作装置及び車内電子機器操作方法
TWM564749U (zh) 車輛多螢幕控制系統
WO2017175666A1 (fr) Système de traitement d'informations à bord d'un véhicule
JP2009184551A (ja) 車載機器入力装置

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17789382

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17789382

Country of ref document: EP

Kind code of ref document: A1