WO2012067193A1 - Display scene creation system - Google Patents

Display scene creation system Download PDF

Info

Publication number
WO2012067193A1
WO2012067193A1 PCT/JP2011/076542 JP2011076542W WO2012067193A1 WO 2012067193 A1 WO2012067193 A1 WO 2012067193A1 JP 2011076542 W JP2011076542 W JP 2011076542W WO 2012067193 A1 WO2012067193 A1 WO 2012067193A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
scene
gesture
touch panel
setting
Prior art date
Application number
PCT/JP2011/076542
Other languages
French (fr)
Japanese (ja)
Inventor
和彦 依田
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Publication of WO2012067193A1 publication Critical patent/WO2012067193A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/215Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays characterised by the combination of multiple visual outputs, e.g. combined instruments with analogue meters and additional displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/11Instrument graphical user interfaces or menu aspects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/122Instrument input devices with reconfigurable control functions, e.g. reconfigurable menus
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/141Activation of instrument input devices by approaching fingers or pens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/143Touch sensitive instrument input devices
    • B60K2360/1438Touch screens
    • B60K2360/1442Emulation of input devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/146Instrument input by gesture

Definitions

  • the present invention relates to a technology for a display device with a touch panel, and more specifically, to a display scene creation system, a display scene creation program, and a display system with a touch panel that transition an image to be presented to a user by inputting a gesture to the touch panel. .
  • a display device with a touch panel has been widely used in various fields such as game machines, mobile phones, PDAs, vending machines, and information boards. Since the display device with a touch panel is associated with the display on the touch panel and a gesture input from the touch panel, the user can perform an intuitive operation.
  • a display system with a touch panel does not have a general-purpose mechanism for associating a touch panel with a gesture. Therefore, a transition of a display scene is realized by associating a touch panel with a gesture with a processing program.
  • the processing program for associating the touch panel and the gesture needs to be created for each display scene, and it takes time and effort to develop the program. For example, if it is assumed that different gestures are input to the same area, or there are buttons and menus that overlap in the same area, the program for correctly recognizing the gesture becomes complicated and the man-hours are enormous. It becomes. Moreover, in order to improve the recognition accuracy, a high-level program is required, and there is a problem that development within a limited time is impossible.
  • the present invention has been made in view of the above problems. That is, it is possible to provide a display scene creation system, a display scene creation program, and a display system with a touch panel that can transition a display scene by inputting a gesture to the touch panel without requiring a processing program for relating the touch panel and the gesture. Objective.
  • a display scene creation system includes a display scene design setting unit for setting a design of a display scene displayed on a touch panel, and a display scene set by the display scene design setting unit.
  • a display component setting unit for setting one or more display components displayed in the design of the display, and setting the gesture for transition of the display scene by inputting a gesture to the display component set by the display component setting unit
  • a transition display scene table that stores the gesture set by the gesture setting unit and the display scene of the transition destination in association with each other.
  • the display component setting unit can set a plurality of display components in a plurality of layers in a state where at least a part thereof overlaps.
  • the gesture setting unit can set different gestures for the plurality of display components that are set to be at least partially overlapped.
  • a display scene creation system capable of transitioning a display scene by inputting a gesture to the touch panel without requiring a processing program relating the touch panel and the gesture.
  • FIG. 1 is a block diagram showing the overall configuration of a display scene creation system according to an embodiment of the present invention.
  • FIG. 2 is a flowchart showing the flow of a scene design creation process for creating a scene design.
  • FIG. 3 is a diagram illustrating a registration example of the still image item on the screen 1.
  • FIG. 4 is a diagram illustrating a registration example of the still image item on the screen 2.
  • FIG. 5 is a diagram illustrating a registration example of the sub-event item on the screen 2.
  • FIG. 6 is a diagram illustrating a screen example of the scene design initial.
  • FIG. 7 is a diagram illustrating a gesture table.
  • FIG. 8 is a flowchart showing a flow of scene design transition information creation processing for creating scene design transition information.
  • FIG. 1 is a block diagram showing the overall configuration of a display scene creation system according to an embodiment of the present invention.
  • FIG. 2 is a flowchart showing the flow of a scene design creation process for creating a scene design.
  • FIG. 9 is a diagram illustrating an example of scene design transition information.
  • FIG. 10 is a block diagram showing the overall configuration of the display system with a touch panel according to the embodiment of the present invention.
  • FIG. 11 is a flowchart showing the flow of the touch panel on which the scene design transitions and the display process.
  • FIG. 12 is a diagram illustrating a screen example of the transitioned scene design Navi.
  • FIG. 13 is a diagram illustrating a screen example of the transitioned scene design Meter.
  • a display scene creation system includes a display scene design setting unit that sets a design of a display scene displayed on a touch panel, and a display within the display scene design set by the display scene design setting unit.
  • a display component setting unit that sets one or more display components, a gesture setting unit that sets the gesture in which the display scene transitions in response to a gesture input to the display component set by the display component setting unit, And a transition display scene table for storing the gesture set by the gesture setting unit and the display scene of the transition destination in association with each other.
  • the display component setting unit can set a plurality of display components in a plurality of layers in a state where at least a part thereof overlaps.
  • the gesture setting unit can set different gestures for the plurality of display components set to at least partially overlap each other.
  • a display scene creation system capable of transitioning a display scene by inputting a gesture to the touch panel without requiring a processing program relating the touch panel and the gesture.
  • a plurality of display parts set for different layers can be used. Appropriate input operations can be performed.
  • the display component setting unit sets a display component defined by a rectangular area represented by coordinates in the display scene.
  • the transition destination display scene can be read based on the rectangular area where the coordinate string input to the touch panel exists and the gesture indicated by the coordinate string in the rectangular area, the processing program relating the touch panel and the gesture. Therefore, it is possible to provide a display scene creation system capable of transitioning display scenes without requiring a display.
  • the display scene creation program causes a computer to execute the following steps. That is, the program includes one or more display parts displayed in the design of the display scene set in the step of setting the design of the display scene displayed on the touch panel and the step of setting the design of the display scene.
  • the gesture set in the step of setting, the step of setting the gesture in which the display scene transitions by the input of the gesture to the display component set in the step of setting the display component, and the step of setting the gesture And causing the computer to execute a step of associating the display scene with the transition destination.
  • the step of setting the display component it is possible to set a plurality of display components in a plurality of layers in a state where at least a part thereof overlaps.
  • different gestures can be set for the plurality of display components set in a state where at least a part thereof overlaps.
  • the step of setting the display component it is preferable to set a display component defined by a rectangular area represented by coordinates in the display scene.
  • the display scene at the transition destination can be read based on the rectangular area where the coordinate string input to the touch panel exists and the gesture indicated by the coordinate string in the rectangular area, a processing program relating the touch panel and the gesture is required.
  • a display scene creation program that can transition display scenes.
  • a display system with a touch panel includes a display device and a touch panel having a detection area for detecting a user's contact over the entire display area of the display device.
  • the display scene displayed by the display device it is possible to set a plurality of display components in a plurality of layers in a state where at least a portion overlaps, and the plurality of displays set in a state where at least a portion overlaps
  • Different gestures can be set for parts.
  • the display system detects a user contact when the touch panel detects a user contact, a layer of the display component, and a display component.
  • a display control unit that displays a transition destination display scene in the display area of the display device based on the gesture input to the display device.
  • the display component is defined by a rectangular area represented by coordinates in the display area of the display device, and the display control unit displays the display displayed in the display area of the display device.
  • the touch panel detects the user's contact
  • both the rectangular region where the coordinate sequence where the user's contact is detected and the gesture indicated by the coordinate sequence in the rectangular region are the region of the display component.
  • the gesture associated with the display component area it is preferable to display the transition destination display scene in the display area of the display device.
  • a display system with a touch panel capable of transitioning a display scene without requiring a processing program for associating the touch panel with a gesture is provided because it is possible to read out a transition destination display scene by determining whether or not the two match. be able to.
  • the display device is preferably a liquid crystal display device.
  • a cockpit module is a cockpit module attached around a cockpit of a moving body, and includes the display system with a touch panel according to any one of the above configurations.
  • a mobile body includes the display system with a touch panel according to any one of the above-described configurations, and the display device is attached at least at a position that can be viewed from the cockpit. .
  • the mobile body is an automobile, and the display system with a touch panel is connected to an ECU (Electronic Control Unit) of each part of the automobile by a CAN (Control Area Network). It is preferable.
  • ECU Electronic Control Unit
  • CAN Control Area Network
  • a display scene creation system As described above, a display scene creation system, a display scene creation program, a display system with a touch panel, which can transition a display scene by inputting a gesture to the touch panel without requiring a processing program relating the touch panel and the gesture,
  • a cockpit module and a moving body can be provided.
  • FIG. 1 is a block diagram showing the overall configuration of a display scene creation system 100 according to an embodiment of the present invention.
  • the display scene creation system 100 includes an instrument panel development support tool 110 and a scene design director 120.
  • the user uses the instrument panel development support tool 110 and the scene design director 120 to create a display scene in advance on a terminal such as a personal computer.
  • the display scene is referred to as a scene design
  • the display component in the scene design is referred to as an item.
  • a plurality of items can be set in a plurality of layers. That is, it is possible to set a plurality of items so as to overlap each other. Items that overlap each other are displayed in a state where the lower layer can be seen through by ⁇ blend (translucent composition).
  • the instrument panel development support tool 110 is a tool for creating a scene design
  • the scene design director 120 is a tool for creating scene design transition information.
  • the instrument panel development support tool 110 includes a scene design setting unit 111 (display scene design setting unit), an item table 112, and an item setting unit 113 (display component setting unit).
  • the user sets a scene design using the scene design setting unit 111.
  • the item table 112 is an item that is displayed in the scene design and stores items defined by rectangular areas represented by coordinates in the scene design.
  • the user reads and sets one or more items from the item table 112 by the item setting unit 113 in the scene design set by the scene design setting unit 111.
  • the user creates a scene design using the instrument panel development support tool 110 configured as described above.
  • the scene design “Initial” is composed of a screen 1 and a screen 2 when displayed on a display device with a touch panel, and in the present embodiment, the screen 2 is described as a screen corresponding to the touch panel.
  • the user inputs the name “Initial” of the scene design through the scene design setting unit 111 (step S201).
  • the user uses the scene design setting unit 111 to select a screen for registering an item among the screens 1 and 2 of the scene design “Initial” (step S202).
  • screen 1 is first selected.
  • the user registers a still image item on the selected screen 1 (step S203).
  • the user refers to the item table 112, selects the image file name of the still image item by the item setting unit 113, and inputs and registers the display area name and the coordinate value.
  • the file name “AC-Under2.png”, the display area name “AC”, and the coordinate value (0, 416) are registered as the still image item, and the file name “Temp240.png” is registered.
  • the display area name “DriverTemp” and the coordinate value (280, 440) are registered, the file name “U04-07.png”, the display area name “DriverFuture7”, the coordinate value (392, 440) are registered, and the file name “392, 440” is registered.
  • the user registers the digital meter item in the selected screen 1 (step S204).
  • the user sets the font of each digit of the digital meter using the item setting unit 113, and inputs and registers the name of the digital meter, the display area name, and the coordinate value.
  • a display area name “Date2”, a coordinate value (600, 424), a date meter, a display area name “Time”, and a time meter are registered as coordinate values (680, 456) are digital meter items. .
  • step S205 the user frames the still image item and digital meter registered in the selected screen 1 (step S205).
  • the user registers the moving image / NTSC item on the selected screen 1 (step S206).
  • the user inputs and registers a display area name for displaying a moving image from a preset device such as a navigation.
  • a preset device such as a navigation.
  • the display area name “Map” is registered.
  • the user uses the scene design setting unit 111 to select the screen 2 for registering an item next from the screens 1 and 2 of the scene design “Initial” (step S207).
  • the user registers a still image item on the selected screen 2 (step S208).
  • the user refers to the item table 112, selects the image file name of the still image item by the item setting unit 113, and inputs and registers the display area name and the coordinate value.
  • the following nine items are registered as still image items.
  • the following registration information is merely an example, and an item registration method is arbitrary.
  • the first still image item As the first still image item, a file name “BlackBack.png”, a display area name “Back”, a coordinate value (0, 0), and a layer “0” are registered.
  • the first still image item is an item 61 displayed as a background screen on the screen 2 shown in FIG.
  • the second still image item As the second still image item, a file name “TitleMainMenu.png”, a display area name “TitleMainMenu”, a coordinate value (500, 0), and a layer “0” are registered.
  • the second still image item is an item 62 displayed as a screen title (Main Menu) on the screen 2 shown in FIG.
  • the third still image item As the third still image item, a file name “Navi-ButtonOff.png”, a display area name “Navi-ButtonOff”, coordinate values (272, 96), and layer “0” are registered.
  • the third still image item is an item 63 displayed as a navigation control button on the screen 2 shown in FIG.
  • This fourth still image item is an item 64 displayed as a control button of the air conditioner on the screen 2 shown in FIG.
  • the fifth still image item As the fifth still image item, a file name “AudioButtonOff.png”, a display area name “AudioButtonOff”, a coordinate value (8,288), and a layer “0” are registered.
  • the fifth still image item is an item 65 displayed as an audio control button on the screen 2 shown in FIG.
  • the sixth still image item As the sixth still image item, a file name “CameraButtonOff.png”, a display area name “CameraButtonOff.”, Coordinate values (272, 288), and a layer “0” are registered.
  • the sixth still image item is an item 66 displayed as a camera control button on the screen 2 shown in FIG.
  • the seventh still image item As the seventh still image item, a file name “MeterButtonOff.png”, a display area name “MeterButtonOff”, coordinate values (536, 288), and layer “0” are registered.
  • the seventh still image item is an item 67 displayed as a meter control button on the screen 2 shown in FIG.
  • the eighth still image item As the eighth still image item, a file name “NaviLower1Off.png”, a display area name “NaviLower1Off”, coordinate values (262, 278), and layer “1” are registered.
  • the eighth still image item is an item 68 that partially overlaps the item 63 and is displayed below the item 63 on the screen 2 shown in FIG. Although not shown in FIG. 6, the item 68 is displayed in a state where it can be seen through the item 63. Further, an ⁇ parameter for ⁇ -blending the overlapping region of the item 68 and the item 63 may be set here.
  • the ninth still image item As the ninth still image item, a file name “NaviLower2Off.png”, a display area name “NaviLower2Off”, coordinate values (252, 268), and layer “2” are registered.
  • the ninth still image item is an item 69 that partially overlaps the item 63 and the item 68 and is displayed below the item 63 and the item 68 on the screen 2 shown in FIG. Although not shown in FIG. 6, the item 69 is displayed in a state where it can be seen under the item 63 and the item 68.
  • an ⁇ parameter for ⁇ -blending the overlapping region of the item 69, the item 63, and the item 68 may be set here.
  • the user registers a sub event item in the selected screen 2 (step S209).
  • the user refers to the item table 112, selects the image file name of the sub event item by the item setting unit 113, and inputs and registers the display area name and the coordinate value.
  • the file name “Navi-ButtonOn.png”, the sub-event name “NaviButtonOn”, the display area name “NaviButton”, and the coordinate value (272, 96) are registered as sub-event items.
  • the screen 1 and the screen 2 in which items are registered using the instrument panel development support tool 110 that is, the scene design “Initial” is a screen as shown in FIG.
  • the user creates scene design transition information using the scene design director 120 linked with the instrument panel development support tool 110.
  • the scene design director 120 includes a gesture table 121, a gesture setting unit 122, and a scene design transition table 123 (transition display scene table).
  • the gesture table 121 is a table that stores gesture patterns.
  • the gesture table 121 according to the specific example illustrated in FIG. 7 stores 15 types of gesture patterns.
  • the user refers to the gesture table 121, and uses the gesture setting unit 122 to set a gesture pattern that reacts to an item set by the item setting unit 113 of the instrument panel development support tool 110.
  • the scene design transition table 123 is a table that stores transition information in which a gesture set by the user using the gesture setting unit 122 and a transition destination scene design are associated with each other.
  • the user creates scene design transition information using the scene design director 120 configured as described above.
  • a scene design transition information creation process in which the user creates scene design transition information using the scene design director 120 will be described with reference to the flowchart of FIG.
  • a case where the scene design director 120 is used to create transition information of the scene design “Initial” will be described as an example.
  • the user selects and registers the variable “TouchPanel”, which is a sub-event execution condition, using the gesture setting unit 122 (step S801).
  • the user uses the gesture setting unit 122 to select and register a scene design in which the sub-event is displayed (step S802).
  • the scene design “Initial” is selected.
  • the user uses the gesture setting unit 122 to display thumbnails of the sub-events displayed in the selected scene design “Initial”, and selects a sub-event to be registered from the sub-events displayed as thumbnails (Step S1). S803).
  • the user refers to the gesture table 121 storing the 15 types of gesture patterns by the gesture setting unit 122, and selects and registers the gesture pattern to which the selected sub-event reacts (step S804).
  • step S805 the user inputs and registers the name of the sub-event to be executed when the gesture setting unit 122 inputs a gesture that reacts to the sub-event.
  • the user uses the gesture setting unit 122 to make a transition setting for making a transition to the designated scene design after a designated time (step S806).
  • the name “Navi” is set, the scene design “Initial”, the sub-event “AirconButtonOn”, the gesture “all”, the sub-event “AirconButton” to be executed are set, the scene design “Initial”, the sub-event “AudioButtonOn”, the gesture “ “All”, the sub-event “AudioButton” to be executed is set, the scene design “Initial”, the sub-event “MeterB” ttonOn "gesture” all ", the sub-event” MeterButton to be executed ", it is assumed that the transition time 100ms, transition scene name” Meter “is set.
  • transition time 100 ms, and transition scene name “Navi1” are set.
  • scene design “Initial”, the sub-event “NaviLower2On”, the gesture “14”, the sub-event “NaviLower2” to be executed, the transition time 100 ms, and the transition scene name “Navi2” are set.
  • the scene design director 120 associates the scene design registered with the instrument panel development support tool 110 with the scene design registered with the scene design director 120.
  • the user downloads and uses the scene design created by the instrument panel development support tool 110 as described above and the transition information of the scene design created by the scene design director 120 to the display system 200 with a touch panel described in detail below.
  • the application object of this invention is not limited only to a motor vehicle.
  • the present invention is applied to various vehicles (moving means or transferring means) such as a motorcycle, an automatic tricycle, a special vehicle, a railway vehicle, other road vehicles, an amphibious vehicle, an aircraft, or a ship in addition to an automobile. It is possible.
  • the present invention can be applied not only to a vehicle whose main purpose is movement or transfer as described above, but also to a simulator that allows the above-mentioned various types of vehicles to be simulated.
  • the vehicles, simulators, and the like as described above are collectively referred to as “moving bodies”.
  • An automotive cockpit module (driver's seat module) incorporating the display system 200 with a touch panel according to the present embodiment includes a conventional analog instrument such as a speedometer and a tachometer, an indicator lamp configured with LEDs, and the like.
  • a liquid crystal display device 210 for displaying a composite image of the automotive instrument panel is provided.
  • the liquid crystal display device 210 is not a segment type liquid crystal display device that is often used in conventional automobiles, but a dot matrix type liquid crystal panel display device. Since the liquid crystal display device 210 can display an image of an arbitrary pattern, the liquid crystal display device 210 functions as an automobile information display device by displaying a composite image obtained by combining various element images such as various instruments and indicator lamps. To do.
  • the liquid crystal display device 210 includes not only an image of an instrument panel, but also an image taken by an in-vehicle camera installed at the rear or side of an automobile, a navigation image, a television broadcast reception image, an in-vehicle DVD player, etc. It is also possible to display a reproduced image or the like together.
  • the liquid crystal display device 210 is attached to an instrument panel (not shown) which is a frame of a cockpit module (not shown) so as to be positioned behind the steering wheel (not shown).
  • the cockpit module includes an air conditioning unit (not shown), an air conditioning duct (not shown) for introducing air from the air conditioning unit into the vehicle, an audio module (not shown), a lamp switch (not shown), A steering mechanism (not shown), an airbag module (not shown), and the like are included.
  • the liquid crystal display device 210 may be arranged in the center of the instrument panel, that is, between the driver seat and the passenger seat.
  • FIG. 10 is a block diagram showing an example of the overall configuration of the display system with a touch panel 200 according to the present embodiment.
  • a display system with a touch panel 200 includes a liquid crystal display device 210 (210a, 210b), a touch panel 220, a flash ROM (scene design storage unit 230, scene design transition information storage unit 240), a video processing LSI, a DPF-ECU 250 (display control unit), It is equipped with a CAN microcomputer, CPU I / F, and RAM.
  • a touch panel 220 having a detection area for detecting a user's contact is installed on the entire display area of the liquid crystal display device 210.
  • the scene design storage unit 230 the scene design created by the instrument panel development support tool 110 is downloaded and stored.
  • the scene design transition information storage unit 240 the transition information of the scene design created by the scene design director 120 is downloaded and stored.
  • the scene design displayed on the liquid crystal display device 210 is controlled by the DPF-ECU 250.
  • the DPF-ECU 250 is connected to various ECUs provided in each part of the automobile via the in-vehicle LAN.
  • the DPF-ECU 250 receives information indicating the state of each part of the vehicle (state information, hereinafter collectively referred to as state information D unless otherwise required) from each ECU via the in-vehicle LAN. Get at a period.
  • the “predetermined cycle” is set to an arbitrary length according to the specifications of the in-vehicle LAN.
  • the transmission cycle of the status information D from each ECU may be different from each other.
  • the sampling period of the state information D in the DPF-ECU 250 may be matched with the transmission period of each state information.
  • the in-vehicle LAN interface standard to which the present invention can be applied is not limited to CAN.
  • any in-vehicle network conforming to various in-vehicle LAN interface standards such as LIN (Local Interconnect Network), MOST (Media Oriented Systems Transport), FlexRay, etc. can be applied to the present invention.
  • the DPF-ECU 250 reflects the acquired vehicle state information on the scene design created by the instrument panel development support tool 110 of the display scene creation system 100, and displays the reflected scene design in the display area of the liquid crystal display device 210.
  • the “state information” is information representing the state of each part of the automobile.
  • information on the mechanical operation state of each part of the automobile for example, traveling speed, engine speed, etc.
  • information on the mechanical operation state of each part of the automobile for example, traveling speed, engine speed, etc.
  • information on the mechanical operation state of each part of the automobile for example, traveling speed, engine speed, etc.
  • information on the mechanical operation state of each part of the automobile for example, traveling speed, engine speed, etc.
  • information for example, fuel remaining amount, room temperature, etc.
  • the state information these are only examples for passenger cars, and do not limit the present invention, for example, the engine speed, travel speed, select position, shift position, operating state of the direction indicator, Lighting state of lights, door / trunk opening / closing state, door locking state, tire state, presence / absence of airbag abnormality, seat belt wearing state, blowing temperature from air conditioner, room temperature, outside temperature, in-vehicle AV equipment State, automatic steering function setting state, wiper operating state, fuel remaining amount, battery remaining amount, engine / battery dependency (in the case of a hybrid vehicle), oil remaining amount, radiator temperature, engine temperature, and the like.
  • the engine speed, travel speed, select position, shift position, operating state of the direction indicator Lighting state of lights, door / trunk opening / closing state, door locking state, tire state, presence / absence of airbag abnormality, seat belt wearing state, blowing temperature from air conditioner, room temperature, outside temperature, in-vehicle AV equipment State, automatic steering function setting state, wiper operating state, fuel remaining amount, battery remaining amount, engine
  • the DPF-ECU 250 acquires a moving image such as a navigation image from a moving image generating device (not shown) such as a navigation provided in the automobile, and the acquired moving image is used by the instrument panel development support tool 110 of the display scene creation system 100.
  • the reflected scene design is reflected in the created scene design, and the reflected scene design is displayed in the display area of the liquid crystal display device 210.
  • the DPF-ECU 250 includes a rectangular region in which a coordinate sequence that detects the user's contact exists,
  • the scene design transition information storage unit 240 is referred to and the corresponding next transition destination scene The design is read from the scene design storage unit 230 and displayed on the display area of the liquid crystal display device 210.
  • the DPF-ECU 250 determines whether or not the touch panel 220 detects a user contact (step S1101). If it is determined in step S1101 that no contact has been detected, the DPF-ECU 250 ends the process. On the other hand, if it is determined in step S1101 that contact has been detected, the DPF-ECU 250 identifies a rectangular area in which the coordinate sequence that detected the user's contact exists (step S1102), and further indicates the coordinate sequence that detected the contact. A gesture is specified (step S1103).
  • the CAN microcomputer performs area determination to identify a rectangular area.
  • the region is determined from the top value of the X and Y coordinate value column coming up from the touch panel 220 and the image information (the upper left XY coordinate and the vertical and horizontal length of the image) registered in the scene design storage unit 230, and the match If there is a rectangle to go to, proceed to the next.
  • a gesture is determined from a row of X and Y coordinate points coming from the touch panel. It is determined whether a matching event exists from the matching rectangle and the gesture. After determining the area from the image information registered in the scene design storage unit 230 (upper left XY coordinates and length and width of the image), and then determining the gesture registered in the scene design transition information storage unit 240, The area is determined by the CAN microcomputer, and the rectangular area is specified.
  • the DPF-ECU 250 determines whether or not the rectangular area where the specified coordinate sequence exists matches the sub-event (step S1104).
  • step S1104 If it is determined in step S1104 that they do not match, the DPF-ECU 250 ends the process. On the other hand, if it is determined in step S1104 that they match, the DPF-ECU 250 determines whether or not the gesture indicated by the identified coordinate sequence matches the gesture associated with the sub-event (step S1105).
  • step S1105 determines whether or not to perform a scene design transition process. If it is determined in step S1106 that the scene design transition process is to be performed, the DPF-ECU 250 refers to the scene design transition information storage unit 240, flashes and displays the set transition time and sub-event, and then stores the scene design. The transition destination scene design read from the unit 230 is displayed in the display area of the liquid crystal display device 210 (step S1107). On the other hand, if it is determined in step S1106 that the scene design transition process is not performed, the DPF-ECU 250 performs display switching based on the sub-event (step S1108).
  • the “Main Menu” of the scene design “Initial” when an input is made by a gesture on the touch panel 220 for the sub event “Navi Button On”, the sub event “Navi Button On” is displayed blinking for 100 ms, and then the transition destination The scene design “Navi” is displayed in the display area of the liquid crystal display device 210 as shown in FIG.
  • the “Main Menu” of the scene design “Initial” when an input is made on the touch panel 220 with any gesture for the sub event “MeterButtonOn”, the sub event “MeterButtonOn” blinks and is displayed for 100 ms, and then the transition destination The scene design “Meter” is displayed in the display area of the liquid crystal display device 210 as shown in FIG.
  • a display scene creation system and a display scene creation that can transition a scene design by inputting a gesture to the touch panel without requiring a processing program that associates the touch panel with the gesture.
  • a program, a display system with a touch panel, a cockpit module, and a moving body can be provided.
  • the item is defined by a rectangular area represented by coordinates in the display area of the display device, and both the rectangular area where the coordinate string input to the touch panel exists, and the gesture indicated by the coordinate string in the rectangular area, By determining whether or not both the event area and the gesture associated with the sub-event match, it is possible to read the transition destination scene design without requiring a processing program that relates the touch panel and the gesture. It is possible to provide a display scene creation system capable of transitioning the scene design, a display scene creation program, a display system with a touch panel, a cockpit module, and a moving body.
  • the system can be realized at low cost and easily.
  • the display system with a touch panel displays the state of a moving body such as a vehicle, as well as, for example, an image that captures a scene outside the vehicle, and an image that is stored in a storage medium that is provided in the vehicle Further, other arbitrary images (still images or moving images) such as video obtained by communication with the outside and additional information such as character information can be displayed together.
  • the liquid crystal display device is used in the above-described embodiment, the application target of the present invention is not limited to a display system with a touch panel using the liquid crystal display device. Any display device can be used as long as at least a scene design display portion is a dot matrix display device.
  • the application target of the present invention is not limited to the above-described display system with a touch panel mounted on an instrument panel as described above.
  • the present invention can be applied to any display system with a touch panel having a function of transitioning a display scene in accordance with an input gesture, and has various uses and hardware configurations. For example, these are merely examples.
  • a software program (in the embodiment, a program corresponding to the flowchart shown in the figure) is supplied to the apparatus, and the computer of the apparatus reads the supplied program. And the case where it is achieved by executing. Therefore, in order to implement the functional processing of the present invention on a computer, the program itself installed in the computer also implements the present invention. That is, the present invention also includes a program for realizing the functional processing of the present invention.
  • the present invention can be industrially used as a touch panel having a function of transitioning a display scene according to an input gesture, a display system including the touch panel, and a system for creating a display scene.

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Provided is a display scene creation system that enables a display scene to make a transition by inputting a gesture on a touch panel without requiring a processing program for associating the gesture and the touch panel. A display scene creation system (100) is provided with: a display scene design setting unit (111) for setting the design of a display scene; a display component setting unit (113) for setting at least one display component to be displayed in the design of the display scene; a gesture setting unit (122) for setting the gesture that causes the display scene to make a transistion; and a transition display scene table (123) for associating and storing the gesture and the transition destination display scene. The display component setting unit (113) makes it possible for a plurality of display components to be set in a plurality of layers in a state where at least part of the display components overlap. The gesture setting unit (122) makes it possible to set a different gesture for each of the plurality of display components.

Description

表示シーン作成システムDisplay scene creation system
 本発明は、タッチパネル付き表示装置の技術に関し、より特定的には、タッチパネルにジェスチャを入力することにより、ユーザに提示する画像を遷移させる表示シーン作成システム、表示シーン作成プログラム、タッチパネル付き表示システムに関する。 The present invention relates to a technology for a display device with a touch panel, and more specifically, to a display scene creation system, a display scene creation program, and a display system with a touch panel that transition an image to be presented to a user by inputting a gesture to the touch panel. .
 近年、ユーザインタフェースの一種として、タッチパネル付き表示装置が、ゲーム機、携帯電話、PDA、自動販売機、案内板等の様々な分野で広く利用されている。タッチパネル付き表示装置は、タッチパネルの表示とタッチパネルから入力されたジェスチャとの関係付けがなされているため、ユーザは直感的な操作を行うことができる。 In recent years, as a type of user interface, a display device with a touch panel has been widely used in various fields such as game machines, mobile phones, PDAs, vending machines, and information boards. Since the display device with a touch panel is associated with the display on the touch panel and a gesture input from the touch panel, the user can perform an intuitive operation.
 例えば、特開2007-279860号公報では、タッチパネルディスプレイを備えた携帯端末として、タッチパネルディスプレイからジェスチャを入力すると、そのジャスチャに割り当てられた機能を実行し、その実行結果に従って表示シーンを遷移させる技術が提案されている。 For example, in Japanese Patent Application Laid-Open No. 2007-279860, as a portable terminal having a touch panel display, when a gesture is input from a touch panel display, a function assigned to the gesture is executed, and a display scene is changed according to the execution result. Proposed.
 また、特開2008-259915号公報では、タッチパネル入力を用いたゲームシステムとして、タッチパネルディスプレイからジェスチャを入力すると、そのジェスチャの示す図形に対応する攻撃を敵キャラクタに行い、その攻撃の実行結果に従って表示シーンを遷移させる技術が提案されている。 Also, in Japanese Patent Application Laid-Open No. 2008-259915, as a game system using touch panel input, when a gesture is input from a touch panel display, an attack corresponding to the figure indicated by the gesture is performed on the enemy character and displayed according to the execution result of the attack. Techniques for transitioning scenes have been proposed.
 しかしながら、従来、タッチパネル付き表示システムにおいては、タッチパネルとジェスチャを関係付ける汎用的な仕組みがなく、従って、処理プログラムでタッチパネルとジェスチャを関係付けて、表示シーンの遷移を実現していた。このタッチパネルとジェスチャを関係付ける処理プログラムは、表示シーン毎に作成する必要があり、プログラム開発の手間がかかっていた。例えば、同じ領域に対して異なるジェスチャが入力されることが想定される場合や、同じ領域にオーバーラップするボタンやメニューが存在する場合、ジェスチャを正しく認識するためのプログラムが複雑となり、工数も膨大となる。また、認識精度を高めるためには高度なプログラムが必要となり、限られた時間内での開発が不可能であるといったような問題があった。 However, conventionally, a display system with a touch panel does not have a general-purpose mechanism for associating a touch panel with a gesture. Therefore, a transition of a display scene is realized by associating a touch panel with a gesture with a processing program. The processing program for associating the touch panel and the gesture needs to be created for each display scene, and it takes time and effort to develop the program. For example, if it is assumed that different gestures are input to the same area, or there are buttons and menus that overlap in the same area, the program for correctly recognizing the gesture becomes complicated and the man-hours are enormous. It becomes. Moreover, in order to improve the recognition accuracy, a high-level program is required, and there is a problem that development within a limited time is impossible.
 そこで、本発明は、上記問題に鑑みてなされた。すなわち、タッチパネルとジェスチャを関係付ける処理プログラムを要することなく、タッチパネルにジェスチャを入力することにより表示シーンを遷移させることができる表示シーン作成システム、表示シーン作成プログラム、タッチパネル付き表示システムを提供することを目的とする。 Therefore, the present invention has been made in view of the above problems. That is, it is possible to provide a display scene creation system, a display scene creation program, and a display system with a touch panel that can transition a display scene by inputting a gesture to the touch panel without requiring a processing program for relating the touch panel and the gesture. Objective.
 上記の目的を達成するために、本発明に係る表示シーン作成システムは、タッチパネルに表示される表示シーンのデザインを設定する表示シーンデザイン設定部と、前記表示シーンデザイン設定部で設定された表示シーンのデザイン内に表示される1つ以上の表示部品を設定する表示部品設定部と、前記表示部品設定部で設定された表示部品へのジェスチャの入力により、前記表示シーンが遷移する当該ジェスチャを設定するジェスチャ設定部と、前記ジェスチャ設定部で設定されたジェスチャと遷移先の表示シーンとを関連付けて格納する遷移表示シーンテーブルとを備える。前記表示部品設定部により、複数の表示部品を少なくとも一部が重なり合う状態で複数のレイヤに設定することが可能である。前記ジェスチャ設定部により、少なくとも一部が重なり合う状態に設定された前記複数の表示部品に対して、互いに異なるジェスチャを設定可能である。 In order to achieve the above object, a display scene creation system according to the present invention includes a display scene design setting unit for setting a design of a display scene displayed on a touch panel, and a display scene set by the display scene design setting unit. A display component setting unit for setting one or more display components displayed in the design of the display, and setting the gesture for transition of the display scene by inputting a gesture to the display component set by the display component setting unit And a transition display scene table that stores the gesture set by the gesture setting unit and the display scene of the transition destination in association with each other. The display component setting unit can set a plurality of display components in a plurality of layers in a state where at least a part thereof overlaps. The gesture setting unit can set different gestures for the plurality of display components that are set to be at least partially overlapped.
 上記構成によれば、タッチパネルとジェスチャを関係付ける処理プログラムを要することなく、タッチパネルにジェスチャを入力することにより、表示シーンを遷移させることができる表示シーン作成システムを提供することができる。また、複数の表示部品の少なくとも一部の領域が重なり合う状態であっても、ジェスチャを正しく認識させることが可能となる。 According to the above configuration, it is possible to provide a display scene creation system capable of transitioning a display scene by inputting a gesture to the touch panel without requiring a processing program relating the touch panel and the gesture. In addition, it is possible to correctly recognize a gesture even in a state where at least some areas of the plurality of display components overlap.
図1は、本発明の実施形態に係る表示シーン作成システムの全体構成を示すブロック図である。FIG. 1 is a block diagram showing the overall configuration of a display scene creation system according to an embodiment of the present invention. 図2は、シーンデザインを作成するシーンデザイン作成処理の流れを示すフロー図である。FIG. 2 is a flowchart showing the flow of a scene design creation process for creating a scene design. 図3は、画面1の静止画アイテムの登録例を示す図である。FIG. 3 is a diagram illustrating a registration example of the still image item on the screen 1. 図4は、画面2の静止画アイテムの登録例を示す図である。FIG. 4 is a diagram illustrating a registration example of the still image item on the screen 2. 図5は、画面2のサブイベントアイテムの登録例を示す図である。FIG. 5 is a diagram illustrating a registration example of the sub-event item on the screen 2. 図6は、シーンデザインInitialの画面例を示す図である。FIG. 6 is a diagram illustrating a screen example of the scene design initial. 図7は、ジェスチャテーブルを示す図である。FIG. 7 is a diagram illustrating a gesture table. 図8は、シーンデザインの遷移情報を作成するシーンデザイン遷移情報作成処理の流れを示すフロー図である。FIG. 8 is a flowchart showing a flow of scene design transition information creation processing for creating scene design transition information. 図9は、シーンデザインの遷移情報の例を示す図である。FIG. 9 is a diagram illustrating an example of scene design transition information. 図10は、本発明の実施形態に係るタッチパネル付き表示システムの全体構成を示すブロック図である。FIG. 10 is a block diagram showing the overall configuration of the display system with a touch panel according to the embodiment of the present invention. 図11は、シーンデザインが遷移するタッチパネルと表示処理の流れを示すフロー図である。FIG. 11 is a flowchart showing the flow of the touch panel on which the scene design transitions and the display process. 図12は、遷移したシーンデザインNaviの画面例を示す図である。FIG. 12 is a diagram illustrating a screen example of the transitioned scene design Navi. 図13は、遷移したシーンデザインMeterの画面例を示す図である。FIG. 13 is a diagram illustrating a screen example of the transitioned scene design Meter.
 本発明の一実施形態に係る表示シーン作成システムは、タッチパネルに表示される表示シーンのデザインを設定する表示シーンデザイン設定部と、前記表示シーンデザイン設定部で設定された表示シーンのデザイン内に表示される1つ以上の表示部品を設定する表示部品設定部と、前記表示部品設定部で設定された表示部品へのジェスチャの入力により、前記表示シーンが遷移する当該ジェスチャを設定するジェスチャ設定部と、前記ジェスチャ設定部で設定されたジェスチャと遷移先の表示シーンとを関連付けて格納する遷移表示シーンテーブルとを備えている。前記表示部品設定部により、複数の表示部品を少なくとも一部が重なり合う状態で複数のレイヤに設定することが可能である。また、前記ジェスチャ設定部により、少なくとも一部が重なり合う状態に設定された前記複数の表示部品に対して、互いに異なるジェスチャを設定可能である。 A display scene creation system according to an embodiment of the present invention includes a display scene design setting unit that sets a design of a display scene displayed on a touch panel, and a display within the display scene design set by the display scene design setting unit. A display component setting unit that sets one or more display components, a gesture setting unit that sets the gesture in which the display scene transitions in response to a gesture input to the display component set by the display component setting unit, And a transition display scene table for storing the gesture set by the gesture setting unit and the display scene of the transition destination in association with each other. The display component setting unit can set a plurality of display components in a plurality of layers in a state where at least a part thereof overlaps. In addition, the gesture setting unit can set different gestures for the plurality of display components set to at least partially overlap each other.
 上記構成によれば、タッチパネルとジェスチャを関係付ける処理プログラムを要することなく、タッチパネルにジェスチャを入力することにより、表示シーンを遷移させることができる表示シーン作成システムを提供することができる。また、少なくとも一部が重なり合う状態に設定された前記複数の表示部品に対して、互いに異なるジェスチャを設定可能であるので、ジェスチャを使い分けることにより、異なるレイヤに設定された複数の表示部品に対して、適切な入力操作を行うことができる。 According to the above configuration, it is possible to provide a display scene creation system capable of transitioning a display scene by inputting a gesture to the touch panel without requiring a processing program relating the touch panel and the gesture. In addition, since it is possible to set different gestures for the plurality of display parts that are set to overlap at least partially, by using different gestures, a plurality of display parts set for different layers can be used. Appropriate input operations can be performed.
 本発明の一実施形態に係る表示シーン作成システムにおいて、前記表示部品設定部は、前記表示シーン内の座標で表される矩形領域で定義された表示部品を設定することが好ましい。 In the display scene creation system according to an embodiment of the present invention, it is preferable that the display component setting unit sets a display component defined by a rectangular area represented by coordinates in the display scene.
 上記構成によれば、タッチパネルに入力された座標列の存する矩形領域、及び当該矩形領域における座標列の示すジェスチャに基づいて、遷移先の表示シーンを読み出せるため、タッチパネルとジェスチャを関係付ける処理プログラムを要することなく、表示シーンを遷移させることができる表示シーン作成システムを提供することができる。 According to the above configuration, since the transition destination display scene can be read based on the rectangular area where the coordinate string input to the touch panel exists and the gesture indicated by the coordinate string in the rectangular area, the processing program relating the touch panel and the gesture. Therefore, it is possible to provide a display scene creation system capable of transitioning display scenes without requiring a display.
 本発明の一実施形態に係る表示シーン作成プログラムは、コンピュータに、以下のステップを実行させる。すなわち、このプログラムは、タッチパネルに表示される表示シーンのデザインを設定するステップと、前記表示シーンのデザインを設定するステップで設定された表示シーンのデザイン内に表示される1つ以上の表示部品を設定するステップと、前記表示部品を設定するステップで設定された表示部品へのジェスチャの入力により、前記表示シーンが遷移する当該ジェスチャを設定するステップと、前記ジェスチャを設定するステップで設定されたジェスチャと遷移先の表示シーンとを関連付けるステップとを、コンピュータに実行させる。前記表示部品を設定するステップにおいて、複数の表示部品を少なくとも一部が重なり合う状態で複数のレイヤに設定することが可能である。前記ジェスチャを設定するステップにおいて、少なくとも一部が重なり合う状態で設定された前記複数の表示部品に対して、互いに異なるジェスチャを設定可能である。 The display scene creation program according to an embodiment of the present invention causes a computer to execute the following steps. That is, the program includes one or more display parts displayed in the design of the display scene set in the step of setting the design of the display scene displayed on the touch panel and the step of setting the design of the display scene. The gesture set in the step of setting, the step of setting the gesture in which the display scene transitions by the input of the gesture to the display component set in the step of setting the display component, and the step of setting the gesture And causing the computer to execute a step of associating the display scene with the transition destination. In the step of setting the display component, it is possible to set a plurality of display components in a plurality of layers in a state where at least a part thereof overlaps. In the step of setting the gesture, different gestures can be set for the plurality of display components set in a state where at least a part thereof overlaps.
 上記のプログラムによれば、タッチパネルとジェスチャを関係付ける処理プログラムを要することなく、タッチパネルにジェスチャを入力することにより、表示シーンを遷移させることが可能な表示シーンを作成できる。 According to the above program, it is possible to create a display scene capable of transitioning a display scene by inputting a gesture to the touch panel without requiring a processing program for relating the touch panel and the gesture.
 上記の表示シーン作成プログラムにおいて、前記表示部品を設定するステップでは、前記表示シーン内の座標で表される矩形領域で定義された表示部品を設定することが好ましい。 In the display scene creation program, in the step of setting the display component, it is preferable to set a display component defined by a rectangular area represented by coordinates in the display scene.
 これにより、タッチパネルに入力された座標列の存する矩形領域、及び当該矩形領域における座標列の示すジェスチャに基づいて、遷移先の表示シーンを読み出せるため、タッチパネルとジェスチャを関係付ける処理プログラムを要することなく、表示シーンを遷移させることができる表示シーン作成プログラムを提供することができる。 Accordingly, since the display scene at the transition destination can be read based on the rectangular area where the coordinate string input to the touch panel exists and the gesture indicated by the coordinate string in the rectangular area, a processing program relating the touch panel and the gesture is required. In addition, it is possible to provide a display scene creation program that can transition display scenes.
 上記の目的を達成するために、本発明の一実施形態に係るタッチパネル付き表示システムは、表示装置と、当該表示装置の表示領域の全面に、ユーザの接触を検出する検出領域を有するタッチパネルとを備える。前記表示装置が表示する表示シーンにおいて、複数の表示部品を少なくとも一部が重なり合う状態で複数のレイヤに設定することが可能であり、かつ、少なくとも一部が重なり合う状態に設定された前記複数の表示部品に対して、互いに異なるジェスチャを設定可能である。前記表示システムは、前記表示装置の表示領域に表示された表示シーンにおいて、前記タッチパネルがユーザの接触を検出したときに、当該ユーザの接触を検出した表示部品、当該表示部品のレイヤ、及び表示部品に対して入力されたジェスチャに基づいて、遷移先の表示シーンを前記表示装置の表示領域に表示する表示制御部を備える。 In order to achieve the above object, a display system with a touch panel according to an embodiment of the present invention includes a display device and a touch panel having a detection area for detecting a user's contact over the entire display area of the display device. Prepare. In the display scene displayed by the display device, it is possible to set a plurality of display components in a plurality of layers in a state where at least a portion overlaps, and the plurality of displays set in a state where at least a portion overlaps Different gestures can be set for parts. In the display scene displayed in the display area of the display device, the display system detects a user contact when the touch panel detects a user contact, a layer of the display component, and a display component. And a display control unit that displays a transition destination display scene in the display area of the display device based on the gesture input to the display device.
 上記構成によれば、タッチパネルとジェスチャを関係付ける処理プログラムを要することなく、タッチパネルにジェスチャを入力することにより、表示シーンを遷移させることができるタッチパネル付き表示システムを提供することができる。 According to the above configuration, it is possible to provide a display system with a touch panel that can change a display scene by inputting a gesture to the touch panel without requiring a processing program for relating the touch panel and the gesture.
 上記のタッチパネル付き表示システムにおいて、前記表示部品は、前記表示装置の表示領域において座標で表される矩形領域で定義されており、前記表示制御部は、前記表示装置の表示領域に表示された表示シーンにおいて、前記タッチパネルが前記ユーザの接触を検出したときに、当該ユーザの接触を検出した座標列の存する矩形領域、及び当該矩形領域における当該座標列が示すジェスチャの双方が、前記表示部品の領域、及び当該表示部品の領域に関連付けられたジェスチャの双方と一致した場合、遷移先の表示シーンを前記表示装置の表示領域に表示することが好ましい。 In the display system with a touch panel, the display component is defined by a rectangular area represented by coordinates in the display area of the display device, and the display control unit displays the display displayed in the display area of the display device. In the scene, when the touch panel detects the user's contact, both the rectangular region where the coordinate sequence where the user's contact is detected and the gesture indicated by the coordinate sequence in the rectangular region are the region of the display component. And the gesture associated with the display component area, it is preferable to display the transition destination display scene in the display area of the display device.
 上記構成によれば、タッチパネルに入力された座標列の存する矩形領域、及び当該矩形領域における座標列の示すジェスチャ双方と、表示部品の領域、及び当該表示部品の領域に関連付けられたジェスチャの双方とが一致するか否かを判定することによって、遷移先の表示シーンを読み出せるため、タッチパネルとジェスチャを関係付ける処理プログラムを要することなく、表示シーンを遷移させることができるタッチパネル付き表示システムを提供することができる。 According to the above configuration, both the rectangular area where the coordinate string input to the touch panel exists, and the gesture indicated by the coordinate string in the rectangular area, both the display component area and the gesture associated with the display component area, A display system with a touch panel capable of transitioning a display scene without requiring a processing program for associating the touch panel with a gesture is provided because it is possible to read out a transition destination display scene by determining whether or not the two match. be able to.
 上記のタッチパネル付き表示システムにおいて、前記表示装置が液晶表示装置であることが好ましい。 In the display system with a touch panel, the display device is preferably a liquid crystal display device.
 また、本発明の一実施形態に係る操縦席用モジュールは、移動体の操縦席まわりに取り付けられる操縦席用モジュールであって、上記のいずれかの構成に係るタッチパネル付き表示システムを備える。 Further, a cockpit module according to an embodiment of the present invention is a cockpit module attached around a cockpit of a moving body, and includes the display system with a touch panel according to any one of the above configurations.
 更に、本発明の一実施形態に係る移動体は、上記のいずれかの構成に係るタッチパネル付き表示システムを備え、前記表示装置が少なくとも操縦席から視認可能な位置に取り付けられたことを特徴とする。 Furthermore, a mobile body according to an embodiment of the present invention includes the display system with a touch panel according to any one of the above-described configurations, and the display device is attached at least at a position that can be viewed from the cockpit. .
 また、本発明の一実施形態に係る移動体において、前記移動体は、自動車であり、前記タッチパネル付き表示システムは、CAN(Control Area Network)によって自動車各部のECU(Electronic Control Unit)と接続されることが好ましい。 In the mobile body according to an embodiment of the present invention, the mobile body is an automobile, and the display system with a touch panel is connected to an ECU (Electronic Control Unit) of each part of the automobile by a CAN (Control Area Network). It is preferable.
 以上のように、タッチパネルとジェスチャを関係付ける処理プログラムを要することなく、タッチパネルにジェスチャを入力することにより、表示シーンを遷移させることができる表示シーン作成システム、表示シーン作成プログラム、タッチパネル付き表示システム、操縦席用モジュール、及び移動体を提供することができる。 As described above, a display scene creation system, a display scene creation program, a display system with a touch panel, which can transition a display scene by inputting a gesture to the touch panel without requiring a processing program relating the touch panel and the gesture, A cockpit module and a moving body can be provided.
 [実施の形態]
 以下、本発明の実施形態に係る表示シーン作成システムのより具体的な実施形態について、図面を参照しながら詳細に説明する。なお、本実施形態においては、タッチパネル付き表示システムの一具体例として車載用のタッチパネル付き表示システムについて説明するが、本発明にかかるタッチパネル付き表示システムの用途は車載用に限定されない。
[Embodiment]
Hereinafter, a more specific embodiment of a display scene creation system according to an embodiment of the present invention will be described in detail with reference to the drawings. In addition, in this embodiment, although the vehicle-mounted display system with a touchscreen is demonstrated as a specific example of the display system with a touchscreen, the use of the display system with a touchscreen concerning this invention is not limited to vehicle-mounted.
 図1は、本発明の実施形態に係る表示シーン作成システム100の全体構成を示すブロック図である。表示シーン作成システム100は、インパネ開発支援ツール110、及びシーンデザインディレクタ120で構成される。ユーザはこのインパネ開発支援ツール110、及びシーンデザインディレクタ120を用いて、パソコン等の端末で表示シーンを予め作成する。なお、以下本実施形態では、表示シーンをシーンデザインと称し、シーンデザイン内の表示部品をアイテムと称する。なお、1つのシーンデザインにおいて、複数のアイテムを複数のレイヤに設定することができる。すなわち、複数のアイテムを、互いに重なり合う状態に設定することが可能である。互いに重なり合うアイテムは、αブレンド(半透明合成)により、下のレイヤが透けて見える状態に表示される。インパネ開発支援ツール110は、シーンデザインを作成するためのツールであり、シーンデザインディレクタ120は、シーンデザインの遷移情報を作成するためのツールである。 FIG. 1 is a block diagram showing the overall configuration of a display scene creation system 100 according to an embodiment of the present invention. The display scene creation system 100 includes an instrument panel development support tool 110 and a scene design director 120. The user uses the instrument panel development support tool 110 and the scene design director 120 to create a display scene in advance on a terminal such as a personal computer. In the following embodiment, the display scene is referred to as a scene design, and the display component in the scene design is referred to as an item. In one scene design, a plurality of items can be set in a plurality of layers. That is, it is possible to set a plurality of items so as to overlap each other. Items that overlap each other are displayed in a state where the lower layer can be seen through by α blend (translucent composition). The instrument panel development support tool 110 is a tool for creating a scene design, and the scene design director 120 is a tool for creating scene design transition information.
 インパネ開発支援ツール110は、シーンデザイン設定部111(表示シーンデザイン設定部)、アイテムテーブル112、及びアイテム設定部113(表示部品設定部)を備える。ユーザは、シーンデザイン設定部111によってシーンデザインを設定する。アイテムテーブル112は、シーンデザイン内に表示されるアイテムであって、当該シーンデザイン内の座標で表される矩形領域で定義されたアイテムを格納しているテーブルである。ユーザは、シーンデザイン設定部111で設定されたシーンデザイン内に、アイテム設定部113によって、アイテムテーブル112から1つ以上のアイテムを読み出して設定する。 The instrument panel development support tool 110 includes a scene design setting unit 111 (display scene design setting unit), an item table 112, and an item setting unit 113 (display component setting unit). The user sets a scene design using the scene design setting unit 111. The item table 112 is an item that is displayed in the scene design and stores items defined by rectangular areas represented by coordinates in the scene design. The user reads and sets one or more items from the item table 112 by the item setting unit 113 in the scene design set by the scene design setting unit 111.
 ユーザは、上記構成からなるインパネ開発支援ツール110を用いて、シーンデザインを作成する。 The user creates a scene design using the instrument panel development support tool 110 configured as described above.
 以下、ユーザがインパネ開発支援ツール110を用いて、シーンデザインを作成するシーンデザイン作成処理について、図2のフロー図に従って説明する。なお、ここでは、インパネ開発支援ツール110を用いて、シーンデザイン「Initial」を作成する場合を例にとって説明する。シーンデザイン「Initial」は、タッチパネル付きの表示装置に表示される場合、画面1と画面2とで構成され、本実施形態では、画面2がタッチパネルに対応する画面であるとして説明する。 Hereinafter, a scene design creation process in which a user creates a scene design using the instrument panel development support tool 110 will be described with reference to the flowchart of FIG. Here, a case where a scene design “Initial” is created using the instrument panel development support tool 110 will be described as an example. The scene design “Initial” is composed of a screen 1 and a screen 2 when displayed on a display device with a touch panel, and in the present embodiment, the screen 2 is described as a screen corresponding to the touch panel.
 まず、ユーザは、シーンデザイン設定部111によって、シーンデザインの名称「Initial」を入力する(ステップS201)。 First, the user inputs the name “Initial” of the scene design through the scene design setting unit 111 (step S201).
 次に、ユーザは、シーンデザイン設定部111によって、シーンデザイン「Initial」の画面1、及び画面2のうち、アイテムを登録する画面を選択する(ステップS202)。ここでは、まず、画面1が選択されたものとする。 Next, the user uses the scene design setting unit 111 to select a screen for registering an item among the screens 1 and 2 of the scene design “Initial” (step S202). Here, it is assumed that screen 1 is first selected.
 ユーザは、選択した画面1に静止画アイテムを登録する(ステップS203)。ユーザは、アイテムテーブル112を参照して、アイテム設定部113によって、静止画アイテムの画像ファイル名を選択し、表示領域名、及び座標値を入力して登録する。ここでは、図3に示すように、静止画アイテムとして、ファイル名「AC-Under2.png」、表示領域名「AC」、座標値(0,416)が登録され、ファイル名「Temp240.png」、表示領域名「DriverTemp」、座標値(280,440)が登録され、ファイル名「U04-07.png」、表示領域名「DriverFuuryou7」、座標値(392,440)が登録され、ファイル名「U03-01.png」、表示領域名「DriverFukidasi1」、座標値(488,424)が登録され、ファイル名「Temp220.png」、表示領域名「PassengerTemp」、座標値(8,440)が登録され、ファイル名「U04-07.png」、表示領域名「PassengerFuuryou7」、座標値(112,440)が登録され、ファイル名「U03-01.png」、表示領域名「PassengerFukidasi1」、座標値(208,424)が登録されたものとする。 The user registers a still image item on the selected screen 1 (step S203). The user refers to the item table 112, selects the image file name of the still image item by the item setting unit 113, and inputs and registers the display area name and the coordinate value. Here, as shown in FIG. 3, the file name “AC-Under2.png”, the display area name “AC”, and the coordinate value (0, 416) are registered as the still image item, and the file name “Temp240.png” is registered. , The display area name “DriverTemp” and the coordinate value (280, 440) are registered, the file name “U04-07.png”, the display area name “DriverFuture7”, the coordinate value (392, 440) are registered, and the file name “392, 440” is registered. U03-01.png ”, display area name“ DriverFukidasi1 ”, coordinate values (488, 424) are registered, file name“ Temp220.png ”, display area name“ PassengerTemp ”, and coordinate values (8,440) are registered. , File name “U04-07.png”, display area name “Passenger” uuryou7 ", is registered coordinate value (112,440) is, the file name" U03-01.png ", the display area name" PassengerFukidasi1 ", coordinate values (208,424) is assumed to have been registered.
 ユーザは、選択した画面1にデジタルメータアイテムを登録する(ステップS204)。ユーザは、アイテム設定部113によって、デジタルメータの各桁のフォントを設定し、デジタルメータの名称、表示領域名、座標値を入力して登録する。ここでは、デジタルメータアイテムとして、表示領域名「Date2」、座標値(600,424)に日付メータ、表示領域名「Time」、座標値(680,456)に時刻メータが登録されたものとする。 The user registers the digital meter item in the selected screen 1 (step S204). The user sets the font of each digit of the digital meter using the item setting unit 113, and inputs and registers the name of the digital meter, the display area name, and the coordinate value. Here, it is assumed that a display area name “Date2”, a coordinate value (600, 424), a date meter, a display area name “Time”, and a time meter are registered as coordinate values (680, 456) are digital meter items. .
 次に、ユーザは、選択した画面1に登録した静止画アイテム、及びデジタルメータをフレーム化する(ステップS205)。 Next, the user frames the still image item and digital meter registered in the selected screen 1 (step S205).
 また、ユーザは、選択した画面1に動画/NTSCアイテムを登録する(ステップS206)。ユーザは、予め設定されたナビ等のデバイスからの動画を表示する表示領域名を入力して登録する。ここでは、表示領域名「Map」が登録されたものとする。 Also, the user registers the moving image / NTSC item on the selected screen 1 (step S206). The user inputs and registers a display area name for displaying a moving image from a preset device such as a navigation. Here, it is assumed that the display area name “Map” is registered.
 次に、ユーザは、シーンデザイン設定部111によって、シーンデザイン「Initial」の画面1、及び画面2のうち、次にアイテムを登録する画面2を選択する(ステップS207)。 Next, the user uses the scene design setting unit 111 to select the screen 2 for registering an item next from the screens 1 and 2 of the scene design “Initial” (step S207).
 次に、ユーザは、選択した画面2に静止画アイテムを登録する(ステップS208)。ユーザは、アイテムテーブル112を参照して、アイテム設定部113によって、静止画アイテムの画像ファイル名を選択し、表示領域名、及び座標値を入力して登録する。ここでは、図4に示すように、静止画アイテムとして、以下の9つのアイテムを登録するものとする。なお、以下の登録情報は、あくまでも一例であって、アイテムの登録方法は任意である。 Next, the user registers a still image item on the selected screen 2 (step S208). The user refers to the item table 112, selects the image file name of the still image item by the item setting unit 113, and inputs and registers the display area name and the coordinate value. Here, as shown in FIG. 4, the following nine items are registered as still image items. The following registration information is merely an example, and an item registration method is arbitrary.
 第1の静止画アイテムとしては、ファイル名「BlackBack.png」、表示領域名「Back」、座標値(0,0)、レイヤ「0」が登録される。この第1の静止画アイテムは、図6に示す画面2において背景画面として表示されるアイテム61である。 As the first still image item, a file name “BlackBack.png”, a display area name “Back”, a coordinate value (0, 0), and a layer “0” are registered. The first still image item is an item 61 displayed as a background screen on the screen 2 shown in FIG.
 第2の静止画アイテムとしては、ファイル名「TitleMainMenu.png」、表示領域名「TitleMainMenu」、座標値(500,0)、レイヤ「0」が登録される。この第2の静止画アイテムは、図6に示す画面2において画面のタイトル(Main Manu)として表示されるアイテム62である。 As the second still image item, a file name “TitleMainMenu.png”, a display area name “TitleMainMenu”, a coordinate value (500, 0), and a layer “0” are registered. The second still image item is an item 62 displayed as a screen title (Main Menu) on the screen 2 shown in FIG.
 第3の静止画アイテムとしては、ファイル名「Navi-ButtonOff.png」、表示領域名「Navi-ButtonOff」、座標値(272,96)、レイヤ「0」が登録される。この第3の静止画アイテムは、図6に示す画面2にナビゲーションの制御ボタンとして表示されるアイテム63である。 As the third still image item, a file name “Navi-ButtonOff.png”, a display area name “Navi-ButtonOff”, coordinate values (272, 96), and layer “0” are registered. The third still image item is an item 63 displayed as a navigation control button on the screen 2 shown in FIG.
 第4の静止画アイテムとしては、ファイル名「AirConButtonOff.png」、表示領域名「AirConButtonOff」、座標値(536,96)、レイヤ「0」が登録される。この第4の静止画アイテムは、図6に示す画面2においてエアコンの制御ボタンとして表示されるアイテム64である。 As the fourth still image item, a file name “AirConButtonOff.png”, a display area name “AirConButtonOff”, a coordinate value (536, 96), and a layer “0” are registered. This fourth still image item is an item 64 displayed as a control button of the air conditioner on the screen 2 shown in FIG.
 第5の静止画アイテムとしては、ファイル名「AudioButtonOff.png」、表示領域名「AudioButtonOff」、座標値(8,288)、レイヤ「0」が登録される。この第5の静止画アイテムは、図6に示す画面2においてオーディオの制御ボタンとして表示されるアイテム65である。 As the fifth still image item, a file name “AudioButtonOff.png”, a display area name “AudioButtonOff”, a coordinate value (8,288), and a layer “0” are registered. The fifth still image item is an item 65 displayed as an audio control button on the screen 2 shown in FIG.
 第6の静止画アイテムとしては、ファイル名「CameraButtonOff.png」、表示領域名「CameraButtonOff」、座標値(272,288)、レイヤ「0」が登録される。この第6の静止画アイテムは、図6に示す画面2においてカメラの制御ボタンとして表示されるアイテム66である。 As the sixth still image item, a file name “CameraButtonOff.png”, a display area name “CameraButtonOff.”, Coordinate values (272, 288), and a layer “0” are registered. The sixth still image item is an item 66 displayed as a camera control button on the screen 2 shown in FIG.
 第7の静止画アイテムとしては、ファイル名「MeterButtonOff.png」、表示領域名「MeterButtonOff」、座標値(536,288)、レイヤ「0」が登録される。この第7の静止画アイテムは、図6に示す画面2においてメータの制御ボタンとして表示されるアイテム67である。 As the seventh still image item, a file name “MeterButtonOff.png”, a display area name “MeterButtonOff”, coordinate values (536, 288), and layer “0” are registered. The seventh still image item is an item 67 displayed as a meter control button on the screen 2 shown in FIG.
 第8の静止画アイテムとしては、ファイル名「NaviLower1Off.png」、表示領域名「NaviLower1Off」、座標値(262,278)、レイヤ「1」が登録される。この第8の静止画アイテムは、図6に示す画面2において、その一部がアイテム63に重なり、かつ、アイテム63よりも下層に表示されるアイテム68である。なお、図6においては図示を省略しているが、アイテム68は、アイテム63の下に透けて見える状態に表示される。また、アイテム68とアイテム63との重なり領域をαブレンドするためのαパラメータを、ここで設定するようにしても良い。 As the eighth still image item, a file name “NaviLower1Off.png”, a display area name “NaviLower1Off”, coordinate values (262, 278), and layer “1” are registered. The eighth still image item is an item 68 that partially overlaps the item 63 and is displayed below the item 63 on the screen 2 shown in FIG. Although not shown in FIG. 6, the item 68 is displayed in a state where it can be seen through the item 63. Further, an α parameter for α-blending the overlapping region of the item 68 and the item 63 may be set here.
 第9の静止画アイテムとしては、ファイル名「NaviLower2Off.png」、表示領域名「NaviLower2Off」、座標値(252,268)、レイヤ「2」が登録される。この第9の静止画アイテムは、図6に示す画面2において、その一部がアイテム63およびアイテム68に重なり、かつ、アイテム63およびアイテム68よりも下層に表示されるアイテム69である。なお、図6においては図示を省略しているが、アイテム69は、アイテム63およびアイテム68の下に透けて見える状態に表示される。また、アイテム69とアイテム63およびアイテム68との重なり領域をαブレンドするためのαパラメータを、ここで設定するようにしても良い。 As the ninth still image item, a file name “NaviLower2Off.png”, a display area name “NaviLower2Off”, coordinate values (252, 268), and layer “2” are registered. The ninth still image item is an item 69 that partially overlaps the item 63 and the item 68 and is displayed below the item 63 and the item 68 on the screen 2 shown in FIG. Although not shown in FIG. 6, the item 69 is displayed in a state where it can be seen under the item 63 and the item 68. Also, an α parameter for α-blending the overlapping region of the item 69, the item 63, and the item 68 may be set here.
 次に、ユーザは、選択した画面2にサブイベントアイテムを登録する(ステップS209)。ユーザは、アイテムテーブル112を参照して、アイテム設定部113によって、サブイベントアイテムの画像ファイル名を選択し、表示領域名、及び座標値を入力して登録する。ここでは、図5に示すように、サブイベントアイテムとして、ファイル名「Navi-ButtonOn.png」、サブイベント名称「NaviButtonOn」、表示領域名「NaviButton」、座標値(272,96)が登録され、ファイル名「AirConButtonOn.png」、サブイベント名称「AirconButtonOn」、表示領域名「AirConButton」、座標値(536,96)が登録され、ファイル名「AudioButtonOn.png」、サブイベント名称「AudioButtonOn」、表示領域名「AudioButton」、座標値(8,288)が登録され、ファイル名「CameraButtonOn.png」、サブイベント名称「CameraButtonOn」、表示領域名「CameraButton」、座標値(272,288)が登録され、ファイル名「MeterButtonOn.png」、サブイベント名称「MeterButtonOn」、表示領域名「MeterButton」、座標値(536,288)が登録されたものとする。また、ファイル名「NaviLower1On.png」、サブイベント名称「NaviLower1On」、表示領域名「NaviLower1」、座標値(262,278)が登録され、ファイル名「NaviLower2On.png」、サブイベント名称「NaviLower2On」、表示領域名「NaviLower2」、座標値(252,268)が登録されたものとする。 Next, the user registers a sub event item in the selected screen 2 (step S209). The user refers to the item table 112, selects the image file name of the sub event item by the item setting unit 113, and inputs and registers the display area name and the coordinate value. Here, as shown in FIG. 5, the file name “Navi-ButtonOn.png”, the sub-event name “NaviButtonOn”, the display area name “NaviButton”, and the coordinate value (272, 96) are registered as sub-event items. File name “AirConButtonOn.png”, sub-event name “AirconButtonOn”, display area name “AirConButton”, coordinate value (536, 96) are registered, file name “AudioButtonOn.png”, sub-event name “AudioButtonOn.display”, display area name “AudioButtonOn. Name “AudioButton”, coordinate value (8,288) are registered, file name “CameraButtonOn.png”, sub-event name “CameraButtonOn” Display area name “CameraButton” and coordinate values (272, 288) are registered, file name “MeterButtonOn.png”, sub-event name “MeterButtonOn”, display area name “MeterButton”, coordinate value (536,288) Shall be. In addition, the file name “NaviLower1On.png”, the sub event name “NaviLower1On”, the display area name “NaviLower1”, and the coordinate value (262, 278) are registered, the file name “NaviLower2On.png”, the subevent name “NaviLower2On”, It is assumed that the display area name “NaviLower2” and coordinate values (252, 268) are registered.
 上記のように、インパネ開発支援ツール110を用いて、アイテムが登録された画面1、及び画面2、すなわちシーンデザイン「Initial」は、図6に示すような画面となる。 As described above, the screen 1 and the screen 2 in which items are registered using the instrument panel development support tool 110, that is, the scene design “Initial” is a screen as shown in FIG.
 このように作成されたシーンデザイン「Initial」について、ユーザは、インパネ開発支援ツール110と連携させたシーンデザインディレクタ120を用いて、シーンデザインの遷移情報を作成する。 For the scene design “Initial” created in this way, the user creates scene design transition information using the scene design director 120 linked with the instrument panel development support tool 110.
 シーンデザインディレクタ120は、ジェスチャテーブル121、ジェスチャ設定部122、シーンデザイン遷移テーブル123(遷移表示シーンテーブル)を備える。ジェスチャテーブル121は、ジェスチャのパターンを格納しているテーブルである。例えば、図7に示した具体例に係るジェスチャテーブル121は、15種類のジェスチャパターンを格納している。ユーザは、ジェスチャテーブル121を参照し、ジェスチャ設定部122によって、インパネ開発支援ツール110のアイテム設定部113で設定されたアイテムに反応するジェスチャのパターンを設定する。シーンデザイン遷移テーブル123は、ユーザがジェスチャ設定部122によって設定したジェスチャと遷移先のシーンデザインとを関連付けた遷移情報を格納するテーブルである。 The scene design director 120 includes a gesture table 121, a gesture setting unit 122, and a scene design transition table 123 (transition display scene table). The gesture table 121 is a table that stores gesture patterns. For example, the gesture table 121 according to the specific example illustrated in FIG. 7 stores 15 types of gesture patterns. The user refers to the gesture table 121, and uses the gesture setting unit 122 to set a gesture pattern that reacts to an item set by the item setting unit 113 of the instrument panel development support tool 110. The scene design transition table 123 is a table that stores transition information in which a gesture set by the user using the gesture setting unit 122 and a transition destination scene design are associated with each other.
 ユーザは、上記構成からなるシーンデザインディレクタ120を用いて、シーンデザインの遷移情報を作成する。以下、ユーザがシーンデザインディレクタ120を用いて、シーンデザインの遷移情報を作成するシーンデザイン遷移情報作成処理について、図8のフロー図に従って説明する。なお、ここでは、シーンデザインディレクタ120を用いて、シーンデザイン「Initial」の遷移情報を作成する場合を例にとって説明する。 The user creates scene design transition information using the scene design director 120 configured as described above. Hereinafter, a scene design transition information creation process in which the user creates scene design transition information using the scene design director 120 will be described with reference to the flowchart of FIG. Here, a case where the scene design director 120 is used to create transition information of the scene design “Initial” will be described as an example.
 まず、ユーザは、ジェスチャ設定部122によって、サブイベントの実行条件である変数「TouchPanel」を選択して登録する(ステップS801)。 First, the user selects and registers the variable “TouchPanel”, which is a sub-event execution condition, using the gesture setting unit 122 (step S801).
 次に、ユーザは、ジェスチャ設定部122によって、サブイベントの表示されるシーンデザインを選択して登録する(ステップS802)。ここでは、シーンデザイン「Initial」が選択されたものとする。 Next, the user uses the gesture setting unit 122 to select and register a scene design in which the sub-event is displayed (step S802). Here, it is assumed that the scene design “Initial” is selected.
 次に、ユーザは、ジェスチャ設定部122によって、選択されたシーンデザイン「Initial」に表示されるサブイベントをサムネイル表示し、サムネイル表示されたサブイベントの中から、登録したいサブイベントを選択する(ステップS803)。 Next, the user uses the gesture setting unit 122 to display thumbnails of the sub-events displayed in the selected scene design “Initial”, and selects a sub-event to be registered from the sub-events displayed as thumbnails (Step S1). S803).
 次に、ユーザは、ジェスチャ設定部122によって、15種類のジェスチャのパターンを格納するジェスチャテーブル121を参照し、選択したサブイベントの反応する当該ジェスチャのパターンを選択して登録する(ステップS804)。 Next, the user refers to the gesture table 121 storing the 15 types of gesture patterns by the gesture setting unit 122, and selects and registers the gesture pattern to which the selected sub-event reacts (step S804).
 次に、ユーザは、ジェスチャ設定部122によって、サブイベントの反応するジェスチャの入力があった場合に実行するサブイベント名称を入力して登録する(ステップS805)。 Next, the user inputs and registers the name of the sub-event to be executed when the gesture setting unit 122 inputs a gesture that reacts to the sub-event (step S805).
 次に、ユーザは、登録したサブイベント名称のサブイベントが実行されたのち、指定時間後に指定したシーンデザインに遷移するように設定するためのトランジション設定をジェスチャ設定部122によって行う(ステップS806)。 Next, after the sub-event having the registered sub-event name is executed, the user uses the gesture setting unit 122 to make a transition setting for making a transition to the designated scene design after a designated time (step S806).
 ここでは、シーンデザインの遷移情報として、図9に示すように、シーンデザイン「Initial」、サブイベント「NaviButtonOn」、ジェスチャ「1~11」、実行するサブイベント「NaviButton」、トランジション時間100ms、トランジションシーン名称「Navi」が設定され、シーンデザイン「Initial」、サブイベント「AirconButtonOn」、ジェスチャ「すべて」、実行するサブイベント「AirconButton」が設定され、シーンデザイン「Initial」、サブイベント「AudioButtonOn」、ジェスチャ「すべて」、実行するサブイベント「AudioButton」が設定され、シーンデザイン「Initial」、サブイベント「MeterButtonOn」、ジェスチャ「すべて」、実行するサブイベント「MeterButton」、トランジション時間100ms、トランジションシーン名称「Meter」が設定されたものとする。 Here, as shown in FIG. 9, as scene design transition information, the scene design “Initial”, the sub-event “NaviButtonOn”, the gesture “1-11”, the sub-event “NaviButton” to be executed, the transition time 100 ms, the transition scene The name “Navi” is set, the scene design “Initial”, the sub-event “AirconButtonOn”, the gesture “all”, the sub-event “AirconButton” to be executed are set, the scene design “Initial”, the sub-event “AudioButtonOn”, the gesture “ "All", the sub-event "AudioButton" to be executed is set, the scene design "Initial", the sub-event "MeterB" ttonOn "gesture" all ", the sub-event" MeterButton to be executed ", it is assumed that the transition time 100ms, transition scene name" Meter "is set.
 また、シーンデザイン「Initial」、サブイベント「NaviLower1On」、ジェスチャ「12~13」、実行するサブイベント「NaviLower1」、トランジション時間100ms、トランジションシーン名称「Navi1」が設定されたものとする。同様に、シーンデザイン「Initial」、サブイベント「NaviLower2On」、ジェスチャ「14」、実行するサブイベント「NaviLower2」、トランジション時間100ms、トランジションシーン名称「Navi2」が設定されたものとする。 Also, assume that the scene design “Initial”, sub-event “NaviLower1On”, gesture “12-13”, sub-event “NaviLower1” to be executed, transition time 100 ms, and transition scene name “Navi1” are set. Similarly, it is assumed that the scene design “Initial”, the sub-event “NaviLower2On”, the gesture “14”, the sub-event “NaviLower2” to be executed, the transition time 100 ms, and the transition scene name “Navi2” are set.
 なお、ここで、互いに重なり合うように設定された複数の表示領域(アイテム)に対応するサブイベントにおいては、互いに異なるジェスチャを設定する必要がある。すなわち、互いに重なり合うアイテムにおいては、互いに異なるジェスチャに反応するように設定することにより、アイテムの表示領域が重なっていても、誤動作することなく、正しい入力を受け付けることが可能となる。 Here, in the sub-event corresponding to a plurality of display areas (items) set to overlap each other, it is necessary to set different gestures. In other words, by setting the items that overlap each other to respond to different gestures, even if the display areas of the items overlap, correct input can be accepted without malfunction.
 そして、シーンデザインディレクタ120は、インパネ開発支援ツール110で登録したシーンデザインとシーンデザインディレクタ120で登録したシーンデザインとの関連付けを行う。 The scene design director 120 associates the scene design registered with the instrument panel development support tool 110 with the scene design registered with the scene design director 120.
 ユーザは、上記のようにインパネ開発支援ツール110で作成したシーンデザイン、及びシーンデザインディレクタ120で作成したシーンデザインの遷移情報を以下詳述するタッチパネル付き表示システム200にダウンロードして利用する。 The user downloads and uses the scene design created by the instrument panel development support tool 110 as described above and the transition information of the scene design created by the scene design director 120 to the display system 200 with a touch panel described in detail below.
 以下、図面を参照しながら、自動車(乗用車)に本発明を適用した場合の、本発明の実施形態について具体的に説明する。なお、本発明の適用対象は自動車のみに限定されない。本発明は、自動車の他にも、自動二輪車、自動三輪車、特殊車両、鉄道車両その他の路面車両、水陸両用車、航空機、または船舶等の、種々の乗り物(移動手段または移送手段)に適用することが可能である。更に、上述のような移動または移送を主目的とする乗り物に限らず、上述した各種の乗り物の操縦を疑似体験させるシミュレータにも本発明を適用可能である。本願では、上述したような乗り物やシミュレータ等を包括して「移動体」と称する。 Hereinafter, an embodiment of the present invention when the present invention is applied to an automobile (passenger car) will be specifically described with reference to the drawings. In addition, the application object of this invention is not limited only to a motor vehicle. The present invention is applied to various vehicles (moving means or transferring means) such as a motorcycle, an automatic tricycle, a special vehicle, a railway vehicle, other road vehicles, an amphibious vehicle, an aircraft, or a ship in addition to an automobile. It is possible. Furthermore, the present invention can be applied not only to a vehicle whose main purpose is movement or transfer as described above, but also to a simulator that allows the above-mentioned various types of vehicles to be simulated. In the present application, the vehicles, simulators, and the like as described above are collectively referred to as “moving bodies”.
 本実施形態に係るタッチパネル付き表示システム200を組み込んだ自動車用コックピットモジュール(操縦席用モジュール)は、スピードメータやタコメータ等の旧来のアナログ計器や、LED等で構成されていたインジケータランプ等を含む旧来の自動車用計器盤の代わりに、自動車用計器盤の合成画像を表示する液晶表示装置210を備えている。 An automotive cockpit module (driver's seat module) incorporating the display system 200 with a touch panel according to the present embodiment includes a conventional analog instrument such as a speedometer and a tachometer, an indicator lamp configured with LEDs, and the like. Instead of the automotive instrument panel, a liquid crystal display device 210 for displaying a composite image of the automotive instrument panel is provided.
 なお、液晶表示装置210は、旧来の自動車においても多く用いられていたセグメント方式の液晶表示器ではなく、ドットマトリクス方式の液晶パネルディスプレイ装置である。液晶表示装置210は、任意のパターンの画像を表示することが可能であるので、各種の計器やインジケータランプ等の各種要素画像を組み合わせた合成画像を表示することにより、自動車用情報表示装置として機能する。また、液晶表示装置210には、計器盤の画像だけでなく、自動車の後部または側方に設置された車載カメラによる撮影画像や、ナビゲーション画像、あるいはテレビジョン放送の受信画像や車載DVDプレイヤー等の再生画像等を併せて表示することも可能である。 Note that the liquid crystal display device 210 is not a segment type liquid crystal display device that is often used in conventional automobiles, but a dot matrix type liquid crystal panel display device. Since the liquid crystal display device 210 can display an image of an arbitrary pattern, the liquid crystal display device 210 functions as an automobile information display device by displaying a composite image obtained by combining various element images such as various instruments and indicator lamps. To do. In addition, the liquid crystal display device 210 includes not only an image of an instrument panel, but also an image taken by an in-vehicle camera installed at the rear or side of an automobile, a navigation image, a television broadcast reception image, an in-vehicle DVD player, etc. It is also possible to display a reproduced image or the like together.
 液晶表示装置210は、ステアリングホイール(図示省略)の裏側に位置するよう、コックピットモジュール(図示省略)の枠体であるインストルメントパネル(図示省略)に取り付けられている。コックピットモジュールは、液晶表示装置210の他に、空調ユニット(図示省略)、空調ユニットからの空気を車内に導入する空調ダクト(図示省略)、オーディオモジュール(図示省略)、ランプスイッチ(図示省略)、ステアリング機構(図示省略)、エアバッグモジュール(図示省略)等を含んでいる。なお、液晶表示装置210が、インストルメントパネルの中央部、すなわち運転席と助手席との間などに配置された構造であっても良い。 The liquid crystal display device 210 is attached to an instrument panel (not shown) which is a frame of a cockpit module (not shown) so as to be positioned behind the steering wheel (not shown). In addition to the liquid crystal display device 210, the cockpit module includes an air conditioning unit (not shown), an air conditioning duct (not shown) for introducing air from the air conditioning unit into the vehicle, an audio module (not shown), a lamp switch (not shown), A steering mechanism (not shown), an airbag module (not shown), and the like are included. Note that the liquid crystal display device 210 may be arranged in the center of the instrument panel, that is, between the driver seat and the passenger seat.
 図10は、本実施形態に係るタッチパネル付き表示システム200の全体構成の一例を示すブロック図である。タッチパネル付き表示システム200は、液晶表示装置210(210a、210b)、タッチパネル220、FlashROM(シーンデザイン格納部230、シーンデザイン遷移情報格納部240)、映像処理LSI、DPF-ECU250(表示制御部)、CANマイコン、CPU I/F、RAMを備える。 FIG. 10 is a block diagram showing an example of the overall configuration of the display system with a touch panel 200 according to the present embodiment. A display system with a touch panel 200 includes a liquid crystal display device 210 (210a, 210b), a touch panel 220, a flash ROM (scene design storage unit 230, scene design transition information storage unit 240), a video processing LSI, a DPF-ECU 250 (display control unit), It is equipped with a CAN microcomputer, CPU I / F, and RAM.
 液晶表示装置210の表示領域の全面には、ユーザの接触を検出する検出領域を有するタッチパネル220が設置されている。シーンデザイン格納部230には、インパネ開発支援ツール110で作成したシーンデザインがダウンロードされて格納される。シーンデザイン遷移情報格納部240には、シーンデザインディレクタ120で作成されたシーンデザインの遷移情報がダウンロードされて格納される。 A touch panel 220 having a detection area for detecting a user's contact is installed on the entire display area of the liquid crystal display device 210. In the scene design storage unit 230, the scene design created by the instrument panel development support tool 110 is downloaded and stored. In the scene design transition information storage unit 240, the transition information of the scene design created by the scene design director 120 is downloaded and stored.
 液晶表示装置210に表示されるシーンデザインは、DPF-ECU250によって制御される。DPF-ECU250は、車内LANを介して自動車の各部に設けられた種々のECUと接続されている。DPF-ECU250は、車内LANを介して、各ECUから、自動車の各部の状態を表す情報(状態情報、以降、特に必要のある場合を除いて、状態情報Dと総称する。)を、所定の周期で取得する。なお、「所定の周期」は、車内LANの仕様等に応じて任意の長さに設定される。また、各ECUからの状態情報Dの送信周期が互いに異なる場合もある。この場合は、DPF-ECU250における状態情報Dのサンプリング周期を、それぞれの状態情報の送信周期に合わせれば良い。しかし、本発明を適用できる車内LANのインタフェース規格はCANに限定されない。例えば、LIN(Local Interconnect Network)、MOST(Media Oriented Systems Transport)、FlexRay等の各種車内LANインタフェース規格に準じた任意の車載ネットワークを、本発明に適用することができる。 The scene design displayed on the liquid crystal display device 210 is controlled by the DPF-ECU 250. The DPF-ECU 250 is connected to various ECUs provided in each part of the automobile via the in-vehicle LAN. The DPF-ECU 250 receives information indicating the state of each part of the vehicle (state information, hereinafter collectively referred to as state information D unless otherwise required) from each ECU via the in-vehicle LAN. Get at a period. The “predetermined cycle” is set to an arbitrary length according to the specifications of the in-vehicle LAN. Moreover, the transmission cycle of the status information D from each ECU may be different from each other. In this case, the sampling period of the state information D in the DPF-ECU 250 may be matched with the transmission period of each state information. However, the in-vehicle LAN interface standard to which the present invention can be applied is not limited to CAN. For example, any in-vehicle network conforming to various in-vehicle LAN interface standards such as LIN (Local Interconnect Network), MOST (Media Oriented Systems Transport), FlexRay, etc. can be applied to the present invention.
 DPF-ECU250は、取得した自動車の状態情報を、表示シーン作成システム100のインパネ開発支援ツール110で作成したシーンデザインに反映させ、反映させたシーンデザインを液晶表示装置210の表示領域に表示させる。 The DPF-ECU 250 reflects the acquired vehicle state information on the scene design created by the instrument panel development support tool 110 of the display scene creation system 100, and displays the reflected scene design in the display area of the liquid crystal display device 210.
 「状態情報」とは、上述したとおり、自動車の各部の状態を表す情報であるが、自動車の各部の機械的動作状態に関する情報(例えば、走行速度やエンジン回転数等)以外に、各部の機械的動作とは直接的には関係がない状態に関する情報(例えば、燃料残量や室内温度等)等の、種々の情報を含み得る。状態情報としては、これらはあくまでも乗用車の場合の例にすぎず、本発明を限定するものではないが、例えば、エンジンの回転数、走行速度、セレクトポジション、シフトポジション、方向指示器の稼働状態、ライト類の点灯状態、ドアやトランクの開閉状態、ドア施錠の状態、タイヤの状態、エアバッグの異常の有無、シートベルトの装着状態、空調機からの吹き出し温度、室温、外気温、車載AV機器の状態、自動操縦機能の設定状態、ワイパーの稼働状態、燃料残量、電池残量、エンジンとバッテリの依存度合(ハイブリッド車の場合)、オイル残量、ラジエータ温度、エンジン温度等がある。 As described above, the “state information” is information representing the state of each part of the automobile. In addition to information on the mechanical operation state of each part of the automobile (for example, traveling speed, engine speed, etc.), Various information such as information (for example, fuel remaining amount, room temperature, etc.) relating to a state that is not directly related to the target operation may be included. As the state information, these are only examples for passenger cars, and do not limit the present invention, for example, the engine speed, travel speed, select position, shift position, operating state of the direction indicator, Lighting state of lights, door / trunk opening / closing state, door locking state, tire state, presence / absence of airbag abnormality, seat belt wearing state, blowing temperature from air conditioner, room temperature, outside temperature, in-vehicle AV equipment State, automatic steering function setting state, wiper operating state, fuel remaining amount, battery remaining amount, engine / battery dependency (in the case of a hybrid vehicle), oil remaining amount, radiator temperature, engine temperature, and the like.
 また、DPF-ECU250は、自動車に備え付けられたナビ等の動画生成装置(図示省略)から、ナビゲーション画像等の動画を取得し、取得した動画を、表示シーン作成システム100のインパネ開発支援ツール110で作成したシーンデザインに反映させ、反映させたシーンデザインを液晶表示装置210の表示領域に表示させる。 Further, the DPF-ECU 250 acquires a moving image such as a navigation image from a moving image generating device (not shown) such as a navigation provided in the automobile, and the acquired moving image is used by the instrument panel development support tool 110 of the display scene creation system 100. The reflected scene design is reflected in the created scene design, and the reflected scene design is displayed in the display area of the liquid crystal display device 210.
 更に、DPF-ECU250は、液晶表示装置210の表示領域に表示されたシーンデザインにおいて、タッチパネル220がユーザの接触を検出したときに、当該ユーザの接触を検出した座標列の存する矩形領域、及び当該矩形領域における座標列の示すジェスチャの双方が、サブイベント、及び当該サブイベントに関連付けられたジェスチャの双方と一致した場合、シーンデザイン遷移情報格納部240を参照し、該当する次の遷移先のシーンデザインをシーンデザイン格納部230から読み出して液晶表示装置210の表示領域に表示させる。 Furthermore, when the touch panel 220 detects a user's contact in the scene design displayed in the display area of the liquid crystal display device 210, the DPF-ECU 250 includes a rectangular region in which a coordinate sequence that detects the user's contact exists, When both the gestures indicated by the coordinate sequence in the rectangular area match both the sub-event and the gesture associated with the sub-event, the scene design transition information storage unit 240 is referred to and the corresponding next transition destination scene The design is read from the scene design storage unit 230 and displayed on the display area of the liquid crystal display device 210.
 以下、液晶表示装置210に表示されるシーンデザインが遷移するタッチパネルと表示処理について、図11のフロー図に従って説明する。 Hereinafter, a touch panel and a display process in which a scene design displayed on the liquid crystal display device 210 changes will be described with reference to a flowchart of FIG.
 まず、DPF-ECU250は、タッチパネル220がユーザの接触を検出したか否かを判定する(ステップS1101)。ステップS1101において、接触を検出しなかったと判定された場合、DPF-ECU250は、処理を終了する。一方、ステップS1101において、接触を検出したと判定された場合、DPF-ECU250は、ユーザの接触を検出した座標列の存する矩形領域を特定し(ステップS1102)、更に接触を検出した座標列の示すジェスチャを特定する(ステップS1103)。ここでは、タッチパネル220からあがってくるX、Y座標値とシーンデザイン遷移情報格納部240に登録された情報を元に、CANマイコンにて領域判定を行い、矩形領域を特定する。すなわち、タッチパネル220からあがってくるX,Y座標値の列の先頭値から、シーンデザイン格納部230に登録された画像情報(左上XY座標と画像の縦横の長さ)から領域判定を行い、マッチする矩形があれば次に進む。タッチパネルからあがってくるX,Y座標地の列からジェスチャを判定する。マッチする矩形と、ジェスチャとから、一致するイベントが存在するかどうかを判定する。シーンデザイン格納部230に登録された画像情報(左上XY座標と画像の縦横の長さ)から領域判定を行い、その後シーンデザイン遷移情報格納部240に登録されたジェスチャとの判定を行ってから、CANマイコンにて領域判定を行い、矩形領域を特定する。 First, the DPF-ECU 250 determines whether or not the touch panel 220 detects a user contact (step S1101). If it is determined in step S1101 that no contact has been detected, the DPF-ECU 250 ends the process. On the other hand, if it is determined in step S1101 that contact has been detected, the DPF-ECU 250 identifies a rectangular area in which the coordinate sequence that detected the user's contact exists (step S1102), and further indicates the coordinate sequence that detected the contact. A gesture is specified (step S1103). Here, based on the X and Y coordinate values coming from the touch panel 220 and the information registered in the scene design transition information storage unit 240, the CAN microcomputer performs area determination to identify a rectangular area. That is, the region is determined from the top value of the X and Y coordinate value column coming up from the touch panel 220 and the image information (the upper left XY coordinate and the vertical and horizontal length of the image) registered in the scene design storage unit 230, and the match If there is a rectangle to go to, proceed to the next. A gesture is determined from a row of X and Y coordinate points coming from the touch panel. It is determined whether a matching event exists from the matching rectangle and the gesture. After determining the area from the image information registered in the scene design storage unit 230 (upper left XY coordinates and length and width of the image), and then determining the gesture registered in the scene design transition information storage unit 240, The area is determined by the CAN microcomputer, and the rectangular area is specified.
 次に、DPF-ECU250は、特定した座標列の存する矩形領域とサブイベントとが一致するか否かを判定する(ステップS1104)。 Next, the DPF-ECU 250 determines whether or not the rectangular area where the specified coordinate sequence exists matches the sub-event (step S1104).
 ステップS1104において、一致しないと判定された場合、DPF-ECU250は、処理を終了する。一方、ステップS1104において、一致すると判定された場合、DPF-ECU250は、特定した座標列の示すジェスチャとサブイベントに関連付けられたジェスチャとが一致するか否か判定する(ステップS1105)。 If it is determined in step S1104 that they do not match, the DPF-ECU 250 ends the process. On the other hand, if it is determined in step S1104 that they match, the DPF-ECU 250 determines whether or not the gesture indicated by the identified coordinate sequence matches the gesture associated with the sub-event (step S1105).
 ステップS1105において、一致しないと判定された場合、DPF-ECU250は、処理を終了する。一方、ステップS1105において、一致すると判定された場合、DPF-ECU250は、シーンデザイン遷移処理を行うか否かを判定する(ステップS1106)。ステップS1106において、シーンデザイン遷移処理を行うと判定された場合、DPF-ECU250は、シーンデザイン遷移情報格納部240を参照し、設定されたトランジション時間、サブイベントを点滅表示させた後、シーンデザイン格納部230から読み出した遷移先のシーンデザインを液晶表示装置210の表示領域に表示する(ステップS1107)。一方、ステップS1106において、シーンデザイン遷移処理を行わないと判定された場合、DPF-ECU250は、サブイベントによる表示切替を行う(ステップS1108)。 If it is determined in step S1105 that they do not match, the DPF-ECU 250 ends the process. On the other hand, if it is determined in step S1105 that they match, the DPF-ECU 250 determines whether or not to perform a scene design transition process (step S1106). If it is determined in step S1106 that the scene design transition process is to be performed, the DPF-ECU 250 refers to the scene design transition information storage unit 240, flashes and displays the set transition time and sub-event, and then stores the scene design. The transition destination scene design read from the unit 230 is displayed in the display area of the liquid crystal display device 210 (step S1107). On the other hand, if it is determined in step S1106 that the scene design transition process is not performed, the DPF-ECU 250 performs display switching based on the sub-event (step S1108).
 例えば、シーンデザイン「Initial」の「MainMenu」において、サブイベント「NaviButtonOn」に対して、タッチパネル220に何らかのジェスチャによる入力が行われた場合、サブイベント「NaviButtonOn」が100ms点滅表示した後、遷移先のシーンデザイン「Navi」が図12に示すように、液晶表示装置210の表示領域に表示される。また、シーンデザイン「Initial」の「MainMenu」において、サブイベント「MeterButtonOn」に対して、タッチパネル220に何らかのジェスチャによる入力が行われた場合、サブイベント「MeterButtonOn」が100ms点滅表示した後、遷移先のシーンデザイン「Meter」が図13に示すように、液晶表示装置210の表示領域に表示される。 For example, in the “Main Menu” of the scene design “Initial”, when an input is made by a gesture on the touch panel 220 for the sub event “Navi Button On”, the sub event “Navi Button On” is displayed blinking for 100 ms, and then the transition destination The scene design “Navi” is displayed in the display area of the liquid crystal display device 210 as shown in FIG. In addition, in the “Main Menu” of the scene design “Initial”, when an input is made on the touch panel 220 with any gesture for the sub event “MeterButtonOn”, the sub event “MeterButtonOn” blinks and is displayed for 100 ms, and then the transition destination The scene design “Meter” is displayed in the display area of the liquid crystal display device 210 as shown in FIG.
 以上説明したように、本発明によれば、タッチパネルとジェスチャを関係付ける処理プログラムを要することなく、タッチパネルにジェスチャを入力することにより、シーンデザインを遷移させることができる表示シーン作成システム、表示シーン作成プログラム、タッチパネル付き表示システム、操縦席用モジュール、及び移動体を提供することができる。 As described above, according to the present invention, a display scene creation system and a display scene creation that can transition a scene design by inputting a gesture to the touch panel without requiring a processing program that associates the touch panel with the gesture. A program, a display system with a touch panel, a cockpit module, and a moving body can be provided.
 また、アイテムを、表示装置の表示領域において座標で表される矩形領域で定義しており、タッチパネルに入力された座標列の存する矩形領域、及び当該矩形領域における座標列の示すジェスチャ双方と、サブイベントの領域、及び当該サブイベントに関連付けられたジェスチャの双方とが一致するか否かを判定することによって、遷移先のシーンデザインを読み出せるため、タッチパネルとジェスチャを関係付ける処理プログラムを要することなく、シーンデザインを遷移させることができる表示シーン作成システム、表示シーン作成プログラム、タッチパネル付き表示システム、操縦席用モジュール、及び移動体を提供することができる。 Further, the item is defined by a rectangular area represented by coordinates in the display area of the display device, and both the rectangular area where the coordinate string input to the touch panel exists, and the gesture indicated by the coordinate string in the rectangular area, By determining whether or not both the event area and the gesture associated with the sub-event match, it is possible to read the transition destination scene design without requiring a processing program that relates the touch panel and the gesture. It is possible to provide a display scene creation system capable of transitioning the scene design, a display scene creation program, a display system with a touch panel, a cockpit module, and a moving body.
 また、タッチパネルとジェスチャを関係付ける処理プログラムを要することなく、DPF-ECU250によってシーンデザインを遷移させることができるため、システムを低価格、且つ簡易に実現できる。 Also, since the scene design can be changed by the DPF-ECU 250 without requiring a processing program for relating the touch panel and the gesture, the system can be realized at low cost and easily.
 また、本実施形態に係るタッチパネル付き表示システムは、車両等の移動体の状態を表示するほか、例えば、車両外部の景色を撮影した映像、車両等に備え付けられた記憶媒体に記憶されている映像、外部との通信によって得られる映像等、その他の任意の画像(静止画または動画)および文字情報等の付加的な情報を、併せて表示することが可能である。 In addition, the display system with a touch panel according to the present embodiment displays the state of a moving body such as a vehicle, as well as, for example, an image that captures a scene outside the vehicle, and an image that is stored in a storage medium that is provided in the vehicle Further, other arbitrary images (still images or moving images) such as video obtained by communication with the outside and additional information such as character information can be displayed together.
 更に、上述の実施形態では液晶表示装置を用いたが、本発明の適用対象は液晶表示装置を用いたタッチパネル付き表示システムに限定されない。少なくともシーンデザインを表示する箇所がドットマトリクス方式の表示装置であれば、任意の表示装置を用いることができる。 Furthermore, although the liquid crystal display device is used in the above-described embodiment, the application target of the present invention is not limited to a display system with a touch panel using the liquid crystal display device. Any display device can be used as long as at least a scene design display portion is a dot matrix display device.
 また、本発明の適用対象は、前述したような、インストルメントパネルに組み込まれる車載用のタッチパネル付き表示システムのみに限定されない。本発明は、入力されたジェスチャにしたがって表示シーンを遷移させる機能を有する任意のタッチパネル付き表示システムに適用可能であり、その用途やハードウェア構成も多岐に亘る。例えば、これらのあくまでも一例に過ぎないが、ゲーム機、携帯電話、携帯型音楽プレイヤー、PDA(Personal Digital Assistant)、自動販売機、双方向式案内表示板、検索用端末装置、インターホン、液晶フォトフレーム等の任意の用途に適用可能である。 Further, the application target of the present invention is not limited to the above-described display system with a touch panel mounted on an instrument panel as described above. The present invention can be applied to any display system with a touch panel having a function of transitioning a display scene in accordance with an input gesture, and has various uses and hardware configurations. For example, these are merely examples. A game machine, a mobile phone, a portable music player, a PDA (Personal Digital Assistant), a vending machine, an interactive guidance display board, a search terminal device, an interphone, a liquid crystal photo frame, etc. It is applicable to arbitrary uses such as.
 なお、本発明は、上述した実施の形態を実現するソフトウェアのプログラム(実施の形態では図に示すフロー図に対応したプログラム)が装置に供給され、その装置のコンピュータが、供給されたプログラムを読出して、実行することによっても達成させる場合を含む。したがって、本発明の機能処理をコンピュータで実現するために、コンピュータにインストールされるプログラム自体も本発明を実現するものである。つまり、本発明は、本発明の機能処理を実現させるためのプログラムも含む。 In the present invention, a software program (in the embodiment, a program corresponding to the flowchart shown in the figure) is supplied to the apparatus, and the computer of the apparatus reads the supplied program. And the case where it is achieved by executing. Therefore, in order to implement the functional processing of the present invention on a computer, the program itself installed in the computer also implements the present invention. That is, the present invention also includes a program for realizing the functional processing of the present invention.
 上記実施形態で説明した構成は、単に具体例を示すものであり、本発明の技術的範囲を制限するものではない。本発明の効果を奏する範囲において、任意の構成を採用することが可能である。 The configuration described in the above embodiment is merely a specific example and does not limit the technical scope of the present invention. Any configuration can be employed within the scope of the effects of the present invention.
 本発明は、入力されたジェスチャにしたがって表示シーンを遷移させる機能を有するタッチパネルおよびそれを備えた表示システムならびに表示シーンを作成するためのシステムとして、産業上の利用が可能である。 The present invention can be industrially used as a touch panel having a function of transitioning a display scene according to an input gesture, a display system including the touch panel, and a system for creating a display scene.

Claims (10)

  1.  タッチパネルに表示される表示シーンのデザインを設定する表示シーンデザイン設定部と、
     前記表示シーンデザイン設定部で設定された表示シーンのデザイン内に表示される1つ以上の表示部品を設定する表示部品設定部と、
     前記表示部品設定部で設定された表示部品へのジェスチャの入力により、前記表示シーンが遷移する当該ジェスチャを設定するジェスチャ設定部と、
     前記ジェスチャ設定部で設定されたジェスチャと遷移先の表示シーンとを関連付けて格納する遷移表示シーンテーブルとを備え、
     前記表示部品設定部により、複数の表示部品を少なくとも一部が重なり合う状態で複数のレイヤに設定することが可能であり、
     前記ジェスチャ設定部により、少なくとも一部が重なり合う状態に設定された前記複数の表示部品に対して、互いに異なるジェスチャを設定可能である、表示シーン作成システム。
    A display scene design setting unit for setting a design of a display scene displayed on the touch panel;
    A display component setting unit for setting one or more display components displayed in the design of the display scene set in the display scene design setting unit;
    A gesture setting unit for setting the gesture in which the display scene transitions by inputting a gesture to the display component set by the display component setting unit;
    A transition display scene table for storing the gesture set by the gesture setting unit and the transition destination display scene in association with each other;
    By the display component setting unit, it is possible to set a plurality of display components in a plurality of layers in a state where at least a part thereof overlaps,
    A display scene creation system in which different gestures can be set for the plurality of display parts set in a state where at least a part thereof is overlapped by the gesture setting unit.
  2.  前記表示部品設定部は、前記表示シーン内の座標で表される矩形領域で定義された表示部品を設定する、請求項1に記載の表示シーン作成システム。 The display scene creation system according to claim 1, wherein the display part setting unit sets a display part defined by a rectangular area represented by coordinates in the display scene.
  3.  タッチパネルに表示される表示シーンを作成する処理をコンピュータに行わせるためのプログラムであって、
     表示シーンのデザインを設定するステップと、
     前記表示シーンのデザインを設定するステップで設定された表示シーンのデザイン内に表示される1つ以上の表示部品を設定するステップと、
     前記表示部品を設定するステップで設定された表示部品へのジェスチャの入力により、前記表示シーンが遷移する当該ジェスチャを設定するステップと、
     前記ジェスチャを設定するステップで設定されたジェスチャと遷移先の表示シーンとを関連付けるステップとを前記コンピュータに実行させ、
     前記表示部品を設定するステップにおいて、複数の表示部品を少なくとも一部が重なり合う状態で複数のレイヤに設定することが可能であり、
     前記ジェスチャを設定するステップにおいて、少なくとも一部が重なり合う状態で設定された前記複数の表示部品に対して、互いに異なるジェスチャを設定可能である、表示シーン作成プログラム。
    A program for causing a computer to perform a process of creating a display scene displayed on a touch panel,
    Setting the design of the display scene,
    Setting one or more display parts displayed in the design of the display scene set in the step of setting the design of the display scene;
    A step of setting the gesture for transition of the display scene by inputting a gesture to the display component set in the step of setting the display component;
    Causing the computer to execute a step of associating the gesture set in the step of setting the gesture with a display scene of a transition destination;
    In the step of setting the display component, it is possible to set a plurality of display components in a plurality of layers in a state where at least a part thereof overlaps,
    In the step of setting the gesture, a display scene creation program capable of setting different gestures for the plurality of display parts set in a state where at least a part thereof overlaps.
  4.  前記表示部品を設定するステップでは、前記表示シーン内の座標で表される矩形領域で定義された表示部品を設定する、請求項3に記載の表示シーン作成プログラム。 The display scene creation program according to claim 3, wherein, in the step of setting the display component, a display component defined by a rectangular area represented by coordinates in the display scene is set.
  5.  表示装置と、当該表示装置の表示領域の全面に、ユーザの接触を検出する検出領域を有するタッチパネルとを備えたタッチパネル付き表示システムであって、
     前記表示装置が表示する表示シーンにおいて、複数の表示部品を少なくとも一部が重なり合う状態で複数のレイヤに設定することが可能であり、少なくとも一部が重なり合う状態に設定された前記複数の表示部品に対して、互いに異なるジェスチャを設定可能であり、
     前記表示装置の表示領域に表示された表示シーンにおいて、前記タッチパネルがユーザの接触を検出したときに、当該ユーザの接触を検出した表示部品、当該表示部品のレイヤ、及び表示部品に対して入力されたジェスチャに基づいて、遷移先の表示シーンを前記表示装置の表示領域に表示する表示制御部を備える、タッチパネル付き表示システム。
    A display system with a touch panel, comprising: a display device; and a touch panel having a detection region for detecting a user's contact over the entire display region of the display device,
    In the display scene displayed by the display device, it is possible to set a plurality of display components in a plurality of layers in a state where at least a portion overlaps, and to the plurality of display components set in a state where at least a portion overlaps On the other hand, different gestures can be set,
    In the display scene displayed in the display area of the display device, when the touch panel detects a user's contact, it is input to the display component that has detected the user's contact, the layer of the display component, and the display component. A display system with a touch panel, comprising: a display control unit configured to display a display scene of a transition destination in a display area of the display device based on the gesture.
  6.  前記表示部品は、前記表示装置の表示領域において座標で表される矩形領域で定義されており、
     前記表示制御部は、前記表示装置の表示領域に表示された表示シーンにおいて、前記タッチパネルが前記ユーザの接触を検出したときに、当該ユーザの接触を検出した座標列の存する矩形領域、及び当該矩形領域における当該座標列が示すジェスチャの双方が、前記表示部品の領域、及び当該表示部品の領域に関連付けられたジェスチャの双方と一致した場合、遷移先の表示シーンを前記表示装置の表示領域に表示する、請求項5に記載のタッチパネル付き表示システム。
    The display component is defined by a rectangular area represented by coordinates in the display area of the display device,
    In the display scene displayed in the display area of the display device, the display control unit includes a rectangular area having a coordinate sequence that detects the user's contact when the touch panel detects the user's contact, and the rectangle When both of the gestures indicated by the coordinate sequence in the region match both the display component region and the gesture associated with the display component region, the display scene of the transition destination is displayed in the display region of the display device. The display system with a touch panel according to claim 5.
  7.  前記表示装置が液晶表示装置である、請求項5または6に記載のタッチパネル付き表示システム。 The display system with a touch panel according to claim 5 or 6, wherein the display device is a liquid crystal display device.
  8.  移動体の操縦席まわりに取り付けられる操縦席用モジュールであって、
     請求項5~7のいずれかに記載のタッチパネル付き表示システムを備えることを特徴とする、操縦席用モジュール。
    A cockpit module attached around the cockpit of a moving object,
    A cockpit module comprising the display system with a touch panel according to any one of claims 5 to 7.
  9.  請求項5~7のいずれかに記載のタッチパネル付き表示システムを備え、
     前記表示装置が少なくとも操縦席から視認可能な位置に取り付けられたことを特徴とする、移動体。
    A display system with a touch panel according to any one of claims 5 to 7,
    A moving body, wherein the display device is attached at a position at least visible from a cockpit.
  10.  前記移動体は、自動車であり、
     前記タッチパネル付き表示システムは、CAN(Control Area Network)によって自動車各部のECU(Electronic Control Unit)と接続される、請求項9に記載の移動体。
    The moving body is an automobile,
    The mobile body according to claim 9, wherein the display system with a touch panel is connected to an ECU (Electronic Control Unit) of each part of the automobile by a CAN (Control Area Network).
PCT/JP2011/076542 2010-11-19 2011-11-17 Display scene creation system WO2012067193A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010259247 2010-11-19
JP2010-259247 2010-11-19

Publications (1)

Publication Number Publication Date
WO2012067193A1 true WO2012067193A1 (en) 2012-05-24

Family

ID=46084114

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/076542 WO2012067193A1 (en) 2010-11-19 2011-11-17 Display scene creation system

Country Status (1)

Country Link
WO (1) WO2012067193A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07152356A (en) * 1993-11-26 1995-06-16 Toppan Printing Co Ltd Display controller
WO2010113350A1 (en) * 2009-03-31 2010-10-07 シャープ株式会社 Display scene creation system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07152356A (en) * 1993-11-26 1995-06-16 Toppan Printing Co Ltd Display controller
WO2010113350A1 (en) * 2009-03-31 2010-10-07 シャープ株式会社 Display scene creation system

Similar Documents

Publication Publication Date Title
US20190394097A1 (en) Vehicle application store for console
JP6073497B2 (en) Display control apparatus, information display method, and information display system
CN103917398B (en) Method and apparatus for the running status of the equipment that shows vehicle
CN107351763A (en) Control device for vehicle
CN111452739A (en) Control panel card user interface
US10650787B2 (en) Vehicle and controlling method thereof
US8082077B2 (en) Steerable vehicle information display system, as well as cockpit module and steerable vehicle incorporating the system
KR102082555B1 (en) Method and device for selecting an object from a list
US20150227492A1 (en) Systems and methods for selection and layout of mobile content on in-vehicle displays
WO2010113350A1 (en) Display scene creation system
US20180307405A1 (en) Contextual vehicle user interface
WO2014129197A1 (en) Display control device and display control program
US8228179B2 (en) Information generating device, control device provided with the same, information providing system for mobile body, module for driver's seat, and mobile body
JP5886172B2 (en) Vehicle information display system and vehicle information display control device
CN106458114A (en) Vehicle operation device
CN105398388B (en) Vehicle security system, Vehicular screen display methods and device
JP2016097928A (en) Vehicular display control unit
CN105760096A (en) Automobile center console direction gesture control method and device supporting blind operation
US11099715B2 (en) Method and device for providing a user interface in a vehicle
JP6561716B2 (en) Vehicle information providing device
US20100245580A1 (en) Display control device, reproduction device, information display system for mobile object, module for driver's seat, and mobile object
WO2012067193A1 (en) Display scene creation system
CN102596629B (en) For the handling device navigated in lists and the method making it to be in operation readiness
US10618407B2 (en) Terminal apparatus, vehicle, and method of controlling the terminal apparatus
KR20220010655A (en) Dynamic cockpit control system for autonomous vehicle using driving mode and driver control gesture

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11842143

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11842143

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP