WO2010113350A1 - 表示シーン作成システム - Google Patents
表示シーン作成システム Download PDFInfo
- Publication number
- WO2010113350A1 WO2010113350A1 PCT/JP2009/068994 JP2009068994W WO2010113350A1 WO 2010113350 A1 WO2010113350 A1 WO 2010113350A1 JP 2009068994 W JP2009068994 W JP 2009068994W WO 2010113350 A1 WO2010113350 A1 WO 2010113350A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- scene
- gesture
- touch panel
- design
- Prior art date
Links
- 238000013461 design Methods 0.000 claims abstract description 136
- 230000007704 transition Effects 0.000 claims abstract description 65
- 239000004973 liquid crystal related substance Substances 0.000 claims description 27
- 238000001514 detection method Methods 0.000 claims description 3
- 238000011161 development Methods 0.000 description 17
- 238000000034 method Methods 0.000 description 12
- 230000008569 process Effects 0.000 description 11
- 238000010586 diagram Methods 0.000 description 7
- 239000010410 layer Substances 0.000 description 5
- 238000012545 processing Methods 0.000 description 4
- 238000004378 air conditioning Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 239000002131 composite material Substances 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 239000000446 fuel Substances 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 241000269913 Pseudopleuronectes americanus Species 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 238000007664 blowing Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000007257 malfunction Effects 0.000 description 1
- 229920001690 polydopamine Polymers 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000002356 single layer Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/22—Display screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/65—Instruments specially adapted for specific vehicle types or users, e.g. for left- or right-hand drive
- B60K35/654—Instruments specially adapted for specific vehicle types or users, e.g. for left- or right-hand drive the user being the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/60—Instruments characterised by their location or relative disposition in or on vehicles
Definitions
- the present invention relates to a technology for a display device with a touch panel, and more specifically, to a display scene creation system, a display scene creation program, and a display system with a touch panel that transition an image to be presented to a user by inputting a gesture to the touch panel. .
- a display device with a touch panel has been widely used in various fields such as game machines, mobile phones, PDAs, vending machines, and information boards. Since the display device with a touch panel is associated with the display on the touch panel and a gesture input from the touch panel, the user can perform an intuitive operation.
- Patent Document 1 proposes a technique for executing a function assigned to a gesture when a gesture is input from the touch panel display as a portable terminal having a touch panel display, and transitioning a display scene according to the execution result.
- Patent Document 2 as a game system using touch panel input, when a gesture is input from a touch panel display, an attack corresponding to a figure indicated by the gesture is performed on an enemy character, and a display scene is changed according to the execution result of the attack.
- Technology has been proposed.
- a display system with a touch panel does not have a general-purpose mechanism for associating a touch panel with a gesture. Therefore, a transition of a display scene is realized by associating a touch panel with a gesture with a processing program.
- the processing program for associating the touch panel and the gesture needs to be created for each display scene, and it takes time and effort to develop the program. For example, when it is assumed that different gestures are input to the same area, the program becomes complicated and the man-hours become enormous. Moreover, in order to improve the recognition accuracy, a high-level program is required, and there is a problem that development within a limited time is impossible.
- the present invention has been made in view of the above problems. That is, it is possible to provide a display scene creation system, a display scene creation program, and a display system with a touch panel that can transition a display scene by inputting a gesture to the touch panel without requiring a processing program for relating the touch panel and the gesture. Objective.
- a display scene creation system includes a display scene design setting unit for setting a display scene design, and a display within the display scene design set by the display scene design setting unit.
- a display component setting unit that sets one or more display components
- a gesture setting unit that sets the gesture in which the display scene transitions in response to a gesture input to the display component set by the display component setting unit
- a transition display scene table for storing the gesture set by the gesture setting unit and the display scene of the transition destination in association with each other.
- a display scene creation system capable of transitioning a display scene by inputting a gesture to the touch panel without requiring a processing program relating the touch panel and the gesture.
- the display scene creation system is characterized in that the display component setting unit sets a display component defined by a rectangular area represented by coordinates in the display scene.
- the transition destination display scene can be read based on the rectangular area where the coordinate string input to the touch panel exists and the gesture indicated by the coordinate string in the rectangular area, the processing program relating the touch panel and the gesture. Therefore, it is possible to provide a display scene creation system capable of transitioning display scenes without requiring a display.
- the display scene design setting unit assigns one layer for each display scene and sets the design of the display scene.
- a display scene creation program includes a step of setting a display scene design in a computer and a step of setting the display scene design. Setting one or more display components displayed on the screen, and setting the gesture for transition of the display scene by inputting a gesture to the display component set in the step of setting the display component; The step of associating the gesture set in the step of setting the gesture with the display scene of the transition destination is performed.
- a display scene creation program capable of transitioning a display scene by inputting a gesture to the touch panel without requiring a processing program for relating the touch panel and the gesture.
- a display component defined by a rectangular area represented by coordinates in the display scene is set.
- the processing program relating the touch panel and the gesture since the transition destination display scene can be read based on the rectangular area where the coordinate string input to the touch panel exists and the gesture indicated by the coordinate string in the rectangular area, the processing program relating the touch panel and the gesture. Therefore, it is possible to provide a display scene creation program capable of transitioning a display scene without requiring an operation.
- one layer is assigned to each display scene and the design of the display scene is set.
- a display system with a touch panel has a touch panel including a display device and a touch panel having a detection region for detecting a user's contact over the entire display region of the display device.
- the display system in the display scene displayed in the display area of the display device, when the touch panel detects a user's contact, the display component that has detected the user's contact and the display component are input.
- a display control unit for displaying a transition destination display scene in a display area of the display device based on the gesture.
- the display component is defined by a rectangular area represented by coordinates in the display area of the display device, and the display control unit is displayed in the display area of the display device.
- the touch panel detects the user's contact
- both the rectangular region where the coordinate sequence where the user's contact is detected and the gesture indicated by the coordinate sequence in the rectangular region are displayed on the display component.
- the display area matches the gesture associated with the display component area and the gesture associated with the display component area
- the display scene of the transition destination is preferably displayed in the display area of the display device.
- a display system with a touch panel capable of transitioning a display scene without requiring a processing program for associating the touch panel with a gesture is provided because it is possible to read out a transition destination display scene by determining whether or not the two match. be able to.
- the display device is preferably a liquid crystal display device.
- a cockpit module according to the present invention is a module for a cockpit seat attached around a cockpit of a moving body, and the touch panel of the present invention according to any one of the above configurations. It is provided with an attached display system.
- a mobile body according to the present invention includes the display system with a touch panel according to the present invention according to any one of the above configurations, and the display device is at least visible from the cockpit. It is attached.
- the moving body is a car
- the display system with a touch panel is connected to an ECU (Electronic Control Unit) of each part of the car by a CAN (Control Area Network).
- ECU Electronic Control Unit
- CAN Control Area Network
- a display scene creation system a display scene creation program, and a display system with a touch panel that can change a display scene by inputting a gesture to the touch panel without requiring a processing program for associating the touch panel with a gesture.
- a cockpit module and a moving body can be provided.
- FIG. 1 is a block diagram showing the overall configuration of a display scene creation system according to an embodiment of the present invention.
- Flow diagram showing the flow of scene design creation process to create a scene design The figure which shows the example of registration of the still image item of the screen 1
- the figure which shows the example of registration of the still image item of the screen 2 The figure which shows the example of registration of the subevent item of the screen 2 Figure showing a screen example of the scene design
- Initial Figure showing the gesture table
- Flow diagram showing the flow of scene design transition information creation processing for creating scene design transition information The figure which shows the example of transition information of scene design
- FIG. 1 is a block diagram showing the overall configuration of a display scene creation system 100 according to an embodiment of the present invention.
- the display scene creation system 100 includes an instrument panel development support tool 110 and a scene design director 120. The user uses the instrument panel development support tool 110 and the scene design director 120 to create a display scene in advance on a terminal such as a personal computer.
- the display scene is referred to as a scene design
- the display component in the scene design is referred to as an item.
- One layer is assigned to one scene design.
- the instrument panel development support tool 110 is a tool for creating a scene design
- the scene design director 120 is a tool for creating transition information of the scene design.
- the instrument panel development support tool 110 includes a scene design setting unit 111 (display scene design setting unit), an item table 112, and an item setting unit 113 (display component setting unit).
- the user sets a scene design using the scene design setting unit 111.
- the item table 112 is an item that is displayed in the scene design and stores items defined by rectangular areas represented by coordinates in the scene design.
- the user reads and sets one or more items from the item table 112 by the item setting unit 113 in the scene design set by the scene design setting unit 111.
- the user creates a scene design using the instrument panel development support tool 110 configured as described above.
- a scene design creation process in which a user creates a scene design using the instrument panel development support tool 110 will be described with reference to the flowchart of FIG.
- a case where a scene design “Initial” is created using the instrument panel development support tool 110 will be described as an example.
- the scene design “Initial” is displayed on a display device with a touch panel
- the scene design “Initial” is composed of a screen 1 and a screen 2
- the screen 2 is described as a screen corresponding to the touch panel.
- the user inputs the name “Initial” of the scene design through the scene design setting unit 111 (step S201).
- the user uses the scene design setting unit 111 to select a screen for registering an item from the screen 1 and the screen 2 of the scene design “Initial” (step S202).
- screen 1 is first selected.
- the user registers a still image item on the selected screen 1 (step S203).
- the user refers to the item table 112, selects the image file name of the still image item by the item setting unit 113, and inputs and registers the display area name and the coordinate value.
- the file name “AC-Under2.png”, the display area name “AC”, and the coordinate value (0,416) are registered as the still image item, and the file name “Temp240.png” is displayed.
- the user registers the digital meter item in the selected screen 1 (step S204).
- the user sets the font of each digit of the digital meter using the item setting unit 113, and inputs and registers the name of the digital meter, the display area name, and the coordinate value.
- a display area name “Date2”, a date meter in the coordinate value (600, 424), a display area name “Time”, and a time meter in the coordinate value (680,456) are registered as digital meter items.
- step S205 the user frames the still image item and digital meter registered in the selected screen 1 (step S205).
- the user registers the moving image / NTSC item on the selected screen 1 (step S206).
- the user inputs and registers a display area name for displaying a moving image from a preset device such as a navigation.
- a preset device such as a navigation.
- the display area name “Map” is registered.
- the user uses the scene design setting unit 111 to select the screen 2 for registering an item next from the screen 1 and the screen 2 of the scene design “Initial” (step S207).
- the user registers a still image item on the selected screen 2 (step S208).
- the user refers to the item table 112, selects the image file name of the still image item by the item setting unit 113, and inputs and registers the display area name and the coordinate value.
- the file name “BlackBack.png”, the display area name “Back”, and the coordinate value (0,0) are registered as the still image item, and the file name “TitleMainMenu.png” is displayed.
- the area name “TitleMainMenu” and coordinate value (0,0) are registered, the file name “Navi-ButtonOff.png”, the display area name “Navi-ButtonOff”, the coordinate value (272,96) are registered, and the file name “ AirConOff.png ”, display area name“ AirConButtonOff ”, coordinate value (536,96) are registered, file name“ AudioButtonOff.png ”, display area name“ AudioButtonOff ”, coordinate value (8,288) are registered, and file name“ “CameraButtonOff.png”, display area name “CameraButtonOff”, coordinate value (272,288) are registered, and file name “MeterButtonOff.png”, display area name “MeterButtonOff”, coordinate value (536,288) are registered.
- the user registers a sub event item in the selected screen 2 (step S209).
- the user refers to the item table 112, selects the image file name of the sub event item by the item setting unit 113, and inputs and registers the display area name and the coordinate value.
- the file name “Navi-ButtonOn.png”, the sub-event name “NaviButtonOn”, the display area name “NaviButton”, and the coordinate value (272,96) are registered as sub-event items.
- the screen 1 and the screen 2 in which items are registered using the instrument panel development support tool 110 that is, the scene design “Initial” is a screen as shown in FIG.
- the user creates scene design transition information using the scene design director 120 linked with the instrument panel development support tool 110.
- the scene design director 120 includes a gesture table 121, a gesture setting unit 122, and a scene design transition table 123 (transition display scene table).
- the gesture table 121 is a table that stores gesture patterns.
- the gesture table 121 according to the specific example illustrated in FIG. 7 stores 15 types of gesture patterns.
- the user refers to the gesture table 121, and uses the gesture setting unit 122 to set a gesture pattern that reacts to an item set by the item setting unit 113 of the instrument panel development support tool 110.
- the scene design transition table 123 is a table that stores transition information in which a gesture set by the user using the gesture setting unit 122 and a transition destination scene design are associated with each other.
- the user creates scene design transition information using the scene design director 120 configured as described above.
- a scene design transition information creation process in which the user creates scene design transition information using the scene design director 120 will be described with reference to the flowchart of FIG.
- the case where the transition information of the scene design “Initial” is created using the scene design director 120 will be described as an example.
- the user selects and registers a variable “TouchPanel”, which is a sub-event execution condition, using the gesture setting unit 122 (step S801).
- the user uses the gesture setting unit 122 to select and register a scene design in which the sub-event is displayed (step S802).
- the scene design “Initial” is selected.
- the user uses the gesture setting unit 122 to display thumbnails of the sub-events displayed in the selected scene design “Initial”, and selects a sub-event to be registered from the sub-events displayed as thumbnails (Step S1). S803).
- the user refers to the gesture table 121 storing the 15 types of gesture patterns by the gesture setting unit 122, and selects and registers the gesture pattern to which the selected sub-event reacts (step S804).
- step S805 the user inputs and registers the name of the sub-event to be executed when the gesture setting unit 122 inputs a gesture that reacts to the sub-event.
- the user uses the gesture setting unit 122 to make a transition setting for making a transition to the designated scene design after a designated time (step S806).
- the transition information of the scene design as shown in FIG. 9, the scene design “Initial”, the sub-event “NaviButtonOn”, the gesture “all”, the sub-event “NaviButton” to be executed, the transition time 100 ms, the transition scene name “ Navi ”is set, scene design“ Initial ”, sub-event“ AirconButtonOn ”, gesture“ all ”, sub-event“ AirconButton ”to be executed are set, scene design“ Initial ”, sub-event“ AudioButtonOn ”, gesture“ all ” , Sub event “AudioButton” to be executed, scene design “Initial”, sub event “MeterButtonOn”, gesture “All”, sub event “MeterButton” to execute, transition time 100ms, transition scene name “Meter” are set Shall be.
- the scene design director 120 associates the scene design registered with the instrument panel development support tool 110 with the scene design registered with the scene design director 120.
- the user downloads and uses the scene design created by the instrument panel development support tool 110 as described above and the transition information of the scene design created by the scene design director 120 to the display system 200 with a touch panel described in detail below.
- the application object of this invention is not limited only to a motor vehicle.
- the present invention is applied to various vehicles (moving means or transferring means) such as a motorcycle, an automatic tricycle, a special vehicle, a railway vehicle, other road vehicles, an amphibious vehicle, an aircraft, or a ship in addition to an automobile. It is possible.
- the present invention can be applied not only to a vehicle whose main purpose is movement or transfer as described above, but also to a simulator that allows the above-mentioned various types of vehicles to be simulated.
- the vehicles, simulators, and the like as described above are collectively referred to as “moving bodies”.
- An automotive cockpit module (driver's seat module) incorporating the display system 200 with a touch panel according to the present embodiment includes a conventional analog instrument such as a speedometer and a tachometer, an indicator lamp configured with LEDs, and the like.
- a liquid crystal display device 210 for displaying a composite image of the automotive instrument panel is provided.
- the liquid crystal display device 210 is not a segment type liquid crystal display device that is often used in conventional automobiles, but a dot matrix type liquid crystal panel display device. Since the liquid crystal display device 210 can display an image of an arbitrary pattern, the liquid crystal display device 210 functions as an automobile information display device by displaying a composite image obtained by combining various element images such as various instruments and indicator lamps. To do.
- the liquid crystal display device 210 includes not only an image of an instrument panel, but also an image taken by an in-vehicle camera installed at the rear or side of the automobile, a navigation image, a television broadcast reception image, an in-vehicle DVD player, etc. It is also possible to display a reproduced image or the like together.
- the liquid crystal display device 210 is attached to an instrument panel (not shown) which is a frame of a cockpit module (not shown) so as to be positioned behind the steering wheel (not shown).
- the cockpit module includes an air conditioning unit (not shown), an air conditioning duct (not shown) for introducing air from the air conditioning unit into the vehicle, an audio module (not shown), a lamp switch (not shown), A steering mechanism (not shown), an airbag module (not shown), and the like are included.
- the liquid crystal display device 210 may be arranged in the center of the instrument panel, that is, between the driver seat and the passenger seat.
- FIG. 10 is a block diagram showing an example of the overall configuration of the display system with a touch panel 200 according to the present embodiment.
- a display system with a touch panel 200 includes a liquid crystal display device 210 (210a, 210b), a touch panel 220, a flash ROM (scene design storage unit 230, scene design transition information storage unit 240), a video processing LSI, a DPF-ECU 250 (display control unit), It is equipped with a CAN microcomputer, CPU I / F, and RAM.
- a touch panel 220 having a detection area for detecting a user's contact is installed on the entire display area of the liquid crystal display device 210.
- the scene design storage unit 230 the scene design created by the instrument panel development support tool 110 is downloaded and stored.
- the scene design transition information storage unit 240 the transition information of the scene design created by the scene design director 120 is downloaded and stored.
- the scene design displayed on the liquid crystal display device 210 is controlled by the DPF-ECU 250.
- the DPF-ECU 250 is connected to various ECUs provided in each part of the automobile via the in-vehicle LAN.
- the DPF-ECU 250 receives information indicating the state of each part of the vehicle (state information, hereinafter collectively referred to as state information D unless otherwise required) from each ECU via the in-vehicle LAN. Get at a period.
- the “predetermined cycle” is set to an arbitrary length according to the specifications of the in-vehicle LAN.
- the transmission cycle of the status information D from each ECU may be different from each other.
- the sampling period of the state information D in the DPF-ECU 250 may be matched with the transmission period of each state information.
- the in-vehicle LAN interface standard to which the present invention can be applied is not limited to CAN.
- any in-vehicle network conforming to various in-vehicle LAN interface standards such as LIN (Local Interconnect Network), MOST (Media Oriented Systems Transport), FlexRay, etc. can be applied to the present invention.
- the DPF-ECU 250 reflects the acquired vehicle state information on the scene design created by the instrument panel development support tool 110 of the display scene creation system 100, and displays the reflected scene design in the display area of the liquid crystal display device 210.
- the “state information” is information representing the state of each part of the automobile.
- information on the mechanical operation state of each part of the automobile for example, traveling speed, engine speed, etc.
- information on the mechanical operation state of each part of the automobile for example, traveling speed, engine speed, etc.
- information on the mechanical operation state of each part of the automobile for example, traveling speed, engine speed, etc.
- information on the mechanical operation state of each part of the automobile for example, traveling speed, engine speed, etc.
- information for example, fuel remaining amount, room temperature, etc.
- the state information these are only examples for passenger cars, and do not limit the present invention, for example, the engine speed, travel speed, select position, shift position, operating state of the direction indicator, Lighting state of lights, door / trunk opening / closing state, door locking state, tire state, presence / absence of airbag abnormality, seat belt wearing state, blowing temperature from air conditioner, room temperature, outside temperature, in-vehicle AV equipment State, automatic steering function setting state, wiper operating state, fuel remaining amount, battery remaining amount, engine / battery dependency (in the case of a hybrid vehicle), oil remaining amount, radiator temperature, engine temperature, and the like.
- the engine speed, travel speed, select position, shift position, operating state of the direction indicator Lighting state of lights, door / trunk opening / closing state, door locking state, tire state, presence / absence of airbag abnormality, seat belt wearing state, blowing temperature from air conditioner, room temperature, outside temperature, in-vehicle AV equipment State, automatic steering function setting state, wiper operating state, fuel remaining amount, battery remaining amount, engine
- the DPF-ECU 250 acquires a moving image such as a navigation image from a moving image generating device (not shown) such as a navigation provided in the automobile, and the acquired moving image is used by the instrument panel development support tool 110 of the display scene creation system 100.
- the reflected scene design is reflected in the created scene design, and the reflected scene design is displayed in the display area of the liquid crystal display device 210.
- the DPF-ECU 250 includes a rectangular region in which a coordinate sequence that detects the user's contact exists,
- the scene design transition information storage unit 240 is referred to and the corresponding next transition destination scene The design is read from the scene design storage unit 230 and displayed on the display area of the liquid crystal display device 210.
- the DPF-ECU 250 determines whether or not the touch panel 220 detects a user contact (step S1101). If it is determined in step S1101 that no contact has been detected, the DPF-ECU 250 ends the process. On the other hand, if it is determined in step S1101 that contact has been detected, the DPF-ECU 250 identifies a rectangular area in which the coordinate sequence that detected the user's contact exists (step S1102), and further indicates the coordinate sequence that detected the contact. A gesture is specified (step S1103).
- the CAN microcomputer performs area determination to identify a rectangular area.
- the region is determined from the top value of the X and Y coordinate value column coming up from the touch panel 220 and the image information (the upper left XY coordinate and the vertical and horizontal length of the image) registered in the scene design storage unit 230, and the match If there is a rectangle to go to, proceed to the next.
- a gesture is determined from a row of X and Y coordinate points coming from the touch panel. It is determined whether a matching event exists from the matching rectangle and the gesture. After determining the area from the image information registered in the scene design storage unit 230 (upper left XY coordinates and length and width of the image), and then determining the gesture registered in the scene design transition information storage unit 240, The area is determined by the CAN microcomputer, and the rectangular area is specified.
- the DPF-ECU 250 determines whether or not the rectangular area where the specified coordinate sequence exists matches the sub-event (step S1104).
- step S1104 If it is determined in step S1104 that they do not match, the DPF-ECU 250 ends the process. On the other hand, if it is determined in step S1104 that they match, the DPF-ECU 250 determines whether or not the gesture indicated by the identified coordinate sequence matches the gesture associated with the sub-event (step S1105).
- step S1105 determines whether or not to perform a scene design transition process. If it is determined in step S1106 that the scene design transition process is to be performed, the DPF-ECU 250 refers to the scene design transition information storage unit 240, flashes and displays the set transition time and sub-event, and then stores the scene design. The transition destination scene design read from the unit 230 is displayed in the display area of the liquid crystal display device 210 (step S1107). On the other hand, if it is determined in step S1106 that the scene design transition process is not performed, the DPF-ECU 250 performs display switching based on the sub-event (step S1108).
- the sub-event “NaviButtonOn” blinks and displays for 100 ms, and then the transition destination
- the scene design “Navi” is displayed in the display area of the liquid crystal display device 210 as shown in FIG.
- a display scene creation system and a display scene creation that can transition a scene design by inputting a gesture to the touch panel without requiring a processing program that associates the touch panel with the gesture.
- a program, a display system with a touch panel, a cockpit module, and a moving body can be provided.
- the item is defined by a rectangular area represented by coordinates in the display area of the display device, and both the rectangular area where the coordinate string input to the touch panel exists, and the gesture indicated by the coordinate string in the rectangular area, By determining whether or not both the event area and the gesture associated with the sub-event match, it is possible to read the transition destination scene design without requiring a processing program that relates the touch panel and the gesture. It is possible to provide a display scene creation system capable of transitioning the scene design, a display scene creation program, a display system with a touch panel, a cockpit module, and a moving body.
- a display system with a touch panel, a cockpit module, and a moving body can be provided.
- the system can be realized at low cost and easily.
- the display system with a touch panel displays the state of a moving body such as a vehicle, as well as, for example, an image that captures a scenery outside the vehicle, and an image that is stored in a storage medium provided in the vehicle or the like. Further, other arbitrary images (still images or moving images) such as video obtained by communication with the outside and additional information such as character information can be displayed together.
- the liquid crystal display device is used in the above-described embodiment, the application target of the present invention is not limited to a display system with a touch panel using the liquid crystal display device. Any display device can be used as long as at least a scene design display portion is a dot matrix display device.
- the application target of the present invention is not limited to the above-described display system with a touch panel mounted on an instrument panel as described above.
- the present invention can be applied to any display system with a touch panel having a function of transitioning a display scene in accordance with an input gesture, and has various uses and hardware configurations. For example, these are merely examples.
- a software program (in the embodiment, a program corresponding to the flowchart shown in the figure) is supplied to the apparatus, and the computer of the apparatus reads the supplied program. And the case where it is achieved by executing. Therefore, in order to implement the functional processing of the present invention on a computer, the program itself installed in the computer also implements the present invention. That is, the present invention also includes a program for realizing the functional processing of the present invention.
- DESCRIPTION OF SYMBOLS 100 Display scene creation system 110 Instrument panel development support tool 111 Scene design setting part 112 Item table 113 Item setting part 120 Scene design director 121 Gesture table 122 Gesture setting part 123 Scene design transition table 200 Display system with a touch panel 210 Liquid crystal display device 220 Touch panel 230 Scene design storage unit 240 Scene design transition information storage unit 250 DPF-ECU
Landscapes
- Engineering & Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
110 インパネ開発支援ツール
111 シーンデザイン設定部
112 アイテムテーブル
113 アイテム設定部
120 シーンデザインディレクタ
121 ジェスチャテーブル
122 ジェスチャ設定部
123 シーンデザイン遷移テーブル
200 タッチパネル付き表示システム
210 液晶表示装置
220 タッチパネル
230 シーンデザイン格納部
240 シーンデザイン遷移情報格納部
250 DPF-ECU
Claims (12)
- 表示シーン作成システムであって、
表示シーンのデザインを設定する表示シーンデザイン設定部と、
前記表示シーンデザイン設定部で設定された表示シーンのデザイン内に表示される1つ以上の表示部品を設定する表示部品設定部と、
前記表示部品設定部で設定された表示部品へのジェスチャの入力により、前記表示シーンが遷移する当該ジェスチャを設定するジェスチャ設定部と、
前記ジェスチャ設定部で設定されたジェスチャと遷移先の表示シーンとを関連付けて格納する遷移表示シーンテーブルとを備えることを特徴とする、表示シーン作成システム。 - 前記表示部品設定部は、前記表示シーン内の座標で表される矩形領域で定義された表示部品を設定する、請求項1に記載の表示シーン作成システム。
- 前記表示シーンデザイン設定部は、前記表示シーン毎に1つのレイヤを割り当てて当該表示シーンのデザインを設定する、請求項1又は2に記載の表示シーン作成システム。
- 表示シーン作成プログラムであって、
コンピュータに、
表示シーンのデザインを設定するステップと、
前記表示シーンのデザインを設定するステップで設定された表示シーンのデザイン内に表示される1つ以上の表示部品を設定するステップと、
前記表示部品を設定するステップで設定された表示部品へのジェスチャの入力により、前記表示シーンが遷移する当該ジェスチャを設定するステップと、
前記ジェスチャを設定するステップで設定されたジェスチャと遷移先の表示シーンとを関連付けるステップとを実行させる、表示シーン作成プログラム。 - 前記表示部品を設定するステップでは、前記表示シーン内の座標で表される矩形領域で定義された表示部品を設定する、請求項4に記載の表示シーン作成プログラム。
- 前記表示シーンのデザインを設定するステップでは、前記表示シーン毎に1つのレイヤを割り当てて当該表示シーンのデザインを設定する、請求項4又は5に記載の表示シーン作成プログラム。
- 表示装置と、当該表示装置の表示領域の全面に、ユーザの接触を検出する検出領域を有するタッチパネルとを備えたタッチパネル付き表示システムであって、
前記表示装置の表示領域に表示された表示シーンにおいて、前記タッチパネルがユーザの接触を検出したときに、当該ユーザの接触を検出した表示部品、及び表示部品に対して入力されたジェスチャに基づいて、遷移先の表示シーンを前記表示装置の表示領域に表示する表示制御部とを備えることを特徴とする、タッチパネル付き表示システム。 - 前記表示部品は、前記表示装置の表示領域において座標で表される矩形領域で定義されており、
前記表示制御部は、前記表示装置の表示領域に表示された表示シーンにおいて、前記タッチパネルが前記ユーザの接触を検出したときに、当該ユーザの接触を検出した座標列の存する矩形領域、及び当該矩形領域における当該座標列が示すジェスチャの双方が、前記表示部品の領域、及び当該表示部品の領域に関連付けられたジェスチャの双方と一致した場合、遷移先の表示シーンを前記表示装置の表示領域に表示する、請求項7に記載のタッチパネル付き表示システム。 - 前記表示装置が液晶表示装置である、請求項7又は8に記載のタッチパネル付き表示システム。
- 移動体の操縦席まわりに取り付けられる操縦席用モジュールであって、
請求項7~9のいずれかに記載のタッチパネル付き表示システムを備えることを特徴とする、操縦席用モジュール。 - 請求項7~9のいずれかに記載のタッチパネル付き表示システムを備え、
前記表示装置が少なくとも操縦席から視認可能な位置に取り付けられたことを特徴とする、移動体。 - 前記移動体は、自動車であり、
前記タッチパネル付き表示システムは、CAN(Control Area Network)によって自動車各部のECU(Electronic Control Unit)と接続される、請求項11に記載の移動体。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN200980158406XA CN102365614A (zh) | 2009-03-31 | 2009-11-06 | 显示场景制作系统 |
US13/138,749 US20120030633A1 (en) | 2009-03-31 | 2009-11-06 | Display scene creation system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009085341 | 2009-03-31 | ||
JP2009-085341 | 2009-03-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010113350A1 true WO2010113350A1 (ja) | 2010-10-07 |
Family
ID=42827673
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/068994 WO2010113350A1 (ja) | 2009-03-31 | 2009-11-06 | 表示シーン作成システム |
Country Status (3)
Country | Link |
---|---|
US (1) | US20120030633A1 (ja) |
CN (1) | CN102365614A (ja) |
WO (1) | WO2010113350A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012067193A1 (ja) * | 2010-11-19 | 2012-05-24 | シャープ株式会社 | 表示シーン作成システム |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102014205653A1 (de) * | 2014-03-26 | 2015-10-01 | Continental Automotive Gmbh | Steuersystem |
CN109062643A (zh) * | 2018-07-06 | 2018-12-21 | 佛山市灏金赢科技有限公司 | 一种显示界面调整方法、装置及终端 |
US10891048B2 (en) * | 2018-07-19 | 2021-01-12 | Nio Usa, Inc. | Method and system for user interface layer invocation |
US10901416B2 (en) * | 2018-07-19 | 2021-01-26 | Honda Motor Co., Ltd. | Scene creation system for autonomous vehicles and methods thereof |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001147751A (ja) * | 1999-11-24 | 2001-05-29 | Sharp Corp | 情報端末及びその制御方法 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1639439A2 (en) * | 2003-06-13 | 2006-03-29 | The University Of Lancaster | User interface |
JP4026071B2 (ja) * | 2003-09-25 | 2007-12-26 | ソニー株式会社 | 車載装置及びコンテンツ提供方法 |
KR101510469B1 (ko) * | 2008-08-08 | 2015-04-08 | 엘지전자 주식회사 | 텔레매틱스 단말기와 그에 따른 개인 차량 운행 정보 파일 업로드 및 다운로드 방법 |
KR101071843B1 (ko) * | 2009-06-12 | 2011-10-11 | 엘지전자 주식회사 | 이동단말기 및 그 제어방법 |
-
2009
- 2009-11-06 US US13/138,749 patent/US20120030633A1/en not_active Abandoned
- 2009-11-06 CN CN200980158406XA patent/CN102365614A/zh active Pending
- 2009-11-06 WO PCT/JP2009/068994 patent/WO2010113350A1/ja active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001147751A (ja) * | 1999-11-24 | 2001-05-29 | Sharp Corp | 情報端末及びその制御方法 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012067193A1 (ja) * | 2010-11-19 | 2012-05-24 | シャープ株式会社 | 表示シーン作成システム |
Also Published As
Publication number | Publication date |
---|---|
US20120030633A1 (en) | 2012-02-02 |
CN102365614A (zh) | 2012-02-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20190394097A1 (en) | Vehicle application store for console | |
CN107351763A (zh) | 用于车辆的控制装置 | |
US20140109080A1 (en) | Self-configuring vehicle console application store | |
US8082077B2 (en) | Steerable vehicle information display system, as well as cockpit module and steerable vehicle incorporating the system | |
CN103917398B (zh) | 用于显示车辆的设备的运行状态的方法和装置 | |
US10650787B2 (en) | Vehicle and controlling method thereof | |
US20200376960A1 (en) | Display device for a motor vehicle, method for operating a display device, control unit, and motor vehicle | |
WO2010113350A1 (ja) | 表示シーン作成システム | |
JP5886172B2 (ja) | 車両用情報表示システム及び車両用情報表示制御装置 | |
US8228179B2 (en) | Information generating device, control device provided with the same, information providing system for mobile body, module for driver's seat, and mobile body | |
KR102082555B1 (ko) | 목록에서 객체를 선택하기 위한 방법 및 장치 | |
US20150227492A1 (en) | Systems and methods for selection and layout of mobile content on in-vehicle displays | |
CN105398388B (zh) | 车辆安全系统、车载屏幕显示方法及装置 | |
JP2016097928A (ja) | 車両用表示制御装置 | |
CN105760096A (zh) | 一种支持盲操作的汽车中控台方位手势操控方法及装置 | |
JP6350271B2 (ja) | 表示装置および表示方法 | |
US11099715B2 (en) | Method and device for providing a user interface in a vehicle | |
JP6561716B2 (ja) | 車両用情報提供装置 | |
Knoll | Some pictures of the history of automotive instrumentation | |
EP2221221A1 (en) | Display control device, reproduction device, information display system for mobile object, module for driver's seat, and mobile object | |
WO2012067193A1 (ja) | 表示シーン作成システム | |
CN111469663A (zh) | 用于车辆的操控系统 | |
US11670187B2 (en) | Information processing device, information processing system, program, and vehicle | |
US8482396B2 (en) | Image information generation device | |
US10618407B2 (en) | Terminal apparatus, vehicle, and method of controlling the terminal apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980158406.X Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09842701 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13138749 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09842701 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |