WO2020084836A1 - Procédé de commande d'affichage et système de commande d'affichage - Google Patents

Procédé de commande d'affichage et système de commande d'affichage Download PDF

Info

Publication number
WO2020084836A1
WO2020084836A1 PCT/JP2019/026380 JP2019026380W WO2020084836A1 WO 2020084836 A1 WO2020084836 A1 WO 2020084836A1 JP 2019026380 W JP2019026380 W JP 2019026380W WO 2020084836 A1 WO2020084836 A1 WO 2020084836A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
display
control
layer
diagram showing
Prior art date
Application number
PCT/JP2019/026380
Other languages
English (en)
Japanese (ja)
Inventor
茉那 中川
亮太 内田
貴生 小川
かほる 居相
賢二 中北
力也 増田
今出 昌宏
明子 藤瀬
敬市郎 山縣
昌史 森光
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Priority to JP2020552518A priority Critical patent/JPWO2020084836A1/ja
Publication of WO2020084836A1 publication Critical patent/WO2020084836A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to a display control method and a display control system for displaying the status of residential IoT (Internet of Things) equipment such as residential equipment and home appliances.
  • residential IoT Internet of Things
  • Echonet Lite registered trademark
  • KNX registered trademark
  • the present invention provides a display control method and a display control system capable of notifying a user of the status of a device.
  • a display control method is a display control method executed by a computer, the first display layer having a display object of a first category, which is a user interface for notifying a state of a device. Acquiring information about a first touch on the touch screen performed while is displayed on the touch screen, and the information about the first touch includes movement of the touch position in the first direction.
  • a program according to one aspect of the present invention is a program for causing a computer to execute the display control method.
  • a display control system is performed while a first display layer having display objects of a first category, which is a user interface for notifying a state of a device, is displayed on a touch screen.
  • the acquisition unit that acquires information regarding the first contact with respect to the touch screen and the information regarding the first contact include movement of the contact position in the first direction, from the first display layer
  • a control unit that shifts to a second display layer having a display object of a second category different from the first category, the control unit acquires the state of the device, and the second display layer is When the state of the device changes while being displayed on the touch screen, a display object that notifies the change in the second display layer. -Objects to display.
  • the display control method and display control system of the present invention can notify the user of the status of the device.
  • FIG. 1 is a front view of a display device according to an embodiment.
  • FIG. 2 is a perspective view of the display device according to the embodiment.
  • FIG. 3 is a diagram showing an example of a display screen of a display device installed at the entrance.
  • FIG. 4 is a diagram showing an example of a display screen of a display device installed in a dining room.
  • FIG. 5 is a diagram showing an example of a display screen of a display device installed in the bedroom.
  • FIG. 6 is a diagram showing an example of a display screen for the user to intuitively control the air conditioning in the entire building.
  • FIG. 7 is a diagram showing an example of a display screen when the notification is given.
  • FIG. 8 is a diagram showing a first example of the notified contents.
  • FIG. 9 is a diagram showing a second example of the notified contents.
  • FIG. 10 is a diagram showing a third example of the notified contents.
  • FIG. 11 is a figure which shows the 4th example of the content notified.
  • FIG. 12 is a diagram showing an example of a display screen for performing scene control when waking up.
  • FIG. 13 is a diagram showing an example of a display screen for performing scene control when returning home.
  • FIG. 14 is a diagram showing an example of a display screen for performing scene control when working from home.
  • FIG. 15 is a diagram showing an example of a display screen for performing scene control before going to bed.
  • FIG. 16 is a diagram showing an example of an application selection screen.
  • FIG. 17 is a diagram showing an example of a screen showing the air quality of each room.
  • FIG. 18 is a diagram showing an example of a detailed air quality screen of the living room.
  • FIG. 19 is a diagram showing a first example of a screen showing the amount of electricity used.
  • FIG. 20 is a diagram showing a second example of the screen showing the amount of electricity used.
  • FIG. 21 is a diagram showing a first example of a screen on which the home messenger application is being executed.
  • FIG. 22 is a diagram showing a second example of the screen on which the home messenger application is being executed.
  • FIG. 23 is a diagram showing a first example of a screen during execution of the doorphone application (display screen of image of visitor imaged by doorphone).
  • FIG. 24 is a diagram showing a second example (doorphone / camera selection screen) of the screen during execution of the doorphone application.
  • FIG. 25 is a diagram showing a first example (handwriting input screen) of a screen on which the message application is being executed.
  • FIG. 26 is a diagram showing a second example of a screen on which a message application is being executed (a screen for selecting a plurality of messages).
  • FIG. 27 is a diagram showing a first example of a screen on which a moving picture application is being executed.
  • FIG. 28 is a diagram showing a second example of a screen on which a moving picture application is being executed.
  • FIG. 29 is a diagram showing a third example of the screen on which the moving picture application is being executed.
  • FIG. 30 is a diagram showing a first example of a screen on which a music application is being executed.
  • FIG. 31 is a diagram showing a second example of a screen on which a music application is being executed.
  • FIG. 32 is a diagram showing a third example of the screen on which the music application is being executed.
  • FIG. 33 is a diagram showing a first example of a screen on which a photograph application is being executed.
  • FIG. 34 is a diagram showing a second example of the screen on which the photograph application is being executed.
  • FIG. 35 is a diagram showing a third example of the screen on which the photo application is being executed.
  • FIG. 36 is a diagram for explaining an example of presenting information by using an operation on a remote controller of an air conditioner as a trigger.
  • FIG. 37 is a diagram for explaining an example of presenting information by using an operation on a wall switch for turning on and off a lighting device as a trigger.
  • FIG. 38 is a diagram for explaining an example of presenting information by using an operation on a water heater as a trigger.
  • FIG. 39 is a diagram showing an example of a display screen of other information.
  • FIG. 40 is a diagram showing an example of a display screen of a weather warning.
  • FIG. 41 is a diagram showing an example of a notification screen for preventing forgetting to lock the key.
  • FIG. 42 is a diagram showing an example of a display screen recommending the control of the air purifier most suitable for pollen removal.
  • FIG. 43 is a diagram illustrating an example of a display screen that proposes improvement of a sleep state.
  • FIG. 44 is a diagram showing an example of a notification screen for the laundry timing.
  • FIG. 45 is a diagram showing an example of a display screen for making a proposal regarding clothes.
  • FIG. 46 is a diagram showing an example of a display screen for calling attention to ultraviolet rays.
  • FIG. 47 is a diagram showing an example of a display screen that presents programs that are likely to suit the user's preference.
  • FIG. 48 is a diagram showing a first example of a display screen for producing a Christmas effect.
  • FIG. 49 is a diagram showing a second example of the display screen for performing the Christmas effect.
  • FIG. 50 is a diagram showing a configuration of a home screen of the display device according to the embodiment.
  • FIG. 51 is a diagram showing an example of a push layer screen when the user is not near the display device.
  • FIG. 52 is a diagram showing an example of a push layer screen including a plurality of icons.
  • FIG. 53 is a diagram showing an example of a push layer screen in the night mode.
  • FIG. 54 is a diagram showing an example of a screen of the control layer when the device is in the off state.
  • FIG. 55 is a diagram showing an example of a screen for moving and deleting icons of devices.
  • FIG. 56 is a diagram for explaining the change of the display mode of the push layer screen.
  • FIG. 57 is a diagram for explaining a notification performed when the application layer screen is displayed.
  • FIG. 58 is a diagram for explaining an example in which the tutorial of the control layer is displayed when the screen of the control layer is first displayed.
  • FIG. 59 is a diagram showing an example of an icon for performing air conditioning in the entire building, which is displayed on the screen of the control layer.
  • FIG. 60 is a diagram showing screen transitions when the icon for the air conditioning in the entire building is operated.
  • FIG. 61 is a diagram showing an example of a control screen for the entire building air conditioning during the automatic save operation.
  • FIG. 62 is a diagram showing an example of a control screen for the entire building air conditioning in the outing mode.
  • FIG. 63 is a diagram showing an example of an icon for controlling an air conditioner displayed on the screen of the control layer.
  • FIG. 64 is a diagram showing screen transitions when the icon of the air conditioner is long-tap operated.
  • FIG. 65 is a figure which shows an example of the icon for controlling the heat pump type water heater displayed on the screen of a control layer.
  • FIG. 66 is a first diagram showing a screen transition when a long tap operation is performed on the icon of the heat pump water heater.
  • FIG. 67 is a second diagram showing a screen transition when a long tap operation is performed on the icon of the heat pump water heater.
  • FIG. 68 is a diagram showing an example of icons displayed on the screen of the control layer for controlling the water heater provided in the fuel cell system.
  • FIG. 69 is a diagram showing screen transitions when a long tap operation is performed on the icon of the fuel cell system.
  • FIG. 70 is a diagram showing an example of an icon for controlling the electric shutter, which is displayed on the screen of the control layer.
  • FIG. 71 is a diagram showing a screen transition when the icon of the electric shutter is long-tap operated.
  • FIG. 72 is a diagram showing an example of an icon for controlling the electric lock displayed on the screen of the control layer.
  • FIG. 73 is a diagram showing screen transitions when a long tap operation is performed on the electric lock icon.
  • FIG. 74 is a diagram showing an example of an icon for controlling the air purifier displayed on the screen of the control layer.
  • FIG. 75 is a figure which shows a screen transition when the icon of an air purifier is long tap-operated.
  • FIG. 76 is a diagram showing screen transitions when the device falls into an irregular state.
  • FIG. 77 is a diagram showing an initial setting screen and screen transition from the initial setting screen.
  • FIG. 78 is a block diagram showing a functional configuration of the display control system according to the embodiment.
  • FIG. 79 is a flowchart of Operation Example 1 of the display control system according to the embodiment.
  • FIG. 80 is a flowchart of Operation Example 2 of the display control system according to the embodiment.
  • each diagram is a schematic diagram and is not necessarily an exact illustration. Further, in each drawing, the same reference numerals are given to substantially the same configurations, and overlapping description may be omitted or simplified.
  • FIG. 1 is a front view of a display device according to an embodiment.
  • FIG. 2 is a perspective view of the display device according to the embodiment.
  • the display device 20 is a dedicated terminal used for a system (for example, a display control system described later) that provides services related to a new lifestyle centered on people.
  • the display device 20 can provide the user with a beautifully integrated, new form of engagement with home appliances and appliances. Further, the display device 20 can provide the user with a value tailored to the person who lives in cooperation with the server device.
  • the display device 20 is installed inside a building such as a house (for example, a wall).
  • the display device 20 includes a touch screen 21 and is used as an integrated remote controller for the user to control the devices installed in the building. According to the display device 20, since it is not necessary to install a remote controller for each device, it is possible to remove complexity from the building.
  • the user can confirm the activity of living or the setting of devices in the building. It can be done from anywhere inside.
  • 3 is a diagram showing an example of a display screen of the display device 20 installed at the entrance
  • FIG. 4 is a diagram showing an example of the display screen of the display device 20 installed at the dining room
  • FIG. It is a figure which shows an example of the display screen of the display apparatus 20 installed in the bedroom.
  • the user can close the entrance key or the electric shutter in the living room from the bedroom, and turn off the air conditioners and lighting devices in the house from the entrance.
  • the display device 20 eliminates unnecessary information from a large amount of information and provides only necessary information to the user.
  • the device control function of the display device 20 eliminates the complicated function that is often present in the conventional remote controller or switch.
  • the display device 20 intuitively provides only necessary functions in daily life.
  • FIG. 6 is a diagram showing an example of a display screen for the user to intuitively control the air conditioning in the whole building (in other words, to control a plurality of air conditioning devices).
  • the display device 20 is configured so that the information desired by the user and the notification the user does not want to miss can be accessed with one tap.
  • the home screen of the display device 20 is notified of the status or notification of the device corresponding to the display device 20.
  • FIG. 7 is a diagram showing an example of a display screen when the notification is given. As shown in FIG. 7, for example, the user confirms a new recording from the display device 20 installed at the entrance immediately after returning home, or confirms a visitor of the intercom from the display device 20 installed in the study. be able to.
  • FIG. 8 to 11 are diagrams showing examples of notified contents.
  • the user can check the homecoming recording and watch the new arrival recording from the entrance or any place immediately after returning home.
  • the display device 20 enables the user to respond to the intercom device and confirm the history of absence from anywhere in the building.
  • the user can confirm completion of hot water in the bathtub from anywhere in the building.
  • the user can also check the message left by someone in the family.
  • Display function scene control
  • the user can freely switch the environment in the building by operating a plurality of devices through the display device 20.
  • the control for switching the environment in the building to the state desired by the user with one tap or a timer is also referred to as scene control.
  • scene control as long as the device is compatible with the display device 20, any device installed in any space (room) in the building can be controlled at once. For example, it is possible to put all the electric shutters, lighting equipment, and air conditioning equipment installed in a building into a state suitable for comfortable awakening, or to set necessary equipment at once when returning home.
  • FIG. 12 is a diagram showing an example of a display screen for performing scene control when waking up.
  • an alarm is sounded, an electric shutter is opened, and a lighting device is turned on (on. Control) is realized by a timer.
  • the scene control when waking up provides a comfortable wake-up for the user and makes the inside of the building an environment suitable for the morning hours.
  • FIG. 13 is a diagram showing an example of a display screen for performing scene control when returning home.
  • control such as opening an electric shutter, lighting an illumination device, or turning on an air conditioner is performed. Is realized with one tap. In other words, what you need when you go home is realized with one tap.
  • scene control is prepared in multiple preset types in advance, but the user can newly create and register it according to their preferences. For example, the user can newly create and register scene control for working from home and scene control before going to bed.
  • FIG. 14 is a diagram showing an example of a display screen for performing scene control when working from home.
  • the lighting device is dimmed and the set temperature of the air conditioning device is low.
  • FIG. 15 is a diagram showing an example of a display screen for performing scene control before going to bed, and is a diagram showing an example of a display screen for performing scene control before going to bed. Lighting equipment is dimmed, the electric shutter is closed, and the temperature setting of the air conditioning equipment is set to a low level, which makes it possible to create an environment in which the building can be comfortably introduced to sleep. .
  • the user can instruct scene control before going to bed not only from the display device 20 installed in the bedroom but also from the display device 20 installed in the living room or dining room.
  • FIG. 16 is a diagram showing an example of an application selection screen. If various applications are installed, the user can confirm the state of the air in the building and the state of the amount of electricity used in the building by selecting the application to be executed. The user can also check the television programs currently being broadcast and recorded programs. The display device 20 can also receive a new application provided from the server device.
  • FIG. 17 is a diagram showing an example of a screen showing the air quality of each room
  • FIG. 18 is a diagram showing an example of a detailed screen of the air quality of the living room. If the screen as shown in FIG. 17 is displayed on the touch screen 21, the user can grasp the air quality of each room at a glance. If the screen as shown in FIG. 18 is displayed on the touch screen 21, the user can confirm the daily (or monthly) change in the air quality of the specific room.
  • the display device 20 can manage the air quality, for example, by acquiring a signal indicating the sensing result from a sensor that measures the air quality.
  • FIG. 19 is a diagram showing a first example of the screen showing the electricity usage amount
  • FIG. 20 is a diagram showing a second example of the screen showing the electricity usage amount.
  • the display device 20 manages the current or past amount of electricity used in the building for each branch circuit, for example, by acquiring a signal indicating the amount of electricity used from a distribution board or the like having a communication function installed in the building. However, it can be provided to the user as an image.
  • the display device 20 acquires the information indicating the electricity usage amount of another user (may be a general electricity usage amount) from the server device, and thus, as shown in FIG.
  • the usage amount and the power usage amount of other users can be displayed side by side.
  • FIG. 21 is a diagram showing a first example of a screen on which a home messenger application is being executed
  • FIG. 22 is a diagram showing a second example of a screen on which a home messenger application is being executed. If the home messenger application is executed, the kitchen can tell the children's room that "rice is ready", or the entrance can tell the dining room that "you're home”. In other words, the possibility of communication between rooms expands.
  • FIG. 23 is a diagram showing a first example of a screen during execution of the doorphone application (display screen of image of visitor imaged by doorphone). Further, if a camera for imaging the outside of the building is communicatively connected to the display device 20, the user can visually recognize the outside of the building through the display device 20.
  • FIG. 24 is a diagram showing a second example (doorphone / camera selection screen) of the screen during execution of the doorphone application.
  • FIG. 25 is a diagram showing a first example (handwriting input screen) of a screen on which the message application is being executed.
  • FIG. 26 is a diagram showing a second example of a screen on which a message application is being executed (a screen for selecting a plurality of messages).
  • the message can be stored (that is, recorded) as a moving image or a still image, or can be stored (that is, recorded) as a voice.
  • the user selects the “movie” icon in the selection screen of FIG. 16 to display the currently broadcast television program and display which are received by the tuner device that is communicatively connected to the display device 20. It is also possible to check the recorded program provided from the moving image content storage device communicatively connected to the device 20.
  • 27 to 29 are diagrams showing examples of screens during execution of a moving image application. The user checks the new recordings using the display device 20 installed at the entrance immediately after returning home, or checks the morning news using the display device 20 installed in the bedroom while changing clothes in the bedroom. You can In other words, the range of ways to interact with information in your life will expand.
  • the user can enjoy the music provided from the music content storage device communicatively connected to the display device 20 by selecting the “music” icon on the selection screen of FIG. 16.
  • 30 to 32 are diagrams showing examples of screens during execution of the music application. The user can enjoy music in any room where the display device 20 is installed. Also, if a separate speaker device is communicatively connected to the display device 20, it is possible to output music from the speaker device.
  • the user can enjoy the photo provided from the still image content storage device communicatively connected to the display device 20 by selecting the “photo” icon in the selection screen of FIG. 33 to 35 are diagrams showing examples of screens during execution of the photo application.
  • the user can enjoy the photograph in any room where the display device 20 is installed.
  • the display device 20 is an action that leads to a better life, triggered by everyday actions performed by the user in daily life, such as "turning on the lighting device”, “turning off the power of the air conditioning device”, and “heating the bath”. Alternatively, it is possible to present an optimal experience and the like (specifically, content or function) according to the situation. That is, the display device 20 can present information to the user, triggered by the user's operation on the device.
  • FIG. 36 is a diagram for explaining an example of presenting information by using an operation on a remote controller of an air conditioner as a trigger.
  • the display device 20 uses this as a trigger to reduce the power consumption of the air conditioner. 36. Displaying advice on how to use the air conditioner to reduce the power consumption (for example, if the user leaves the room for about one hour, the power consumption is lower when the power is on) ((b) in FIG. 36). ).
  • FIG. 37 is a diagram for explaining an example of presenting information by using an operation on a wall switch for turning on and off the lighting device as a trigger.
  • the display device 20 displays the light control setting suitable for the midnight time zone by using this as a trigger ( FIG. 37 (b)).
  • FIG. 38 is a diagram for explaining an example of presenting information by using an operation on the water heater as a trigger.
  • the display device 20 uses this as a trigger to display an image recommending bathing goods that enriches the bath time. indicate.
  • the display device 20 can present various information triggered by the user's operation on the device.
  • 36 to 38 show examples of information presentation.
  • Various information can be presented by combining the devices connected to the display device 20 and various services.
  • the display device 20 combines a device use log, date and time data, weather information provided from a server device (cloud server), and the like to protect the safety and health of a family, advice for helping housework hurt, and , It is also possible to provide recommended contents that match the tastes of the family.
  • FIG. 39 is a diagram showing an example of a display screen of such other information. Hereinafter, a specific example of information that the display device 20 can present will be described.
  • the display device 20 can display that it is a stormy weather and a weather warning by being provided with weather information (in other words, weather information) from the server device, for example.
  • FIG. 40 is a diagram showing an example of a display screen of a weather warning.
  • the display device 20 can support the peace of mind of the family by displaying stormy weather and displaying a weather warning.
  • the display device 20 can display a notification screen for preventing forgetting to lock the key, based on the state of the electric lock and the time zone.
  • FIG. 41 is a diagram showing an example of a notification screen for preventing forgetting to lock the key.
  • the display device 20 can support the peace of mind of the family by giving a notification for preventing forgetting to lock the lock.
  • the display device 20 can recommend the control of the air purifier that is most suitable for pollen removal, depending on the pollen information and the home-based situation.
  • FIG. 42 is a diagram showing an example of a display screen recommending the control of the air purifier most suitable for pollen removal.
  • the display device 20 can support the health of the family by recommending control of the air purifier that is most suitable for pollen removal.
  • the display device 20 can estimate the sleep condition of the user from the usage log of the lighting device and the electric shutter, and can make a proposal regarding improvement of the sleep state.
  • FIG. 43 is a diagram illustrating an example of a display screen that proposes improvement of a sleep state.
  • the display device 20 can support the health of the family by making a proposal regarding improvement of the sleep state.
  • the display device 20 can notify the best washing timing, for example, by receiving weather information (in other words, information of weather API (Application Programming Interface)) from the server device.
  • FIG. 44 is a diagram showing an example of a notification screen for the laundry timing.
  • the display device 20 can also recommend drying the room in the season when pollen is abundant.
  • the display device 20 can support housework by notifying the laundry timing.
  • the display device 20 can make a proposal regarding clothes (thick or light clothing) based on the difference between the average temperature of the week included in the weather information provided by the server device and the temperature of the day.
  • FIG. 45 is a diagram showing an example of a display screen for making a proposal regarding clothes.
  • the display device 20 can also suggest carrying a rain gear on a rainy day.
  • the display device 20 can alert the user about ultraviolet rays during the day (morning to daytime) based on the ultraviolet ray index included in the weather information provided from the server device.
  • FIG. 46 is a diagram showing an example of a display screen for calling attention to ultraviolet rays.
  • the display device 20 can support the beauty of the user by calling attention to ultraviolet rays.
  • the display device 20 can record and present a program that is likely to suit the user's preference from the viewing history of the television and the recording log of the moving image content storage device.
  • FIG. 47 is a diagram showing an example of a display screen that presents programs that are likely to suit the user's preference.
  • the display device 20 can perform special device control on an anniversary or the like.
  • the display device 20 displays a display screen for performing a Christmas effect using the device at Christmas.
  • 48 and 49 are diagrams each showing an example of a display screen for performing a Christmas effect. When the user's approval is obtained through such a display screen, a Christmas effect is performed by the speaker (music) and the lighting device (light) in the room where the display device 20 is installed.
  • FIG. 50 is a diagram showing the configuration of the home screen of the display device 20.
  • the home screen of the display device 20 has (a) "Apps Layer (application layer)", (b) “Push Layer (push layer)”, and (c) "Control Layer (control layer)”.
  • Layer) has a three-story structure (three-layer structure).
  • the push layer serves as a base point and a swipe operation is performed on the touch screen 21 from the top to the bottom while the push layer screen is displayed, the screen transitions to the application layer screen.
  • the touch screen 21 is swiped in the direction from the bottom to the top while the push layer screen is displayed, the screen changes to the control layer screen.
  • the push layer is a push-type interaction layer that is the basis of the display device 20 such as recommendations and notifications.
  • information such as a notification or a recommendation transmitted from a device that is communicatively connected to the display device 20 as a starting point is displayed.
  • the display device 20 detects the presence or absence of the user around the display device 20 by a human sensor or the like, and the user detects the presence of the user around the display device 20 (that is, near the display device 20). ) Displays information such as notifications or recommendations on the screen. On the other hand, when the user is not around the display device 20 (that is, near the display device 20), the display device 20 does not display the information such as the notification or the recommendation to be displayed on the screen.
  • FIG. 51 is a diagram showing an example of a push layer screen when the user is not near the display device 20.
  • FIG. 52 is a diagram showing an example of a push layer screen including a plurality of icons.
  • the plurality of icons are arranged in a line in the left-right direction in the order of the generated time, for example, the one having a newer time when the information is generated is on the left side.
  • a horizontal swipe operation in other words, scroll operation
  • the icon is scrolled.
  • the display areas of time, date, and weather are maintained (fixed).
  • the screen changes to the screen of the application layer or the screen of the control layer related to the information indicated by the icon.
  • FIG. 53 is a diagram showing an example of a push layer screen in such a night mode.
  • the control layer is a layer for individually controlling the devices grouped for each device category and place (room), such as turning off lighting devices in the kitchen and setting air conditioning in the whole building.
  • the user can turn the device on and off without screen transition by tapping the device icon included in the control layer screen. That is, the user can simply control the device. Also, the user can perform detailed control of the device by long-tap operation of the device icon.
  • Respective icons of a plurality of devices are displayed in different modes (for example, in different colors) according to the on and off states so that the on and off states of the devices can be seen at a glance.
  • the icon of the device is displayed in color
  • the icon of the device is displayed in monochrome (grayscale, washout).
  • FIG. 54 is a diagram showing an example of a screen of the control layer when the device is in the off state.
  • the device icon may include characters indicating the operating state of the device.
  • the device icons are arranged in a line in the left-right direction, and when a left-right swipe operation (in other words, scroll operation) is performed on the icon display area, the icon is scrolled. .
  • the icon located on the rightmost side of the touch screen 21 in the default state is displayed so that a part of the icon is interrupted so that the user can be informed that the icons of a plurality of devices scroll.
  • the icons of multiple devices are arranged in the order according to the place (room) where the devices are installed.
  • an icon for example, a scene control icon
  • a room for example, a living room
  • the icon of the device installed in is placed.
  • icons of devices installed in the room closest to the living room are arranged.
  • the icon of the device installed in the room closer to the room in which the display device 20 is installed is on the left side (in other words, the position that is easily accessible in the default state (initial state)). Will be placed. That is, the icons of the plurality of devices are arranged in an order based on the installation positions of the devices (in other words, an order related to the installation positions).
  • the icons on the relatively left side of the icons of the multiple devices are located on the screen in the default state, but are on the relatively right side. Is not on the screen in the default state and appears on the screen only after scrolling.
  • a jump shortcut (characters indicating a place (space) in the building such as “all rooms”, “living room”, and “kitchen”) is provided at the lower left of the screen of the control layer. By tapping the jump shortcut, the user can position the icon of the device installed at the location indicated by the tapped character on the screen (a plurality of icons automatically scroll).
  • FIG. 55 is a diagram showing an example of a screen for moving and deleting icons of devices.
  • the user can change the arrangement (left-right positional relationship) of the device icon by performing a drag-and-drop operation on the device icon on the device icon movement / deletion screen.
  • the application layer is used to check the power consumption in the building and watch a video.
  • multiple icons are categorized according to the user's desired action or experience action.
  • new applications provided by the developer and new services provided in the future are added to the application layer.
  • the application program described in FIGS. 16 to 35 can be executed from the application layer screen.
  • What can be executed from the application layer and what can be executed from the control layer are classified as follows, for example.
  • What can be executed from the application layer is equipment that is easy to add and change, such as so-called home appliances (equipment that is not fixed to the wall of the building), and equipment (lighting equipment and air conditioning equipment that is fixed to the wall of the building, etc.).
  • Device but refers to control of a device that operates by referring to information.
  • control layer what can be executed from the control layer is control of equipment. In some cases, the same control can be executed from both the application layer and the control layer.
  • the display mode of the push layer screen may be changed according to the distance between the display device 20 and a person standing in front of the display device 20. That is, the display device 20 may automatically switch the display according to the distance to the user.
  • FIG. 56 is a diagram for explaining the change of the display mode of the push layer screen.
  • the screen of the push layer is when the user is located within a range of 1 m from the display device 20 (for example, while the user is operating the display device 20 or in front of the display device 20).
  • Standing state is displayed in the normal state.
  • the screen of the push layer is displayed in the ambient state when the user is located within the range of 1 m or more and less than 3 m from the display device 20 (when the user is in the room where the display device 20 is installed).
  • the push layer screen is in a sleep state (for example, black when the user is located 3 m or more away from the display device 20 (for example, when the user is not in the room where the display device 20 is installed)).
  • Image for example, black when the user is located 3 m or more away from the display device 20 (for example, when the user is not in the room where the display device 20 is installed).
  • Ambient state means a state in which the presence or absence of a notification can be confirmed even from a distant position while taking care not to insist on the notification too much.
  • the push layer screen in the ambient state is realized, for example, by displaying the time and the notification icon on a white background.
  • the information that the user always wants to display may be changed by the user setting.
  • the information that the user always wants to display includes, specifically, time, weather, outdoor temperature and humidity, indoor temperature and humidity, air quality (in other words, the degree of contamination of air. Smell, and PM2. 5 concentration, etc., power consumption, power generation amount of solar power generation system, family schedule, what day is today, etc.
  • the information that the user always wants to display may be considered such that the user can visually recognize it even from a slightly distant position, for example, by enlarging the characters or the display.
  • FIG. 57 is a diagram for explaining a notification performed when the application layer screen is displayed. Although not shown, the same applies to the notification given when the screen of the control layer is displayed.
  • the content of the notification is superimposed and displayed on the upper part of the screen of the application layer. While the application is being executed, the notification is not displayed on the application execution screen (screens shown in FIGS. 17 to 35), and when the execution screen returns to the app layer screen, the notification content is superimposed on the upper part of the screen. It When the area on which the notification content is superimposed is tap-operated, for example, the notification ends and the state returns to the state of (a) of FIG.
  • a screen for controlling the device is displayed.
  • Notifications that are normally performed as push notifications include, for example, notification of the anniversary of the installation of the display device 20, notification of the birthday of the user or family, notification of an important message, notification of abnormal temperature and humidity, and opening of the electric shutter.
  • the notification includes a notification of the time, a notification when the electric lock is opened, and a notification regarding the delivery box.
  • the forced priority push notification is a notification that requires immediate response from the user like an incoming call, in other words, it wants to be notified immediately regardless of any operation (more important than normal push notification). It is a notification. As shown in (a) of FIG. 57 and (c) of FIG. 57, when the forced priority push notification is required, the entire screen is switched to the notification screen even during the operation.
  • the notification performed as the forced priority push notification includes, for example, a power outage (cooperation / non-cooperation) notification, a power overuse notification (peak alarm), a notification indicating a visitor by an intercom device ((c) in FIG. 57), Calls from other rooms (that is, other display devices 20) are included.
  • the display device 20 displays a tutorial (specifically, an outline of what can be done by the application and typical operations) when the application or the like is activated for the first time.
  • FIG. 58 is for explaining an example in which the tutorial of the control layer is displayed ((c) of FIG. 58) when the screen of the control layer is first displayed ((a) and (b) of FIG. 58).
  • FIG. 58 is for explaining an example in which the tutorial of the control layer is displayed ((c) of FIG. 58) when the screen of the control layer is first displayed ((a) and (b) of FIG. 58).
  • Animation in other words, video is used to display the tutorial.
  • a balloon or a character is displayed on the screen, or an animation for predicting an operation is displayed on the screen. Accordingly, the display device 20 can present the operation method to the user in an easy-to-understand manner.
  • the tutorial by removing the amount of information (words and contents, etc.), it is possible to prevent users from feeling stress.
  • the tutorial is configured to advance to the next item by a user operation such as a swipe operation so that the user can quickly access a necessary portion, and this can prevent the user from feeling annoying.
  • a user operation such as a swipe operation so that the user can quickly access a necessary portion, and this can prevent the user from feeling annoying.
  • Detailed functions and operating instructions are described in the online manual, and the display device 20 displays guidance so that the user can smoothly reach the place where the online manual is described.
  • FIG. 59 is an example of an icon (hereinafter, also simply referred to as an entire building air conditioning icon) for performing the entire building air conditioning displayed on the screen of the control layer (specifically, an icon corresponding to the state of the entire air conditioning). It is a figure which shows an example of a change.
  • FIG. 60 is a diagram showing screen transitions when the icon for the air conditioning in the entire building is operated.
  • the entire building air conditioning icon is not grouped (in other words, associated) with the "all rooms" tab on the control layer screen, for example.
  • the entire building air conditioning is basically always on and is not suitable for single tap operation. Therefore, even if either single tap operation or long tap operation is performed on the entire building air conditioning icon, the modal shown in FIG. A window is displayed.
  • the entire building air conditioning may have a specification that cannot be turned off from the display device 20.
  • FIG. 61 is a diagram showing an example of the control screen for the entire building air conditioning during the automatic save operation
  • FIG. 62 is a diagram showing an example of the control screen for the whole building air conditioning in the outing mode (during outing). These control screens are displayed only when an operation is performed from the original remote controller of the air conditioning system in the whole building.
  • FIG. 63 is an example of an icon (hereinafter, also simply referred to as an icon of an air conditioner) for controlling the air conditioner displayed on the screen of the control layer (specifically, an icon corresponding to the state of the air conditioner).
  • FIG. 4 is a diagram showing an example of the change of FIG.
  • FIG. 64 is a diagram showing screen transitions when the icon of the air conditioner is long-tap operated.
  • FIG. 65 is an example of an icon (hereinafter, also simply referred to as a heat pump water heater icon) for controlling the heat pump water heater displayed on the screen of the control layer (specifically, a heat pump water heater).
  • FIG. 4 is a diagram showing an example of change of an icon according to the state of FIG.
  • FIG. 66 is a first diagram showing a screen transition when a long tap operation is performed on the icon of the heat pump water heater.
  • FIG. 67 is a second diagram showing a screen transition when a long tap operation is performed on the icon of the heat pump water heater. Similar icons are used for the gas water heater, and similar screen transitions are performed.
  • FIG. 68 is an example of an icon (specifically, also referred to as an icon of the fuel cell system, hereinafter) for controlling the water heater provided in the fuel cell system, which is displayed on the screen of the control layer (specifically, fuel). It is a figure which shows an example of a change of the icon according to the state of the water heater with which a battery system is equipped.
  • FIG. 69 is a diagram showing screen transitions when a long tap operation is performed on the icon of the fuel cell system.
  • FIG. 70 shows an example of an icon (hereinafter, also simply referred to as an electric shutter icon) for controlling the electric shutter displayed on the screen of the control layer (specifically, depending on the state of the electric shutter). It is a figure which shows an example of a change of an icon. As shown in FIG. 70, during the operation of the electric shutter, the icon is animated to present the state of the electric shutter to the user in an easy-to-understand manner. A similar icon (the title of the icon is changed from "shutter” to "blind”) is also used for the electric blind.
  • FIG. 71 is a diagram showing a screen transition when the icon of the electric shutter (or the electric blind) is long-tap operated.
  • a long-tap operation is performed on the electric shutter icon grouped (in other words, associated) with the “all rooms” tab on the control layer screen, a plurality of electric shutters in the building are collectively controlled.
  • the icon of the electric shutter grouped in each room tab on the screen of the control layer is long-tap operated, the electric shutter installed in the room is controlled.
  • FIG. 72 shows an example of an icon (hereinafter, also simply referred to as an electric lock icon) for controlling the electric lock displayed on the screen of the control layer (specifically, depending on the state of the electric lock). It is a figure which shows an example of a change of an icon.
  • FIG. 73 is a diagram showing screen transitions when a long tap operation is performed on the electric lock icon.
  • the electric lock icon grouped in the "All rooms" tab on the control layer screen is long-tapped, a plurality of electric locks in the building are collectively controlled.
  • the electric lock icon grouped in each room tab on the screen of the control layer is long-tap operated, the electric lock installed in the room is controlled.
  • FIG. 74 shows an example of an icon for controlling the air purifier (hereinafter, also simply referred to as an air purifier icon) displayed on the screen of the control layer (specifically, the state of the air purifier).
  • FIG. 6 is a diagram showing an example of change of an icon according to FIG.
  • FIG. 75 is a figure which shows a screen transition when the icon of an air purifier is long tap-operated.
  • FIG. 76 is a diagram showing screen transitions when the device falls into an irregular state.
  • 76 (a) is a diagram showing an example of a screen when the system is abnormal
  • FIG. 76 (b) is a diagram showing an example of a screen when offline.
  • irregular states are classified into, for example, a detached state, an indefinite state, and a failure state.
  • the detached state means a state in which the communication connection between the display device 20 and the device is lost
  • the indefinite state is a state in which the display device 20 and the device are connected by communication but it takes time to detect a numerical value (for example, starting State).
  • the failure state means that the device itself is out of order.
  • FIG. 77 is a diagram showing such an initial setting screen and screen transition from the initial setting screen. As shown in FIG. 77, when inputting the name of the installation place on the initial setting screen ((a) of FIG. 77), the screen changes to the input screen ((b) of FIG. 77).
  • the screen ((c) in Fig. 77) notifying that the initial settings are completed is displayed.
  • a predetermined operation in the example of FIG. 77, a swipe operation in the upward direction
  • the screen changes to the push layer screen ((d) of FIG. 77).
  • the tutorial screen is automatically displayed as a modal window as described above ((e) in FIG. 77).
  • FIG. 78 is a block diagram showing a functional configuration of the display control system according to the embodiment.
  • the display control system 10 is a system that displays an image for controlling the control target devices (the first control target device 41 and the second control target device 42).
  • the display control system 10 includes a display device 20, a control device 30, a first control target device 41, a second control target device 42, a sensor 43, and a server device 50.
  • the display device 20, the control device 30, the first control target device 41, the second control target device 42, and the sensor 43 are installed in, for example, a building such as a house or the site of the building.
  • the display device 20 is a device that functions as a user interface device related to device control.
  • the display device 20 is directly or indirectly communicatively connected to a plurality of devices installed in a building.
  • the display device 20 is directly or indirectly connected to a control target device including a home appliance, a home electric appliance, or the like via the control device.
  • the display device 20 is installed, for example, on a wall inside a building.
  • the display device 20 includes a touch screen 21, a control unit 22, a storage unit 23, and a communication unit 24.
  • the touch screen 21 includes a touch panel 21a that receives a user operation and a display unit 21b that displays an image (moving image or still image).
  • the touch panel 21a is an example of an acquisition unit.
  • the touch panel 21a is, for example, a capacitance type touch panel, but may be a resistive film type touch panel.
  • the display unit 21b is, for example, a display panel such as a liquid crystal panel or an organic EL (Electro Luminescence) panel.
  • the control unit 22 performs control for operating the control target device according to the operation received by the touch panel 21a and control for displaying an image on the display unit 21b.
  • the control unit 22 is realized by, for example, a microcomputer, but may be realized by a processor.
  • the storage unit 23 is a storage device in which a control program executed by the control unit 22 is stored.
  • the storage unit 23 is realized by, for example, a semiconductor memory.
  • the communication unit 24 is a communication circuit (in other words, a communication module) for the display device 20 to communicate with the control device 30, the second controlled device 42, and the server device 50.
  • the communication unit 24 transmits a control request for the control target device to the control device 30 based on the control of the control unit 22, for example. Further, the communication unit 24 transmits a control signal for controlling the second control target device 42 to the second control target device 42 based on the control of the control unit 22.
  • the communication unit 24 receives, for example, a signal indicating the sensing result from the sensor.
  • the communication unit 24 performs wireless communication, for example, but may perform wired communication.
  • the communication standard of communication performed by the communication unit 24 is not particularly limited.
  • the control device 30 is, for example, a HEMS (Home Energy Management System) controller, and controls the first control target device 41 based on a control request transmitted by the communication unit 24.
  • the control device 30 is not limited to the HEMS controller having the energy management function, and may be another home controller having no energy management function or a gateway device.
  • the first controlled device 41 is a device controlled by the control device 30 and is indirectly connected to the display device 20 so as to be communicable. Specifically, the first control target device 41 operates based on a control signal transmitted from the control device 30.
  • the first control target device 41 includes, for example, a storage battery system, a fuel cell system, a solar power generation system, a lighting device, a heat pump water heater, an electric shutter, or a device such as an electric lock.
  • wireless communication based on a communication standard such as Echonet Lite (registered trademark) is performed.
  • the second control target device 42 is a device controlled by the display device 20, and is directly communicably connected to the display device 20.
  • the second controlled device 42 specifically operates based on a control signal transmitted from the display device 20.
  • the second control target device 42 includes, for example, a device such as an intercom device or an air conditioning device.
  • the device described as the first control target device 41 may be included in the second control target device 42, and the device described as the second control target device 42 may be included in the first control target device 41.
  • the equipment to be controlled may include household equipment or household appliances.
  • the sensor 43 senses the environment inside the building and transmits a signal indicating the sensing result to the display device 20. Accordingly, the display device 20 can display the environment inside the building. In addition, the display device 20 can also control the control target device based on the environment in the building.
  • the sensor 43 includes a pressure sensor, a temperature sensor, a humidity sensor, an illuminance sensor, or the like.
  • the server device 50 is a cloud server. Various services are realized by the cooperation of the server device 50 and the display device 20.
  • FIG. 79 is a flowchart of the operation example 1 of the display control system 10.
  • the operation of FIG. 79 is an operation corresponding to the case of the normal push notification of FIG. 57 based on FIG. 50, and will be described below with reference to FIGS. 50, 57, and 79.
  • the control unit 22 of the display device 20 displays the first display layer on the touch screen 21 (more specifically, the display unit 21b) (S11).
  • the first display layer corresponds to, for example, the push layer in (b) of FIG.
  • the first display layer has display objects belonging to the first category.
  • the display object belonging to the first category is a rectangular icon displayed on the push layer screen in FIG. 50 (b).
  • the first category is, for example, a user interface for notifying the state of each device to be controlled (in other words, a plurality of devices).
  • the touch panel 21a acquires information regarding the first touch on the touch screen 21 performed while the first display layer is displayed on the touch screen 21 (S12).
  • control unit 22 determines whether the information on the first contact includes the movement of the contact position in the first direction (S13).
  • the movement of the contact position in the first direction is, for example, movement of the contact position based on an upward swipe operation.
  • the control unit 22 determines that the information regarding the first contact includes the movement of the contact position in the first direction (Yes in S13), the image displayed on the display unit 21b is displayed in the first display layer.
  • the second display layer corresponds to the control layer in (c) of FIG. 50, for example.
  • the second display layer has display objects that belong to a second category different from the first category. Specifically, the display object belonging to the second category is a rectangular icon displayed on the screen of the control layer in (c) of FIG.
  • the second category is a user interface for controlling each state of the control target device (a plurality of devices).
  • the control unit 22 acquires the state of the control target device via the communication unit 24 (S15), and determines whether the state of the control target device has changed (S16).
  • the control unit 22 determines that the state of the control target device has changed (Yes in S16)
  • the control unit 22 notifies the change of the state of the device (S17).
  • the display unit 21b superimposes a display object that notifies the change of the state of the device on a part of the second display layer being displayed. That is, when the state of the control target device changes while the second display layer is displayed on the touch screen 21, the control unit 22 displays a display object that notifies the change in the second display layer. indicate.
  • step S13 If it is determined in step S13 that the information regarding the first contact does not include the movement of the contact position in the first direction (No in S13), the processes in step S14 and thereafter are not performed. If it is determined in step S16 that the state of the control target device has not changed (No in S16), the process of step S17 is not performed.
  • the display control method as described above can notify the user of the device status.
  • control unit 22 controls the second display layer to change the state of the control target device when the state of the control target device changes while the second display layer is displayed on the touch screen 21. You may move to the device usage screen.
  • FIG. 80 is a flowchart of an operation example 2 of such a display control system 10.
  • the operation example 2 is an operation corresponding to the case of the forced priority push notification in FIG.
  • the control unit 22 determines that the state of the control target device has changed (Yes in S16)
  • the control unit 22 shifts to the use screen of the control target device having the state change (S18).
  • the control unit 22 shifts to the intercom use screen when, for example, the state of the intercom changes.
  • the use screen of the intercom is displayed on the full screen, for example.
  • the display control method described above can display a usage screen and prompt the user for an immediate response when the user needs an immediate response such as an incoming call.
  • the operation example 1 and the operation example 2 may be switched according to what kind of device the control target device having the state change is (or what kind of state change).
  • “while the second display layer is displayed on the touch screen 21” includes the time when the second display layer is actually displayed on the touch screen 21 in a manner that can be visually recognized by the user.
  • the sleep state black image
  • the sleep state is set when the second display layer is displayed on the touch screen 21. The case (that is, while the second display layer is displayed on the touch screen 21 in a manner invisible to the user) is also included.
  • the touch screen 21 enters a sleep state.
  • the display object notifying the change of the state of the intercom is superimposed on a part of the second display layer.
  • the display object that notifies the change of the intercom state may be superimposed on the screen in the sleep state, or may be actually displayed when the user returns home and performs an operation such as canceling the sleep state.
  • a display object for notifying the change of the state of the intercom may be superimposed on the second display layer.
  • the processing executed by a specific processing unit may be executed by another processing unit.
  • part or all of the processing performed by the display device may be performed by the control device or the server device.
  • the communication method between the devices in the above embodiment is not particularly limited. Wireless communication may be performed between devices, or wired communication may be performed. In addition, wireless communication and wired communication may be combined between the devices. Further, in the above embodiment, when two devices communicate with each other, a relay device (not shown) may be interposed between the two devices.
  • each component may be realized by executing a software program suitable for each component.
  • Each component may be realized by a program execution unit such as a CPU or a processor reading and executing a software program recorded in a recording medium such as a hard disk or a semiconductor memory.
  • each component may be realized by hardware.
  • each component may be a circuit (or integrated circuit). These circuits may form one circuit as a whole or may be separate circuits. Further, each of these circuits may be a general-purpose circuit or a dedicated circuit.
  • a recording medium such as a system, a device, a method, an integrated circuit, a computer program, or a computer-readable CD-ROM.
  • the system, the device, the method, the integrated circuit, the computer program, and the recording medium may be implemented in any combination.
  • the present invention may be realized as a display control method executed by a computer such as a display control system, or may be realized as a program for causing a computer to execute such a display control method. Further, the present invention may be realized as a computer-readable non-transitory recording medium in which such a program is recorded.
  • the display control system is realized by a plurality of devices, but it may be realized as a single device.
  • the display control system may be realized, for example, as a single device corresponding to a display device, a single device corresponding to a control device, or a single device corresponding to a server device. May be realized as.
  • the components included in the display control system may be distributed to the plurality of devices in any way.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé de commande d'affichage qui comprend une étape (S12) consistant à acquérir des informations relatives à un premier contact sur un écran tactile réalisé pendant qu'une première couche d'affichage est affichée sur l'écran tactile, une étape (S14) consistant à passer de la première couche d'affichage à une seconde couche d'affichage si les informations relatives au premier contact comprennent un mouvement de la position de contact dans une première direction, une étape (S15) consistant à acquérir l'état d'un dispositif, et une étape (S17) dans laquelle, si pendant que la seconde couche d'affichage est affichée sur l'écran tactile, il y a un changement dans l'état du dispositif, une notification d'objet d'affichage dudit changement est affichée dans la seconde couche d'affichage.
PCT/JP2019/026380 2018-10-25 2019-07-02 Procédé de commande d'affichage et système de commande d'affichage WO2020084836A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020552518A JPWO2020084836A1 (ja) 2018-10-25 2019-07-02 表示制御方法、及び、表示制御システム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862750525P 2018-10-25 2018-10-25
US62/750,525 2018-10-25

Publications (1)

Publication Number Publication Date
WO2020084836A1 true WO2020084836A1 (fr) 2020-04-30

Family

ID=70331501

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/026380 WO2020084836A1 (fr) 2018-10-25 2019-07-02 Procédé de commande d'affichage et système de commande d'affichage

Country Status (2)

Country Link
JP (1) JPWO2020084836A1 (fr)
WO (1) WO2020084836A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102613548B1 (ko) * 2022-12-14 2023-12-13 송현민 창문 대체형 디스플레이 장치

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014067123A (ja) * 2012-09-25 2014-04-17 Sharp Corp 情報処理装置
JP2015530664A (ja) * 2012-09-07 2015-10-15 グーグル インコーポレイテッド 電子デバイス上のスタッカブル・ワークスペース
JP2017123564A (ja) * 2016-01-07 2017-07-13 ソニー株式会社 制御装置、表示装置、方法及びプログラム

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2940568A4 (fr) * 2012-12-28 2016-08-24 Sony Corp Dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement
JP6244957B2 (ja) * 2014-02-10 2017-12-13 凸版印刷株式会社 表示制御装置、表示制御方法及びプログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015530664A (ja) * 2012-09-07 2015-10-15 グーグル インコーポレイテッド 電子デバイス上のスタッカブル・ワークスペース
JP2014067123A (ja) * 2012-09-25 2014-04-17 Sharp Corp 情報処理装置
JP2017123564A (ja) * 2016-01-07 2017-07-13 ソニー株式会社 制御装置、表示装置、方法及びプログラム

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102613548B1 (ko) * 2022-12-14 2023-12-13 송현민 창문 대체형 디스플레이 장치

Also Published As

Publication number Publication date
JPWO2020084836A1 (ja) 2021-09-30

Similar Documents

Publication Publication Date Title
US11372433B2 (en) Thermostat user interface
US9952573B2 (en) Systems and methods for a graphical user interface of a controller for an energy-consuming system having spatially related discrete display elements
US10175668B2 (en) Systems and methods for energy-efficient control of an energy-consuming system
US10778461B2 (en) Appliance control system, home controller, remote control method, and recording medium
CN113203168A (zh) 建筑物自动化控制系统
US11372530B2 (en) Using a wireless mobile device and photographic image of a building space to commission and operate devices servicing the building space
JP5882391B2 (ja) エネルギーマネジメントコントローラ、エネルギーマネジメントシステム、エネルギーマネジメント方法、及び、プログラム
JP7122615B2 (ja) 表示制御方法、及び、表示制御システム
WO2020084836A1 (fr) Procédé de commande d'affichage et système de commande d'affichage
JP7042454B2 (ja) 情報端末、及び、情報端末の操作プログラム
JP7033724B2 (ja) 情報端末、及び、操作支援プログラム
WO2020084835A1 (fr) Procédé de commande d'affichage et système de commande d'affichage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19875626

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020552518

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19875626

Country of ref document: EP

Kind code of ref document: A1