CN109835784B - Elevator system - Google Patents

Elevator system Download PDF

Info

Publication number
CN109835784B
CN109835784B CN201810549553.0A CN201810549553A CN109835784B CN 109835784 B CN109835784 B CN 109835784B CN 201810549553 A CN201810549553 A CN 201810549553A CN 109835784 B CN109835784 B CN 109835784B
Authority
CN
China
Prior art keywords
touch
control unit
touch panel
panel
operation panel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810549553.0A
Other languages
Chinese (zh)
Other versions
CN109835784A (en
Inventor
菊山贤一
日野原隆正
小野田洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Elevator and Building Systems Corp
Original Assignee
Toshiba Elevator Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Elevator Co Ltd filed Critical Toshiba Elevator Co Ltd
Publication of CN109835784A publication Critical patent/CN109835784A/en
Application granted granted Critical
Publication of CN109835784B publication Critical patent/CN109835784B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Elevator Control (AREA)
  • Indicating And Signalling Devices For Elevators (AREA)

Abstract

The invention provides an elevator system having a touch panel type operation panel which can be used by both a sound person and a visually impaired person. One embodiment relates to an elevator system, including: a touch panel type operation panel that displays a destination layer button corresponding to each layer in advance; a control unit that controls the operation panel; and a speaker which outputs sound by the control of the control unit. The control unit outputs, from the speaker, a sound for guiding a touch position of a1 st touch to a 2 nd area where a predetermined destination layer can be registered when the 1 st touch by the user is detected in the 1 st area on the touch panel, and registers a layer corresponding to the 2 nd area as the destination layer when a predetermined action by the user is detected in the 2 nd area.

Description

Elevator system
The present application is based on Japanese patent application 2017-227216 (application date: 11/27/2017), according to which priority is enjoyed. This application is incorporated by reference in its entirety.
Technical Field
Embodiments of the present invention relate to an elevator system including a touch panel type operation panel in a car.
Background
Generally, an operation panel for moving an elevator to an arbitrary floor is provided in a car of the elevator. The user can move the elevator to a target floor by pressing a destination floor button on the operation panel. In recent years, by using a touch panel type operation panel, it is possible to achieve greater operability than arranging physical buttons.
Documents of the prior art
Patent document
Patent document 1: japanese patent No. 5115554
Patent document 2: japanese patent laid-open No. 2001-250134
Disclosure of Invention
Technical problem to be solved by the invention
As described above, when the operation panel is of a touch panel type, the operation surface is a smooth plane, and therefore, for example, it is difficult for a visually impaired person to operate the operation panel. In this case, it is conceivable to provide a dedicated operation panel or device that can be operated by the visually impaired person, and the like, thereby ensuring operability of the visually impaired person. Specifically, a device having 0 to 9 numeric keys is separately provided to enable input of numbers, and a handset having 0 to 9 numeric keys is separately provided to enable input of a destination layer through voice guidance and operation of the 0 to 9 numeric keys.
However, in many cases, the space in the car is limited, and it is difficult to separately install a device. In addition, if 2 kinds of operation panels are provided, the cost for providing the operation panels increases. Therefore, it is preferable to be able to cope with visually impaired people by 1 touch panel type operation panel.
The present invention has been made to solve the above-described problems, and an object of the present invention is to provide an elevator system including a touch panel type operation panel usable by both a healthy person and a visually impaired person.
Means for solving the problems
One embodiment relates to an elevator system including: a touch panel type operation panel that displays a destination layer button corresponding to each layer in advance; a control unit that controls the operation panel; and a speaker which outputs sound by the control of the control unit. The control unit outputs, from the speaker, a sound for guiding a touch position of a1 st touch to a 2 nd area in which a predetermined destination layer can be registered, when the 1 st touch by a user is detected in the 1 st area on the touch panel, and registers a layer corresponding to the 2 nd area as the destination layer, when a predetermined action by the user is detected in the 2 nd area.
Drawings
Fig. 1 is a diagram showing an example of the configuration of an elevator system according to embodiment 1.
Fig. 2 is a diagram showing an example 1 of the configuration of the operation panel according to the embodiment.
Fig. 3 is a flowchart showing an example of the processing of the operation control unit according to the embodiment.
Fig. 4 is a diagram showing an example 2 of the configuration of the operation panel according to the embodiment.
Fig. 5 is a diagram showing an example of the configuration of an elevator system according to embodiment 2.
Fig. 6 is a flowchart showing an example of the processing of the operation control unit according to the embodiment.
Fig. 7 is a diagram showing an example 2 of the configuration of the operation panel according to the embodiment.
Fig. 8 is a diagram showing an example of the configuration of the operation panel according to embodiment 3.
Detailed Description
Visually impaired people often take actions such as feeling around by touching surrounding objects, grasping the surrounding situation based on a feeling, and searching for a target object. When a person with visual impairment takes a scene of taking an elevator, the person with visual impairment is found by investigating and discovering that most of the persons with visual impairment first find out an operation panel while feeling a wall surface by groping from a door side when taking the elevator, then find out a destination floor button on the operation panel, and press a destination floor button as a target.
Therefore, in order to enable a visually impaired person to perform an operation even when the operation panel is of a touch panel type, it is necessary to provide a structure for guiding the visually impaired person so that the visually impaired person can find the position of a target layer button displayed on the display device of the operation panel while touching the touch panel of the operation panel.
Further, since the visually impaired person searches for the operation panel while stroking, for example, when the stroking speed is fast, the operation is too large, and the hand may reach the operation panel and press any one of the destination layer buttons displayed on the touch panel. In this case, it is possible to register a layer that the visually impaired person does not intend to go as the destination layer. Such a phenomenon is likely to occur, for example, when the destination layer button is disposed on the entire screen of the touch panel.
In the embodiments described below, a touch panel type operation panel and an elevator system provided with the touch panel type operation panel, which can prevent a visually impaired person from registering an unintended destination floor as described above and which does not impair convenience for a healthy person, will be described. The following describes embodiments with reference to the drawings.
In each embodiment, the user is a generic term of a sound person and a visually impaired person who use the elevator.
(embodiment 1)
Fig. 1 is a diagram showing a configuration of an elevator system according to embodiment 1. In embodiment 1, a description is given assuming an elevator system having a single car with 1 car, but the present invention can also be applied to, for example, a multi-car elevator system having a plurality of cars in 1 lifting path, a group control system having a plurality of cars, and the like.
In the elevator system of the present embodiment, the car door 2 is openably and closably attached to the entrance of the car 1. Beside the car door 2, a touch panel type operation panel 3 is provided. The operation panel 3 is composed of a display device and a transparent touch panel mounted on the display device. The operation panel 3 displays the destination layer buttons 3a corresponding to the respective floors of the building so that the destination layer buttons 3a can be touch-operated. A touch panel is a pointing device capable of detecting a position on a screen touched by an object (e.g., a finger of a user).
Examples of the touch panel include a resistive film type and a capacitive type, and the touch panel is not particularly limited to these. The Display device is configured by, for example, an LCD (Liquid Crystal Display), an Organic EL (Organic Electro-luminescence), or the like, but is not particularly limited thereto.
The operation panel 3 is connected to an elevator control device (EL control device) 9 via a touch panel control device (T/P control device) 7. When a user in the car 1 operates (touches) the destination button 3a on an arbitrary floor on the operation panel 3, an operation signal of the destination button 3a is sent to the elevator control device 9 via the touch panel control device 7.
The operation panel 3 may be provided with a door opening button, a door closing button, an emergency call button, and the like, which are not shown, in addition to the destination layer button 3a of each layer. The door opening button is used when instructing the opening of the car door 2. The door closing button is used when indicating the closing of the car door 2. The emergency call button is used when communicating with the outside (a monitoring room of a building, a monitoring center of an elevator existing in a remote place, or the like). These buttons are important in emergency, and therefore, if the operation is reliable, it is preferable that the buttons be push-type (hardware) buttons on the outside of the touch panel type operation panel 3.
A display 6 and a speaker 5 are provided in the vicinity of the operation panel 3 in the car 1. The display 6 and the speaker 5 are connected to an elevator control device 9.
The display 6 includes a display device. On the display device, various messages including the traveling direction, the current position, and the like are displayed as an in-car indicator.
The speaker 5 is used when various messages are broadcast by voice to users in the car 1. The speaker 5 may not be provided in the car, and may be a device that can be mounted by a user, for example. The speaker 5 outputs a sound based on an instruction from the sound output unit 93 b. The speaker 5 may be provided on the same surface as or in the vicinity of the operation panel 3, unlike fig. 1. The speakers 5 can be provided at all locations in the car 1, such as the ceiling, floor, and side surfaces of the car 1. The speaker 5 may be provided at a position that is recognized by the user when a sound is emitted from the operation panel 3. This allows the visually impaired to recognize the position of the operation panel 3 by voice.
The touch panel control device 7 can detect, for example, when the user touches the operation panel 3 with an object, a touch position (coordinates) and a state where the touch panel is pressed. The touch panel control device 7 can detect that the object is separated (released) from the state where the touch panel 3 is touched. The touch panel control device 7 may be capable of detecting the speed, intensity (pressure), and the like of the touch.
The elevator control device 9 is configured by a computer having a CPU, ROM, RAM, and the like, and performs control of the entire elevator including operation control of the car 1. The elevator control device 9 includes, for example, an operation control unit 91, a registration control unit 92, an operation control unit 93, and the like as functions related to the touch panel type operation panel 3.
The operation control unit 91 controls the operation of the car 1 based on information on a destination floor (car call) registered in a destination floor management table T1 described later.
The registration control unit 92 includes a table T1 for destination layer management. When the user operates the destination layer button 3a of an arbitrary layer, the registration control unit 92 registers the information of the destination layer corresponding to the destination layer button 3a in the destination layer management table T1. When the user performs an operation to cancel the registered destination floor, the registration control unit 92 cancels the information of the registered destination floor. When the cancel operation is performed, the registration control unit 92 determines whether or not the destination floor can be canceled based on, for example, the number of already registered destination floors and the timing of the floor leveling control of the car 1, and when the cancel operation is possible, the registration of the destination floor is canceled from the table T1.
The operation control unit 93 recognizes, for example, a user operation on a touch panel or the like of the operation panel 3, and controls, based on the operation, the display contents of the display device of the operation panel 3 or the display 6, the contents of the sound output from the speaker 5, and the like.
The operation control unit 93 includes, for example, a display control unit 93a, an audio output unit 93b, and a touch panel (T/P) control unit 93 c.
The display control section 93a performs display control of the display device provided in the operation panel 3. More specifically, the display control unit 93a determines, for example, a display area of the destination layer button 3a on an arbitrary layer on the operation panel 3, and displays each destination layer button 3a based on the display area.
The audio output unit 93b transmits audio data for guiding the user to the speaker 5 based on the operation state of the operation panel 3 by the user. The sound output unit 93b may generate the sound data based on the operation state of the operation panel 3, or may select sound data stored in advance in a memory in the elevator control device 9. The sound data may be data representing human voice, or may be short sound having a meaning specific to the visually impaired person, a combination of sounds, or the like.
The touch panel control unit 93c detects or recognizes the operation of the operation panel 3 by the user based on data or the like received from the touch panel control device 7. More specifically, the touch panel control unit 93c detects which region of the operation panel 3 has been touched by the user (for example, whether or not the region is within the destination layer button 3a) based on the data indicating the touched position received from the touch panel control device 7, the display position of the destination layer button 3a obtained from the display control unit 93a, and the like. The touch panel control unit 93c detects the presence or absence of a user operation on the touch panel based on data indicating the intensity of touch, a signal indicating pressing and releasing, and the like received from the touch panel control device 7. In addition to the above, the touch panel control unit 93c is preferably capable of recognizing various touch operations performed by the user on the touch panel via the touch panel control device 7.
It is preferable that data, commands, and the like can be freely transmitted and received between the processing units included in the operation control unit 93.
In the present embodiment, the touch panel control unit 93c is included in the operation control unit 93 for explanation, but the touch panel control unit 93c may be included in the touch panel control device 7. In this case, the elevator control device 9 may receive information of the destination floor determined by the touch panel control device 7, and perform registration of the destination floor, display control of the display device, audio output processing, and the like based on the information.
Fig. 2 is a diagram showing an example 1 of the configuration of the operation panel 3 according to embodiment 1.
The display control section 93a of the operation control section 93 displays the destination layer button 3a on the display device of the operation panel 3 as described above. In the example of FIG. 2, the destination layer buttons 3a are divided into destination layer buttons 3a _ A1-3 a _ A5 representing 1 to 5 layers.
The touch panel control unit 93c handles the touch panel of the operation panel 3 by dividing the touch panel into a registrable area Y in which the user can register the destination layer and an unregisterable area N in which the user cannot register the destination layer. When the user performs a registration operation in the registrable area Y, the registration control unit 92 can register a layer corresponding to the destination layer button displayed at the position where the operation is performed as the destination layer. Further, when the user touches the unregisterable area, or performs a stroking operation on the touch panel, for example, the operation control unit 93 executes a process of urging (guiding) the visually impaired person to register the destination layer.
In the example of fig. 2, the destination layer buttons 3a _ a1 to 3a _ a5 are registerable areas Y, respectively. That is, since the healthy person can visually recognize the destination floor buttons 3a _ a1 to 3a _ a5, the target floor is registered by directly performing the registration operation on the destination floor buttons 3a _ a1 to 3a _ a 5. On the other hand, the visually impaired person touches the unregisterable region N by searching from the end of the operation panel 3, guides the touched position to each of the destination layer buttons 3a _ a1 to 3a _ a5 as the registrable region Y by voice guidance or the like, and then performs a registration operation on the destination layer buttons 3a _ a1 to 3a _ a5 to register the destination layer as a target.
Note that, although fig. 2 shows an example in which the destination layer buttons are arranged in a single row in the vertical direction, the same configuration can be applied to a case in which the destination layer buttons are arranged in a plurality of rows in the vertical direction.
Fig. 3 is a flowchart showing an example of processing of the operation control unit 93 according to the present embodiment.
In step S101, the touch panel control unit 93c of the operation control unit 93 determines whether or not the registrable area Y is touched on the touch panel of the operation panel 3. When it is determined that the registrable area Y is touched, the touch panel control unit 93c recognizes the touch operation as an operation performed by a healthy person (that is, a registration operation of a destination layer performed by a healthy person), and advances the process to step S108.
In step S108, the touch panel control unit 93c determines the layer corresponding to the destination layer button 3a displayed at the position of the touched registrable area Y, and causes the registration control unit 92 to register the layer.
On the other hand, when it is determined in step S101 that the registrable area Y is not touched, the touch panel control section 93c recognizes the touch operation as an operation performed by the visually impaired person.
In step S102, the sound output unit 93b outputs the 1 st sound guidance. The 1 st sound guidance is, for example, a content that informs that the operation panel 3 is a touch panel type.
In step S103, the audio output unit 93b outputs the 2 nd audio guidance. The 2 nd voice guidance is, for example, content guided to the registrable area Y (destination layer button 3a) based on the position touched by the user. More specifically, when the touched position enters any one of the destination buttons 3a, the audio output unit 93b outputs an audio corresponding to the floor name. When the current touch position of the user is distant from the destination-layer button 3a, the sound output unit 93b may output a sound for making the user know in which direction from the current touch position the destination-layer button 3a is present. The sound output unit 93b may output not only the sound of the floor name but also the sound of the description of the store, counter, service, product, and the like associated with the floor.
In step S104, the touch panel control unit 93c determines whether or not the touch position is stationary in the registrable region Y. When the touch position is not still, that is, while the user continues to search for the destination-layer button 3a corresponding to the target layer, the operation control unit 93 continues the guidance of the user by the 2 nd voice guidance (step S103).
Note that the stationary state does not necessarily require the moving speed of the touched position to be 0, and indicates that the touch speed is changed to such an extent that the operation control unit 93 can determine that the hand of the user touching the operation panel 3 is stopped.
On the other hand, when it is determined in step S104 that the touched position is stationary, the sound output unit 93b outputs the 3 rd sound guidance (step S105). The 3 rd voice guidance is, for example, to make the user know the contents of the registration operation. More specifically, the audio output unit 93b outputs to the user audio a message that the layer corresponding to the touched position can be registered as the destination layer by releasing the user's hand at the current touched position.
The user may perform the registration operation by a method other than releasing (releasing from the touch panel). For example, the touch panel control unit 93c may determine that the registration operation is performed when the touch panel is released and then the touch panel is detected again. This can prevent the destination layer from being inadvertently registered when the user is inadvertently left.
The touch panel control unit 93c may determine that the user has performed the registration operation when the touched position is stationary on the registrable area Y for a predetermined time, for example. For example, when the touch panel control device 7 can detect the intensity of the touch, or when a pressure equal to or higher than a predetermined intensity is detected in the registrable area Y, that is, when the user presses the touch panel, the touch panel control unit 93c may determine that the user has performed the registration operation. The registration operation is not limited to the above, and the touch panel control unit 93c may recognize another touch operation as the registration operation.
In step S106, the touch panel control unit 93c determines whether or not the user has performed a registration operation in the registrable area Y. When the registration operation is detected, the process proceeds to step S108. In step S108, as described above, the touch panel control unit 93c causes the registration control unit 92 to register the layer corresponding to the position of the registrable area Y. After registration, the process ends. When the registration is completed, the sound output unit 93b may output a sound indicating that the registration is completed. The operation control unit 93 may indicate the completion of registration to the user by vibrating the operation panel 3 instead of the audio data.
On the other hand, when the registration operation is not detected, the touch panel control unit 93c determines whether or not the touched position is deviated from the registrable region Y (step S107). When the registrable area Y is deviated, it is determined that the user has searched for the destination-layer button 3a of another layer, and the process returns to step S103. As long as the touched position remains in the registrable area Y, the touch panel control section 93c waits for detection of the registration operation.
Further, after step S101, when it is detected that the object (that is, the hand of the user or the like) is released from the touch panel by an operation other than the above-described registration operation, the operation control unit 93 may end the process of guiding the touch position to the visually impaired person, which is described in steps S102 to S107 (step S109). Thus, when the sound-disabled region N is erroneously touched by a sound-disabled person, unnecessary sound guidance can be stopped by immediately releasing the touch panel.
In steps S103 to S104, the operation control unit 93 may recognize the vertical movement of the touch position at an arbitrary position on the touch panel regardless of the shape, arrangement, or the like of the destination button, and broadcast the vertical floor name by sound. In this case, the sound output unit 93b may sequentially broadcast floor names that exist in the direction of touch position movement with respect to the floor corresponding to the current touch position as a reference, for example.
Fig. 4 is a view showing a 2 nd example of the configuration of the operation panel 3 according to the present embodiment.
In example 1 of the configuration of the operation panel 3 described with reference to fig. 2, the destination layer buttons 3a _ a1 to 3a _ a5 (i.e., the registrable area Y) are circular and arranged vertically, and therefore the unregisterable area N is larger than the registrable area Y. For example, when the lateral width of the operation panel 3 is large, or when the number of floors is large, the unregisterable region N becomes larger, and there is a possibility that the convenience may be lowered for a visually impaired person who searches for the destination floor button 3a from the end of the operation panel 3.
Therefore, in the example of fig. 4, the destination layer button is displayed on the entire display device of the operation panel 3. More specifically, the entire display device is divided into regions of the number of floors ("6" in the example of fig. 4), and the divided regions are set as destination floor buttons 3a _ B1 to 3a _ B6 each indicating 1 floor to 6 floors. Further, a region of a predetermined width from an end portion of the touch panel of the operation panel 3 is set as an unregisterable region N, and a region other than the unregisterable region N is set as an registrable region Y. Note that, for example, descriptions of stores (tenants), counters, services, commodities, and the like associated with floors may be displayed on the destination layer buttons 3a _ B1 to 3a _ B6.
In the configuration of the operation panel 3, the visually impaired person can search for the target destination-layer button 3a by touching the unregisterable region N from the end of the operation panel 3 and receiving voice guidance, as in the case of fig. 2. By disposing the destination-floor buttons 3a _ B1 to 3a _ B6 without a gap, the time and distance of movement within the unregisterable area N are reduced, and therefore, the visually impaired can recognize the registrable area Y corresponding to the destination-floor button 3a of the destination floor more quickly.
In the configuration of the operation panel 3, since the destination-layer button 3a overlaps the unregisterable region N, when the healthy person touches the unregisterable region N included in the destination-layer button 3a at first glance, the user may not determine that the operation is a registration operation and play a guide, which may impair the convenience of the healthy person. However, the unregisterable region N is an end portion of the operation panel 3, and is rarely touched when a user who can visually recognize the screen registers the destination layer, so that convenience of the user is not impaired. Further, by using this configuration, the unregisterable region N is smaller than the example of fig. 2, and therefore, the possibility that the sound user touches the unregisterable region N can be reduced.
Further, the visually impaired may inadvertently release his/her hands when hearing the 1 st voice guidance (step S102). In this case, although the touch panel is recognized, the 1 st voice guidance is heard again due to the touch of the unregisterable region N again, and convenience is lost. In addition, when the re-touched portion is the registrable area Y, it may be determined that the registration operation of the healthy person is performed, and a layer that is not intended to be registered may be registered as the destination layer, which is inconvenient.
In order to solve such inconvenience, the touch panel control unit 93c may detect a touch to the unregisterable region N in step S101, and may maintain the state of step S101 without performing the process of step S109 for a predetermined period of time when the touch is immediately released. Thus, when the user touches the touch panel again within the predetermined time, the touch panel control unit 93c can resume the processing from step S103, and the convenience of the visually impaired person is improved. In this case, the touch panel control unit 93c may remove the unregisterable region N (or may narrow the unregisterable region N) when the touch panel control unit re-touches the touch panel. This reduces the possibility that the visually impaired person touches the unregisterable region N again, and therefore, the registrable region Y can be found out more quickly.
As described above, according to embodiment 1, the touch panel of the operation panel 3 is divided into the registrable region Y and the unregisterable region N. When the unregisterable region N is first touched, the operation control unit 93 guides the touch position of the user to the registrable region Y by voice guidance. This can appropriately guide the visually impaired who touches the touch panel from the end of the operation panel 3 and finds the destination layer button to touch the destination layer button 3a (registrable area Y) corresponding to the destination layer, and thus can improve the convenience of the visually impaired.
In the present embodiment, the destination layer button 3a is defined as the entire display device, and only a predetermined region from the end is defined as the unregisterable region N. That is, the non-registrable region N is made smaller and the registrable region Y is made larger in consideration of the behavior characteristics of the visually impaired person, so that the operability of the visually impaired person can be further improved and the convenience of the sound person can be ensured.
In the example of fig. 4, the unregisterable region N may be color-distinguished. Since the area used by the visually impaired person and the area used by the sound person can be distinguished by color discrimination, the convenience of the sound person and the visually impaired person can be both achieved.
(embodiment 2)
Next, embodiment 2 will be explained.
In the elevator system according to the present embodiment, the registration impossible area N is not disposed on the touch panel of the operation panel 3, but instead, the detection sensor is provided in the vicinity of the touch panel. That is, the operation control unit 93 detects an operation performed by the visually impaired person using the detection sensor before the visually impaired person touches the touch panel by groping, and starts the above-described voice guidance. With this configuration, the convenience of the visually impaired can be improved.
Fig. 5 is a diagram showing a configuration example of an elevator system according to the present embodiment.
In fig. 5, the detection sensor 4 is provided in the vicinity of the operation panel 3. The detection sensor control device 8 is connected to the detection sensor 4. The operation control unit 93 includes a detection sensor control unit 93d for controlling the detection sensor control device 8. The configuration other than the detection sensor 4, the detection sensor control device 8, and the detection sensor control unit 93d is the same as that of embodiment 1 (fig. 1).
The detection sensor 4 is, for example, a rod-shaped sensor provided on one side of the entrance side of the operation panel. In addition, the longitudinal length of the detection sensor 4 is preferably equal to or longer than the longitudinal length of the operation panel 3 in consideration of the convenience of the visually impaired. The detection sensor 4 is connected to an elevator control device (EL control device) 9 via a touch panel control device (T/P control device) 7.
The detection sensor control device 8 can detect, for example, that the detection sensor 4 is touched (contacted) by an object. The detection sensor control device 8 may be capable of detecting a touch position (coordinate), speed, or the like when the detection sensor 4 is touched.
The probe sensor control unit 93d detects or recognizes the operation of the probe sensor 4 by the user based on the data or the like received from the probe sensor control device 8.
For example, when a user in the car 1 touches the detection sensor 4, a signal indicating the touch to the detection sensor 4, data indicating the touched position, and the like are transmitted to the elevator control device 9 via the detection sensor control device 8. The detection sensor control unit 93d receives the signal and the data, and recognizes that the user touches the detection sensor 4.
Fig. 6 is a flowchart showing an example of the processing of the operation control unit 93 according to the present embodiment.
In step S301, the detection sensor control unit 93d of the operation control unit 93 detects whether or not the detection sensor 4 is touched. When the touch to the detection sensor 4 is detected, the sound output unit 93b outputs the 1 st sound guidance in step S302. The content of the 1 st sound guidance is the same as that explained in step S102 of fig. 3.
Further, in a case where the touch to the detection sensor 4 is not detected, the operation by the sound person is recognized. That is, when the user directly touches the registrable area Y (step S310), the touch panel control unit 93c causes the registration control unit 92 to register a layer corresponding to the position of the registrable area Y (step S309).
In step S303, the touch panel control unit 93c determines whether or not the touch panel of the operation panel 3 is touched within a predetermined time after the detection of the touch on the detection sensor 4. If the touch panel is touched within the predetermined time, the process proceeds to step S304.
On the other hand, when the touch panel is not touched within the predetermined time, the operation control unit 93 may end the guidance processing of the touched position to the visually impaired person. That is, when the touch is made after the predetermined time has elapsed, the operation is recognized as a sound person, and the process proceeds to step S310. This ensures convenience for the sound person even when the sound person erroneously touches the detection sensor 4.
The processing in steps S304 to S309 is equivalent to the processing in steps S103 to S108 in fig. 2, and therefore, the description thereof is omitted.
In addition, as in step S109 of fig. 2, the operation control unit 93 may end the process of guiding the touched position to the visually impaired person described in steps S305 to S308 when it is detected that the object is released from the touch panel by an operation other than the above-described registration operation after step S303 (step S311).
Fig. 7 is a view showing a 2 nd example of the configuration of the operation panel 3 according to the present embodiment.
In consideration of the behavior characteristics of the visually impaired person, the detection sensor 4 is preferably provided on the side of the entrance/exit side of the operation panel 3 as shown in fig. 5. Further, in order to cope with the case where the visually impaired person can take an unexpected action, the detection sensor 4 may be provided on at least 1 or more of the remaining 3 sides of the control panel 3 or on all of the remaining 3 sides.
In the example of fig. 7, the detection sensor 4 is disposed outside the operation panel 3 so as to surround the operation panel 3. In addition, as in the example of fig. 4, the entire display device of the operation panel 3 is the destination layer button 3 a. The destination layer buttons 3a are divided into destination layer buttons 3a _ C1-3 a _ C6 for each layer.
With the configuration of the detection sensor 4 and the operation panel 3, it is not necessary to provide the unregisterable region N on the touch panel of the operation panel 3, and therefore the user can use the entire touch panel as the registrable region Y.
The detection sensor 4 may be disposed inside the operation panel 3. In this case, the detection sensor 4 having a predetermined width is arranged from the end of the operation panel 3, and the display device and the touch panel are arranged in the region obtained by removing the detection sensor 4 from the operation panel 3.
As described above, according to embodiment 2, the operation performed by the visually impaired can be recognized with high accuracy by providing the detection sensor 4 at a predetermined position independently of the operation panel 3 and detecting a touch (contact) with the detection sensor 4. In addition, since the entire touch panel area of the operation panel 3 can be used as the registrable area Y, convenience for both the healthy and visually impaired can be improved.
In the example of fig. 7, the detection sensor 4 may be replaced with a non-registrable area N on the touch panel of the operation panel 3. In this case, fig. 7 is equivalent to the state of the reduced display in the example of fig. 4 such that the destination layer buttons 3a _ B1 to 3a _ B6 are accommodated in the registrable area Y. According to this configuration, since the unregisterable region N (corresponding to the detection sensor 4 in fig. 7) is outside the destination layer button 3a, the possibility that the healthy person touches the unregisterable region N can be reduced. By using this configuration, a function equivalent to that of fig. 7 can be realized without using the detection sensor 4.
(embodiment 3)
Next, embodiment 3 will be explained.
In embodiments 1 and 2, the operation panel 3 for registering a destination floor is provided in the car 1. On the other hand, in an elevator of a system in which a destination floor is registered in advance before the elevator is taken, for example, the operation panel may be provided outside the car 1.
Fig. 8 is a diagram showing an example of the configuration of the operation panel 11 according to the present embodiment. In fig. 8, an operation panel 11 is provided in a hall 10.
In the case of the operation panel 11 installed outside the car 1, the visually impaired person may find the destination layer button arranged in the operation area 11B in the frame 11A by recognizing the frame 11A from the end of the operation panel 11 by groping, as in the case of finding the operation panel 3.
Therefore, when the operation area 11B is configured by a touch panel and a display device as in the above-described operation panel 3, the operation panel 11 is preferably controlled as in the operation panel 3. Thus, the same user interface as that of the operation panel 3 is provided to the user also in the operation panel 11.
In addition, since the width of the frame 11A of the operation panel 11 of the hall 10 is often larger than the width of the operation area 11B (or the area of the frame 11A is clear) compared to the operation panel 3 in the car 1, the system can be configured so as not to give any sense of incongruity to the healthy person by disposing the detection sensor 4 in the frame 11A.
According to at least 1 embodiment described above, when the touch panel is used as the operation panel, the operation control unit 93 recognizes a predetermined touch operation performed by the visually impaired person, and guides the touch position of the visually impaired person to the registrable area Y by voice guidance. This can prevent the visually impaired from registering a layer which the visually impaired person does not intend to go. Further, in the elevator provided with the touch panel type operation panel, the reduction of the operation efficiency of the elevator, the falling of the elevator by the visually impaired person at the wrong floor, and the like can be prevented.
Several embodiments of the present invention have been described, but these embodiments are presented as examples and are not intended to limit the scope of the invention. These novel embodiments can be implemented in other various forms, and various omissions, substitutions, and changes can be made without departing from the spirit of the invention. These embodiments and modifications thereof are included in the scope and gist of the invention, and are included in the invention described in the claims and the equivalent scope thereof.
Description of the symbols
1 … passenger car; 2 … car doors; 3. 11 … operating panel; 4 … detection sensor; a5 … speaker; 6 … display; 7 … touch panel control device; 8 … detecting the sensor control; 9 … elevator control means; 10 … elevator waiting hall; 11A … box; 11B … operating region; 91 … operation control part; 92 … register the control unit; 93 … operation control part; 93a … display control unit; 93b … sound output unit; 93c … touch panel control section; 93d … detecting the sensor control part; t1 … destination layer management table.

Claims (7)

1. An elevator system is characterized by comprising:
a touch panel type operation panel that displays a destination layer button corresponding to each layer in advance;
a control unit that controls the operation panel; and
a speaker which outputs a sound guide by the control of the control unit,
the control unit, when detecting a1 st touch by a user in a1 st area on the touch panel, outputs a1 st voice guidance indicating that the operation panel is a touch panel type from the speaker, then guides a touch position of the 1 st touch to a 2 nd area where a destination layer set on the touch panel can be registered, and outputs a 2 nd voice guidance notifying a floor name corresponding to a current touch position from the speaker when a touch position enters the 2 nd area,
when a predetermined operation by the user is detected in the 2 nd area, the 3 rd voice guidance notifying that the floor corresponding to the current touch position is registered as the destination floor is output from the speaker.
2. Elevator system according to claim 1,
the 2 nd voice guidance is continuously output until the 2 nd area detects a predetermined motion performed by the user.
3. Elevator system according to claim 1,
the 1 st zone includes a zone of a predetermined width from an end of one side of the elevator door side on the operation panel.
4. Elevator system according to any one of claims 1-3,
the predetermined operation is one of a release operation of the 1 st touch, an operation of performing the 2 nd touch after performing the release operation, or an operation of stopping the touch position for a predetermined time.
5. An elevator system is characterized by comprising:
a touch panel type operation panel that displays a destination layer button corresponding to each layer in advance;
a detection sensor that detects a contact made by a user;
a control unit that controls the detection sensor and the operation panel; and
a speaker which outputs a sound guide by the control of the control unit,
the control unit outputs 1 st voice guidance indicating that the operation panel is a touch panel type from the speaker when the touch of the user to the detection sensor is detected, then guides the touched touch position to a predetermined area in which a destination layer set on the touch panel can be registered, and outputs 2 nd voice guidance notifying a floor name corresponding to a current touch position from the speaker when the touch position enters the predetermined area,
when a predetermined operation is performed by the user in the predetermined area, the 3 rd voice guidance notifying that the floor corresponding to the current touched position is registered as the target floor is output from the speaker.
6. Elevator system according to claim 5,
the 2 nd voice guidance is output when a touch to the touch panel is detected within a predetermined time from the contact of the user with the detection sensor.
7. Elevator system according to claim 5 or 6,
the detection sensor is provided in the vicinity of one side of the elevator door side on the operation panel.
CN201810549553.0A 2017-11-27 2018-05-31 Elevator system Active CN109835784B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017227216A JP6513775B1 (en) 2017-11-27 2017-11-27 Elevator system
JP2017-227216 2017-11-27

Publications (2)

Publication Number Publication Date
CN109835784A CN109835784A (en) 2019-06-04
CN109835784B true CN109835784B (en) 2021-06-18

Family

ID=66530855

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810549553.0A Active CN109835784B (en) 2017-11-27 2018-05-31 Elevator system

Country Status (2)

Country Link
JP (1) JP6513775B1 (en)
CN (1) CN109835784B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7478690B2 (en) * 2021-02-26 2024-05-07 株式会社日立製作所 Elevator
JP7563420B2 (en) 2022-05-27 2024-10-08 三菱電機株式会社 Elevator landing operation display

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000198630A (en) * 1999-01-11 2000-07-18 Hitachi Building Systems Co Ltd Elevator usable for visual handicapped person
CN2521734Y (en) * 2002-01-22 2002-11-20 丁艺彬 Universal elevator button for blind man
JP2006056700A (en) * 2004-08-23 2006-03-02 Mitsubishi Electric Corp Guide device for elevator
JP2006127170A (en) * 2004-10-29 2006-05-18 Hitachi Omron Terminal Solutions Corp Information terminal input system
CN101678997A (en) * 2007-06-20 2010-03-24 三菱电机株式会社 Elevator destination floor registration device
CN103328365A (en) * 2011-01-26 2013-09-25 三菱电机株式会社 Destination floor registration device for elevator

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003233465A (en) * 2002-02-08 2003-08-22 Nec Soft Ltd Input device and processing program for detecting button-operation
JP2004115151A (en) * 2002-09-24 2004-04-15 Toshiba Elevator Co Ltd Elevator call operation panel
JP2011063366A (en) * 2009-09-16 2011-03-31 Toshiba Elevator Co Ltd Call registration guide device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000198630A (en) * 1999-01-11 2000-07-18 Hitachi Building Systems Co Ltd Elevator usable for visual handicapped person
CN2521734Y (en) * 2002-01-22 2002-11-20 丁艺彬 Universal elevator button for blind man
JP2006056700A (en) * 2004-08-23 2006-03-02 Mitsubishi Electric Corp Guide device for elevator
JP2006127170A (en) * 2004-10-29 2006-05-18 Hitachi Omron Terminal Solutions Corp Information terminal input system
CN101678997A (en) * 2007-06-20 2010-03-24 三菱电机株式会社 Elevator destination floor registration device
CN103328365A (en) * 2011-01-26 2013-09-25 三菱电机株式会社 Destination floor registration device for elevator

Also Published As

Publication number Publication date
JP6513775B1 (en) 2019-05-15
CN109835784A (en) 2019-06-04
JP2019094203A (en) 2019-06-20

Similar Documents

Publication Publication Date Title
JP5115554B2 (en) Elevator destination floor registration device
JP5996725B1 (en) Elevator control panel
US10494225B2 (en) Control panel with accessibility wheel
JP5494837B2 (en) Elevator destination floor registration device
JP6222867B1 (en) Elevator system
CN109835784B (en) Elevator system
KR101260756B1 (en) Method for Guiding Blind Person Elevator and Elevator for Blind Person
JP2006056700A (en) Guide device for elevator
JP2018002428A (en) Car operation panel
CN109019199B (en) Elevator system
KR20100028989A (en) Apparatus of operating elevator for the blind
JP6173256B2 (en) Group management elevator equipment
WO2019077646A1 (en) Elevator control device and elevator control method
JP6238271B2 (en) Elevator hall call registration device and elevator hall call registration method
JP6565605B2 (en) Destination floor registration device and elevator group management system
JP2018184223A (en) Elevator system with touch panel operation board
EP4238918A1 (en) Elevator system and elevator control method
JP6388123B2 (en) Elevator user interface device
JP2013245079A (en) Cage call registration device and method
JP2013184791A (en) Destination floor registration device provided in elevator hall
JP2023127396A (en) Elevator system and elevator control method
JP2023092637A (en) elevator system
JP2023091149A (en) elevator system
JP2023132296A (en) Elevator system and elevator control method
CN118619027A (en) Elevator system, information processing device, information providing method, and recording medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant