CN112573312B - Elevator car position determining method and device, elevator system and storage medium - Google Patents

Elevator car position determining method and device, elevator system and storage medium Download PDF

Info

Publication number
CN112573312B
CN112573312B CN202011415173.1A CN202011415173A CN112573312B CN 112573312 B CN112573312 B CN 112573312B CN 202011415173 A CN202011415173 A CN 202011415173A CN 112573312 B CN112573312 B CN 112573312B
Authority
CN
China
Prior art keywords
car
distance
robot
determining
center
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011415173.1A
Other languages
Chinese (zh)
Other versions
CN112573312A (en
Inventor
屈运
张永生
郭伟文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Building Technology Guangzhou Co Ltd
Original Assignee
Hitachi Building Technology Guangzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Building Technology Guangzhou Co Ltd filed Critical Hitachi Building Technology Guangzhou Co Ltd
Priority to CN202011415173.1A priority Critical patent/CN112573312B/en
Publication of CN112573312A publication Critical patent/CN112573312A/en
Application granted granted Critical
Publication of CN112573312B publication Critical patent/CN112573312B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B1/00Control systems of elevators in general
    • B66B1/02Control systems without regulation, i.e. without retroactive action
    • B66B1/06Control systems without regulation, i.e. without retroactive action electric
    • B66B1/14Control systems without regulation, i.e. without retroactive action electric with devices, e.g. push-buttons, for indirect control of movements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B1/00Control systems of elevators in general
    • B66B1/34Details, e.g. call counting devices, data transmission from car to control system, devices giving information to the control system
    • B66B1/3415Control system configuration and the data transmission or communication within the control system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B1/00Control systems of elevators in general
    • B66B1/34Details, e.g. call counting devices, data transmission from car to control system, devices giving information to the control system
    • B66B1/3492Position or motion detectors or driving means for the detector
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B3/00Applications of devices for indicating or signalling operating conditions of elevators
    • B66B3/02Position or depth indicators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66BELEVATORS; ESCALATORS OR MOVING WALKWAYS
    • B66B5/00Applications of checking, fault-correcting, or safety devices in elevators

Abstract

The invention discloses a method and a device for determining the position of an elevator car, an elevator system and a storage medium. The method comprises the following steps: determining the number of pixels between any edge mark and a center mark in a target image by acquiring the target image of the bottom of the car shot by the robot; determining a first distance between the robot and the bottom of the car according to the shooting angle of the robot corresponding to the number of pixels and the actual distance between the edge mark and the center mark; or based on a small hole imaging principle, determining a first distance between the robot and the bottom of the car according to the image distance corresponding to the number of pixels, the actual distance between the edge mark and the center mark and the camera focal length of the robot, and solving the problem that the existing car position determining method is troublesome.

Description

Elevator car position determining method and device, elevator system and storage medium
Technical Field
The embodiment of the invention relates to the technical field of elevator control, in particular to a method and a device for determining the position of an elevator car, an elevator system and a storage medium.
Background
When the elevator is powered off or the traction wheel and the traction steel wire rope slide, the elevator car slides, the final position of the elevator car is different from the position of the elevator car in the power-off state due to sliding, in order to determine the final stop position of the elevator car, the elevator is controlled to slowly run to an end station, namely the bottommost layer or the topmost layer, then the position of the elevator car is corrected again through a limit sensor, or a maintenance worker opens the elevator car door, looks at the final position of the elevator car according to experience, and runs the elevator to a preset service floor according to the final position.
In conclusion, the inventor finds that the existing car position determining method is time-consuming and labor-consuming in the process of implementing the embodiment of the invention.
Disclosure of Invention
The invention provides a method and a device for determining the position of an elevator car, an elevator system and a storage medium, which are used for solving the problem that the existing method for determining the position of the elevator car takes a lot of time and labor.
In a first aspect, an embodiment of the present invention provides a method for determining a position of an elevator car, where the method includes:
the method comprises the steps of obtaining a target image of the bottom of a car shot by a robot, wherein the target image carries a preset pattern arranged at the bottom of the car, and the preset pattern comprises a center mark arranged at the center of the bottom of the outer side of the car and at least three edge marks surrounding the center mark;
determining the number of pixels between any edge marker and the center marker in the target image;
determining a first distance between the robot and the bottom of the car according to the shooting angle of the robot corresponding to the number of pixels and the actual distance between the edge mark and the center mark; or based on a small hole imaging principle, determining a first distance between the robot and the bottom of the car according to the image distance corresponding to the number of pixels, the actual distance between the edge identifier and the center identifier, and the camera focal length of the robot.
In a second aspect, an embodiment of the present invention further provides an elevator car position determining apparatus, including:
the system comprises an image acquisition module, a detection module and a control module, wherein the image acquisition module is used for acquiring a target image of the bottom of a car shot by a robot, the target image carries a preset pattern arranged at the bottom of the car, and the preset pattern comprises a center mark arranged at the center of the bottom of the outer side of the car and at least three edge marks surrounding the center mark;
the pixel determining module is used for determining the number of pixels between any edge identifier and the center identifier in the target image;
the first distance determining module is used for determining a first distance between the robot and the bottom of the car according to the shooting angle of the robot corresponding to the pixel number and the actual distance between the edge mark and the center mark; or based on a small hole imaging principle, determining a first distance between the robot and the bottom of the car according to the image distance corresponding to the number of pixels, the actual distance between the edge identifier and the center identifier, and the camera focal length of the robot.
In a third aspect, an embodiment of the present invention further provides an elevator system, where the elevator system includes:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement an elevator car position determination method as provided by an embodiment of the invention.
In a fourth aspect, embodiments of the present invention also provide a storage medium containing computer-executable instructions that, when executed by a computer processor, perform an elevator car position determination method as provided by embodiments of the present invention.
The embodiment of the invention has the following advantages or beneficial effects:
the method comprises the steps that a target image of the bottom of a car shot by a robot is obtained, wherein the target image carries a preset pattern arranged at the bottom of the car, and the preset pattern comprises a center mark arranged at the center of the bottom of the outer side of the car and at least three edge marks surrounding the center mark; determining the number of pixels between any edge identifier and center identifier in the target image; determining a first distance between the robot and the bottom of the car according to the shooting angle of the robot corresponding to the number of pixels and the actual distance between the edge mark and the center mark; or based on the pinhole imaging principle, determining a first distance between the robot and the bottom of the car according to the image distance corresponding to the number of pixels, the actual distance between the edge identifier and the center identifier, and the camera focal length of the robot. The first distance between the robot and the bottom of the car is rapidly determined through the relative position relation among all the marks in the target image of the bottom of the car shot by the robot, so that the final stopping position of the car is determined, and the technical effect of rapidly, accurately and safely determining the stopping position of the car is achieved.
Drawings
In order to more clearly illustrate the technical solution of the exemplary embodiment of the present invention, a brief introduction will be made to the drawings required for describing the embodiment. It is clear that the described figures are only figures of a part of the embodiments of the invention to be described, not all figures, and that for a person skilled in the art, without inventive effort, other figures can also be derived from them.
Fig. 1 is a schematic flow chart of a method for determining the position of an elevator car according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a predetermined pattern according to an embodiment of the invention;
FIG. 3 is a schematic diagram of a first distance according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a first distance provided in accordance with one embodiment of the present invention;
FIG. 5 is a schematic diagram of another predetermined pattern according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of an offset angle according to an embodiment of the present invention;
fig. 7 is a schematic flow chart of a method for determining the position of an elevator car according to a second embodiment of the present invention;
fig. 8 is a schematic flow chart of an elevator car position determining method according to a third embodiment of the present invention;
fig. 9A is a schematic structural view of an elevator car position determining apparatus according to a fourth embodiment of the present invention;
fig. 9B is a schematic structural view of an elevator car position determining apparatus according to a fourth embodiment of the present invention;
fig. 10A is a schematic structural view of an elevator system according to a fifth embodiment of the present invention;
fig. 10B is a schematic structural diagram of an elevator system according to the fifth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some structures related to the present invention are shown in the drawings, not all of them.
Example one
Fig. 1 is a schematic flow chart of a method for determining a position of an elevator car according to an embodiment of the present invention, which is applicable to a situation where a first distance between a robot and a car bottom is obtained based on an image of the car bottom captured by the robot to determine the position of the car, and the method can be implemented by an elevator car position determining device, which can be implemented by hardware and/or software, and specifically includes the following steps:
s110, acquiring a target image of the bottom of the car shot by the robot, wherein the target image carries a preset pattern arranged at the bottom of the car, and the preset pattern comprises a center mark arranged at the center of the bottom of the outer side of the car and at least three edge marks surrounding the center mark.
The car bottom is a bottom of an outer surface of an elevator box-shaped space for carrying loads such as passengers and cargo.
It is understood that the robot in this embodiment may be an electronic device integrated with a camera function, such as a camera or a video camera, and the robot is placed at the center of the bottom of the hoistway, and the camera of the robot is vertically 90 degrees upward, so that the shooting direction of the robot is aligned with the center mark of the bottom center outside the car. Specifically, the robot can realize automatic focusing of the bottom of the car by using a phase method or a contrast method. The target image is an image of the robot taken at the current moment for the bottom of the car. The robot can be set to shoot the target image of the bottom of the car according to the preset time interval, and can also be triggered by the car stop signal to shoot the target image of the bottom of the car.
The preset pattern refers to a preset pattern which comprises a plurality of identification points and is arranged at the bottom of the outer side of the car, so that when the robot shoots the bottom of the car, a target image comprising the preset pattern can be acquired. Wherein, central sign is used for the central point of sign car outside bottom, and the edge sign is used for the sign around the marginal point around the central point. Specifically, the distances from the edge marks to the center mark are equal, and the number of the edge marks and the positions of the edge marks can be determined according to the shape of the preset pattern. For example, as shown in fig. 2, when the preset pattern is an equilateral triangle, the center mark is a point e, and the number of edge marks is 3, that is, the edge marks are points a, b, and c in fig. 2, respectively; the edge mark may be any point on the OA, OB, and OC lines in the figure, as long as the values of ea, eb, and ec are guaranteed to be equal.
It can be understood that the shape of the preset pattern in this embodiment may also be a regular image such as a square or a regular pentagon, and when the shape of the preset pattern is a square, the number of the edge marks is 4; when the shape of the preset pattern is a pentagon, the number of the edge marks is 5.
And S120, determining the number of pixels between any edge identifier and center identifier in the target image.
The pixel number refers to the number of pixels of a linear distance between two pixels in the image, and is used for representing the distance between the edge identifier and the center identifier in the target image. Specifically, the number of pixels between the edge identifier and the center identifier may be determined based on the image distance and the pixel size, i.e., the number of pixels = image distance/pixel size. The image distance refers to a straight line distance between an edge identifier and a center identifier in a target image, and the pixel size refers to a straight line distance between centers of two adjacent pixel points in the target image or a side length of one pixel. Illustratively, the pixel size of the target image is 6.5 μm, and the straight-line distance between a certain edge identifier and the center identifier is 40 μm, then the number of pixels between the edge identifier and the center identifier is 40/6.5=6.15.
S130, determining a first distance between the robot and the bottom of the car according to the shooting angle of the robot corresponding to the number of pixels and the actual distance between the edge mark and the center mark; or based on the pinhole imaging principle, determining a first distance between the robot and the bottom of the car according to the image distance corresponding to the number of pixels, the actual distance between the edge identifier and the center identifier, and the camera focal length of the robot.
The shooting angle refers to the shooting direction of the robot for the actual position of any edge mark. For example, as shown in fig. 3, point c in the drawing is an edge identifier of the preset pattern, point e is a center identifier of the preset pattern, and an included angle between a distance from the robot to the point e and a distance from the robot to the point c in the drawing is a shooting angle corresponding to the point c. The actual distance between the edge mark and the center mark refers to the straight line distance from the center mark at the bottom of the outer side of the car to the edge mark, such as the distance between the point e and the point c in fig. 3. The first distance refers to a straight line distance between the robot and a center mark of the bottom center outside the car, such as the distance from the robot to the point e in fig. 3. Specifically, if the shooting angle of the robot corresponding to the number of pixels is θ, and the actual distance between the edge mark and the center mark is L, the first distance between the robot and the bottom of the car is θ
Figure BDA0002816716290000061
The image distance corresponding to the number of pixels refers to a straight-line distance between the edge identifier and the center identifier in the target image. For example, as shown in fig. 4, if the center in the target image is identified as the point e 'and the edge is identified as the point c', the image distance corresponding to the number of pixels at the point c is the line segment e 'c' in fig. 4; the focal distance of the camera of the robot is f in fig. 4, and the first distance is h in fig. 4. As can be seen from the principle of pinhole imaging,
Figure BDA0002816716290000071
then the first distance
Figure BDA0002816716290000072
It will be understood that this first distance is perpendicular to the bottom outside the car, i.e. in the plane of the line ce in fig. 4.
It should be noted that the shooting angle of the robot corresponding to the number of pixels in the present embodiment may be based on the edge mark and the center markThe number of pixels in between, and the pixel arc value of the image taken by the robot. The pixel radian value refers to the radian corresponding to a single pixel in an image shot by the robot. The calculation formula is as follows:
Figure BDA0002816716290000073
wherein, θ is the shooting angle of the robot corresponding to the number of the pixels, N is the number of the pixels between the edge identifier and the center identifier, and Rop is the pixel arc value of the image shot by the robot. Considering that pi in the formula takes a value of 3.1415926 \ 8230 \8230;, the calculated photographing angle cannot be an integer degree, but is actually a photographing angle at an integer angle. Therefore, the above formula is modified, that is, an error value is added to the above calculation formula, so that the obtained shooting angle is an integer. The improved formula is as follows:
Figure BDA0002816716290000074
wherein the offset is the radian error. Rop and offset can be obtained by solving simultaneous equations by measuring the first distance corresponding to different car positions for multiple times by the robot in advance.
According to the technical scheme of the embodiment, a target image of the bottom of the car shot by a robot is obtained, wherein the target image carries a preset pattern arranged at the bottom of the car, and the preset pattern comprises a center mark arranged at the center of the bottom of the outer side of the car and at least three edge marks surrounding the center mark; determining the number of pixels between any edge mark and a center mark in a target image; determining a first distance between the robot and the bottom of the car according to the shooting angle of the robot corresponding to the number of pixels and the actual distance between the edge mark and the center mark; or based on a pinhole imaging principle, determining a first distance between the robot and the bottom of the car according to the image distance corresponding to the number of pixels, the actual distance between the edge identifier and the center identifier, and the camera focal length of the robot. The first distance between the robot and the bottom of the car is rapidly determined through the relative position relation among all the marks in the target image of the bottom of the car shot by the robot, so that the final stopping position of the car is determined, and the technical effect of rapidly, accurately and safely determining the stopping position of the car is achieved.
Optionally, the preset pattern includes edge marks arranged on four corners of the bottom of the outer side of the car and a center mark arranged in the center of the bottom of the outer side of the car.
The four corners of the bottom of the outer side of the car refer to the top left corner vertex, the bottom left corner vertex, the top right corner vertex and the bottom right corner vertex of the bottom of the outer side of the car, as shown in fig. 5. Points a, b, c and d in fig. 5 are edge marks provided on four corners of the bottom of the outside of the car, and point e is a center mark provided at the center of the bottom of the outside of the car. The pattern of presetting the pattern setting for including the corresponding sign that has four corners and the center point of car outside bottom in this embodiment, this preset pattern be convenient for confirm fast image distance and the actual distance between arbitrary edge sign and the center sign to be convenient for confirm the first distance between robot and the car bottom fast.
Optionally, the elevator car position determining method further includes: acquiring a floor where the car is located when the elevator fails and a second distance between the pre-stored car and the robot when the car is at the floor; controlling the car to move downwards for a third distance when the first distance is greater than the second distance; and when the first distance is less than the second distance, controlling the elevator car to move upwards by a third distance, wherein the third distance is equal to the absolute value of the first distance minus the second distance.
The elevator fault can be abnormal conditions such as power failure of the elevator or slippage between a traction wheel and a traction steel wire rope. The robot can obtain the floor where the car is located when the elevator fails through a Data Transfer Unit (DTU) of the elevator or a remote monitoring center. After the floor where the car is located when the elevator fails is obtained, the initial height corresponding to the floor where the car is located, namely the second distance, can be determined. The second distances corresponding to the floors can be obtained in advance through the steps of S120-S130 and are correspondingly stored. Specifically, when the elevator breaks down in an ascending state, the first distance is greater than the second distance, namely the position of the elevator car at the current moment is higher than the initial height corresponding to the fault, the elevator car is controlled to move downwards so as to stop at the floor where the elevator car breaks down, wherein the moving distance is equal to the sum of the first distance and the second distance; when the elevator breaks down in a descending state, the first distance is smaller than the second distance, namely the position of the car at the current moment is lower than the initial height corresponding to the fault, the car is controlled to move upwards, so that the car stops at the floor where the car is located at the fault, wherein the moving distance is equal to the second distance minus the first distance.
In another embodiment, after the floor where the elevator car is located when the elevator fails is obtained, the elevator car can be controlled to move to the adjacent floor where the elevator car is located when the elevator fails. Illustratively, the floor where the car is located at the fault is the 4 th floor, the second distance between the car and the robot at the 4 th floor is 16m, the second distance between the car and the robot at the 5 th floor is 20m, and the first distance of the car is 19m at the moment, so that the car can be controlled to move from 1m to the 5 th floor.
In the embodiment, the floor where the car is located when the elevator fails and the pre-stored second distance between the car and the robot when the car is at the floor are obtained, the car is controlled to move downwards for the third distance when the first distance is larger than the second distance, and the car is controlled to move upwards for the third distance when the first distance is smaller than the second distance, so that the car is moved to the floor where the car is located when the elevator fails, the car stops in time when the elevator fails, the maintenance speed of the elevator is increased, and the safety of the elevator is further improved.
Optionally, before the acquiring the target image of the car bottom captured by the robot, the method further includes: acquiring a target image of the bottom of the car shot by a robot at the bottom of the hoistway, and identifying a center mark on the target image; determining an image distance between a center mark and a center point of a target image, an offset included angle between the center mark and a preset direction, an image distance between any two edge mark points on the target image, and acquiring an actual distance between the two edge mark points; determining the scaling of the target image according to the ratio of the actual distance to the image distance; and determining a moving route of the robot according to the offset included angle, determining a correction distance according to the image distance and the scaling, and controlling the robot to move along the moving route for the correction distance.
It can be understood that for the robot just placed in the bottom of the shaft, it is necessary to verify whether the initial placement position corresponds to the center mark of the bottom center outside the car, and when the initial placement position does not correspond to the center mark, the position of the robot is corrected. As shown in fig. 6, the target image is an area defined by a black frame, the preset pattern is an area defined by a dashed line, point g in the figure is a center point of the target image, point e is a center mark on the target image, and ge is an image distance between the center mark and the center point of the target image.
The zoom ratio refers to a size change ratio of an actual scene photographed with respect to a photographed image. Specifically, the distance may be determined according to an actual distance between any two edge identification points and an image distance between two edge identification points. Illustratively, the actual distance between two edge identification points is 2m, the image distance is 2cm, and the scaling ratio is 2m/2cm =100. Accordingly, after obtaining the scaling of the target image, the correction distance, that is, the distance that the robot needs to move to correspond to the position of the center mark, may be determined according to the image distance between the center mark and the center point of the target image and the scaling.
The preset direction may be a horizontal direction or a vertical direction, such as β 1 and β 2 in fig. 6, an offset angle between the center mark and the horizontal direction is β 2, and an offset angle between the center mark and the vertical direction is β 1. Specifically, the offset included angle is used for determining a moving route of the robot, and the moving route may be a straight line route or a broken line route, for example, moving in a horizontal direction and then moving in a vertical direction. For example, if the offset included angle is an included angle β 2 with the horizontal direction, the moving route of the robot may be a straight line moving along a path whose direction is the included angle β 2 with the horizontal direction, and the correction distance of the movement is shorter; or the correction distance of the movement is longer when the movement is performed in the vertical direction and then in the horizontal direction.
In the embodiment, the scaling of the target image is determined according to the ratio of the actual distance to the image distance, the moving route of the robot is determined according to the offset included angle, the correction distance is determined according to the image distance and the scaling, and the robot is controlled to move along the moving route for the correction distance, so that the robot corresponds to the center identifier, the accurate correction of the position of the robot is realized, and the accuracy of the position of the car is further improved.
Example two
Fig. 7 is a schematic flow chart of a method for determining a position of an elevator car according to a second embodiment of the present invention, in which, based on the above embodiments, a determination of a moving speed of the car based on two first distances and a time difference between two stops of the car is added, and an overspeed notification message is generated when the moving speed of the car is greater than a preset speed threshold. Wherein explanations of the same or corresponding terms as those of the above embodiments are omitted.
Referring to fig. 7, the elevator car position determining method provided by the present embodiment includes the steps of:
s710, a target image of the bottom of the car shot by the robot is obtained, the target image carries a preset pattern arranged at the bottom of the car, and the preset pattern comprises a center mark arranged at the center of the bottom of the outer side of the car and at least three edge marks surrounding the center mark.
S720, determining the number of pixels between any edge mark and the center mark in the target image.
S730, determining a first distance between the robot and the bottom of the car according to the shooting angle of the robot corresponding to the number of pixels and the actual distance between the edge mark and the center mark; or based on the pinhole imaging principle, determining a first distance between the robot and the bottom of the car according to the image distance corresponding to the number of pixels, the actual distance between the edge identifier and the center identifier, and the camera focal length of the robot.
S740, acquiring a target image of the bottom of the car shot by the robot when the car stops for two adjacent times; determining a first distance between the robot corresponding to each target image and the bottom of the car; and determining the movement speed of the car according to the difference value of the two first distances and the time difference between two times of stopping of the car.
The two adjacent stops of the car mean that the car stops at the floor twice continuously, such as the first stop at the 3 rd floor and the second stop at the 1 st floor. The time difference between two stops of the car, which is the time required for the car to move from the floor at which the car stopped for the first time to the floor at which the car stopped for the second time, can be obtained by a timer. Specifically, a timer is started to determine the current time when the car stops each time, and the time difference between two times of stopping of the car is determined according to the difference value of the current time determined at two adjacent times.
The speed of movement of the car = the difference of the two first distances/time difference between two stops of the car. Illustratively, if the first distance corresponding to the first stop of the car is 10m, the first distance corresponding to the second stop of the car is 20m, and the time difference between the two stops of the car is 4s, the moving speed of the car is (20-10)/4 =2.5m/s.
And S750, if the movement speed is greater than a preset speed threshold value, outputting overspeed prompt information for indicating that the car moves overspeed.
The preset speed threshold can be understood as a preset safety critical speed of the car movement, and is used for judging whether the current movement speed of the car belongs to a safety speed. The overspeed prompt message can be text message, pattern message or alarm prompt tone, and can also be any combination of text message, pattern message and alarm prompt tone. The overspeed prompt information can be sent to a related terminal of a maintenance worker to inform the maintenance worker that the elevator has the risk or hidden danger of stalling; the elevator rescue system can also be sent to a display interface of the elevator, so that a user taking the elevator can trigger an alarm button of the elevator in time to seek rescue.
It can be understood that, the embodiment may also set a second preset speed threshold, and when the moving speed is less than the second preset speed threshold, a low speed prompt message indicating that the moving speed of the car is too low is output. And the second preset speed threshold is used for judging whether the current movement speed of the car is too slow or not, and informing maintenance personnel that the elevator has faults when the movement speed is too slow (smaller than the second preset speed threshold).
According to the technical scheme of the embodiment, when the car stops for two adjacent times, the robot shoots a target image of the bottom of the car; determining a first distance between the robot corresponding to each target image and the bottom of the car; the movement speed of the lift car is determined according to the difference value of the two first distances and the time difference between two times of stop of the lift car, and overspeed prompt information used for indicating that the lift car moves overspeed is output when the movement speed is larger than a preset speed threshold value, so that the movement speed of the lift is detected, an alarm is given in time when the lift has a stalling risk or hidden danger, and the safety of the lift is further improved.
EXAMPLE III
Fig. 8 is a schematic flow chart of a method for determining a position of an elevator car according to a third embodiment of the present invention, and in this embodiment, based on the above embodiments, a further optimization is performed on "determining a first distance between a robot and a car bottom according to a shooting angle of the robot corresponding to the number of pixels and an actual distance between the edge identifier and the center identifier". Wherein explanations of the same or corresponding terms as those of the above embodiments are omitted.
Referring to fig. 8, the elevator car position determining method provided by the present embodiment includes the steps of:
s810, acquiring a target image of the bottom of the car shot by the robot, wherein the target image carries a preset pattern arranged at the bottom of the car, and the preset pattern comprises a center mark arranged at the center of the bottom of the outer side of the car and at least three edge marks surrounding the center mark.
S820, determining the number of pixels between any edge mark and center mark in the target image.
S830, shooting angles of the robots corresponding to the edge identifications in the target image are determined, and the maximum shooting angle and the minimum shooting angle are determined based on all the determined shooting angles.
Because the car may have an angle inclination in the moving process, the angle of the position of each edge mark relative to the robot is obtained by determining the shooting angle of the robot corresponding to each edge mark, and therefore the safety monitoring of the car angle is realized. It can be understood that, because the robot is located at a position corresponding to the center mark of the bottom center outside the car, and the distances between the edge marks and the center mark are equal, if the car has no inclination angle or the inclination angle of the car is within the safe inclination range, the shooting angles of the robots corresponding to the edge marks should be approximately equal. And when the inclination angle of the car exceeds the safe inclination range, the difference value between the shooting angles of the robots corresponding to the edge marks is larger. Therefore, the maximum shooting angle and the minimum shooting angle in the shooting angles corresponding to the edge marks are determined.
S840, if the difference value between the maximum shooting angle and the minimum shooting angle is within a preset angle deviation range, acquiring the actual distance between the edge mark and the center mark, and determining a first distance between the robot and the bottom of the car; and if the difference value between the maximum shooting angle and the minimum shooting angle is larger than the preset angle deviation range, outputting inclination prompt information for indicating the inclination of the car.
If the difference value between the maximum shooting angle and the minimum shooting angle is within the preset angle deviation range, the maximum inclination angle of the car does not exceed the preset angle deviation range, namely the inclination angle of the car is within the safe inclination range, and the position of the car is continuously monitored. If the difference between the maximum shooting angle and the minimum shooting angle is larger than the preset angle deviation range, the maximum inclination angle of the car exceeds the preset angle deviation range, namely the inclination angle of the car is not in the safe inclination range, and at the moment, inclination prompt information is output to prompt maintenance personnel that the elevator has an inclination fault and needs to be maintained in time.
According to the technical scheme, the shooting angle of the robot corresponding to each edge mark in the target image is determined, the maximum shooting angle and the minimum shooting angle are determined based on all the determined shooting angles, if the difference value between the maximum shooting angle and the minimum shooting angle is larger than the preset angle deviation range, the inclination prompt information used for indicating the inclination of the car is output, so that maintenance personnel can timely maintain the elevator according to the inclination prompt information, the effective monitoring of the inclination angle of the car is achieved, the probability of danger caused by the inclination of the car is reduced, and the safety of the elevator is improved.
Example four
Fig. 9A is a schematic structural diagram of an elevator car position determining apparatus according to a fourth embodiment of the present invention, which is applicable to a case where a distance between a robot and a car bottom is obtained based on an image of the car bottom photographed by the robot, so as to determine a car position, and the apparatus specifically includes: an image acquisition module 910, a pixel determination module 920, and a first distance determination module 930.
The image acquisition module 910 is configured to acquire a target image of the bottom of the car, which is photographed by the robot, where the target image carries a preset pattern arranged at the bottom of the car, and the preset pattern includes a center identifier arranged at the center of the bottom outside the car and at least three edge identifiers surrounding the center identifier;
a pixel determining module 920, configured to determine the number of pixels between any edge identifier and center identifier in the target image;
a first distance determining module 930, configured to determine a first distance between the robot and the car bottom according to the shooting angle of the robot corresponding to the number of pixels and the actual distance between the edge identifier and the center identifier; or based on a pinhole imaging principle, determining a first distance between the robot and the bottom of the car according to the image distance corresponding to the number of pixels, the actual distance between the edge identifier and the center identifier, and the camera focal length of the robot.
In the embodiment, a target image of the bottom of a car shot by a robot is obtained through an image obtaining module, wherein the target image carries a preset pattern arranged at the bottom of the car, and the preset pattern comprises a center mark arranged at the center of the bottom of the outer side of the car and at least three edge marks surrounding the center mark; determining the number of pixels between any edge identifier and center identifier in the target image through a pixel determination module; determining a first distance between the robot and the bottom of the car based on the shooting angle of the robot corresponding to the number of pixels and the actual distance between the edge mark and the center mark by a first distance determining module; or based on the pinhole imaging principle, determining a first distance between the robot and the bottom of the car according to the image distance corresponding to the pixel number, the actual distance between the edge mark and the center mark and the camera focal length of the robot, so that the car position based on the robot is obtained, the cost is low, the safety factor is high, the position of the elevator car is determined in real time, the speed of obtaining the position of the car is increased, and the precision of the position of the car is also improved.
Optionally, the preset pattern comprises edge marks arranged on four corners of the bottom of the outer side of the car and a center mark arranged at the center of the bottom of the outer side of the car.
Optionally, the elevator car position determining device further includes a fault moving module 940, where the fault moving module 940 is configured to obtain a floor where the car is located when the elevator is in fault and a second distance between the car and the robot when the car is at the floor, where the second distance is stored in advance; controlling the car to move downwards for a third distance when the first distance is greater than the second distance; and controlling the car to move upwards by a third distance when the first distance is smaller than the second distance, wherein the third distance is equal to the absolute value of the first distance minus the second distance.
Optionally, the elevator car position determining apparatus further includes a speed prompting module 950, where the speed prompting module 950 is configured to obtain a target image of the bottom of the car, which is taken by the robot when the car stops twice adjacent to each other; determining a first distance between the robot corresponding to each target image and the bottom of the car; determining the movement speed of the car according to the difference value of the two first distances and the time difference between two times of stopping of the car; and if the movement speed is greater than the preset speed threshold value, outputting overspeed prompt information for indicating that the movement of the car is overspeed.
Optionally, the first distance determining module 930 is specifically configured to determine a shooting angle of the robot corresponding to each edge identifier in the target image, and determine a maximum shooting angle and a minimum shooting angle based on all the determined shooting angles; if the difference value between the maximum shooting angle and the minimum shooting angle is within a preset angle deviation range, acquiring the actual distance between the edge mark and the center mark, and determining a first distance between the robot and the bottom of the car; and if the difference value between the maximum shooting angle and the minimum shooting angle is larger than the preset angle deviation range, outputting inclination prompt information for indicating the inclination of the car.
Optionally, the elevator car position determining apparatus further includes a correction module 900, the correction module 900 is configured to, before obtaining the target image of the car bottom captured by the robot, obtain the target image of the car bottom captured by the robot located at the bottom of the hoistway, and identify the center identifier on the target image; determining an image distance between a center mark and a central point of a target image, an offset included angle between the center mark and a preset direction, an image distance between any two edge mark points on the target image, and acquiring an actual distance between the two edge mark points; determining the scaling of the target image according to the ratio of the actual distance to the image distance; and determining a moving route of the robot according to the offset included angle, determining a correction distance according to the image distance and the scaling, and controlling the robot to move along the moving route for the correction distance.
The elevator car position determining device provided by the embodiment of the invention can execute the elevator car position determining method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the executing method.
It should be noted that, the units and modules included in the system are merely divided according to functional logic, but are not limited to the above division as long as the corresponding functions can be realized; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the embodiment of the invention.
EXAMPLE five
Fig. 10A is a schematic structural diagram of an elevator system according to a fifth embodiment of the present invention. Fig. 10A illustrates a block diagram of an exemplary elevator system 100 suitable for use in implementing embodiments of the present invention. The elevator system 100 shown in fig. 10A is only an example and should not impose any limitations on the functionality or scope of use of embodiments of the present invention.
As shown in fig. 10A, the elevator system 100 is embodied in the form of a general purpose computing device. The components of the elevator system 100 may include, but are not limited to: one or more processors or processing units 1001, a system memory 1002, and a bus 1003 that couples the various system components (including the system memory 1002 and the processing unit 1001).
Bus 1003 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, or a local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
The elevator system 100 typically includes a variety of computer system readable media. These media may be any available media that can be accessed by the elevator system 100 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 1002 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM) 1004 and/or cache memory 1005. The elevator system 100 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 1006 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 10A, commonly referred to as a "hard disk drive"). Although not shown in FIG. 10A, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to the bus 1003 by one or more data media interfaces. Memory 1002 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 1008 having a set (at least one) of program modules 1007 may be stored, for example, in memory 1002, such program modules 1007 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may include an implementation of a network environment. Program modules 1007 generally perform functions and/or methods in the described embodiments of the invention.
The elevator system 100 can also communicate with one or more external devices 1009 (e.g., keyboard, pointing device, display 1010, etc.), with one or more devices that enable a user to interact with the elevator system 100, and/or with any devices (e.g., network card, modem, etc.) that enable the elevator system 100 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 1011. Also, the elevator system 100 can communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) through the network adapter 1012. As shown, the network adapter 1012 communicates with the other modules of the elevator system 100 over the bus 1003. It should be understood that although not shown in fig. 10A, other hardware and/or software modules can be used in conjunction with the elevator system 100, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, to name a few.
The processing unit 1001 executes various functional applications and data processing by running a program stored in the system memory 1002, for example, to implement steps of an elevator car position determining method provided in this embodiment, the method including:
the method comprises the steps of obtaining a target image of the bottom of a car shot by a robot, wherein the target image carries a preset pattern arranged at the bottom of the car, and the preset pattern comprises a center mark arranged at the center of the bottom of the outer side of the car and at least three edge marks surrounding the center mark;
determining the number of pixels between any edge identifier and center identifier in the target image;
determining a first distance between the robot and the bottom of the car according to the shooting angle of the robot corresponding to the number of pixels and the actual distance between the edge mark and the center mark; or based on the pinhole imaging principle, determining a first distance between the robot and the bottom of the car according to the image distance corresponding to the number of pixels, the actual distance between the edge identifier and the center identifier, and the camera focal length of the robot.
Of course, it will be understood by those skilled in the art that the processor may also implement the solution of the method for determining the position of an elevator car according to any of the embodiments of the present invention.
Optionally, the elevator system 100 further includes a car 1013 and a robot 1014, and as shown in fig. 10B, the car 1013 and the robot 1014 are respectively connected to the processing unit 1001. A preset pattern is arranged at the bottom of the outer side of the car 1013, and the preset pattern comprises a center mark and at least three edge marks surrounding the center mark; the robot 1014 is disposed at the center of the hoistway bottom.
EXAMPLE six
An embodiment of the present invention also provides a storage medium containing computer-executable instructions which, when executed by a computer processor, perform a method of elevator car position determination, the method comprising:
the method comprises the steps that a target image of the bottom of a car shot by a robot is obtained, the target image carries a preset pattern arranged at the bottom of the car, and the preset pattern comprises a center mark arranged at the center of the bottom of the outer side of the car and at least three edge marks surrounding the center mark;
determining the number of pixels between any edge mark and a center mark in a target image;
determining a first distance between the robot and the bottom of the car according to the shooting angle of the robot corresponding to the number of pixels and the actual distance between the edge mark and the center mark; or based on the pinhole imaging principle, determining a first distance between the robot and the bottom of the car according to the image distance corresponding to the number of pixels, the actual distance between the edge identifier and the center identifier, and the camera focal length of the robot.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for embodiments of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in some detail by the above embodiments, the invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the invention, and the scope of the invention is determined by the scope of the appended claims.

Claims (10)

1. An elevator car position determining method, comprising:
the method comprises the steps that a target image of the bottom of a car shot by a robot is obtained, the target image carries a preset pattern arranged at the bottom of the car, the preset pattern comprises a center mark arranged at the center of the bottom of the outer side of the car and at least three edge marks surrounding the center mark, and a camera of the robot is vertical to the upper direction by 90 degrees so that the shooting direction of the robot is aligned to the center mark;
determining the number of pixels between any edge identifier and the center identifier in the target image;
determining a first distance between the robot and the bottom of the car according to the shooting angle of the robot corresponding to the pixel number and the actual distance between the edge mark and the center mark; or based on a small hole imaging principle, determining a first distance between the robot and the bottom of the car according to the image distance corresponding to the number of pixels, the actual distance between the edge identifier and the center identifier, and the camera focal length of the robot.
2. The method of claim 1, wherein the predetermined pattern includes edge marks disposed on four corners of the bottom of the outer side of the car and a center mark disposed at the center of the bottom of the outer side of the car.
3. The method of claim 1, further comprising:
acquiring a floor where the car is located when the elevator fails and a second distance between the car and the robot when the car is at the floor, wherein the second distance is stored in advance;
controlling the car to move downward a third distance when the first distance is greater than the second distance; and when the first distance is smaller than the second distance, controlling the car to move upwards by a third distance which is equal to the absolute value of the first distance minus the second distance.
4. The method of claim 1, further comprising:
acquiring a target image of the bottom of the car, which is shot by a robot when the car stops for two adjacent times;
determining a first distance between the robot corresponding to each target image and the bottom of the car;
determining the movement speed of the car according to the difference value of the two first distances and the time difference between two times of stopping of the car;
and if the movement speed is greater than a preset speed threshold value, outputting overspeed prompt information for indicating that the movement of the car is overspeed.
5. The method of claim 1, wherein determining the first distance between the robot and the car bottom according to the shooting angle of the robot corresponding to the number of pixels and the actual distance between the edge identifier and the center identifier comprises:
determining the shooting angle of the robot corresponding to each edge identifier in the target image, and determining the maximum shooting angle and the minimum shooting angle based on all the determined shooting angles;
if the difference value between the maximum shooting angle and the minimum shooting angle is within a preset angle deviation range, acquiring the actual distance between the edge identifier and the center identifier, and determining a first distance between the robot and the bottom of the car;
and if the difference value between the maximum shooting angle and the minimum shooting angle is larger than a preset angle deviation range, outputting inclination prompt information for representing the inclination of the car.
6. The method of any one of claims 1-5, further comprising, prior to said acquiring an image of the target at the bottom of the car captured by the robot:
acquiring a target image of the bottom of a car shot by a robot at the bottom of a hoistway, and identifying a center mark on the target image;
determining an image distance between the center mark and the center point of the target image, an offset included angle between the center mark and a preset direction, an image distance between any two edge mark points on the target image, and acquiring an actual distance between the two edge mark points;
determining the scaling of the target image according to the ratio of the actual distance to the image distance;
and determining a moving route of the robot according to the offset included angle, determining a correction distance according to the image distance and the scaling, and controlling the robot to move the correction distance along the moving route.
7. An elevator car position determining apparatus, comprising:
the image acquisition module is used for acquiring a target image of the bottom of the car shot by the robot, wherein the target image carries a preset pattern arranged at the bottom of the car, the preset pattern comprises a center mark arranged at the center of the bottom of the outer side of the car and at least three edge marks surrounding the center mark, and a camera of the robot is vertical to the upper direction by 90 degrees so that the shooting direction of the robot is aligned to the center mark;
a pixel determination module for determining the number of pixels between any edge identifier and the center identifier in the target image;
the first distance determining module is used for determining a first distance between the robot and the bottom of the car according to the shooting angle of the robot corresponding to the pixel number and the actual distance between the edge mark and the center mark; or based on a small hole imaging principle, determining a first distance between the robot and the bottom of the car according to an image distance corresponding to the number of pixels, an actual distance between the edge identifier and the center identifier, and a camera focal length of the robot.
8. An elevator system, characterized in that the elevator system comprises:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the elevator car position determination method of any of claims 1-6.
9. The system of claim 8, further comprising:
the car, the bottom outside the car is provided with a preset pattern, and the preset pattern comprises a center mark and at least three edge marks surrounding the center mark;
the robot is arranged at the center of the bottom of the well.
10. A storage medium containing computer executable instructions for performing the elevator car position determination method of any of claims 1-6 when executed by a computer processor.
CN202011415173.1A 2020-12-03 2020-12-03 Elevator car position determining method and device, elevator system and storage medium Active CN112573312B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011415173.1A CN112573312B (en) 2020-12-03 2020-12-03 Elevator car position determining method and device, elevator system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011415173.1A CN112573312B (en) 2020-12-03 2020-12-03 Elevator car position determining method and device, elevator system and storage medium

Publications (2)

Publication Number Publication Date
CN112573312A CN112573312A (en) 2021-03-30
CN112573312B true CN112573312B (en) 2023-02-28

Family

ID=75127594

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011415173.1A Active CN112573312B (en) 2020-12-03 2020-12-03 Elevator car position determining method and device, elevator system and storage medium

Country Status (1)

Country Link
CN (1) CN112573312B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113023515B (en) * 2021-04-15 2023-06-23 上海高仙自动化科技发展有限公司 Method, device, equipment, system and storage medium for determining position of carrying equipment
CN113344611B (en) * 2021-05-19 2023-04-18 天津旗滨节能玻璃有限公司 Cost determination method, smart device, and storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006256833A (en) * 2005-03-18 2006-09-28 Toshiba Elevator Co Ltd Elevator
CN102706329A (en) * 2012-05-31 2012-10-03 中国航天科技集团公司第五研究院第五一三研究所 Charge coupled device (CCD) measuring method for rendezvous and docking
CN105366470A (en) * 2014-08-11 2016-03-02 通力股份公司 Positioning apparatus, elevator and a method for determining the position of an elevator car
CN105444668A (en) * 2014-09-19 2016-03-30 株式会社东芝 Elevator shaft inner dimension measuring device
CN105636893A (en) * 2013-10-14 2016-06-01 塞德斯股份公司 Coding apparatus and position-finding apparatus and method
CN105800411A (en) * 2016-04-14 2016-07-27 上海之跃信息科技有限公司 Device, method and system for measuring elevator motion
CN107250839A (en) * 2015-02-23 2017-10-13 三菱电机株式会社 Displacement measuring device
US20170349400A1 (en) * 2014-12-16 2017-12-07 Inventio Ag Position-determination system for an elevator
CN108975112A (en) * 2017-06-01 2018-12-11 奥的斯电梯公司 image analysis for elevator maintenance
CN109466980A (en) * 2018-12-28 2019-03-15 长沙慧联智能科技有限公司 A kind of real-time floor detection device, method for vertical lift
CN109641723A (en) * 2016-08-30 2019-04-16 因温特奥股份公司 For analyzing the method for the lift well of lift facility and the measuring system of the lift well for measuring lift facility
CN110612264A (en) * 2017-05-18 2019-12-24 因温特奥股份公司 System and method for determining the position of an elevator car of an elevator installation

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004037203A (en) * 2002-07-02 2004-02-05 Toshiba Elevator Co Ltd Measurement instrument for dimension in elevator shaft

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006256833A (en) * 2005-03-18 2006-09-28 Toshiba Elevator Co Ltd Elevator
CN102706329A (en) * 2012-05-31 2012-10-03 中国航天科技集团公司第五研究院第五一三研究所 Charge coupled device (CCD) measuring method for rendezvous and docking
CN105636893A (en) * 2013-10-14 2016-06-01 塞德斯股份公司 Coding apparatus and position-finding apparatus and method
CN105366470A (en) * 2014-08-11 2016-03-02 通力股份公司 Positioning apparatus, elevator and a method for determining the position of an elevator car
CN105444668A (en) * 2014-09-19 2016-03-30 株式会社东芝 Elevator shaft inner dimension measuring device
US20170349400A1 (en) * 2014-12-16 2017-12-07 Inventio Ag Position-determination system for an elevator
CN107250839A (en) * 2015-02-23 2017-10-13 三菱电机株式会社 Displacement measuring device
CN105800411A (en) * 2016-04-14 2016-07-27 上海之跃信息科技有限公司 Device, method and system for measuring elevator motion
CN109641723A (en) * 2016-08-30 2019-04-16 因温特奥股份公司 For analyzing the method for the lift well of lift facility and the measuring system of the lift well for measuring lift facility
CN110612264A (en) * 2017-05-18 2019-12-24 因温特奥股份公司 System and method for determining the position of an elevator car of an elevator installation
CN108975112A (en) * 2017-06-01 2018-12-11 奥的斯电梯公司 image analysis for elevator maintenance
CN109466980A (en) * 2018-12-28 2019-03-15 长沙慧联智能科技有限公司 A kind of real-time floor detection device, method for vertical lift

Also Published As

Publication number Publication date
CN112573312A (en) 2021-03-30

Similar Documents

Publication Publication Date Title
EP3409629B1 (en) Image analytics for elevator maintenance
CN112573312B (en) Elevator car position determining method and device, elevator system and storage medium
CN111204662B (en) System for recognizing state parameters, hoisting positioning system and hoisting equipment
BR112020024333A2 (en) track vehicles in a warehouse environment
EP3379826A1 (en) Method for monitoring moving target, and monitoring device, apparatus and system
CN212425180U (en) Hoisting information identification system, hoisting positioning system and hoisting equipment
US10939477B2 (en) Service tool wireless access management
JP6139756B1 (en) Warning device
KR102056564B1 (en) Method And Apparatus for Managing Facility by using Machine Vision
CN113213340B (en) Method, system, equipment and storage medium for unloading collection card based on lockhole identification
US11295471B1 (en) Camera-based pallet placement detection and notification
CN113776546A (en) Method and device for determining robot path, electronic equipment and medium
CN113965698B (en) Monitoring image calibration processing method, device and system for fire-fighting Internet of things
CN110072797B (en) Crane control method, computer program, equipment and crane updating method
KR102448233B1 (en) Drone controlling method for precise landing
CN110516551B (en) Vision-based line patrol position deviation identification system and method and unmanned aerial vehicle
CN114407986A (en) Testing method and device for annunciator button, electronic equipment and storage medium
Nam et al. AR-based evacuation route guidance system in indoor fire environment
CN112225025A (en) Method and device for monitoring engagement condition of lock hook of elevator landing door lock
CN114381596B (en) Position detecting and positioning device, method and system thereof and positioning method of system
US20240059525A1 (en) Elevator control
JP2020042323A (en) Driving support device, driving support system, and driving support method
AU2021106241A4 (en) System and method for aerial power lines measurement using computer vision and unmanned aerial vehicle
CN114056389B (en) Train warehouse-in and warehouse-out monitoring method and device and electronic equipment
CN117163788A (en) Elevator position determining method, system and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant