WO2024041508A1 - 红外触控状态数据采集方法、装置、计算机设备以及介质 - Google Patents

红外触控状态数据采集方法、装置、计算机设备以及介质 Download PDF

Info

Publication number
WO2024041508A1
WO2024041508A1 PCT/CN2023/114188 CN2023114188W WO2024041508A1 WO 2024041508 A1 WO2024041508 A1 WO 2024041508A1 CN 2023114188 W CN2023114188 W CN 2023114188W WO 2024041508 A1 WO2024041508 A1 WO 2024041508A1
Authority
WO
WIPO (PCT)
Prior art keywords
infrared
infrared touch
robotic arm
touch frame
frame
Prior art date
Application number
PCT/CN2023/114188
Other languages
English (en)
French (fr)
Inventor
覃亮
Original Assignee
广州众远智慧科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广州众远智慧科技有限公司 filed Critical 广州众远智慧科技有限公司
Publication of WO2024041508A1 publication Critical patent/WO2024041508A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means

Definitions

  • the present application relates to the field of computer technology, for example, to an infrared touch state data collection method, device, computer equipment and storage medium.
  • the infrared touch screen is equipped with an infrared touch frame; infrared transmitting tubes and infrared receiving tubes are densely arranged in the infrared touch frame; infrared transmitting tubes and infrared receiving tubes will form an infrared detection network on the surface of the display screen.
  • the infrared detection network continuously scans whether any infrared rays are blocked by the touching object, and calculates the coordinate position of the touching object based on the missing infrared rays after blocking, thereby realizing touch recognition on the infrared touch screen.
  • the purpose of this application is to provide an infrared touch status data collection method, device, computer equipment and storage medium, which can automatically collect infrared touch status data and improve data collection efficiency and accuracy.
  • an infrared touch status data collection method including the following steps:
  • the several frames of infrared sample signals are used as input, and the corresponding touch state data of the front end of the robotic arm is used as output, and input into the infrared touch state data collection model for training and learning, and a trained infrared touch state data collection model is obtained. ;
  • the infrared signal is input to the trained infrared touch state data collection model to obtain infrared touch state data.
  • an infrared touch status data collection device including:
  • a movement control module used to control the front end of the robotic arm to move in the predetermined direction of the infrared touch frame
  • a sample signal acquisition module configured to acquire several frames of infrared sample signals when the front end of the robotic arm contacts the infrared rays of the infrared touch frame during movement along the predetermined direction and the corresponding touch state of the front end of the robotic arm. data;
  • the collection model acquisition module is used to take the several frames of infrared sample signals as input, take the corresponding touch state data of the front end of the robotic arm as output, and input it into the infrared touch state data collection model for training and learning, and obtain the trained Infrared touch status data collection model;
  • An infrared signal acquisition module configured to acquire infrared signals in response to a touch operation on the infrared touch frame
  • a status data acquisition module is used to input the infrared signal to the trained infrared touch status data collection model to obtain infrared touch status data.
  • a computer device including: a processor and a memory; wherein the memory stores a computer program, and the computer program is adapted to be loaded by the processor and executed as any of the above.
  • a computer-readable storage medium on which a computer program is stored.
  • the computer program is executed by a processor, the infrared touch state data collection method as described in any one of the above is implemented. .
  • the embodiment of the present application controls the front end of the robotic arm to move in a predetermined direction of the infrared touch frame; and obtains several frames of infrared sample signals when the front end of the robotic arm contacts the infrared rays of the infrared touch frame while moving along the predetermined direction.
  • the corresponding touch state data of the front end of the robotic arm use the several frames of infrared sample signals as input, use the corresponding touch state data of the front end of the robotic arm as output, and input it into the infrared touch state data collection model for training Learning to obtain a trained infrared touch state data collection model; acquiring an infrared signal in response to a touch operation on the infrared touch frame; inputting the infrared signal into the trained infrared touch state data collection model , obtain infrared touch status data, thereby automatically obtaining infrared touch status data based on the trained infrared touch status data collection model, without the need for manual collection of infrared touch status data, which improves the efficiency and accuracy of data collection; further , you can also calibrate the infrared touch status data to save calibration time.
  • Figure 1 is a schematic flow chart of an infrared touch status data collection method provided by an embodiment of the present application
  • Figure 2 is an example of obtaining the front end of the robotic arm from the optical network layer of the infrared touch frame for the first time according to an embodiment of the present application.
  • Figure 3 is a structural block diagram of an infrared touch state data acquisition device provided by an embodiment of the present application.
  • Figure 4 is a schematic structural block diagram of an electronic device provided by an embodiment of the present application.
  • plural means two or more unless otherwise specified.
  • “And/or” describes the relationship between related objects, indicating that there can be three relationships. For example, A and/or B can mean: A exists alone, A and B exist simultaneously, and B exists alone. The character “/” generally indicates that the related objects are in an "or” relationship.
  • the application environment of the infrared touch status data collection method provided in the embodiment of this application includes a robotic arm platform, an infrared touch status data collection device, and an infrared touch frame.
  • the robotic arm platform includes the front end of the robotic arm, the robotic arm controller and the robotic arm driving device.
  • the robotic arm driving device is connected to the robotic arm controller.
  • the robotic arm controller issues commands.
  • the robotic arm driving device receives the command and controls the robotic arm. Front-end operations.
  • the infrared touch status data collection device communicates with the robotic arm platform.
  • infrared touch status data collection The collection device is connected to the robotic arm platform through a wired serial interface and is used to transmit control instructions to the robotic arm controller and obtain status data of the robotic arm front end.
  • the infrared touch status data collection method provided in the embodiment of the present application can be executed by an infrared touch status data collection device.
  • the infrared touch status data collection device can be implemented through software and/or hardware.
  • the infrared touch status data can be composed of two or more physical entities, or it can be composed of one physical entity.
  • the infrared touch status data collection device can be any electronic device with an infrared touch status data collection application installed.
  • the electronic device can be a smart device such as a computer, mobile phone, tablet or interactive tablet.
  • the infrared touch frame communicates with the infrared touch status data collection device, and the communication connection can be a wired connection or a wireless connection.
  • the infrared touch frame includes an optical mesh layer and a screen protection glass layer. Among them, the infrared touch frame is provided with several relative infrared transmitting tubes and infrared receiving tubes; the infrared transmitting tube emits infrared rays, and the infrared receiving tube receives corresponding infrared rays.
  • the cooperation between the infrared ray transmitting tube and the infrared ray receiving tube forms an infrared detection network. That is, the optical network layer, which is located directly above the surface of the screen protective glass layer.
  • the infrared touch frame sends the changed infrared signal to the infrared touch status data collection device for infrared touch.
  • the infrared touch status data acquisition device collects infrared signals.
  • the infrared touch status data acquisition device can collect and calibrate the infrared touch status by acquiring the infrared signal of the infrared touch frame and the status data of the front end of the robotic arm.
  • Figure 1 is a schematic flow chart of an infrared touch status data collection method provided by an embodiment of the present application.
  • the infrared touch status data collection method provided by the embodiment of this application includes the following steps:
  • S10 Control the front end of the robotic arm to move in the predetermined direction of the infrared touch frame.
  • the front end of the robotic arm is a touch object that can block infrared signals.
  • the front end of the robotic arm is a pen-type touch object; the predetermined direction is toward the infrared touch frame. It can be understood that the predetermined direction can be adjusted according to actual needs.
  • the predetermined direction when collecting and calibrating the touch status data of a touch point, can be a direction perpendicular to the infrared touch frame.
  • the predetermined direction may be a direction parallel to the infrared touch frame, etc.
  • the front end of the robotic arm When collecting and calibrating the touch status data of a touch point, the front end of the robotic arm is initially at a certain height from the infrared touch frame. At this time, the front end of the robotic arm is controlled to move in a direction perpendicular to the infrared touch frame, and then The front end of the control robot arm is close to the infrared touch box.
  • control robot arm when the front end of the control robot arm moves in a predetermined direction of the infrared touch frame, it can move at a constant speed according to a preset speed, or it can also move at a constant speed according to a preset acceleration.
  • S20 Obtain several frames of infrared sample signals when the front end of the robotic arm contacts the infrared rays of the infrared touch frame while moving along the predetermined direction and the corresponding touch state data of the front end of the robotic arm.
  • the mechanical The front end of the arm will touch the infrared rays of the infrared touch frame, thus blocking part of the infrared rays and generating corresponding infrared sample signals.
  • the infrared touch frame scans at preset time intervals whether any infrared rays are blocked by the front end of the robotic arm, thereby obtaining several frames of infrared sample signals.
  • the infrared touch status data acquisition device After obtaining each frame of infrared sample signal in the infrared touch frame, the infrared touch status data acquisition device sends instructions to the robotic arm platform; the infrared touch status data acquisition device receives the touch status of the front end of the robotic arm fed back by the robotic arm platform according to the instructions. data.
  • the touch status data of the front end of the robotic arm is the status data when the front end of the robotic arm is in contact with the infrared ray of the infrared touch frame, which can be automatically measured and obtained through the robotic arm platform;
  • the touch status data of the front end of the robotic arm includes the coordinate position data of the front end of the robotic arm, the mechanical The height data of the front end of the arm from the infrared touch frame, the width data and height data of the front end of the robotic arm, and the moving speed and acceleration data of the front end of the robotic arm.
  • a spatial rectangular coordinate system is established based on the plane where the infrared touch frame is located.
  • the X-axis direction and the Y-axis direction of the spatial rectangular coordinate system can be the horizontal width direction and the vertical height direction of the infrared touch frame, and the Z-axis direction is the same as the infrared touch frame.
  • the coordinate origin is a fixed point in the lower left corner of the plane where the infrared touch frame is located.
  • the width data and height data of the front end of the robotic arm are the width data and height data of the touch point.
  • the touch point mentioned here is not a point in the mathematical or physical sense, but the shape of the cross-section of the front end of the robotic arm when it contacts the optical mesh layer of the infrared touch frame. In one embodiment, it can be approximately understood. Be rectangular or circular.
  • the width data of the touch point refers to the length of the touch point along the X-axis direction
  • the height data of the touch point refers to the length of the touch point along the Y-axis direction.
  • S30 Use several frames of infrared sample signals as input, use the corresponding touch state data of the front end of the robotic arm as output, and input it into the infrared touch state data collection model for training and learning, to obtain a trained infrared touch state data collection model.
  • the infrared touch state data collection model can be a machine learning model or a neural network model
  • the training process of the infrared touch state data collection model can be carried out in the infrared touch state data collection equipment or other training equipment. If it is carried out in the training equipment, the trained people can be transferred to the training equipment after the training is completed.
  • the face recognition model parameters are transplanted into the infrared touch status data collection device.
  • S40 Acquire infrared signals in response to the touch operation on the infrared touch frame.
  • the touch object contacts the infrared rays of the infrared touch frame
  • the touch operation of the touch object on the infrared touch frame is detected, the infrared rays at the corresponding touch position are blocked, and the corresponding infrared signal is obtained.
  • S50 Input the infrared signal to the trained infrared touch state data collection model to obtain infrared touch state data.
  • the trained infrared touch state data collection model can output the infrared touch state data.
  • the touch status data of the front end of the robotic arm includes the coordinate position data of the front end of the robotic arm, the height data of the front end of the robotic arm from the infrared touch frame, and the width data and height data of the front end of the robotic arm
  • the infrared touch state The data includes the coordinate position data of the touching object, the height data of the screen protection glass layer of the touching object from the infrared touch frame, and/or the width and height data of the touching object.
  • a touch point is formed on the optical network layer, and the width data and height data of the touch object are the width data and height data of the touch point.
  • the touch point mentioned here is not a point in the mathematical or physical sense, but the shape of the cross section of the touch object when it is not in contact with the optical mesh layer of the infrared touch frame. In one embodiment, it can be approximately understood as a rectangle. or round.
  • the width data of the touch point refers to the length of the touch point along the X-axis direction
  • the height data of the touch point refers to the length of the touch point along the Y-axis direction.
  • the coordinate position data of the touched object can be directly obtained through the infrared touch status data output by the trained infrared touch status data collection model, instead of using the traditional algorithm to calculate the coordinate position of the touched object, which is time-consuming. short.
  • the height data of the distance between the touch object and the screen protection glass layer of the infrared touch frame can be directly obtained, which saves labor costs and is highly efficient.
  • the width and height data of the touch object can be directly obtained, the area of the touch object can be calculated based on the width and height data of the touch object, and the category of the touch object can be automatically identified based on the area size. For example, when the touch object is a collection pen, thick pens and Fine pen etc.
  • the touch state data of the front end of the arm use several frames of infrared sample signals as input, use the corresponding touch state data of the front end of the robotic arm as output, and input it into the infrared touch state data acquisition model for training and learning, and obtain the trained infrared touch state Data collection model; in response to the touch operation on the infrared touch frame, obtain infrared signals; input the infrared signals to the trained infrared touch state data collection model to obtain infrared touch state data, so as to obtain the infrared touch state data according to the trained infrared touch state.
  • the status data collection model automatically obtains infrared touch status data, eliminating the need for manual collection of infrared touch status data, improving the efficiency and accuracy of data collection. Furthermore, the infrared touch status data can also be calibrated to save calibration time.
  • step S11 is as follows:
  • S11 Obtain the motion trajectory file of the robotic arm, and send control instructions to the robotic arm based on the motion trajectory file to control the front end of the robotic arm to move in the predetermined direction of the infrared touch box.
  • the motion trajectory file of the robot arm is used to control the robot arm to reach the specified target point along the planned path trajectory, including the starting point, the end point and several intermediate points during the movement of the robot arm.
  • the motion trajectory file of the robotic arm is analyzed, and control instructions are sent to the robotic arm based on the analysis results to control the front end of the robotic arm to move in the predetermined direction of the infrared touch frame, thereby realizing the automatic and accurate movement of the front end of the robotic arm. move.
  • the infrared touch frame includes an optical mesh layer and a screen protective glass layer; the optical mesh layer is located directly above the surface of the screen protective glass layer.
  • step S20 the movement of the front end of the robotic arm in the predetermined direction is obtained.
  • the steps of obtaining several frames of infrared sample signals when in contact with the infrared rays of the infrared touch frame and the corresponding touch state data of the front end of the robotic arm include step S21. Step S21 is as follows:
  • S21 Obtain several frames of infrared sample signals and the corresponding touch status data of the front end of the robotic arm during the movement from the first contact with the optical mesh layer of the infrared touch frame to the contact with the screen protection glass layer of the infrared touch frame.
  • the optical network layer is an infrared detection network formed by the infrared transmitting tube of the infrared touch frame emitting infrared rays and the infrared receiving tube receiving the corresponding infrared rays.
  • the screen protection glass layer is a layer of protective glass on the display screen. As the front end of the robotic arm slowly moves toward the screen protection glass layer of the infrared touch screen, the front end of the robotic arm will come into contact with the optical mesh layer of the infrared touch frame. Obtain an image of the first time the front end of the robotic arm comes into contact with the optical mesh layer of the infrared touch frame. Frame infrared sample signal and corresponding touch status data of the front end of the robotic arm.
  • the front end of the robotic arm After the front end of the robotic arm comes into contact with the optical mesh layer of the infrared touch frame for the first time, the front end of the robotic arm is controlled to continue to move slowly toward the screen protection glass layer of the infrared touch screen.
  • the front end of the robotic arm blocks infrared rays and acquires multi-frame infrared sample signals and the corresponding robotic arm. Touch state data for the frontend. While continuing to move the front end of the robotic arm, determine whether the front end of the robotic arm is in contact with the screen protection glass layer of the infrared touch screen.
  • step S21 several frames of infrared sample signals and corresponding frames are obtained during the movement of the front end of the robotic arm from first contact with the optical mesh layer of the infrared touch frame to contact with the screen protection glass layer of the infrared touch frame.
  • the steps of obtaining the touch status data of the front end of the robotic arm include step S211. Step S211 is as follows:
  • the infrared sample signal received by the infrared touch frame is unchanged, for example, the signal of the infrared sample signal
  • the intensity value is a fixed value.
  • the infrared sample signal received by the infrared touch frame will change.
  • the signal strength value of the infrared sample signal is less than a fixed value.
  • the front end of the robotic arm is a collection pen, and the end of the collection pen facing the infrared touch frame is provided with a pressure Force sensor; in step S21, several frames of infrared sample signals and corresponding touches of the front end of the robotic arm are obtained during the movement of the front end of the robotic arm from the first contact with the optical mesh layer of the infrared touch frame to the contact with the screen protection glass layer of the infrared touch frame.
  • the steps of status data include step S212. Step S212 is as follows:
  • the pressure value displayed by the pressure sensor on the collection pen is 0.
  • the collection pen will receive a reaction force from the screen protection glass layer.
  • the pressure sensor on the collection pen detects the reaction force, so that the pressure value of the pressure sensor changes. It can be quickly determined that the collection pen is in contact with the screen protection glass layer of the infrared touch frame.
  • step S21 several frames of the movement process of the front end of the robotic arm from first contact with the optical mesh layer of the infrared touch frame to contact with the screen protection glass layer of the infrared touch frame are obtained.
  • the steps of infrared sample signals and corresponding touch state data of the front end of the robotic arm include steps S213 to S215. Steps S213 to S215 are as follows:
  • S213 The position of the front end of the robotic arm relative to the X-axis and Y-axis of the plane where the infrared touch frame is located remains unchanged, the front end of the robotic arm is controlled to move along the Z-axis direction toward the infrared touch frame, and the current touch point is obtained from the first contact with the infrared touch frame.
  • the optical network layer of the touch frame comes into contact with several frames of infrared sample signals that are in contact with the screen protection glass layer of the infrared touch frame and the corresponding touch status data of the front end of the robotic arm.
  • a spatial rectangular coordinate system is established based on the plane where the infrared touch frame is located.
  • the X-axis direction and the Y-axis direction of the spatial rectangular coordinate system can be the horizontal width direction and the vertical height direction of the infrared touch frame, and the Z-axis direction is Perpendicular to the plane where the infrared touch frame is located, the origin of the coordinates is a fixed point in the lower left corner of the plane where the infrared touch frame is located.
  • the coordinate position of the front end of the robotic arm is (X1, Y1, Z1).
  • the current touch point (X1, Y1, Zm) is generated.
  • the current touch point is obtained from the first contact with the infrared touch frame.
  • the optical network layer is exposed to several frames of infrared sample signals that are in contact with the screen protection glass layer of the infrared touch frame and the corresponding touch status data of the front end of the robotic arm.
  • the coordinates of the X-axis and Y-axis of the front end of the robotic arm are kept unchanged, and only the coordinates of the Z-axis of the front end of the robotic arm are changed, such that The Z-axis coordinate of the front end of the robotic arm slowly increases until the front end of the robotic arm is no longer in contact with the optical mesh layer of the infrared touch frame.
  • S215 Control the front end of the robotic arm to move along the X-axis or Y-axis direction of the plane where the infrared touch frame is located to change the coordinate position of the front end of the robotic arm relative to the X-axis or Y-axis of the infrared touch frame, and control the front end of the robotic arm to move along the direction of the infrared touch frame.
  • Box Z Move in the axial direction to obtain a preset number of touch points from the first contact with the optical mesh layer of the infrared touch frame to the contact with the screen protection glass layer of the infrared touch frame. Several frames of infrared sample signals and the corresponding touch status data of the front end of the robotic arm .
  • the front end of the robotic arm is controlled to move along the X-axis of the plane where the infrared touch frame is located or Moving in the Y-axis direction, for example, the coordinate position of the front end of the robot arm becomes (X1, Y2, Z1) or (X2, Y1, Z1) or (X2, Y2, Z1).
  • the optical network layer is exposed to several frames of infrared sample signals that are in contact with the screen protection glass layer of the infrared touch frame and the corresponding touch status data of the front end of the robotic arm. Repeat the above steps to obtain several frames of infrared sample signals when the front end of the robotic arm contacts the infrared rays of the infrared touch frame at different coordinate positions and the corresponding touch status data of the front end of the robotic arm to improve the comprehensiveness of data collection.
  • step S21 several frames of infrared sample signals and corresponding frames are obtained during the movement of the front end of the robotic arm from first contact with the optical mesh layer of the infrared touch frame to contact with the screen protection glass layer of the infrared touch frame.
  • the steps of obtaining the touch status data of the front end of the robotic arm include step S216. Step S216 is as follows:
  • S216 Replace different types of robotic arm front ends; for different types of robotic arm front ends, obtain several corresponding movement processes from the first contact with the optical mesh layer of the infrared touch frame to the contact with the screen protection glass layer of the infrared touch frame. Frame infrared sample signal and corresponding touch status data of the front end of the robotic arm.
  • the front end of the robotic arm after acquiring several frames of infrared sample signals of the current touch point on the infrared touch frame and the corresponding touch status data of the front end of the robotic arm, the front end of the robotic arm can be replaced to obtain the next touch point. Several frames of infrared sample signals and corresponding touch status data of the front end of the robotic arm, thereby improving the diversity of data collection.
  • FIG. 3 shows a schematic structural diagram of an infrared touch state data collection device provided by an embodiment of the present application.
  • the infrared touch status data collection device 3 provided by the embodiment of the present application includes:
  • the movement control module 31 is used to control the front end of the robotic arm to move in the predetermined direction of the infrared touch frame;
  • the sample signal acquisition module 32 is used to acquire several frames of infrared sample signals when the front end of the robotic arm contacts the infrared rays of the infrared touch frame during movement along the predetermined direction and the corresponding touch state data of the front end of the robotic arm;
  • the collection model acquisition module 33 is used to take several frames of infrared sample signals as input, take the corresponding touch state data of the front end of the robotic arm as output, and input it into the infrared touch state data collection model for training and learning, and obtain the trained infrared touch Status data collection model;
  • the infrared signal acquisition module 34 is used to acquire infrared signals in response to a touch operation on the infrared touch frame;
  • the status data acquisition module 35 is used to input infrared signals to the trained infrared touch status data collection model to obtain infrared touch status data.
  • the touch state data of the front end of the arm use several frames of infrared sample signals as input, use the corresponding touch state data of the front end of the robotic arm as output, and input it into the infrared touch state data acquisition model for training and learning, and obtain the trained infrared touch state Data collection model; in response to the touch operation on the infrared touch frame, obtain infrared signals; input the infrared signals to the trained infrared touch state data collection model to obtain infrared touch state data, so as to obtain the infrared touch state data according to the trained infrared touch state.
  • the state data collection model automatically obtains infrared touch state data, eliminating the need for manual collection of infrared touch state data, which improves the efficiency and accuracy of data collection; further, the infrared touch state data can also be calibrated, saving calibration time.
  • the sample signal acquisition module 31 includes:
  • the trajectory file acquisition unit is used to obtain the motion trajectory file of the robotic arm, and send control instructions to the robotic arm according to the motion trajectory file to control the front end of the robotic arm to move in the predetermined direction of the infrared touch frame.
  • the sample signal acquisition module 32 includes:
  • the sample signal acquisition unit is used to acquire several frames of infrared sample signals during the movement of the front end of the robotic arm from the first contact with the optical mesh layer of the infrared touch frame to the contact with the screen protection glass layer of the infrared touch frame and the corresponding frames of the front end of the robotic arm. Touch status data.
  • the sample signal acquisition unit includes:
  • the sample signal detection unit is used to determine the first contact between the front end of the robotic arm and the optical mesh layer of the infrared touch frame when a change in the infrared sample signal is detected.
  • the sample signal acquisition unit includes:
  • the pressure detection unit is used to determine that the front end of the robotic arm is in contact with the screen protection glass layer of the infrared touch frame when it detects that the pressure value of the pressure sensor changes for the first time.
  • the sample signal acquisition unit includes:
  • the position control unit is used to control the position of the front end of the robotic arm relative to the X-axis and Y-axis directions of the plane where the infrared touch frame is located, and to control the front end of the robotic arm to move along the Z-axis direction toward the infrared touch frame to obtain the current touch point.
  • the first movement direction control unit is used to control the movement of the robot arm when the front end of the robot arm comes into contact with the screen protection glass layer of the infrared touch frame. Make the front end of the robotic arm move in the Z-axis direction away from the infrared touch frame until the front end of the robotic arm is no longer in contact with the optical mesh layer of the infrared touch frame;
  • the first movement direction control unit is used to control the front end of the robotic arm to move along the X-axis or Y-axis direction of the plane where the infrared touch frame is located, so as to change the coordinate position of the front end of the robotic arm relative to the X-axis or Y-axis of the infrared touch frame, and control the machinery.
  • the front end of the arm moves along the Z-axis direction toward the infrared touch frame, and acquires several frames of infrared sample signals from the first contact with the optical mesh layer of the infrared touch frame to the first contact with the screen protection glass layer of the infrared touch frame. And the corresponding touch status data of the front end of the robotic arm.
  • the electronic device 300 can be a computer, a mobile phone, a tablet, an interactive tablet, etc.
  • the electronic device 300 is an interactive tablet, and the interactive tablet can be It includes: at least one processor 301, at least one memory 302, at least one display, at least one network interface 303, user interface 304 and at least one communication bus 305.
  • the user interface 304 is mainly used to provide an input interface for the user and obtain data input by the user.
  • the user interface may also include standard wired interfaces and wireless interfaces.
  • the network interface 303 may optionally include a standard wired interface and a wireless interface (such as a WI-FI interface).
  • the communication bus 305 is used to realize connection communication between these components.
  • the processor 301 may include one or more processing cores.
  • the processor uses various interfaces and lines to connect various parts of the entire electronic device, and executes various functions of the electronic device by running or executing instructions, programs, code sets or instruction sets stored in the memory, and calling data stored in the memory. functions and process data.
  • the processor can use at least one of digital signal processing (Digital Signal Processing, DSP), field-programmable gate array (Field-Programmable Gate Array, FPGA), and programmable logic array (Programmable Logic Array, PLA). implemented in hardware form.
  • the processor can integrate one or a combination of a central processing unit (Central Processing Unit, CPU), a graphics processor (Graphics Processing Unit, GPU), a modem, etc.
  • CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • modem etc.
  • the CPU mainly processes the operating system, user interface, and applications; the GPU is responsible for rendering and drawing the content that needs to be displayed in the display layer; the modem is used to process wireless communications. It is understandable that the above modem may not be integrated into the processor and may be implemented by a separate chip.
  • the memory 302 may include random access memory (RAM) or read-only memory (Read-Only Memory).
  • the memory includes non-transitory computer-readable storage medium.
  • Memory may be used to store instructions, programs, code, sets of codes, or sets of instructions.
  • the memory may include a program storage area and a data storage area, where the program storage area may store Instructions of the current operating system, instructions for at least one function (such as touch function, sound playback function, image playback function, etc.), instructions for implementing each of the above method embodiments, etc.; the storage data area can store the implementation of each of the above methods.
  • the memory may optionally be at least one storage device located remotely from the aforementioned processor.
  • a memory as a computer storage medium may include an operating system, a network communication module, a user interface module, and an operating application program.
  • the processor can be used to call the application program of the generation method of the business data model stored in the memory, and execute the method steps shown above.
  • execution process please refer to the description shown above, which will not be described again here.
  • This application also provides a computer-readable storage medium on which a computer program is stored.
  • the instructions are suitable for the processor to load and execute the above-mentioned method steps.
  • For the execution process please refer to the description shown in the embodiment, which will not be mentioned here. Elaborate.
  • the device where the storage medium is located can be an electronic device such as a personal computer, laptop, smartphone, tablet, etc.
  • the equipment embodiment since it basically corresponds to the method embodiment, please refer to the partial description of the method embodiment for relevant details.
  • the device embodiments described above are only illustrative, in which components illustrated as separate components may or may not be physically separated, and components shown as units may or may not be physical units, that is, they may be located in one place. , or it can be distributed to multiple network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the solution of this application. Persons of ordinary skill in the art can understand and implement the method without any creative effort.
  • embodiments of the present application may be provided as methods, systems, or computer program products. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment that combines software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
  • computer-usable storage media including, but not limited to, disk storage, CD-ROM, optical storage, etc.
  • These computer program instructions may also be stored in a computer-readable memory that causes a computer or other programmable data processing apparatus to operate in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction means, the instructions
  • the device implements the functions selected in one process or multiple processes of the flowchart and/or one block or multiple blocks of the block diagram.
  • These computer program instructions may also be loaded onto a computer or other programmable data processing device such that the computer or A series of operational steps are performed on other programmable devices to produce a computer-implemented process, whereby instructions executed on the computer or other programmable device provide for implementing a process or processes in a flowchart and/or a block in a block diagram or Steps for features selected in multiple boxes.
  • a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
  • processors CPUs
  • input/output interfaces network interfaces
  • memory volatile and non-volatile memory
  • Memory may include non-volatile memory in computer-readable media, random access memory (RAM), and/or non-volatile memory in the form of read-only memory (ROM) or flash memory (flashRAM). Memory is an example of a computer-readable medium.
  • RAM random access memory
  • ROM read-only memory
  • flashRAM flash memory
  • Computer-readable media includes both persistent and non-volatile, removable and non-removable media that can be implemented by any method or technology for storage of information.
  • Information may be computer-readable instructions, data structures, modules of programs, or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), and read-only memory.
  • PRAM phase change memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • RAM random access memory
  • read-only memory read-only memory
  • ROM read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • flash memory or other memory technology
  • compact disc read-only memory CD-ROM
  • DVD digital versatile disc
  • Magnetic tape cassettes tape magnetic disk storage or other magnetic storage devices or any other non-transmission medium can be used to store information that can be accessed by a computing device.
  • computer-readable media does not include transitory media, such as modulated data signals and carrier waves.

Abstract

本申请涉及一种红外触控状态数据采集方法、装置、计算机设备以及存储介质,该方法包括:控制机械臂前端向红外触摸框的预定方向移动;获取机械臂前端在沿着预定方向移动过程中与红外触摸框的红外线接触时的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据;将若干帧红外样本信号作为输入,将对应的机械臂前端的触摸状态数据作为输出,输入到红外触控状态数据采集模型进行训练学习,得到已训练的红外触控状态数据采集模型;响应于对红外触摸框的触控操作,获取红外信号;将所述红外信号输入至已训练的所述红外触控状态数据采集模型,获得红外触控状态数据,从而提高了数据采集的效率和精度。

Description

红外触控状态数据采集方法、装置、计算机设备以及介质
说明书本申请要求于2022年08月22日提交国家知识产权局、申请号为202211009336.5、发明名称为“红外触控状态数据采集方法、装置、计算机设备以及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及计算机技术领域,例如是涉及一种红外触控状态数据采集方法、装置、计算机设备以及存储介质。
背景技术
红外触摸屏上装配有红外触摸框;红外触摸框中密集排列有红外线发射管以及红外线接收管;红外线发射管以及红外线接收管会在显示屏的表面形成红外线探测网。红外线探测网通过不停的扫描是否有红外线被触摸物遮挡,根据遮挡后缺失的红外线计算触摸物的坐标位置,从而实现红外触摸屏的触摸识别。
然而,使用传统算法计算触摸物的触控状态数据需要通过人工的方式手动采集,采集效率低以及误差大。
发明内容
基于此,本申请的目的在于,提供一种红外触控状态数据采集方法、装置、计算机设备以及存储介质,其可自动采集红外触控状态数据、提高数据采集效率和精度。
根据本申请实施例的第一方面,提供一种红外触控状态数据采集方法,包括如下步骤:
控制机械臂前端向红外触摸框的预定方向移动;
获取所述机械臂前端在沿着所述预定方向移动过程中与所述红外触摸框的红外线接触时的若干帧红外样本信号以及对应的所述机械臂前端的触摸状态数据;
将所述若干帧红外样本信号作为输入,将对应的所述机械臂前端的触摸状态数据作为输出,输入到红外触控状态数据采集模型进行训练学习,得到已训练的红外触控状态数据采集模型;
响应于对所述红外触摸框的触控操作,获取红外信号;
将所述红外信号输入至已训练的所述红外触控状态数据采集模型,获得红外触控状态数据。
根据本申请实施例的第二方面,提供一种红外触控状态数据采集装置,包括:
移动控制模块,用于控制机械臂前端向红外触摸框的预定方向移动;
样本信号获取模块,用于获取所述机械臂前端在沿着所述预定方向移动过程中与所述红外触摸框的红外线接触时的若干帧红外样本信号以及对应的所述机械臂前端的触摸状态数据;
采集模型获得模块,用于将所述若干帧红外样本信号作为输入,将对应的所述机械臂前端的触摸状态数据作为输出,输入到红外触控状态数据采集模型进行训练学习,得到已训练的红外触控状态数据采集模型;
红外信号获取模块,用于响应于对所述红外触摸框的触控操作,获取红外信号;
状态数据获得模块,用于将所述红外信号输入至已训练的所述红外触控状态数据采集模型,获得红外触控状态数据。
根据本申请实施例的第三方面,提供一种计算机设备,包括:处理器和存储器;其中,所述存储器存储有计算机程序,所述计算机程序适于由所述处理器加载并执行如上述任意一项所述的红外触控状态数据采集方法。
根据本申请实施例的第四方面,提供一种计算机可读存储介质,其上储存有计算机程序,该计算机程序被处理器执行时实现如上述任意一项所述的红外触控状态数据采集方法。
本申请实施例通过控制机械臂前端向红外触摸框的预定方向移动;获取所述机械臂前端在沿着所述预定方向移动过程中与所述红外触摸框的红外线接触时的若干帧红外样本信号以及对应的所述机械臂前端的触摸状态数据;将所述若干帧红外样本信号作为输入,将对应的所述机械臂前端的触摸状态数据作为输出,输入到红外触控状态数据采集模型进行训练学习,得到已训练的红外触控状态数据采集模型;响应于对所述红外触摸框的触控操作,获取红外信号;将所述红外信号输入至已训练的所述红外触控状态数据采集模型,获得红外触控状态数据,从而根据已训练的红外触控状态数据采集模型,自动获得红外触控状态数据,无需人工手动采集红外触控状态数据,提高了数据采集的效率和精度;进一步地,还可以对红外触控状态数据进行标定,节约标定时间。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本申请。
为了更好地理解和实施,下面结合附图详细说明本申请。
附图说明
图1为本申请一个实施例提供的红外触控状态数据采集方法的流程示意图;
图2为本申请一个实施例提供的获取所述机械臂前端从首次与所述红外触摸框的光网层 接触到与所述红外触摸框的屏幕保护玻璃层接触的移动过程中的若干帧红外样本信号以及对应的所述机械臂前端的触摸状态数据的流程示意图;
图3为本申请一个实施例提供的红外触控状态数据采集装置的结构框图;
图4为本申请一个实施例提供的电子设备的结构示意框图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请实施例方式作进一步地详细描述。
应当明确,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其它实施例,都属于本申请保护的范围。
在本申请实施例使用的术语是仅仅出于描述特定实施例的目的,而非旨在限制本申请实施例。在本申请实施例和所附权利要求书中所使用的单数形式的“一种”、“所述”和“该”也旨在包括多数形式,除非上下文清楚地表示其他含义。还应当理解,本文中使用的术语“和/或”是指并包含一个或多个相关联的列出项目的任何或所有可能组合。
下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本申请相一致的所有实施方式。相反,它们仅是如所附权利要求书中所详述的、本申请的一些方面相一致的装置和方法的例子。在本申请的描述中,需要理解的是,术语“第一”、“第二”、“第三”等仅用于区别类似的对象,而不必用于描述特定的顺序或先后次序,也不能理解为指示或暗示相对重要性。对于本领域的普通技术人员而言,可以根据具体情况理解上述术语在本申请中的具体含义。
此外,在本申请的描述中,除非另有说明,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。字符“/”一般表示前后关联对象是一种“或”的关系。
本申请实施例中提供的红外触控状态数据采集方法的应用环境包括机械臂平台、红外触控状态数据采集设备以及红外触摸框。
机械臂平台包括机械臂前端、机械臂控制器以及机械臂的驱动装置,机械臂的驱动装置与机械臂控制器连接,由机械臂控制器下达命令,机械臂的驱动装置接收命令后控制机械臂前端运作。
红外触控状态数据采集设备与机械臂平台进行通信连接,可选的,红外触控状态数据采 集设备通过有线串行接口与机械臂平台相连,用于向机械臂控制器传输控制指令,并获取机械臂前端的状态数据。本申请实施例中提供的红外触控状态数据采集方法可以由红外触控状态数据采集设备执行,该红外触控状态数据采集设备可以通过软件和/或硬件的方式实现,该红外触控状态数据采集设备可以是两个或多个物理实体构成,也可以是一个物理实体构成。红外触控状态数据采集设备可以为任何安装红外触控状态数据采集应用程序的电子设备,电子设备可以是电脑、手机、平板或交互平板等智能设备。
红外触摸框与红外触控状态数据采集设备通信连接,通信连接可以为有线连接和无线连接。红外触摸框包括光网层以及屏幕保护玻璃层。其中,红外触摸框内设置有相对的若干个红外线发射管和红外线接收管;红外线发射管发射红外线,红外线接收管接收对应的红外线,红外线发射管和红外线接收管之间的配合形成红外线探测网,即光网层,光网层位于屏幕保护玻璃层的表面正上方。当红外触摸框上有物体接触光网层,会对红外线进行遮挡,红外线接收管接收的红外线信号发生变化,红外触摸框将变化的红外线信号发送给红外触控状态数据采集设备,以供红外触控状态数据采集设备对红外信号进行采集,红外触控状态数据采集设备通过获取红外触摸框的红外信号和机械臂前端的状态数据,可以实现对红外触控状态的采集以及标定。
请参阅图1,其为本申请一个实施例提供的红外触控状态数据采集方法的流程示意图。本申请实施例提供的红外触控状态数据采集方法,包括如下步骤:
S10:控制机械臂前端向红外触摸框的预定方向移动。
在本申请实施例中,机械臂前端为能够遮挡红外信号的触摸物,可选的,机械臂前端为笔类触摸物;预定方向为朝向红外触摸框的方向。可以理解的是,预定方向可以根据实际需要进行调整,在一实施例中,在对一个触控点的触控状态数据进行采集和标定时,预定方向可以为垂直于红外触摸框的方向,在确定下一个触控点以进行触控状态数据采集和标定时,预定方向可以为与红外触摸框平行的方向等。
在对一个触控点的触控状态数据进行采集和标定时,机械臂前端刚开始距离红外触摸框有一定的高度,此时,控制机械臂前端沿着垂直于红外触摸框的方向移动,进而控制机械臂前端靠近红外触摸框。
可以理解的是,控制机械臂前端向红外触摸框的预定方向移动时,可以按照预设速度匀速移动,也可以按照预设加速度进行匀变速移动。
S20:获取机械臂前端在沿着预定方向移动过程中与红外触摸框的红外线接触时的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据。
在本申请实施例中,在机械臂前端按照预设速度匀速向红外触摸框的移动过程中,机械 臂前端会触碰到红外触摸框的红外线,从而对部分的红外线造成遮挡,产生相应的红外样本信号。红外触摸框按照预设时间间隔扫描是否有红外线被机械臂前端遮挡,从而获得若干帧红外样本信号。红外触控状态数据采集设备在获得红外触摸框中每一帧红外样本信号之后,均发送指令给机械臂平台;红外触控状态数据采集设备接收机械臂平台根据指令反馈的机械臂前端的触摸状态数据。
机械臂前端的触摸状态数据为机械臂前端与红外触摸框的红外线接触时的状态数据,可以通过机械臂平台自动测量并获取;机械臂前端的触摸状态数据包括机械臂前端的坐标位置数据、机械臂前端距离红外触摸框的高度数据、机械臂前端的宽度数据和高度数据以及机械臂前端的移动速度和加速度数据。
其中,以红外触摸框所在平面建立空间直角坐标系,空间直角坐标系的X轴方向和Y轴方向可以为红外触摸框的水平宽度方向和竖直高度方向,Z轴方向为与红外触摸框所在平面的垂直方向,坐标原点为红外触摸框所在平面的左下角方向某一固定点。获取机械臂前端的坐标位置数据(X,Y),同时可以获取机械臂前端距离红外触摸框的高度数据Z。当机械臂前端与红外线接触时,在光网层形成一个触摸点,机械臂前端的宽度数据和高度数据为触摸点的宽度数据和高度数据。此处提到的触摸点并不是数学或物理意义上的点,而是机械臂前端与红外触摸框的光网层接触时的机械臂前端的截面的形状,在一实施例中,可以近似理解为矩形或圆形。触摸点的宽度数据是指触摸点沿X轴方向的长度,触摸点的高度数据是指触模点沿Y轴方向的长度。
S30:将若干帧红外样本信号作为输入,将对应的机械臂前端的触摸状态数据作为输出,输入到红外触控状态数据采集模型进行训练学习,得到已训练的红外触控状态数据采集模型。
在本申请实施例中,红外触控状态数据采集模型可以为机器学习模型或神经网络模型,
红外触控状态数据采集模型的训练过程可以在红外触控状态数据采集设备中进行,也可以是在其他训练设备中进行,若在训练设备中进行,可以在训练完成后将已训练好的人脸识别模型参数移植入红外触控状态数据采集设备中。
S40:响应于对红外触摸框的触控操作,获取红外信号。
在本申请实施例中,当触摸物接触到红外触摸框的红外线时,检测到触摸物对红外触摸框的触控操作,相应触控位置的红外线会被遮挡,获取相应的红外信号。
S50:将红外信号输入至已训练的红外触控状态数据采集模型,获得红外触控状态数据。
在本申请实施例中,通过将红外信号输入至已训练的红外触控状态数据采集模型,已训练的红外触控状态数据采集模型即可输出红外触控状态数据。
可以理解的,当械臂前端的触摸状态数据包括机械臂前端的坐标位置数据、机械臂前端距离红外触摸框的高度数据以及机械臂前端的宽度数据和高度数据时,对应的,红外触控状态数据包括触摸物的坐标位置数据、触摸物距离红外触摸框的屏幕保护玻璃层的高度数据和/或触摸物的宽度和高度数据。其中,触摸物与红外线接触时,在光网层形成一个触摸点,触摸物的宽度数据和高度数据为触摸点的宽度数据和高度数据。此处提到的触摸点并不是数学或物理意义上的点,而是触摸无与红外触摸框的光网层接触时的触摸物的截面的形状,在一实施例中,可以近似理解为矩形或圆形。触摸点的宽度数据是指触摸点沿X轴方向的长度,触摸点的高度数据是指触模点沿Y轴方向的长度。
在本申请实施例中,通过已训练的红外触控状态数据采集模型输出的红外触控状态数据,可以直接得到触摸物的坐标位置数据,代替使用传统算法去计算触摸物的坐标位置,耗时短。可以直接得到触摸物距离红外触摸框的屏幕保护玻璃层的高度数据,节省了人工成本,效率高。可以直接得到触摸物的宽度和高度数据,根据触摸物的宽度和高度数据计算触摸物的面积,根据面积大小来自动识别触摸物的类别,例如,触摸物为采集笔时,可以识别粗笔和细笔等。
应用本申请实施例,通过控制机械臂前端向红外触摸框的预定方向移动;获取机械臂前端在沿着预定方向移动过程中与红外触摸框的红外线接触时的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据;将若干帧红外样本信号作为输入,将对应的机械臂前端的触摸状态数据作为输出,输入到红外触控状态数据采集模型进行训练学习,得到已训练的红外触控状态数据采集模型;响应于对红外触摸框的触控操作,获取红外信号;将红外信号输入至已训练的红外触控状态数据采集模型,获得红外触控状态数据,从而根据已训练的红外触控状态数据采集模型,自动获得红外触控状态数据,无需人工手动采集红外触控状态数据,提高了数据采集的效率和精度。进一步地,还可以对红外触控状态数据进行标定,节约标定时间。
在一个可选的实施例中,控制机械臂前端向红外触摸框的预定方向移动的步骤,包括步骤S11,步骤S11如下:
S11:获取机械臂的运动轨迹文件,根据运动轨迹文件向机械臂发送控制指令,以控制机械臂前端向红外触摸框的预定方向移动。
在本申请实施例中,机械臂的运动轨迹文件用于控制机械臂沿着规划的路径轨迹到达指定的目标点,包括机械臂运动过程中的起始点、终点以及若干个中间点。通过获取机械臂的运动轨迹文件,对机械臂的运动轨迹文件进行解析,根据解析结果向机械臂发送控制指令,以控制机械臂前端向红外触摸框的预定方向移动,从而实现机械臂前端自动准确地移动。
在一个可选的实施例中,红外触摸框包括光网层以及屏幕保护玻璃层;光网层位于屏幕保护玻璃层的表面正上方,步骤S20中获取机械臂前端在沿着预定方向移动过程中与红外触摸框的红外线接触时的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据的步骤,包括步骤S21,步骤S21如下:
S21:获取机械臂前端从首次与红外触摸框的光网层接触到与红外触摸框的屏幕保护玻璃层接触的移动过程中的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据。
在本申请实施例中,光网层是由红外触摸框的红外线发射管发射红外线,红外线接收管接收对应的红外线形成的红外线探测网,屏幕保护玻璃层是显示屏上的一层保护玻璃。在机械臂前端向红外触摸屏的屏幕保护玻璃层慢慢移动的过程中,机械臂前端会与红外触摸框的光网层接触,获取机械臂前端首次与红外触摸框的光网层接触时的一帧红外样本信号以及对应的机械臂前端的触摸状态数据。
在机械臂前端与红外触摸框的光网层首次接触之后,控制机械臂前端继续向红外触摸屏的屏幕保护玻璃层慢慢移动,机械臂前端遮挡红外线,获取多帧红外样本信号以及对应的机械臂前端的触摸状态数据。在继续移动机械臂前端的过程中,判断机械臂前端是否与红外触摸屏的屏幕保护玻璃层接触。在机械臂前端与红外触摸屏的屏幕保护玻璃层首次接触后,获取机械臂前端首次与红外触摸框的屏幕保护玻璃层接触时的一帧红外样本信号以及对应的机械臂前端的触摸状态数据。
通过获取机械臂前端从首次与红外触摸框的光网层接触到与红外触摸框的屏幕保护玻璃层接触的移动过程中的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据,可以自动快捷地获得红外触摸框的触摸高度数据。
在一个可选的实施例中,步骤S21中获取机械臂前端从首次与红外触摸框的光网层接触到与红外触摸框的屏幕保护玻璃层接触的移动过程中的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据的步骤,包括步骤S211,步骤S211如下:
S211:若检测到红外样本信号首次发生变化时,确定机械臂前端首次与红外触摸框的光网层首次接触。
在本申请实施例中,由于红外触摸框在工作时且机械臂前端未与红外触摸框的光网层接触时,红外触摸框接收到的红外样本信号是不变的,例如红外样本信号的信号强度值是一固定值。当机械臂前端首次与红外触摸框的光网层接触时,红外触摸框接收到的红外样本信号会发生变化,此时的红外样本信号的信号强度值小于固定值。通过检测到红外样本信号发生变化,可以快捷地确定机械臂前端首次与红外触摸框的光网层接触。
在一个可选的实施例中,机械臂前端为采集笔,采集笔朝向红外触控框的端部设置有压 力传感器;步骤S21中获取机械臂前端从首次与红外触摸框的光网层接触到与红外触摸框的屏幕保护玻璃层接触的移动过程中的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据的步骤,包括步骤S212,步骤S212如下:
S212:当检测到压力传感器的压力值首次发生变化时,确定机械臂前端与红外触摸框的屏幕保护玻璃层接触。
在本申请实施例中,在采集笔未与红外触摸框的屏幕保护玻璃层接触时,采集笔上的压力传感器显示的压力值为0。当采集笔与红外触摸框的屏幕保护玻璃层接触时,采集笔会受到屏幕保护玻璃层对它的反作用力,采集笔上的压力传感器检测到该反作用力,从而压力传感器的压力值发生变化,从而可以快捷地确定采集笔与红外触摸框的屏幕保护玻璃层接触。
在一个可选的实施例中,请参阅图2,步骤S21中获取机械臂前端从首次与红外触摸框的光网层接触到与红外触摸框的屏幕保护玻璃层接触的移动过程中的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据的步骤,包括步骤S213~S215,步骤S213~S215如下:
S213:控制机械臂前端相对于红外触摸框所在平面的X轴和Y轴方向的位置不变,控制机械臂前端沿着朝向红外触摸框的Z轴方向移动,获取当前触控点从首次与红外触摸框的光网层接触到与红外触摸框的屏幕保护玻璃层接触的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据。
在本申请实施例中,以红外触摸框所在平面建立空间直角坐标系,空间直角坐标系的X轴方向和Y轴方向可以为红外触摸框的水平宽度方向和竖直高度方向,Z轴方向为与红外触摸框所在平面的垂直方向,坐标原点为红外触摸框所在平面的左下角方向某一固定点。开始移动时,机械臂前端的坐标位置为(X1,Y1,Z1),保持机械臂前端的X轴和Y轴的坐标不变,只改变机械臂前端的Z轴的坐标,使得机械臂前端的Z轴的坐标慢慢变小,当机械臂前端首次与红外触摸框的光网层接触时,产生当前触控点(X1,Y1,Zm),获取当前触控点从首次与红外触摸框的光网层接触到与红外触摸框的屏幕保护玻璃层接触的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据。
S214:当机械臂前端与红外触摸框的屏幕保护玻璃层接触后,控制机械臂前端向远离红外触摸框的Z轴方向移动,直至机械臂前端不与红外触摸框的光网层接触为止。
在本申请实施例中,在机械臂前端与红外触摸框的屏幕保护玻璃层接触后,保持机械臂前端的X轴和Y轴的坐标不变,只改变机械臂前端的Z轴的坐标,使得机械臂前端的Z轴的坐标慢慢变大,直至机械臂前端不与红外触摸框的光网层接触为止。
S215:控制机械臂前端沿红外触摸框所在平面的X轴或Y轴方向移动,以改变机械臂前端相对于红外触控框X轴或者Y轴的坐标位置,控制机械臂前端沿着朝向红外触摸框的Z 轴方向移动,获取预设数量的触控点从首次与红外触摸框的光网层接触到与红外触摸框的屏幕保护玻璃层接触的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据。
在本申请实施例中,当获取完红外触摸框上当前触控点的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据之后,控制机械臂前端沿红外触摸框所在平面的X轴或Y轴方向移动,例如,机械臂前端的坐标位置变为(X1,Y2,Z1)或者(X2,Y1,Z1)或者(X2,Y2,Z1)。以开始移动时,机械臂前端的坐标位置为(X2,Y2,Z1)为例,保持机械臂前端的X轴和Y轴的坐标不变,只改变机械臂前端的Z轴的坐标,使得机械臂前端的Z轴的坐标慢慢变小,当机械臂前端首次与红外触摸框的光网层接触时,产生触控点(X2,Y2,Zm),获取触控点从首次与红外触摸框的光网层接触到与红外触摸框的屏幕保护玻璃层接触的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据。重复上述步骤,从而获取机械臂前端在不同的坐标位置与红外触摸框的红外线接触时的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据,提高数据采集的全面性。
在一个可选的实施例中,步骤S21中获取机械臂前端从首次与红外触摸框的光网层接触到与红外触摸框的屏幕保护玻璃层接触的移动过程中的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据的步骤,包括步骤S216,步骤S216如下:
S216:更换不同类型的机械臂前端;针对不同类型的机械臂前端,分别获取对应的从首次与红外触摸框的光网层接触到与红外触摸框的屏幕保护玻璃层接触的移动过程中的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据。
在本申请实施例中,在获取完红外触摸框上当前触控点的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据之后,可以更换机械臂前端,以获取下一个触控点的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据,从而提高数据采集的多样性。
下述为本申请装置实施例,可以用于执行本以上方法的内容。对于本申请装置实施例中未披露的细节,请参照上述方法的内容。
请参见图3,其示出了本申请实施例提供的红外触控状态数据采集装置的结构示意图。本申请实施例提供的红外触控状态数据采集装置3,包括:
移动控制模块31,用于控制机械臂前端向红外触摸框的预定方向移动;
样本信号获取模块32,用于获取机械臂前端在沿着预定方向移动过程中与红外触摸框的红外线接触时的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据;
采集模型获得模块33,用于将若干帧红外样本信号作为输入,将对应的机械臂前端的触摸状态数据作为输出,输入到红外触控状态数据采集模型进行训练学习,得到已训练的红外触控状态数据采集模型;
红外信号获取模块34,用于响应于对红外触摸框的触控操作,获取红外信号;
状态数据获得模块35,用于将红外信号输入至已训练的红外触控状态数据采集模型,获得红外触控状态数据。
应用本申请实施例,通过控制机械臂前端向红外触摸框的预定方向移动;获取机械臂前端在沿着预定方向移动过程中与红外触摸框的红外线接触时的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据;将若干帧红外样本信号作为输入,将对应的机械臂前端的触摸状态数据作为输出,输入到红外触控状态数据采集模型进行训练学习,得到已训练的红外触控状态数据采集模型;响应于对红外触摸框的触控操作,获取红外信号;将红外信号输入至已训练的红外触控状态数据采集模型,获得红外触控状态数据,从而根据已训练的红外触控状态数据采集模型,自动获得红外触控状态数据,无需人工手动采集红外触控状态数据,提高了数据采集的效率和精度;进一步地,还可以对红外触控状态数据进行标定,节约标定时间。
在本申请的一个实施例中,样本信号获取模块31,包括:
轨迹文件获取单元,用于获取机械臂的运动轨迹文件,根据运动轨迹文件向机械臂发送控制指令,以控制机械臂前端向红外触摸框的预定方向移动。
在本申请的一个实施例中,样本信号获取模块32,包括:
样本信号获取单元,用于获取机械臂前端从首次与红外触摸框的光网层接触到与红外触摸框的屏幕保护玻璃层接触的移动过程中的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据。
在本申请的一个实施例中,样本信号获取单元,包括:
样本信号检测单元,用于若检测到红外样本信号发生变化时,确定机械臂前端首次与红外触摸框的光网层首次接触。
在本申请的一个实施例中,样本信号获取单元,包括:
压力检测单元,用于当检测到压力传感器的压力值首次发生变化时,确定机械臂前端与红外触摸框的屏幕保护玻璃层接触。
在本申请的一个实施例中,样本信号获取单元,包括:
位置控制单元,用于控制机械臂前端相对于红外触摸框所在平面的X轴和Y轴方向的位置不变,控制机械臂前端沿着朝向红外触摸框的Z轴方向移动,获取当前触控点从首次与红外触摸框的光网层接触到与红外触摸框的屏幕保护玻璃层接触的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据;
第一移动方向控制单元,用于当机械臂前端与红外触摸框的屏幕保护玻璃层接触后,控 制机械臂前端向远离红外触摸框的Z轴方向移动,直至机械臂前端不与红外触摸框的光网层接触为止;
第一移动方向控制单元,用于控制机械臂前端沿红外触摸框所在平面的X轴或Y轴方向移动,以改变机械臂前端相对于红外触控框X轴或者Y轴的坐标位置,控制机械臂前端沿着朝向红外触摸框的Z轴方向移动,获取预设数量的触控点从首次与红外触摸框的光网层接触到与红外触摸框的屏幕保护玻璃层接触的若干帧红外样本信号以及对应的机械臂前端的触摸状态数据。
下述为本申请设备实施例,可以用于执行上述方法的内容。对于本申请设备实施例中未披露的细节,请参照上述方法的内容。
请参阅图4,本申请还提供一种电子设备300,电子设备可以为计算机、手机、平板电脑、交互平板等,在本申请的示例性实施例中,电子设备300为交互平板,交互平板可以包括:至少一个处理器301、至少一个存储器302,至少一个显示器,至少一个网络接口303,用户接口304以及至少一个通信总线305。
其中,用户接口304主要用于为用户提供输入的接口,获取用户输入的数据。可选的,用户接口还可以包括标准的有线接口、无线接口。
其中,网络接口303可选的可以包括标准的有线接口、无线接口(如WI-FI接口)。
其中,通信总线305用于实现这些组件之间的连接通信。
其中,处理器301可以包括一个或者多个处理核心。处理器利用各种接口和线路连接整个电子设备内的各个部分,通过运行或执行存储在存储器内的指令、程序、代码集或指令集,以及调用存储在存储器内的数据,执行电子设备的各种功能和处理数据。可选的,处理器可以采用数字信号处理(Digital Signal Processing,DSP)、现场可编程门阵列(Field-Programmable Gate Array,FPGA)、可编程逻辑阵列(Programmable Logic Array,PLA)中的至少一种硬件形式来实现。处理器可集成中央处理器(Central Processing Unit,CPU)、图像处理器(Graphics Processing Unit,GPU)和调制解调器等中的一种或几种的组合。其中,CPU主要处理操作系统、用户界面和应用程序等;GPU用于负责显示层所需要显示的内容的渲染和绘制;调制解调器用于处理无线通信。可以理解的是,上述调制解调器也可以不集成到处理器中,单独通过一块芯片进行实现。
其中,存储器302可以包括随机存储器(Random Access Memory,RAM),也可以包括只读存储器(Read-Only Memory)。可选的,该存储器包括非瞬时性计算机可读介质(non-transitory computer-readable storage medium)。存储器可用于存储指令、程序、代码、代码集或指令集。存储器可包括存储程序区和存储数据区,其中,存储程序区可存储用于实 现操作系统的指令、用于至少一个功能的指令(比如触控功能、声音播放功能、图像播放功能等)、用于实现上述各个方法实施例的指令等;存储数据区可存储上面各个方法实施例中涉及到的数据等。存储器可选的还可以是至少一个位于远离前述处理器的存储装置。如图4所示,作为一种计算机存储介质的存储器中可以包括操作系统、网络通信模块、用户接口模块、操作应用程序。
处理器可以用于调用存储器中存储的业务数据模型的生成方法的应用程序,并执行上述所示上述的方法步骤,执行过程可以参见上述所示的说明,在此不进行赘述。
本申请还提供一种计算机可读存储介质,其上储存有计算机程序,指令适于由处理器加载并执行上述所示上述的方法步骤,执行过程可以参见实施例所示的说明,在此不进行赘述。存储介质所在设备可以是个人计算机、笔记本电脑、智能手机、平板电脑等电子设备。
对于设备实施例而言,由于其基本对应于方法实施例,所以相关之处参见方法实施例的部分说明即可。以上所描述的设备实施例仅仅是示意性的,其中作为分离部件说明的组件可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本申请方案的目的。本领域普通技术人员在不付出创造性劳动的情况下,即可以理解并实施。
本领域内的技术人员应明白,本申请的实施例可提供为方法、系统、或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本申请是参照根据本申请实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中选定的功能的装置。这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中选定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或 其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中选定的功能的步骤。
在一个典型的配置中,计算设备包括一个或多个处理器(CPU)、输入/输出接口、网络接口和内存。
存储器可能包括计算机可读介质中的非永久性存储器,随机存取存储器(RAM)和/或非易失性内存等形式,如只读存储器(ROM)或闪存(flashRAM)。存储器是计算机可读介质的示例。
计算机可读介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。按照本文中的界定,计算机可读介质不包括暂存电脑可读媒体(transitorymedia),如调制的数据信号和载波。
还需要说明的是,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、商品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、商品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括要素的过程、方法、商品或者设备中还存在另外的相同要素。
以上仅为本申请的实施例而已,并不用于限制本申请。对于本领域技术人员来说,本申请可以有各种更改和变化。凡在本申请的精神和原理之内所作的任何修改、等同替换、改进等,均应包含在本申请的权利要求范围之内。

Claims (11)

  1. 一种红外触控状态数据采集方法,其中,包括如下步骤:
    控制机械臂前端向红外触摸框的预定方向移动;
    获取所述机械臂前端在沿着所述预定方向移动过程中与所述红外触摸框的红外线接触时的若干帧红外样本信号以及对应的所述机械臂前端的触摸状态数据;
    将所述若干帧红外样本信号作为输入,将对应的所述机械臂前端的触摸状态数据作为输出,输入到红外触控状态数据采集模型进行训练学习,得到已训练的红外触控状态数据采集模型;
    响应于对所述红外触摸框的触控操作,获取红外信号;
    将所述红外信号输入至已训练的所述红外触控状态数据采集模型,获得红外触控状态数据。
  2. 根据权利要求1所述的红外触控状态数据采集方法,其中:
    所述红外触摸框包括光网层以及屏幕保护玻璃层;所述光网层位于所述屏幕保护玻璃层的表面正上方;
    所述获取所述机械臂前端在沿着预定方向移动过程中与所述红外触摸框的红外线接触时的若干帧红外样本信号以及对应的所述机械臂前端的触摸状态数据的步骤,包括:
    获取所述机械臂前端从首次与所述红外触摸框的光网层接触到与所述红外触摸框的屏幕保护玻璃层接触的移动过程中的若干帧红外样本信号以及对应的所述机械臂前端的触摸状态数据。
  3. 根据权利要求2所述的红外触控状态数据采集方法,其中:
    所述获取所述机械臂前端从首次与所述红外触摸框的光网层接触到与所述红外触摸框的屏幕保护玻璃层接触的移动过程中的若干帧红外样本信号以及对应的所述机械臂前端的触摸状态数据的步骤,包括:
    当检测到所述红外样本信号首次发生变化时,确定所述机械臂前端首次与所述红外触摸框的光网层接触。
  4. 根据权利要求2所述的红外触控状态数据采集方法,其中:
    所述机械臂前端为采集笔,所述采集笔朝向红外触控框的端部设置有压力传感器;
    所述获取所述机械臂前端从首次与所述红外触摸框的光网层接触到与所述红外触摸框的屏幕保护玻璃层接触的移动过程中的若干帧红外样本信号以及对应的所述机械臂前端的触摸状态数据的步骤,包括:
    当检测到所述压力传感器的压力值首次发生变化时,确定所述机械臂前端与所述红外触摸框的屏幕保护玻璃层接触。
  5. 根据权利要求2所述的红外触控状态数据采集方法,其中:
    所述获取所述机械臂前端从首次与所述红外触摸框的光网层接触到与所述红外触摸框的屏幕保护玻璃层接触的移动过程中的若干帧红外样本信号以及对应的所述机械臂前端的触摸状态数据的步骤,包括:
    控制所述机械臂前端相对于所述红外触摸框所在平面的X轴和Y轴方向的位置不变,控制所述机械臂前端沿着朝向红外触摸框的Z轴方向移动,获取当前触控点从首次与所述红外触摸框的光网层接触到与所述红外触摸框的屏幕保护玻璃层接触的若干帧红外样本信号以及对应的所述机械臂前端的触摸状态数据;
    当所述机械臂前端与所述红外触摸框的屏幕保护玻璃层接触后,控制所述机械臂前端向远离所述红外触摸框的Z轴方向移动,直至所述机械臂前端不与所述红外触摸框的光网层接触为止;
    控制所述机械臂前端沿所述红外触摸框所在平面的X轴或Y轴方向移动,以改变所述机械臂前端相对于红外触控框X轴或者Y轴的坐标位置,控制所述机械臂前端沿着朝向红外触摸框的Z轴方向移动,获取预设数量的触控点从首次与所述红外触摸框的光网层接触到与所述红外触摸框的屏幕保护玻璃层接触的若干帧红外样本信号以及对应的所述机械臂前端的触摸状态数据。
  6. 根据权利要求2所述的红外触控状态数据采集方法,其中:
    所述获取所述机械臂前端从首次与所述红外触摸框的光网层接触到与所述红外触摸框的屏幕保护玻璃层接触的移动过程中的若干帧红外样本信号以及对应的所述机械臂前端的触摸状态数据的步骤,包括:
    更换不同类型的机械臂前端;针对不同类型的机械臂前端,分别获取对应的从首次与所述红外触摸框的光网层接触到与所述红外触摸框的屏幕保护玻璃层接触的移动过程中的若干帧红外样本信号以及对应的所述机械臂前端的触摸状态数据。
  7. 根据权利要求1至6中任一项权利要求所述的红外触控状态数据采集方法,其中:
    所述红外触控状态数据包括触摸物的坐标位置数据、触摸物距离所述红外触摸框的屏幕保护玻璃层的高度数据和/或触摸物的宽度和高度数据。
  8. 根据权利要求1至6中任一项权利要求所述的红外触控状态数据采集方法,其中:
    所述控制机械臂前端向红外触摸框的预定方向移动的步骤,包括:
    获取机械臂的运动轨迹文件,根据所述运动轨迹文件向所述机械臂发送控制指令,以控 制机械臂前端向红外触摸框的预定方向移动。
  9. 一种红外触控状态数据采集装置,其中,包括:
    移动控制模块,用于控制机械臂前端向红外触摸框的预定方向移动;
    样本信号获取模块,用于获取所述机械臂前端在沿着所述预定方向移动过程中与所述红外触摸框的红外线接触时的若干帧红外样本信号以及对应的所述机械臂前端的触摸状态数据;
    采集模型获得模块,用于将所述若干帧红外样本信号作为输入,将对应的所述机械臂前端的触摸状态数据作为输出,输入到红外触控状态数据采集模型进行训练学习,得到已训练的红外触控状态数据采集模型;
    红外信号获取模块,用于响应于对所述红外触摸框的触控操作,获取红外信号;
    状态数据获得模块,用于将所述红外信号输入至已训练的所述红外触控状态数据采集模型,获得红外触控状态数据。
  10. 一种计算机设备,包括:处理器、存储器以及存储在所述存储器中并可在所述处理器上运行的计算机程序,其中,所述处理器执行所述计算机程序时实现如权利要求1至8任意一项所述方法的步骤。
  11. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,其中,所述计算机程序被处理器执行时实现如权利要求1至8任意一项所述方法的步骤。
PCT/CN2023/114188 2022-08-22 2023-08-22 红外触控状态数据采集方法、装置、计算机设备以及介质 WO2024041508A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211009336.5A CN116700584A (zh) 2022-08-22 2022-08-22 红外触控状态数据采集方法、装置、计算机设备以及介质
CN202211009336.5 2022-08-22

Publications (1)

Publication Number Publication Date
WO2024041508A1 true WO2024041508A1 (zh) 2024-02-29

Family

ID=87822750

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/114188 WO2024041508A1 (zh) 2022-08-22 2023-08-22 红外触控状态数据采集方法、装置、计算机设备以及介质

Country Status (2)

Country Link
CN (1) CN116700584A (zh)
WO (1) WO2024041508A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102109927A (zh) * 2009-12-25 2011-06-29 康佳集团股份有限公司 一种红外触摸框的控制方法、系统及触摸屏设备
US20140264036A1 (en) * 2013-03-18 2014-09-18 Iman Hung Assembling Infrared Touch Control Module
CN106557209A (zh) * 2016-10-28 2017-04-05 青岛海信电器股份有限公司 红外触摸屏触控信号的处理方法、装置及终端设备
CN112799547A (zh) * 2021-01-26 2021-05-14 广州创知科技有限公司 红外触摸屏的触摸定位方法、模型训练方法、装置、设备及介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102109927A (zh) * 2009-12-25 2011-06-29 康佳集团股份有限公司 一种红外触摸框的控制方法、系统及触摸屏设备
US20140264036A1 (en) * 2013-03-18 2014-09-18 Iman Hung Assembling Infrared Touch Control Module
CN106557209A (zh) * 2016-10-28 2017-04-05 青岛海信电器股份有限公司 红外触摸屏触控信号的处理方法、装置及终端设备
CN112799547A (zh) * 2021-01-26 2021-05-14 广州创知科技有限公司 红外触摸屏的触摸定位方法、模型训练方法、装置、设备及介质

Also Published As

Publication number Publication date
CN116700584A (zh) 2023-09-05

Similar Documents

Publication Publication Date Title
US8619049B2 (en) Monitoring interactions between two or more objects within an environment
US9612675B2 (en) Emulating pressure sensitivity on multi-touch devices
JP5738707B2 (ja) タッチパネル
JP5213183B2 (ja) ロボット制御システム及びロボット制御プログラム
US9886190B2 (en) Gesture discernment and processing system
JP2013539580A (ja) デバイス上の動き制御方法及び装置
EP2984545B1 (en) Virtual touch screen
JP2003280785A (ja) 画像表示処理装置、および画像表示処理方法、並びにコンピュータ・プログラム
CN102568253A (zh) 电子白板中图形的几何特征显示方法及装置
JP2017506399A (ja) 改善されたタッチスクリーン精度のためのシステムおよび方法
US20130321303A1 (en) Touch detection
CN106406572A (zh) 光标的控制方法和装置
WO2024041508A1 (zh) 红外触控状态数据采集方法、装置、计算机设备以及介质
CN108604142B (zh) 一种触屏设备操作方法及触屏设备
CN103279304B (zh) 一种显示选中图标的方法、装置及移动设备
US9927917B2 (en) Model-based touch event location adjustment
CN103412724A (zh) 一种移动终端及其操控方法
JP6141290B2 (ja) マルチポインタ間接入力装置の加速度によるインターラクション
CN112416115A (zh) 一种用于控件交互界面中进行人机交互的方法与设备
CN102955601B (zh) 触控面板的3d感测方法及系统
TWI664997B (zh) 電子設備及玩具控制方法
JP2018185780A (ja) 対話機能を実行するための電子装置及び方法
CN115471852A (zh) 触摸识别方法、触摸设备、存储介质以及计算机设备
CN114546146A (zh) 触摸屏的触摸响应延时测量方法、装置、系统及交互平板
US10423314B2 (en) User interface with quantum curves and quantum arcs

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23856598

Country of ref document: EP

Kind code of ref document: A1