CN117608424B - Touch knob screen management and control system and method based on Internet of things - Google Patents

Touch knob screen management and control system and method based on Internet of things Download PDF

Info

Publication number
CN117608424B
CN117608424B CN202410096122.9A CN202410096122A CN117608424B CN 117608424 B CN117608424 B CN 117608424B CN 202410096122 A CN202410096122 A CN 202410096122A CN 117608424 B CN117608424 B CN 117608424B
Authority
CN
China
Prior art keywords
control surface
surface layer
touch
gui
object information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410096122.9A
Other languages
Chinese (zh)
Other versions
CN117608424A (en
Inventor
王江昆
季中源
谈雨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Jinhua Electronics Co ltd
Original Assignee
Jiangsu Jinhua Electronics Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Jinhua Electronics Co ltd filed Critical Jiangsu Jinhua Electronics Co ltd
Priority to CN202410096122.9A priority Critical patent/CN117608424B/en
Publication of CN117608424A publication Critical patent/CN117608424A/en
Application granted granted Critical
Publication of CN117608424B publication Critical patent/CN117608424B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01HELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
    • H01H19/00Switches operated by an operating part which is rotatable about a longitudinal axis thereof and which is acted upon directly by a solid body external to the switch, e.g. by a hand
    • H01H19/02Details
    • H01H19/10Movable parts; Contacts mounted thereon
    • H01H19/14Operating parts, e.g. turn knob
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/38Creation or generation of source code for implementing user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/4401Bootstrapping
    • G06F9/4418Suspend and resume; Hibernate and awake
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/02Power saving arrangements
    • H04W52/0209Power saving arrangements in terminal devices
    • H04W52/0251Power saving arrangements in terminal devices using monitoring of local events, e.g. events related to user activity
    • H04W52/0254Power saving arrangements in terminal devices using monitoring of local events, e.g. events related to user activity detecting a user operation or a tactile contact or a motion of the device

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Computer Security & Cryptography (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a touch knob screen control system and method based on the Internet of things, and belongs to the technical field of touch knobs. The device comprises a knob piece, a touch screen piece and a wireless networking piece, wherein the knob piece is designed around the knob piece, the knob piece realizes knob and pressing operation through a mechanical knob encoder, the touch screen piece supports touch operation and function display, and the wireless networking piece periodically switches the equipment end function awakening condition between an active state and a sleep state; on the software, the function services on the software are synchronously fed back through interface instructions, the calling interfaces of different function services are subjected to driving registration, and the control objects are subjected to picture rendering; algorithmically, adopting lightweight algorithm scheduling to perform first and second display optimization on the UI control surface layer and the GUI graphical object respectively; furthermore, by combining software and hardware, analysis of user operation behaviors is realized, spontaneous function service optimization scheduling prediction is performed, so that probability of misoperation of a user is reduced, and interaction experience is simplified.

Description

Touch knob screen management and control system and method based on Internet of things
Technical Field
The invention relates to the technical field of touch knobs, in particular to a touch knob screen control system and method based on the Internet of things.
Background
With the popularization of smart home, more and more home appliances begin to use man-machine interaction technology, and designers need to consider how to make users more easily understand and use the home appliances, so as to improve the usability and user experience of the home appliances; in some from having knob controlling device's intelligent household electrical appliances, both compromise operation functions such as knob, press and touch, also make knob controlling device size design little and pleasing to the eye, experience the in-process at small-size knob controlling device, not only produce the maloperation easily, still make human-computer interaction become more complicated, be unfavorable for user's intelligent interactive experience.
Disclosure of Invention
The invention aims to provide a touch knob screen control system and method based on the Internet of things, which are used for solving the problems in the background technology.
In order to solve the technical problems, the invention provides the following technical scheme:
touch knob screen management and control system based on thing networking, this system includes: the touch type knob screen assembly module, the function layer library module, the instruction information module and the display optimization module;
the touch knob screen assembly module comprises a knob piece, a touch screen piece and a wireless networking piece, wherein the knob piece, the touch screen piece and the wireless networking piece are integrated by hardware of the touch knob screen assembly module, and rotation operation, pressing operation and touch operation are provided through the knob piece and the touch screen piece;
The function layer library module designs a graphical user interface describing functions based on functions realized by the equipment end, and a user locks different GUI graphical objects through rotation operation, pressing operation and touch operation to realize function requirements; rendering and manufacturing a UI control interface based on GUI graphic objects in different functional scenes to generate a UI control surface layer;
the instruction information module monitors operation behaviors perceived by a background program based on instructions of an interface of the Internet of things, wakes up each function service of the equipment end through the instructions, and displays functions; establishing a characterization object information set of the operation behaviors, and determining each single-step operation object information by identifying an execution object of the operation behaviors;
the display optimization module is used for carrying out functional display scheduling through analyzing the operation behaviors corresponding to the functional service based on the operation behaviors, wherein the functional display scheduling comprises a first functional display scheduling and a second functional display scheduling, the object of the first functional display scheduling is a UI control surface layer, and the object of the second functional display scheduling is a GUI graphic object.
Further, the touch knob screen assembly module further comprises a hardware integrated unit and an operation sensing unit;
The hardware integrated unit is used for integrating a knob piece, a touch screen piece and a wireless networking piece through hardware integration, wherein the knob piece supports rotation regulation and press regulation, the touch screen piece supports touch regulation and function display, and the wireless networking piece supports Internet of things connection;
the operation sensing unit is used for sensing the operation behaviors of the user by the background program, the operation behaviors comprise W operation steps, the W operation steps are arranged according to the sequence of operation from front to back to form one operation behavior, one operation step corresponds to one of rotation operation, pressing operation and touch operation, and the operation behaviors corresponding to the W operation steps are formed by reciprocating intersection among the rotation operation, the pressing operation and the touch operation.
Further, the function layer library module further comprises a GUI graphic object unit and a UI control unit;
the GUI graphic object unit designs a graphic user interface according to the function realized by the equipment end, generates a GUI graphic object library set and records asWherein->Respectively representing 1,2, & gt, n GUI graphic objects, wherein one GUI graphic object corresponds to one function of the realization equipment end;
The UI control unit renders and makes a UI control interface according to the GUI graphic object library, and displays GUI graphs corresponding to different functions of the equipment end in the UI control interfaceGenerating a UI control surface layer by the shape object, and recording asWherein->Represents the i-th UI control layer, +.>Represents the jth GUI graphical object contained in the ith UI control surface layer, and m represents the number of the mth GUI graphical object in the ith UI control surface layer.
Further, the instruction information module further comprises an instruction awakening unit and an object information unit;
the instruction awakening unit is used for monitoring the operation behaviors perceived by the background program through a wireless networking piece connected with the support Internet of things, sending the instruction of the operation behaviors through the wireless networking piece when the operation behaviors are monitored, awakening each function service of the equipment end, and displaying the functions of each function service through the touch screen piece; when the non-operation behavior is monitored, the wireless networking piece is in a sleep state, so that each function service of the equipment end cannot be awakened, and the function display of each function service cannot be performed;
the object information unit is used for establishing a characterization object information set of operation behaviors, wherein the characterization object information set comprises single-step operation object information of W operation steps sequentially executed, the single-step operation refers to one operation mode of rotation operation, pressing operation and touch operation, the rotation operation supports regulation of a UI control surface layer, the pressing operation and the touch operation support regulation of a GUI graphic object, the pressing operation supports option determination of the GUI graphic object, and the touch operation supports option switching of the GUI graphic object; caching each single-step operation object information, and recording a characterization object information set of the operation behaviors as Wherein->Characterization object information set representing the y-th operation behavior,/->Information indicating the xth single step operation object under the xth operation action, and +.>,/>,/>Single step operation object information corresponding to one operation mode of rotation operation, pressing operation and touch operation is represented, the single step operation object information comprises a UI control surface layer and a GUI graphic object, and +_>UI control surface layer corresponding to the indication rotation operation, < ->And->GUI graphical objects corresponding to the pressing operation and the touching operation are respectively represented.
Further, the display optimization module further comprises a first scheduling unit and a second scheduling unit;
the first scheduling unit is used for representing the object information set in the operation behaviorIn the method, each UI control surface layer is intercepted, and when single-step operation object information is intercepted +.>Is->When in use, automatically intercept +.>A corresponding UI control surface layer; when automatically intercepting +.>The corresponding UI control surface layer is +.>At the time of the operation, characterizing object information set +.>In the process, the UI control surface layer is extracted +.>The former is +.>UI control surface layer corresponding to single step operation object information of (2), and the latter is +.>UI control surface layer corresponding to single-step operation object information; in the characteristic object information set of each operation behavior, the overall UI control surface layer is +. >When the method is used, all UI control surface layer results are correspondingly extracted, two UI control surface layers with the names of the first UI control surface layer and the second UI control surface layer which are arranged according to the number of occurrence of the second UI control surface layer are selected from the corresponding extracted all UI control surface layer results, and the first UI control surface layer and the second UI control surface layer which are arranged according to the number of occurrence of the second UI control surface layer are regarded as UI control surface layers +.>The first function display schedule object at the time is recorded asWherein->And->Respectively are provided withRepresenting an a-th UI control surface layer and a b-th UI control surface layer, wherein a is not equal to b;
the rescheduling unit is used for displaying the scheduling object in the first functionOne UI control surface layer is marked as SL and +.>The method comprises the steps of carrying out a first treatment on the surface of the Characterization object information set at operation behavior +.>In the method, when the single-step operation object information is intercepted as the UI control surface layer SL, the one after the UI control surface layer SL reaches the UI control surface layer SL is +.>Picking +.>And->GUI graphic objects corresponding to the two single step operation fragments are formed, the total number Q of the single step operation fragments is counted, and any single step operation fragment is marked as +.>And->Q represents the number of the single-step operation fragment; after the first time of function display scheduling object determination, performing re-function display scheduling object determination analysis, and arbitrarily selecting a GUI graphic object in the UI control surface layer SL, which is marked as +. >R represents a GUI graphic object +.>Calculating the number of GUI graphic object +.>The specific calculation formula is as follows:
wherein,representing GUI graphical objects +.>Is>Representing if GUI graphical object->Belonging to->Let->Otherwise, let
Calculating the rescheduling index of each GUI graphic object in the UI control surface layer SL, arranging the rescheduling indexes in the sequence from large to small to form a rescheduling object for rescheduling function display, and recording as
According to the system module introduction, in some intelligent household appliance interaction scenes with a knob control device, such as a steaming and baking integrated machine, different functional interactions of the steaming and baking integrated machine are realized through knob, pressing and touching; meanwhile, in the design of the knob control device, misoperation can be avoided and interaction process experience is simplified through software and hardware cooperative control; on hardware, revolve aroundThe buttons, the touch screen piece and the wireless networking piece are designed; the knob piece realizes knob operation through a mechanical knob encoder, the working mode of the encoder adopts a GPIO interrupt mode, the MCN of a timer without the encoder mode is supported, meanwhile, the knob piece is also provided with a mechanical button switch, the interface instruction of LVGL on software is called through identifying the mechanical button pressing state, the function service on the software is synchronously fed back, and the pressing type supports long-time pressing, short-time pressing, self-definition and the like; the touch screen piece uses a QSPI liquid crystal screen to support touch types such as clicking, sliding and the like, and functional services on the software can be synchronously fed back by calling interface instructions of LVGL on the software; the wireless networking piece periodically switches the equipment end function awakening condition between an active state and a sleep state, and the radio frequency related circuit is in a closed state in the sleep state so as to reduce power consumption; on software, driving registration is carried out on calling interfaces of different function services through a function layer library module, and picture rendering is carried out on control objects of different services; algorithmically, optimizing through a display optimization module, respectively performing first display optimization and second display optimization on a UI control surface layer and a GUI graphical object to improve the intellectualization of the algorithm, wherein a lightweight algorithm scheduling mode is adopted, namely, the UI control surface layer is extracted The former is +.>UI control surface layer corresponding to single step operation object information of (2), and the latter is +.>The UI control surface layer corresponding to the single-step operation object information is subjected to light simulation analysis to predict the operation behavior of the user, the first functional display scheduling is performed, the next operation of the user is further predicted by quantifying the single-step operation fragments, the second functional display scheduling is performed, and the larger the second scheduling index is, the larger the probability that the GUI graphic object is selected is; meanwhile, the storage area can be divided in software for storing the optimized result of the first and the second function display schedules,and the function display is carried out, and then the spontaneous function service optimization scheduling prediction is carried out by combining the actual operation of the user, so that the probability of misoperation of the user can be reduced, and the user interaction experience is simplified.
A touch knob screen control method based on the Internet of things comprises the following steps:
step S100: the method comprises the steps that a user operates a touch knob screen and perceives operation behaviors of the user, wherein the touch knob screen supports rotation operation, pressing operation and touch operation;
step S200: based on the functions realized by the equipment end, designing a graphical user interface for describing the functions, and locking different GUI graphical objects by a user through rotation operation, pressing operation and touch operation so as to realize function services; rendering and manufacturing a UI control interface based on GUI graphic objects in different functional scenes to generate a UI control surface layer;
Step S300: based on an instruction of an interface of the Internet of things, monitoring operation behaviors perceived by a background program, waking up each function service of the equipment end through the instruction, and performing function display; establishing a characterization object information set of the operation behaviors, and determining each single-step operation object information by identifying an execution object of the operation behaviors;
step S400: and carrying out functional display scheduling by analyzing the operation behaviors corresponding to the functional service based on the operation behaviors, wherein the functional display scheduling comprises a first functional display scheduling and a second functional display scheduling, the object of the first functional display scheduling is a UI control surface layer, and the object of the second functional display scheduling is a GUI graphic object.
Further, the specific implementation process of the step S100 includes:
step S101: the rotary knob piece, the touch screen piece and the wireless networking piece are integrated through hardware, the rotary knob piece supports rotary regulation and pressing regulation, the touch screen piece supports touch regulation and function display, and the wireless networking piece supports Internet of things connection;
step S102: the method comprises the steps of sensing operation behaviors of a user by a background program, wherein the operation behaviors comprise W operation steps, the W operation steps are arranged according to a sequence from first to last to form one operation behavior, one operation step corresponds to one operation mode of rotation operation, pressing operation and touch operation, and the operation behaviors corresponding to the W operation steps are formed by reciprocating intersection among the rotation operation, the pressing operation and the touch operation.
Further, the specific implementation process of the step S200 includes:
step S201: according to the function service realized by the equipment end, designing a graphical user interface, generating a GUI graphical object library set, and recording asWherein->Respectively representing 1,2, n GUI graphical objects, and one GUI graphical object corresponds to one function service of the implementation device side;
step S202: rendering and making a UI control interface according to the GUI graphic object library set, and displaying GUI graphic objects corresponding to different function services of the equipment end in the UI control interface to generate a UI control surface layer which is recorded asWherein->Represents the i-th UI control layer, +.>Represents the jth GUI graphical object contained in the ith UI control surface layer, and m represents the number of the mth GUI graphical object in the ith UI control surface layer.
Further, the implementation process of the step S300 includes:
step S301: the method comprises the steps that operation behaviors perceived by a background program are monitored through a wireless networking piece connected with the support Internet of things, when the operation behaviors are monitored, the wireless networking piece sends an operation behavior instruction, each function service at the equipment end is awakened, and function display of each function service is carried out through a touch screen piece; when the non-operation behavior is monitored, the wireless networking piece is in a sleep state, so that each function service of the equipment end cannot be awakened, and the function display of each function service cannot be performed;
Step S302: establishing a characterization object information set of operation behaviors, wherein the characterization object information set comprises single-step operation object information sequentially executed by W operation steps, the single-step operation is one operation mode of rotation operation, pressing operation and touch operation, the rotation operation supports regulation of a UI control surface layer, the pressing operation and the touch operation support regulation of a GUI graphic object, the pressing operation supports option determination of the GUI graphic object, and the touch operation supports option switching of the GUI graphic object; caching each single-step operation object information, and recording a characterization object information set of the operation behaviors asWherein->Characterization object information set representing the y-th operation behavior,/->Information indicating the xth single step operation object under the xth operation action, and +.>,/>,/>Single step operation object information corresponding to one operation mode of rotation operation, pressing operation and touch operation is represented, the single step operation object information comprises a UI control surface layer and a GUI graphic object, and +_>UI control surface layer corresponding to the indication rotation operation, < ->And->GUI graphical objects corresponding to the pressing operation and the touching operation are respectively represented.
Further, the specific implementation process of the step S400 includes:
Step S401: characterizing object information sets in operational behaviorIn the method, each UI control surface layer is intercepted, and when single-step operation object information is intercepted +.>Is->When in use, automatically intercept +.>A corresponding UI control surface layer; when automatically intercepting +.>The corresponding UI control surface layer is +.>At the time of the operation, characterizing object information set +.>In the process, the UI control surface layer is extracted +.>The former is +.>UI control surface layer corresponding to single step operation object information of (2), and the latter is +.>UI control surface layer corresponding to single-step operation object information; at each operation behaviorIs characterized in that the information set of the characterization object of the UI control surface layer is +.>When the method is used, all UI control surface layer results are correspondingly extracted, two UI control surface layers with the names of the first UI control surface layer and the second UI control surface layer which are arranged according to the number of occurrence times from big to small are selected from all UI control surface layer results, and the first UI control surface layer and the second UI control surface layer which are ranked are used as UI control surface layersThe first function display schedule object at that time is marked as +.>Wherein->And->Respectively representing an a-th UI control surface layer and a b-th UI control surface layer, wherein a is not equal to b;
step S402: displaying schedule objects at first functionAny UI control surface layer is marked as SL and The method comprises the steps of carrying out a first treatment on the surface of the Characterization object information set at operation behavior +.>In the method, when the single-step operation object information is intercepted as the UI control surface layer SL, the one after the UI control surface layer SL reaches the UI control surface layer SL is +.>Picking +.>And->GUI graphic objects corresponding to the two single step operation fragments are formed, the total number Q of the single step operation fragments is counted, and any single step operation fragment is marked as +.>And->Q represents the number of the single-step operation fragment; after the first time of function display scheduling object determination, performing re-function display scheduling object determination analysis, and arbitrarily selecting a GUI graphic object in the UI control surface layer SL, which is marked as +.>R represents a GUI graphic object +.>Calculating the number of GUI graphic object +.>The specific calculation formula is as follows:
wherein,representing GUI graphical objects +.>Is>Representing if GUI graphical object->Belonging to->Let->Otherwise, let
Calculating the rescheduling index of each GUI graphic object in the UI control surface layer SL, arranging the rescheduling indexes in the sequence from large to small to form a rescheduling object for rescheduling function display, and recording as
Compared with the prior art, the invention has the following beneficial effects: according to the touch knob screen management and control system and method based on the Internet of things, the knob, the touch screen and the wireless networking piece are designed around the knob piece, the knob piece realizes knob and pressing operation through the mechanical knob encoder, the touch screen piece supports touch operation and function display, and the wireless networking piece periodically switches the equipment end function awakening condition between an active state and a sleep state; on the software, the function services on the software are synchronously fed back through interface instructions, the calling interfaces of different function services are subjected to driving registration, and the control objects are subjected to picture rendering; algorithmically, adopting lightweight algorithm scheduling to perform first and second display optimization on the UI control surface layer and the GUI graphical object respectively; furthermore, by combining software and hardware, analysis of user operation behaviors is realized, spontaneous function service optimization scheduling prediction is performed, so that probability of misoperation of a user is reduced, and interaction experience is simplified.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate the invention and together with the embodiments of the invention, serve to explain the invention. In the drawings:
FIG. 1 is a schematic structural diagram of a touch knob screen control system based on the Internet of things of the invention;
fig. 2 is a schematic diagram of steps of a touch knob screen control method based on the internet of things.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1-2, the present invention provides the following technical solutions:
referring to fig. 1, in a first embodiment: the utility model provides a touch knob screen management and control system based on thing networking, this system includes: the touch type knob screen assembly module, the function layer library module, the instruction information module and the display optimization module;
The touch knob screen assembly module comprises a knob piece, a touch screen piece and a wireless networking piece, wherein the knob piece, the touch screen piece and the wireless networking piece are integrated by hardware of the touch knob screen assembly module, and rotation operation, pressing operation and touch operation are provided through the knob piece and the touch screen piece;
the touch knob screen assembly module further comprises a hardware integrated unit and an operation sensing unit;
the hardware integrated unit is used for integrating a knob piece, a touch screen piece and a wireless networking piece through hardware integration, wherein the knob piece supports rotation regulation and press regulation, the touch screen piece supports touch regulation and function display, and the wireless networking piece supports Internet of things connection;
the operation sensing unit is used for sensing the operation behaviors of the user by the background program, wherein the operation behaviors comprise W operation steps, the W operation steps are arranged according to the sequence of operation from first to last to form one operation behavior, one operation step corresponds to one of rotation operation, pressing operation and touch operation, and the operation behaviors corresponding to the W operation steps are formed by reciprocating intersection among the rotation operation, the pressing operation and the touch operation;
the function layer library module is used for designing a graphical user interface for describing functions based on functions realized by the equipment end, and a user locks different GUI graphical objects through rotation operation, pressing operation and touch operation so as to realize function requirements; rendering and manufacturing a UI control interface based on GUI graphic objects in different functional scenes to generate a UI control surface layer;
The function layer library module further comprises a GUI graphic object unit and a UI control unit;
GUI graphic object unit, according to the function realized by the equipment end, designing graphic user interface, generating GUI graphic object library set, and recording asWherein->Respectively representing 1,2, & gt, n GUI graphic objects, wherein one GUI graphic object corresponds to one function of the realization equipment end;
the UI control unit is used for rendering and manufacturing a UI control interface according to the GUI graphic object library set, displaying GUI graphic objects corresponding to different functions of the equipment end in the UI control interface, generating a UI control surface layer and recording as followsWherein->Represents the i-th UI control layer, +.>Representing a jth GUI graphical object contained in the ith UI control surface layer, and m represents the number of the mth GUI graphical object in the ith UI control surface layer;
the instruction information module is used for monitoring the operation behavior perceived by the background program based on the instruction of the interface of the Internet of things, waking up each function service of the equipment end through the instruction and displaying the functions; establishing a characterization object information set of the operation behaviors, and determining each single-step operation object information by identifying an execution object of the operation behaviors;
the instruction information module further comprises an instruction awakening unit and an object information unit;
The instruction awakening unit is used for monitoring the operation behaviors perceived by the background program through a wireless networking piece connected with the support Internet of things, sending the instruction of the operation behaviors through the wireless networking piece when the operation behaviors are monitored, awakening each function service of the equipment end, and displaying the functions of each function service through the touch screen piece; when the non-operation behavior is monitored, the wireless networking piece is in a sleep state, so that each function service of the equipment end cannot be awakened, and the function display of each function service cannot be performed;
the object information unit is used for establishing a characterization object information set of operation behaviors, wherein the characterization object information set comprises single-step operation object information of W operation steps sequentially executed, the single-step operation refers to one operation mode of rotation operation, pressing operation and touch operation, the rotation operation supports regulation of a UI control surface layer, the pressing operation and the touch operation support regulation of a GUI graphic object, the pressing operation supports option determination of the GUI graphic object, and the touch operation supports option switching of the GUI graphic object; caching the single-step operation object information, and recording the characterization object information set of the operation behavior asWherein- >Characterization object information set representing the y-th operation behavior,/->Information indicating the xth single step operation object under the xth operation action, and +.>,/>,/>Indicating the rotation operation,Single step operation object information corresponding to one operation mode of pressing operation and touch operation, wherein the single step operation object information comprises a UI control surface layer and a GUI graphic object, and +.>UI control surface layer corresponding to the indication rotation operation, < ->And->GUI graphical objects corresponding to the pressing operation and the touching operation are respectively represented;
the display optimization module is used for carrying out functional display scheduling by analyzing the operation behaviors corresponding to the functional service when the operation behaviors are monitored, wherein the functional display scheduling comprises first functional display scheduling and second functional display scheduling, an object of the first functional display scheduling is a UI control surface layer, and an object of the second functional display scheduling is a GUI graphic object;
the display optimization module further comprises a first scheduling unit and a second scheduling unit;
a first scheduling unit for representing object information set in operation behaviorIn the method, each UI control surface layer is intercepted, and when single-step operation object information is intercepted +.>Is->When in use, automatically intercept +.>A corresponding UI control surface layer; when automatically intercepting +. >The corresponding UI control surface layer is +.>At the time of the operation, characterizing object information set +.>In the process, the UI control surface layer is extracted +.>The former is +.>UI control surface layer corresponding to single step operation object information of (2), and the latter is +.>UI control surface layer corresponding to single-step operation object information; in the characteristic object information set of each operation behavior, the overall UI control surface layer is +.>When the method is used, all UI control surface layer results are correspondingly extracted, two UI control surface layers with the names of the first UI control surface layer and the second UI control surface layer which are arranged according to the number of occurrence of the second UI control surface layer are selected from the corresponding extracted all UI control surface layer results, and the first UI control surface layer and the second UI control surface layer which are arranged according to the number of occurrence of the second UI control surface layer are regarded as UI control surface layers->The first function display schedule object at the time is recorded asWherein->And->Respectively representing an a-th UI control surface layer and a b-th UI control surface layer, wherein a is not equal to b;
a rescheduling unit for displaying the scheduling object in the first functionEither one is taken outUI control surface layer, marked as SL, and +.>The method comprises the steps of carrying out a first treatment on the surface of the Characterization object information set at operation behavior +.>In the method, when the single-step operation object information is intercepted as the UI control surface layer SL, the one after the UI control surface layer SL reaches the UI control surface layer SL is +.>Picking +. >And->GUI graphic objects corresponding to the two single step operation fragments are formed, the total number Q of the single step operation fragments is counted, and any single step operation fragment is marked as +.>And->Q represents the number of the single-step operation fragment; after the first time of function display scheduling object determination, performing re-function display scheduling object determination analysis, and arbitrarily selecting a GUI graphic object in the UI control surface layer SL, which is marked as +.>R represents a GUI graphic object +.>Calculating the number of GUI graphic object +.>The specific calculation formula is as follows:
wherein,representing GUI graphical objects +.>Is>Representing if GUI graphical object->Belonging to->Let->Otherwise, let
Calculating the rescheduling index of each GUI graphic object in the UI control surface layer SL, arranging the rescheduling indexes in the sequence from large to small to form a rescheduling object for rescheduling function display, and recording as
Referring to fig. 2, in the second embodiment: the utility model provides a touch knob screen control method based on the Internet of things, which comprises the following steps:
step S100: the user operates the touch knob screen and perceives the operation behavior of the user, and the touch knob screen supports rotation operation, pressing operation and touch operation;
Specifically, the knob piece, the touch screen piece and the wireless networking piece are integrated through hardware, the knob piece supports rotation regulation and press regulation, the touch screen piece supports touch regulation and function display, and the wireless networking piece supports Internet of things connection;
the method comprises the steps that a background program senses operation behaviors of a user, wherein the operation behaviors comprise W operation steps, the W operation steps are arranged according to an operation sequence from first to last to form one operation behavior, one operation step corresponds to one operation mode of rotation operation, pressing operation and touch operation, and the operation behaviors corresponding to the W operation steps are formed by reciprocating intersection among the rotation operation, the pressing operation and the touch operation;
step S200: based on the functions realized by the equipment end, designing a graphical user interface for describing the functions, and locking different GUI graphical objects by a user through rotation operation, pressing operation and touch operation so as to realize function services; rendering and manufacturing a UI control interface based on GUI graphic objects in different functional scenes to generate a UI control surface layer;
specifically, according to the function service realized by the equipment end, a graphic user interface is designed, a GUI graphic object library set is generated and recorded as Wherein->Respectively representing 1,2, n GUI graphical objects, and one GUI graphical object corresponds to one function service of the implementation device side;
rendering and making a UI control interface according to the GUI graphic object library set, and displaying GUI graphic objects corresponding to different function services of the equipment end in the UI control interface to generate a UI control surface layer which is recorded asWherein->Represents the i-th UI control layer, +.>Representing a jth GUI graphical object contained in the ith UI control surface layer, and m represents the number of the mth GUI graphical object in the ith UI control surface layer;
step S300: based on an instruction of an interface of the Internet of things, monitoring operation behaviors perceived by a background program, waking up each function service of the equipment end through the instruction, and performing function display; establishing a characterization object information set of the operation behaviors, and determining each single-step operation object information by identifying an execution object of the operation behaviors;
specifically, the operation behaviors perceived by the background program are monitored through a wireless networking piece connected with the support Internet of things, when the operation behaviors are monitored, the wireless networking piece sends an operation behavior instruction, wakes up each function service of the equipment end, and performs function display of each function service through the touch screen piece; when the non-operation behavior is monitored, the wireless networking piece is in a sleep state, so that each function service of the equipment end cannot be awakened, and the function display of each function service cannot be performed;
Establishing a characterization object information set of operation behaviors, wherein the characterization object information set comprises single-step operation object information of W operation steps sequentially executed, the single-step operation refers to one operation mode of rotation operation, pressing operation and touch operation, the rotation operation supports regulation of a UI control surface layer, the pressing operation and the touch operation support regulation of a GUI graphic object, the pressing operation supports option determination of the GUI graphic object, and the touch operation supports option switching of the GUI graphic object; caching the single-step operation object information, and recording the characterization object information set of the operation behavior asWherein, the method comprises the steps of, wherein,characterization object information set representing the y-th operation behavior,/->Information indicating the xth single step operation object under the xth operation action, and +.>,/>,/>Single step operation object information corresponding to one operation mode of a rotation operation, a pressing operation and a touch operation is represented, the single step operation object information includes a UI control surface layer and a GUI graphic object, and + ->UI control surface layer corresponding to the indication rotation operation, < ->And->GUI graphical objects corresponding to the pressing operation and the touching operation are respectively represented;
step S400: based on the monitoring of the operation behaviors, performing function display scheduling by analyzing the operation behaviors corresponding to the function service, wherein the function display scheduling comprises first function display scheduling and second function display scheduling, an object of the first function display scheduling is a UI control surface layer, and an object of the second function display scheduling is a GUI graphic object;
Specifically, in the characterization object information set of operation behaviorIn the method, each UI control surface layer is intercepted, and when single-step operation object information is intercepted +.>Is->When in use, automatically intercept +.>A corresponding UI control surface layer; when automatically intercepting +.>The corresponding UI control surface layer is +.>At the time of the operation, characterizing object information set +.>In the process, the UI control surface layer is extracted +.>The former is +.>UI control surface layer corresponding to single step operation object information of (2), and the latter is +.>UI control surface layer corresponding to single-step operation object information; in the characteristic object information set of each operation behavior, the overall UI control surface layer is +.>When the method is used, all UI control surface layer results are correspondingly extracted, two UI control surface layers with the names of the first UI control surface layer and the second UI control surface layer which are arranged according to the number of occurrence of the second UI control surface layer are selected from the corresponding extracted all UI control surface layer results, and the first UI control surface layer and the second UI control surface layer which are arranged according to the number of occurrence of the second UI control surface layer are regarded as UI control surface layers->The first function display schedule object at that time is marked as +.>Wherein->And->Respectively representing an a-th UI control surface layer and a b-th UI control surface layer, wherein a is not equal to b;
displaying schedule objects at first functionAny UI control surface layer is taken and recordedIs SL and isThe method comprises the steps of carrying out a first treatment on the surface of the Characterization object information set at operation behavior +. >In the method, when the single-step operation object information is intercepted as the UI control surface layer SL, the one after the UI control surface layer SL reaches the UI control surface layer SL is +.>Picking +.>And->GUI graphic objects corresponding to the two single step operation fragments are formed, the total number Q of the single step operation fragments is counted, and any single step operation fragment is marked as +.>And->Q represents the number of single step operation fragments, e.g., 4 single step operations in operation behavior in 5, respectively,
,/>,/>,/>then at +.>Andbetween, form a single-step operation fragment of +.>The method comprises the steps of carrying out a first treatment on the surface of the After the first time of function display scheduling object determination, performing re-function display scheduling object determination analysis, and arbitrarily selecting a GUI graphic object in the UI control surface layer SL, which is marked as +.>R represents a GUI graphic object +.>Calculating the number of GUI graphic object +.>The specific calculation formula is as follows:
wherein,representing GUI graphical objects +.>Is>Representing if GUI graphical object->Belonging to->Let->Otherwise, let
Calculating the rescheduling index of each GUI graphic object in the UI control surface layer SL, arranging the rescheduling indexes in the sequence from large to small to form a rescheduling object for rescheduling function display, and recording as
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Finally, it should be noted that: the foregoing description is only a preferred embodiment of the present invention and is not intended to limit the present invention, but although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that modifications may be made to the technical solutions described in the foregoing embodiments, or equivalents may be substituted for some of the technical features thereof. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (2)

1. The touch knob screen control method based on the Internet of things is characterized by comprising the following steps of:
step S100: the method comprises the steps that a user operates a touch knob screen and perceives operation behaviors of the user, wherein the touch knob screen supports rotation operation, pressing operation and touch operation;
step S200: based on the functions realized by the equipment end, designing a graphical user interface for describing the functions, and locking different GUI graphical objects by a user through rotation operation, pressing operation and touch operation so as to realize function services; rendering and manufacturing a UI control interface based on GUI graphic objects in different functional scenes to generate a UI control surface layer;
step S300: based on an instruction of an interface of the Internet of things, monitoring operation behaviors perceived by a background program, waking up each function service of the equipment end through the instruction, and performing function display; establishing a characterization object information set of the operation behaviors, and determining each single-step operation object information by identifying an execution object of the operation behaviors;
step S400: performing functional display scheduling by analyzing the operation behaviors corresponding to the functional service based on the operation behaviors, wherein the functional display scheduling comprises a first functional display scheduling and a second functional display scheduling, the object of the first functional display scheduling is a UI control surface layer, and the object of the second functional display scheduling is a GUI graphic object;
The specific implementation process of the step S100 includes:
step S101: the rotary knob piece, the touch screen piece and the wireless networking piece are integrated through hardware, the rotary knob piece supports rotary regulation and pressing regulation, the touch screen piece supports touch regulation and function display, and the wireless networking piece supports Internet of things connection;
step S102: the method comprises the steps that a background program senses operation behaviors of a user, wherein the operation behaviors comprise W operation steps, the W operation steps are arranged according to a sequence from first to last to form one operation behavior, one operation step corresponds to one operation mode of rotation operation, pressing operation and touch operation, and the operation behaviors corresponding to the W operation steps are formed by reciprocating intersection among the rotation operation, the pressing operation and the touch operation;
the specific implementation process of the step S200 includes:
step S201: according to the function service realized by the equipment end, designing a graphical user interface, generating a GUI graphical object library set, and recording asWhich is provided withIn (I)>Respectively representing 1,2, n GUI graphical objects, and one GUI graphical object corresponds to one function service of the implementation device side;
step S202: rendering and making a UI control interface according to the GUI graphic object library set, and displaying GUI graphic objects corresponding to different function services of the equipment end in the UI control interface to generate a UI control surface layer which is recorded as Wherein->Represents the i-th UI control layer, +.>Representing a jth GUI graphical object contained in the ith UI control surface layer, and m represents the number of the mth GUI graphical object in the ith UI control surface layer;
the specific implementation process of the step S300 includes:
step S301: the method comprises the steps that operation behaviors perceived by a background program are monitored through a wireless networking piece connected with the support Internet of things, when the operation behaviors are monitored, the wireless networking piece sends an operation behavior instruction, each function service at the equipment end is awakened, and function display of each function service is carried out through a touch screen piece; when the non-operation behavior is monitored, the wireless networking piece is in a sleep state, so that each function service of the equipment end cannot be awakened, and the function display of each function service cannot be performed;
step S302: establishing a characterization object information set of operation behaviors, wherein the characterization object information set comprises single-step operation object information sequentially executed by W operation steps, the single-step operation refers to one operation mode of rotation operation, pressing operation and touch operation, the rotation operation supports regulation of a UI control surface layer, the pressing operation and the touch operation support regulation of a GUI graphical object, and the pressing operation supports option determination of the GUI graphical object Determining, namely supporting option switching on GUI graphic objects by touch operation; caching each single-step operation object information, and recording a characterization object information set of the operation behaviors asWherein->Characterization object information set representing the y-th operation behavior,/->Information indicating the xth single step operation object under the xth operation action, and +.>,/>,/>Single step operation object information corresponding to one operation mode of rotation operation, pressing operation and touch operation is represented, the single step operation object information comprises a UI control surface layer and a GUI graphic object, and +_>UI control surface layer corresponding to the indication rotation operation, < ->And->GUI graphical objects corresponding to the pressing operation and the touching operation are respectively represented;
the specific implementation process of the step S400 includes:
step S401: characterizing object information sets in operational behaviorIn the method, each UI control surface layer is intercepted, and when single-step operation object information is intercepted +.>Is->When in use, automatically intercept +.>A corresponding UI control surface layer; when automatically intercepting +.>The corresponding UI control surface layer is +.>At the time of the operation, characterizing object information set +.>In the process, the UI control surface layer is extracted +.>The former is +.>UI control surface layer corresponding to single step operation object information of (2), and the latter is +.>UI control surface layer corresponding to single-step operation object information; in the characteristic object information set of each operation behavior, the overall UI control surface layer is +. >When the method is used, all UI control surface layer results are correspondingly extracted, and two UI control surface layers with the first and second names in the large-to-small rank are selected from all UI control surface layer results, and the first and second ranks are rankedTwo UI control surface layers are used as UI control surface layers and are +.>The first function display schedule object at that time is marked as +.>Wherein, the method comprises the steps of, wherein,and->Respectively representing an a-th UI control surface layer and a b-th UI control surface layer, wherein a is not equal to b;
step S402: displaying schedule objects at first functionAny UI control surface layer is marked as SL andthe method comprises the steps of carrying out a first treatment on the surface of the Characterization object information set at operation behavior +.>In the method, when the single-step operation object information is intercepted as the UI control surface layer SL, the one after the UI control surface layer SL reaches the UI control surface layer SL is +.>Picking +.>And->GUI graphic objects corresponding to the two single step operation fragments are formed, the total number Q of the single step operation fragments is counted, and any single step operation fragment is marked as +.>And->Q represents the number of the single-step operation fragment; after the first time of function display scheduling object determination, performing re-function display scheduling object determination analysis, and arbitrarily selecting a GUI graphic object in the UI control surface layer SL, which is marked as +. >R represents a GUI graphic object +.>Calculating the number of GUI graphic object +.>The specific calculation formula is as follows:
wherein,representing GUI graphical objects +.>Is>Representing if GUI graphical object->Belonging to->Let->Otherwise, let
Calculating the rescheduling index of each GUI graphic object in the UI control surface layer SL, arranging the rescheduling indexes in the sequence from large to small to form a rescheduling object for rescheduling function display, and recording as
2. Touch knob screen management and control system based on thing networking, its characterized in that, the system includes: the touch type knob screen assembly module, the function layer library module, the instruction information module and the display optimization module;
the touch knob screen assembly module comprises a knob piece, a touch screen piece and a wireless networking piece, wherein the knob piece, the touch screen piece and the wireless networking piece are integrated by hardware of the touch knob screen assembly module, and rotation operation, pressing operation and touch operation are provided through the knob piece and the touch screen piece;
the function layer library module designs a graphical user interface describing functions based on functions realized by the equipment end, and a user locks different GUI graphical objects through rotation operation, pressing operation and touch operation to realize function requirements; rendering and manufacturing a UI control interface based on GUI graphic objects in different functional scenes to generate a UI control surface layer;
The instruction information module monitors operation behaviors perceived by a background program based on instructions of an interface of the Internet of things, wakes up each function service of the equipment end through the instructions, and displays functions; establishing a characterization object information set of the operation behaviors, and determining each single-step operation object information by identifying an execution object of the operation behaviors;
the display optimization module is used for carrying out functional display scheduling through analyzing the operation behaviors corresponding to the functional service based on the operation behaviors, wherein the functional display scheduling comprises a first functional display scheduling and a second functional display scheduling, the object of the first functional display scheduling is a UI control surface layer, and the object of the second functional display scheduling is a GUI graphic object;
the touch knob screen assembly module further comprises a hardware integrated unit and an operation sensing unit;
the hardware integrated unit is used for integrating a knob piece, a touch screen piece and a wireless networking piece through hardware integration, wherein the knob piece supports rotation regulation and press regulation, the touch screen piece supports touch regulation and function display, and the wireless networking piece supports Internet of things connection;
the operation sensing unit is used for sensing operation behaviors of a user by a background program, wherein the operation behaviors comprise W operation steps, the W operation steps are arranged according to the sequence of operation from front to back to form one operation behavior, one operation step corresponds to one of rotation operation, pressing operation and touch operation, and the operation behaviors corresponding to the W operation steps are formed by reciprocating intersection among the rotation operation, the pressing operation and the touch operation;
The function layer library module further comprises a GUI graphic object unit and a UI control unit;
the GUI graphic object unit designs a graphic user interface according to the function realized by the equipment end, generates a GUI graphic object library set and records asWherein->Respectively representing 1,2, & gt, n GUI graphic objects, wherein one GUI graphic object corresponds to one function of the realization equipment end;
the UI control unit renders and makes a UI control interface according to the GUI graphic object library, and displays GUI graphic objects corresponding to different functions of the equipment end in the UI control interface to generate a UI control surface layer which is recorded asWherein->Represents the i-th UI control layer, +.>Representing a jth GUI graphical object contained in the ith UI control surface layer, and m represents the number of the mth GUI graphical object in the ith UI control surface layer;
the instruction information module further comprises an instruction awakening unit and an object information unit;
the instruction awakening unit is used for monitoring the operation behaviors perceived by the background program through a wireless networking piece connected with the support Internet of things, sending the instruction of the operation behaviors through the wireless networking piece when the operation behaviors are monitored, awakening each function service of the equipment end, and displaying the functions of each function service through the touch screen piece; when the non-operation behavior is monitored, the wireless networking piece is in a sleep state, so that each function service of the equipment end cannot be awakened, and the function display of each function service cannot be performed;
The object information unit is used for establishing a characterization object information set of operation behaviors, wherein the characterization object information set comprises single-step operation object information of W operation steps sequentially executed, the single-step operation refers to one operation mode of rotation operation, pressing operation and touch operation, the rotation operation supports regulation of a UI control surface layer, the pressing operation and the touch operation support regulation of a GUI graphic object, the pressing operation supports option determination of the GUI graphic object, and the touch operation supports option switching of the GUI graphic object; caching each single-step operation object information, and recording a characterization object information set of the operation behaviors asWherein->Characterization object information set representing the y-th operation behavior,/->Information indicating the xth single step operation object under the xth operation action, and +.>,/>,/>Single step operation object information corresponding to one operation mode of rotation operation, pressing operation and touch operation is represented, the single step operation object information comprises a UI control surface layer and a GUI graphic object, and +_>UI control surface layer corresponding to the indication rotation operation, < ->And->GUI graphical objects corresponding to the pressing operation and the touching operation are respectively represented;
the display optimization module further comprises a first scheduling unit and a second scheduling unit;
The first scheduling unit is used for representing the object information set in the operation behaviorIn the method, each UI control surface layer is intercepted, and when single-step operation object information is intercepted +.>Is->When in use, automatically intercept +.>A corresponding UI control surface layer; when automatically intercepting +.>The corresponding UI control surface layer is +.>At the time of the operation, characterizing object information set +.>In the process, the UI control surface layer is extracted +.>The former is +.>UI control surface layer corresponding to single step operation object information of (2), and the latter is +.>UI control surface layer corresponding to single-step operation object information; in the characteristic object information set of each operation behavior, the overall UI control surface layer is +.>When the method is used, all UI control surface layer results are correspondingly extracted, two UI control surface layers with the names of the first UI control surface layer and the second UI control surface layer which are arranged according to the number of occurrence of the second UI control surface layer are selected from the corresponding extracted all UI control surface layer results, and the first UI control surface layer and the second UI control surface layer which are arranged according to the number of occurrence of the second UI control surface layer are regarded as UI control surface layers +.>The first function display schedule object at the time is recorded asWherein->And->Respectively representing an a-th UI control surface layer and a b-th UI control surface layer, wherein a is not equal to b;
the rescheduling unit is used for displaying the scheduling object in the first function One UI control surface layer is marked as SL and +.>The method comprises the steps of carrying out a first treatment on the surface of the Characterization object information set at operation behavior +.>In the method, when the single-step operation object information is intercepted as the UI control surface layer SL, the one after the UI control surface layer SL reaches the UI control surface layer SL is +.>Picking +.>And->GUI graphic objects corresponding to the two single step operation fragments are formed, the total number Q of the single step operation fragments is counted, and any single step operation fragment is marked as +.>And->Q represents the number of the single-step operation fragment; after the first time of function display scheduling object determination, performing re-function display scheduling object determination analysis, and optionally selecting the function display scheduling object from the UI control surface layer SLSelecting a GUI graphic object, denoted +.>R represents a GUI graphic object +.>Calculating the number of GUI graphic object +.>The specific calculation formula is as follows:
wherein,representing GUI graphical objects +.>Is>Representing if GUI graphical object->Belonging to->Let->Otherwise, let
Calculating the rescheduling index of each GUI graphic object in the UI control surface layer SL, arranging the rescheduling indexes in the sequence from large to small to form a rescheduling object for rescheduling function display, and recording as
CN202410096122.9A 2024-01-24 2024-01-24 Touch knob screen management and control system and method based on Internet of things Active CN117608424B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410096122.9A CN117608424B (en) 2024-01-24 2024-01-24 Touch knob screen management and control system and method based on Internet of things

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410096122.9A CN117608424B (en) 2024-01-24 2024-01-24 Touch knob screen management and control system and method based on Internet of things

Publications (2)

Publication Number Publication Date
CN117608424A CN117608424A (en) 2024-02-27
CN117608424B true CN117608424B (en) 2024-04-12

Family

ID=89953883

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410096122.9A Active CN117608424B (en) 2024-01-24 2024-01-24 Touch knob screen management and control system and method based on Internet of things

Country Status (1)

Country Link
CN (1) CN117608424B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101118469A (en) * 2006-07-31 2008-02-06 索尼株式会社 Apparatus and method for touch screen interaction based on tactile feedback and pressure measurement
CN102449593A (en) * 2010-01-22 2012-05-09 电子部品研究院 Method for providing a user interface based on touch pressure, and electronic device using same
CN104866005A (en) * 2015-05-07 2015-08-26 上海优悦广告有限公司 Knob mechanism and household eclectic appliance with same
KR101773032B1 (en) * 2016-09-01 2017-08-30 한국알프스 주식회사 Multifunctional composite input device
CN108055405A (en) * 2017-12-26 2018-05-18 北京传嘉科技有限公司 Wake up the method and terminal of terminal
CN110651242A (en) * 2017-05-16 2020-01-03 苹果公司 Apparatus, method and graphical user interface for touch input processing
CN111052060A (en) * 2017-10-24 2020-04-21 微芯片技术股份有限公司 Touch sensitive user interface including configurable virtual widgets
CN210897102U (en) * 2019-09-27 2020-06-30 华帝股份有限公司 Control panel assembly and household electrical appliance
JP2020160856A (en) * 2019-03-27 2020-10-01 日本精機株式会社 Display controller, gui device, method, and gui program
EP3944069A2 (en) * 2021-07-09 2022-01-26 Guangzhou Xiaopeng Motors Technology Co., Ltd. Method, apparatus and system of voice interaction, vehicles and storage media
CN114095766A (en) * 2020-07-31 2022-02-25 海信视像科技股份有限公司 Display device and rotation control method
CN114488824A (en) * 2021-12-17 2022-05-13 珠海格力电器股份有限公司 Intelligent household appliance touch panel control method and device, storage medium and touch panel
CN115718913A (en) * 2023-01-09 2023-02-28 荣耀终端有限公司 User identity identification method and electronic equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111869186A (en) * 2018-05-07 2020-10-30 康维达无线有限责任公司 Mechanism for intelligent service layer to request abstract service
KR102524294B1 (en) * 2018-11-20 2023-04-21 현대자동차주식회사 Method and apparatus for controlling vehicle using dial knob
US10817063B2 (en) * 2019-02-11 2020-10-27 Volvo Car Corporation Facilitating interaction with a vehicle touchscreen using haptic feedback

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101118469A (en) * 2006-07-31 2008-02-06 索尼株式会社 Apparatus and method for touch screen interaction based on tactile feedback and pressure measurement
CN102449593A (en) * 2010-01-22 2012-05-09 电子部品研究院 Method for providing a user interface based on touch pressure, and electronic device using same
CN104866005A (en) * 2015-05-07 2015-08-26 上海优悦广告有限公司 Knob mechanism and household eclectic appliance with same
KR101773032B1 (en) * 2016-09-01 2017-08-30 한국알프스 주식회사 Multifunctional composite input device
CN110651242A (en) * 2017-05-16 2020-01-03 苹果公司 Apparatus, method and graphical user interface for touch input processing
CN111052060A (en) * 2017-10-24 2020-04-21 微芯片技术股份有限公司 Touch sensitive user interface including configurable virtual widgets
CN108055405A (en) * 2017-12-26 2018-05-18 北京传嘉科技有限公司 Wake up the method and terminal of terminal
JP2020160856A (en) * 2019-03-27 2020-10-01 日本精機株式会社 Display controller, gui device, method, and gui program
CN210897102U (en) * 2019-09-27 2020-06-30 华帝股份有限公司 Control panel assembly and household electrical appliance
CN114095766A (en) * 2020-07-31 2022-02-25 海信视像科技股份有限公司 Display device and rotation control method
EP3944069A2 (en) * 2021-07-09 2022-01-26 Guangzhou Xiaopeng Motors Technology Co., Ltd. Method, apparatus and system of voice interaction, vehicles and storage media
CN114488824A (en) * 2021-12-17 2022-05-13 珠海格力电器股份有限公司 Intelligent household appliance touch panel control method and device, storage medium and touch panel
CN115718913A (en) * 2023-01-09 2023-02-28 荣耀终端有限公司 User identity identification method and electronic equipment

Also Published As

Publication number Publication date
CN117608424A (en) 2024-02-27

Similar Documents

Publication Publication Date Title
JP5803249B2 (en) Information processing apparatus, information processing method, and program
JP5803248B2 (en) Information processing apparatus, information processing method, and program
CN106993143A (en) A kind of method that multi-functional control is realized by television set single-button
CN103927247B (en) Figure calibration method is shown according to service use state and supports this method mobile terminal
US20140359518A1 (en) Method of Promptly Starting Windowed Applications Installed on a Mobile Operating System and Device Using the Same
CN102929425B (en) A kind of touch key control method and device
CN103067784A (en) Virtual key pressing method based on touch screen television and television
CN105843499A (en) Display state switching method and terminal
CN103604197A (en) Air conditioner touch control interface display method and system
KR101916741B1 (en) Operating Method for three-dimensional Handler And Portable Device supporting the same
JP5472118B2 (en) Operation support method, operation support system, operation support apparatus, and operation support program
CN106681503A (en) Display control method, terminal and display device
CN105824502A (en) Information processing method and electronic equipment
CN106161804A (en) A kind of audio play control method and mobile terminal
CN109243158A (en) A kind of household intelligent control terminal
CN106126090B (en) The control method and electronic equipment of a kind of electronic equipment
CN117608424B (en) Touch knob screen management and control system and method based on Internet of things
CN108334743A (en) Study and optimizing care agreement
CN203212836U (en) Operation panel of industrial sewing machine
CN114114951A (en) Intelligent kitchen electrical equipment information interaction method and device, storage medium and intelligent terminal
JP2004326498A (en) Information terminal equipment and program
CN109298907A (en) Application program display methods, application program display device and terminal
CN112015101A (en) Control method and device of intelligent equipment, intelligent control switch and storage medium
Zhao et al. HUMAN-COMPUTER INTERACTION AND USER EXPERIENCE IN SMART HOME RESEARCH: A CRITICAL ANALYSIS.
Wu Research on the micro-interactive interface design of intelligent washing machines in IOT environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant