CN116361375A - Object management method and system based on user visual interaction - Google Patents

Object management method and system based on user visual interaction Download PDF

Info

Publication number
CN116361375A
CN116361375A CN202310331985.5A CN202310331985A CN116361375A CN 116361375 A CN116361375 A CN 116361375A CN 202310331985 A CN202310331985 A CN 202310331985A CN 116361375 A CN116361375 A CN 116361375A
Authority
CN
China
Prior art keywords
user
terminal
interaction
request
service module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310331985.5A
Other languages
Chinese (zh)
Inventor
许�鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changzhou Lingsi Yuedong Information Technology Co ltd
Original Assignee
Changzhou Lingsi Yuedong Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changzhou Lingsi Yuedong Information Technology Co ltd filed Critical Changzhou Lingsi Yuedong Information Technology Co ltd
Priority to CN202310331985.5A priority Critical patent/CN116361375A/en
Publication of CN116361375A publication Critical patent/CN116361375A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/25Integrating or interfacing systems involving database management systems
    • G06F16/252Integrating or interfacing systems involving database management systems between a Database Management System and a front-end application
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/904Browsing; Visualisation therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Human Computer Interaction (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses an object management system based on user visual interaction, which comprises: the invention realizes intelligent and visual object management, has universality and growth, and has obvious technical progress in the field.

Description

Object management method and system based on user visual interaction
Technical Field
The invention relates to the field of network data processing, in particular to an object management method and system based on user visual interaction.
Background
The current discussion about big data visualization applications is mostly focused on business and industry fields, for example, business fields such as business intelligence, government decision, public service, marketing, etc., and industry fields such as financial industry, electric power industry, communication industry, industrial manufacturing, medical care, etc. Data visualization may make the data more meaningful and visualization may also make the data easier to understand. Data visualization software is helping more and more enterprises to process the sophisticated data from the vast seas, reduce the complexity and realize more effective decision making processes.
The data visualization tool is capable of presenting complex data in a simple and easy-to-use manner. The visual system supporting man-machine interaction in visual display can meet the visual requirements of different users.
However, the current data visualization system suffers from a deficiency, such as in the file entitled 201310325767.7, descriptive framework for data visualization, a system framework is disclosed for facilitating data visualization that provides a plurality of module listings, each module listing representing a particular component of the visualization. Wherein the diagram inventory may be used to coordinate one or more module inventory. The visualization may be presented based on the manifest of the graph. But has the defect that the visual requirement of large-scale data cannot be processed and the visual man-machine interaction cannot be realized.
Therefore, the existing visual system has the defects that local data or imported data cannot be displayed by selecting a proper visual algorithm through man-machine interaction matching according to different data characteristics and display requirements, so that the visual performance of the system is low, the system does not have a learning function, various constituent elements need to be considered thoroughly in software design, the closed architecture is difficult to realize, the closed architecture also has no universality, and the application of one set of system in multiple fields cannot be realized.
Disclosure of Invention
The invention provides an object management method based on user visual interaction, which aims at the problems of the existing object management system or method: the method comprises the following steps:
s1, constructing an object management system comprising a center end and a user end;
s2, a user terminal sends a request for acquiring an object to a central terminal, the central terminal recommends a corresponding service module according to registration information input by the user terminal, a plurality of objects are packaged in the service module, and the objects are displayed in a mode of combining one or more of characters, figures, 3D models, outlines and identifiers;
s3, the user side adds, deletes and modifies the object in the service module according to the actual situation by selecting a certain service module recommended by the center side, so as to form a dedicated service module of the user side, the dedicated service module is stored in a user space of the center side, if the service module recommended by the center side is not needed by the user side, the user side is guided by the center side, a needed service module is newly built, and the service module is stored in a database of the center side and the user space, so that the service module recommended by the center side can be managed and expanded or perfected intelligently for individuation of the user side;
s4, the user side sends an object management request to the center side through an operation instruction, the center side judges the next operation of the object possibly needed by the user side through analyzing historical operation data or database big data of the user in the user space, a plurality of operation items are recommended to the user side, the user side selects the operation items, if the operation items are not needed, the operation items are input through equipment by the user side, the center side stores the operation items added by the user in the user space and the center side database respectively, and the operation categories are preferentially displayed on the user side.
Preferably, in step S1, a plurality of service modules are set prior to the central end, each service module stores a set formed by a plurality of objects related to the service, and the central end further stores the service module input by the user end, so as to facilitate expanding the application range of the object management system.
Preferably, in step S2, the objects in the service module recommended by the central side are hierarchical, and the user side adds, deletes and modifies the objects step by step through the operation instruction.
Preferably, in step S4, the historical operation data includes: the system comprises an operation instruction, a behavior chain and operation selection, wherein the operation instruction and the behavior chain formed by combining a plurality of operation instructions or interaction between operation and an object are used as input to the object, the object outputs different operation items to a user terminal according to different inputs of the user, and when the object detects that a certain input corresponds to a certain operation item for a plurality of times, the selection step is omitted to directly execute the operation item.
Preferably, in step S3, when the user side needs to add and modify the object, the added and modified object is one or more combinations of text information, graphics, 3D model, outline and identifier of the object searched in the network or the central database by the central side, or the central side performs image processing on text and graphics information related to the added and modified object uploaded by the user side, so as to realize visualization.
Preferably, the operation instruction includes: text input, graphical input, image acquisition, or other actions that enable object interaction.
Preferably, when the center end initially sets the object of the service module in step S1 and stores the object added and modified by the user in step S3, the object is divided into a passive object and an active object, wherein the passive object refers to an object which cannot interact with and is controlled by the center end or the client, and the active object refers to an object which can interact with and is controlled by the center end or the client; when a passive object or an active object needs to be operated, recommending an executor to a user end by a central end according to a database or network big data, wherein the executor is the active object in a management service module of the user end, selecting the active object by the user end, inputting the active object by the user end if no executor satisfied by the user end, and after the executor is selected, interacting the executor in butt joint by the central end with the user end to finish the operation on the object; when the operation of the active object needs to be controlled, the object end is installed at the active object, then the object end and the user end are interacted, and a control instruction of the user end is received and the function of the control instruction is executed; the visual images of the active object and the passive object are distinguished by different visual effects imparted by the central end.
Preferably, in step S2, the service module further includes a visualization space image for bearing the object therein, where the visualization space image includes: customizable maps, sand tables, hierarchical plan views, 3D building models, tables, analytical charts.
Preferably, in step S2, the objects stored in the central database and the user space include: the object attribute comprises a plurality of labels capable of highlighting or summarizing the object attribute, the labels are used for splitting and combining the object by a user side, the labels are modified by the user side or set by a center side, and the object function comprises: receiving and identifying input information, displaying different operation items as output items on a user side according to the input information, simplifying selection steps of a user when the operation items are input for multiple times, and adding the number of the output operation items.
The object management system based on the object management method comprises: comprising the following steps: the management system software is used for carrying out interaction among the center end, the user end, the object end and the support; the central terminal comprises an object processor for storing and processing objects and a request processor for interacting with the user terminal; the user terminal comprises a communication module for interacting with the center terminal, a visualization processing module for visualizing the object and a man-machine interaction module.
Preferably, the center end includes: the object processor is used for receiving and responding to the object requests of the user terminals, maintaining a corresponding user space for each user terminal, accessing, processing and analyzing the attribute data related to the object by taking the object as a main body and executing the related functions of the object; the user terminal comprises: the user interface and the interaction unit form interaction with the central terminal through the request service unit, and the visual interaction processing unit comprises: a visual interaction component and a template interpreter which rely on an object visual interaction template library; the object terminal receives the control instruction of the user terminal or the center terminal and operates according to the instruction.
Preferably, the object terminal is a device embedded with a control chip or a terminal device which is controllable by people and is provided with management system software, and the control chip can realize unmanned automatic control according to a written program.
Preferably, the request processor includes: the system comprises a request interface unit, a user space unit and a request processing unit; the receiving and responding of the request of the user end are processed by the request interface unit, the unit can carry out necessary serialization and deserialization on the request and provide support on the aspects of information security, connection quality, flow scheduling and the like; the user space unit is used for storing all interaction states of each user during connection, accessed object states and other states for maintaining space collaboration and operation; the request of the user side is responsible for interpretation and scheduling by the request processing unit, the processing procedure and the response result of the request are optimized through analysis of the space state of the user of the request source, and the request processing unit completes the actual request processing and response procedure through the scheduling object processor.
Preferably, the object processor includes: the object access unit depends on the object attribute library, the object execution unit depends on the object function library, and the data processing unit and the image processing unit depend on the data visualization information library.
Preferably, the central end is a cloud storage end or a server, and the user end is a PC or a mobile device.
The invention has the following creation points:
1. the invention is a management system which builds a basic logic framework, can display the most relevant objects and the operation items with highest use frequency in different fields of the user side according to the different input information of the user side, thereby having universality and being applicable to various fields and industries.
2. The object management system has a learning function, does not need to set too complete and perfect contents manually in the initial operation of the management system, and along with the increase of the user end, the information such as objects stored in a database of the center end is modified and stored by the user end, and the information is accumulated to be more and more full and close to the requirements of the user end.
3. The invention realizes the intelligent management of the objects, the central end opens an independent user space for each user end, is special for monitoring and analyzing the operation carried out by the user end, recommends a plurality of most possible options of the objects or management actions to the user end for the user end to select and use, can reduce the operation steps and save the operation time.
4. The invention realizes the flexible visualization processing of the object, the central terminal has the optimization rule and the mapping logic for mapping the data to the visualized graph, the object obtained by the central terminal is not simply responsive to the object data and is provided to the user terminal, but the optimal visualization scheme is provided by combining the space state of the user. The client has all the object visualized templates, which may be pre-existing, updated or dynamically generated during running, and the object visualized data responded by the central end may be mapped to the corresponding client visualized templates.
5. The invention can control the object end which can interact with the center end or the user end on line, the center end divides the object into two types of active object and passive object according to whether the object can interact with the center end or the user end, and the active object can be managed directly through a control chip embedded in the device or terminal equipment which is held by a manager and is provided with management system software.
Drawings
Fig. 1 is a schematic diagram of the overall structure of an object management system.
Fig. 2 is a schematic diagram of a user end structure.
FIG. 3 is a schematic view of the central end structure
Detailed Description
Example 1
For ease of understanding, some of the key terms described above are explained herein:
1. the object is: the system is an object to be managed by a user side, is set by a central side or is edited and generated by the user side, has a visual image (which can be only stored and not displayed) capable of being displayed on the user side, has an attribute and a function, wherein the object attribute is a numerical value or a specific combination of a plurality of numerical values, and can distinguish a plurality of object sets according to the attribute commonality so as to split, combine objects without or with one or more commonalities; the function of an object is that the object allows one or more algorithms to act on its own properties, which can be input or output with other objects, which feature facilitates interoperability between objects.
2. Center end: the center end is used for storing and setting object attributes and object functions, interacting with the user end, and optimizing management operation of the user end on the objects; the central terminal can be cloud computing service or actually deployed server, and an independent user space is formed for each user terminal to store object data of the user terminal, monitor and analyze historical operation of the user, so that the object functions required by operation instructions of the user can be prejudged, the prejudging functions are displayed as a plurality of object functions with decreasing priorities for selection by the user, and the required object functions can be edited by the user, so that the object functions and the operation instructions of the user can be monitored and analyzed through the central terminal to form a matched relationship.
3. The user terminal: the system is used for realizing interaction with a central terminal to acquire an object and a management object, and the system completes related functions through specific software running in a computer, mobile equipment and a handheld terminal; the user side can directly control the active object operation of the object side software.
The object management method based on user visual interaction comprises the following steps: the method comprises the following steps:
s1, constructing an object management system comprising a center end and a user end;
s2, a user terminal sends a request for acquiring an object to a central terminal, the central terminal recommends a corresponding service module according to registration information input by the user terminal, a plurality of objects are packaged in the service module, and the objects are displayed in a mode of combining one or more of characters, figures, 3D models, outlines and identifiers;
s3, the user side adds, deletes and modifies the object in the service module according to the actual situation by selecting a certain service module recommended by the center side, so as to form a dedicated service module of the user side, the dedicated service module is stored in a user space of the center side, if the service module recommended by the center side is not needed by the user side, the user side is guided by the center side, a needed service module is newly built, and the service module is stored in a database of the center side and the user space, so that the service module recommended by the center side can be managed and expanded or perfected intelligently for individuation of the user side;
s4, the user side sends an object management request to the center side through an operation instruction, the center side judges the next operation of the object possibly needed by the user side through analyzing historical operation data or database big data of the user in the user space, a plurality of operation items are recommended to the user side, the user side selects the operation items, if the operation items are not needed, the operation items are input through equipment by the user side, the center side stores the operation items added by the user in the user space and the center side database respectively, and the operation categories are preferentially displayed on the user side.
More specifically, in step S1, a plurality of service modules are set prior to the central end, each service module stores a set formed by a plurality of objects related to the service, and the central end further stores the service module input by the user end, so as to facilitate expanding the application range of the object management system.
More specifically, in step S2, the objects in the service module recommended by the central side are hierarchical, and are added, deleted and modified by the user side step by step through the operation instruction.
More specifically, in step S4, the historical operation data includes: in step S4, the historical operating data includes: the system comprises an operation instruction, a behavior chain and operation selection, wherein the operation instruction and the behavior chain formed by combining a plurality of operation instructions or interaction between operation and an object are used as input to the object, the object outputs different operation items to a user terminal according to different inputs of the user, and when the object detects that a certain input corresponds to a certain operation item for a plurality of times, the selection step is omitted to directly execute the operation item.
More specifically, in step S3, when the user side needs to add and modify the object in step S3, the added and modified object is one or more combinations of text information, graphics, 3D model, outline and identifier of the object searched in the network or the central database by the central side, or the text and graphics information related to the added and modified object uploaded by the user side is subjected to image processing by the central side, so as to realize visualization.
More specifically, the operation instruction includes: the operation instruction includes: text input, graphic input, image acquisition or other actions capable of realizing object interaction, wherein the actions are actions such as shooting through a touch screen or other external devices such as a camera, a keyboard, a mouse, microphone operation, text input, mouse operation, voice input and the like, and the mouse operation comprises single operation of clicking a key and compound operation such as key dragging and gesture drawing, and the actions are based on the function capable of triggering an object.
More specifically, when the central terminal initially sets the object of the service module in step S1 and stores the object added and modified by the user in step S3, the object is divided into a passive object and an active object, wherein the passive object refers to an object which cannot interact with and is controlled by the central terminal or the client, and the active object refers to an object which can interact with and is controlled by the central terminal or the client; when a passive object or an active object needs to be operated, recommending an executor to a user end by a central end according to a database or network big data, wherein the executor is the active object in a management service module of the user end, selecting the active object by the user end, inputting the active object by the user end if no executor satisfied by the user end, and after the executor is selected, interacting the executor in butt joint by the central end with the user end to finish the operation on the object; when the operation of the active object needs to be controlled, the object end is installed at the active object, then the object end and the user end are interacted, and a control instruction of the user end is received and the function of the control instruction is executed; the visual representation of the active and passive objects is distinguished by the different visual effects imparted by the central end, including but not limited to color, graphic style, logos, etc.
More specifically, the service module in step S2 further includes a visualization space image for bearing the object therein, where the visualization space image includes: a customizable map, a sand table, a layered plan, a 3D building model, a table and an analysis chart; the visual space image may interact with the object.
More specifically, in step S2, the objects stored in the central database and the user space include: in step S2, the objects stored in the central database and the user space both include: the object attribute comprises a plurality of labels capable of highlighting or summarizing the object attribute, the labels are used for splitting and combining the object by a user side, the labels are modified by the user side or set by a center side, and the object function comprises: receiving and identifying input information, displaying different operation items as output items on a user side according to the input information, simplifying selection steps of a user when the operation items are input for multiple times, and adding the number of the output operation items. After a plurality of preset inputs are input to the object function, if the user side has an operation instruction and a behavior chain different from the preset inputs, the system inquires the user side to add a new output item, a corresponding relation is formed between the new input item and the new input item, the object function is expanded, when a certain input item and a certain output item are detected to have a plurality of corresponding relations, an intermediate selection step is omitted to directly execute the output item, the user operation and the object function are corresponding, and for the object function, the user operation and the object function respond to clicking, dragging and other operations, and a plurality of different options or new interfaces or other output categories are displayed.
Example 2
As shown in fig. 1 to 3, the object management system based on user visual interaction according to the present invention includes: the management system software is used for carrying out interaction among the center end, the user end, the object end and the support; the central terminal comprises an object processor for storing and processing objects and a request processor for interacting with the user terminal; the user terminal comprises a communication module for interacting with the center terminal, a visualization processing module for visualizing the object and a man-machine interaction module.
More specifically, the object processor is configured to receive and respond to an object request from a user terminal, maintain a corresponding user space for each user terminal, access, process and analyze attribute data related to the object by using the object as a main body, and execute related functions of the object; the user terminal comprises: the user interface and the interaction unit form interaction with the central terminal through the request service unit, and the visual interaction processing unit comprises: a visual interaction component and a template interpreter which rely on an object visual interaction template library; the object terminal receives the control instruction of the user terminal or the center terminal and operates according to the instruction.
More specifically, the object terminal is a device embedded with a control chip or a terminal device which is controllable by people and is provided with management system software, and the control chip can realize unmanned automatic control according to a written program.
More specifically, the request processor includes: the system comprises a request interface unit, a user space unit and a request processing unit; the receiving and responding of the request of the user end are processed by the request interface unit, the unit can carry out necessary serialization and deserialization on the request and provide support on the aspects of information security, connection quality, flow scheduling and the like; the user space unit is used for storing all interaction states of each user during connection, accessed object states and other states for maintaining space collaboration and operation; the request of the user side is responsible for interpretation and scheduling by the request processing unit, the processing procedure and the response result of the request are optimized through analysis of the space state of the user of the request source, and the request processing unit completes the actual request processing and response procedure through the scheduling object processor.
More specifically, the object processor includes: the object access unit depends on the object attribute library, the object execution unit depends on the object function library, and the data processing unit and the image processing unit depend on the data visualization information library.
More specifically, the central end is a cloud storage end or a server, and the user end is a PC or a mobile device.
Example 3
The method is applied to a specific scene, a user A is a hotel manager, the object management system is used for managing a hotel, user terminal software supporting the management system is firstly downloaded in a computer or a mobile phone, registration is carried out at a user terminal, corresponding fields are filled in, at the moment, a central terminal recommends an object set corresponding to the hotel fields according to the fields filled in by the user terminal, the object set comprises first-level assets and manpower options, and the user terminal selects and adds the assets and the manpower options.
For human management, a human option is clicked to enter a second-level object, the second-level object is the name of each post, a user side increases and decreases the names of the second-level objects according to the conditions of the user side, any post is clicked to enter a third-level object to add staff information, staff is distinguished by a preset man and woman personnel outline, the classification method and image setting are preset by a center side, in addition, the user side can upload photos, work cards and the like to replace the photos, when personnel are required to mobilize, the corresponding staff is dragged into or sheared to the corresponding post, when cleaning or ward round is required, the corresponding staff is dragged into a visual image of a room number under an asset object set, the system preferably prompts the user side to select cleaning or ward round (the operation item with the preferential prompt can be preset, or can be formed after a plurality of operations of the user side are monitored and analyzed by the system), the object side APP downloaded by the staff receives an object management instruction sent by the user side, after the instruction is finished, in addition, if the corresponding visual image is required to be dragged into or sheared to the corresponding post, the corresponding staff is required to be cleaned or shoveled, the corresponding staff can be firstly, the visual image is required to be selected by the user side at the user side can be conveniently inquired at the center side, and the best can be recorded by the system according to the recommended data of the operation item.
For asset management, the asset option is clicked to enter a second-level object, the second-level object comprises a room, equipment facilities, finance and the like, the point-offered equipment facilities enter a third-level object, the third-level object comprises a monitoring camera and a hotel robot, wherein the monitoring camera is provided with a control chip which can carry out information interaction with a user end or a central end of the system, the input of the monitoring camera can acquire a visual image and an object attribute of the monitoring through scanning a two-dimensional code on the monitoring camera, and for the hotel robot, the hotel robot is provided with software of the object end of the system so as to receive and execute an object management instruction of the user end.
Example 4
The method is applied to a specific scene, the user C is a pre-user in a certain field, a database at the central end is not added with a service module in the field, the central end guides the user to upload the object in the field by utilizing a preset general template, the general template is provided with general operation instructions besides the function of adding the object, for example, the user C adds a certain upper object and then double clicks, according to the general operation instructions, the central end interprets the double clicks as that the user wants to split the upper object to establish a secondary object, so that the user is guided to establish the secondary object by entering a new interface, if the user C selects certain objects by a frame and then clicks a right key, the central end interprets the objects as the most possible operation items such as 'merging', 'grouping' and the like, and the operation items are preferentially displayed at the user end, so that the operation steps of the user are simplified.
Example 5
The user D is a programmer and is also an online shopping user, and can use an IT service module and also can add a shopping service module, and the two modules can be used in a switching mode or can be used in a concurrent area.
The invention discloses an object management system based on user visual interaction, which comprises: the invention realizes intelligent and visual object management, has universality and growth, and has obvious technical progress in the field.
Example 6
User E performs a production tour at the factory floor, who wears intelligent interactive glasses to help him locate production equipment faster and obtain production data.
The intelligent interaction glasses of the user E identify the production equipment objects (or a plurality of production equipment objects) in the visual range in a photoelectric sensing mode, and upload corresponding sensing data to the central end for analysis and matching. The center end feeds back the production equipment objects actually corresponding to different sensing signals, the production states and the production data of the production equipment objects to the intelligent interaction glasses. The intelligent interactive glasses recognize the interactive outline of the production equipment according to the fed-back object information (category and characteristic), and display the corresponding information of the production equipment at the adaptive position through the association line of the outline. The user E can conveniently find out the equipment with problems at present or check the key index data of the productivity of the production line through the production equipment information displayed by the intelligent interactive glasses.
When the user E finds that the throughput efficiency of a temporary storage area on the production line is abnormal, the user E performs a spreading operation by hand at the corresponding position of the production equipment visually seen by the intelligent interaction glasses, and the intelligent interaction glasses upload the production equipment object and the corresponding gesture information to the central end. Because the central end learns the gesture intention of the user from the interactive habit of the user to check and solve the equipment efficiency problem, the temporary storage area usage chart of the production equipment object is directly fed back to the intelligent interactive glasses, the materials remained in the area for a long time are marked, and an optional combined operation scheme for processing the abnormal materials is provided. The intelligent interactive glasses display the visual image of the temporary storage area to the user E according to the feedback information, highlight abnormal materials and information thereof, and list the recommended optional combined operation scheme of the central end. After the user E judges the actual situation, the user E selects the option of reprocessing the reflow abnormal materials by hand, and the intelligent interactive glasses upload the selection of the user E to the central terminal. The central end sends out instructions to the corresponding mechanical arms through the production control system, the mechanical arms are controlled to grasp abnormal materials and put on a reflow production line, and abnormal material handling sheets are automatically generated and fed back to the intelligent interaction glasses. The intelligent interactive glasses display the treatment list to the user E, and after the user E executes the confirmation gesture, the user E signs the treatment list by using the digital signature of the user E and uploads the treatment list to the central terminal. So far, one exception handling is completed.
In this example, the intelligent interactive glasses are used as a user terminal, belong to a mobile device, and the production device is an object terminal provided with an embedded control chip, belong to an active object, can monitor the running state of the intelligent interactive glasses and interact with the central terminal, store working conditions in the central terminal in real time, and when the user terminal performs gesture operation visually through a camera component of the intelligent interactive glasses, the intelligent interactive glasses place gesture actions and the production device in a plane to analyze whether operation instructions of the intelligent interactive glasses are conventional or not, and the input instructions are used for directly executing conventional operations or inquiring possible operations to the user, so that the system is clearly shown to realize visual object management in an intelligent factory.
Example 7
User G is the warehouse carrier of the factory, responsible for transporting material from the warehouse to the designated other warehouse. User H is the warehouse manager of the factory and is responsible for managing and coordinating warehouse and transportation work.
The central terminal informs the user G of the vehicle starting a transportation task according to the transportation plan scheduled by the system. And the central terminal downloads the transportation plan to an intelligent terminal of the user G vehicle, and the intelligent terminal reminds the user G to drive the vehicle to a specified platform scale, and displays the planning content and planning the driving route. And the user G runs the empty vehicle to the platform scale to weigh the empty vehicle, the platform scale sensing equipment is communicated with the intelligent terminal of the vehicle, the in-situ condition of the vehicle is fed back in real time, the intelligent terminal uploads an in-situ instruction to the center end until the vehicle is completely in place, the platform scale equipment uploads weighing data to the center end, and the center end performs integration verification on the two parts of data to generate empty vehicle data of the vehicle.
If the central end finds that the empty data is abnormal, the central end feeds back an abnormal prompt to the intelligent vehicle terminal and simultaneously feeds back an abnormal alarm to the control terminal of the user H, and at the moment, the follow-up operation starts to be taken over by the user H. The central terminal automatically collects real-time images of the vehicle, inquires vehicle journey data and historical task weighing data and feeds the data back to the control terminal of the user H. The user H finds that uncleaned batch materials exist on the vehicle, intercepts and marks the picture on the control terminal, and uploads the picture to the central terminal. The central terminal automatically generates an exception handling list and simultaneously feeds back to the user H control terminal and the user G vehicle intelligent terminal, and the user G and the user H complete signing by using digital signatures of the user G and the user H respectively. After signing, the central terminal issues a new destination to the user G so as to clean the vehicle, and meanwhile, the vehicle of the user G is removed from the task queue, and a new vehicle is rearranged to complete the current task.
If the central terminal confirms that the empty data is correct, a new destination is issued to the user G to load materials, and the task state is updated.
In this example, the control terminal of the user H is a user terminal, the intelligent terminal of the user G is an object terminal, which belongs to a "personal controllable terminal device with management system software", and the devices such as a platform scale and monitoring for collecting real-time data of a vehicle are also object terminals, which belong to a "device with an embedded control chip", and this example clearly shows how to use the system to realize visual object management in a logistics transportation task.
Although embodiments of the present invention have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (15)

1. An object management method based on user visual interaction is characterized in that: the method comprises the following steps:
s1, constructing an object management system comprising a center end and a user end;
s2, a user terminal sends a request for acquiring an object to a central terminal, the central terminal recommends a corresponding service module according to registration information input by the user terminal, a plurality of objects are packaged in the service module, and the objects are displayed in a mode of combining one or more of characters, figures, 3D models, outlines and identifiers;
s3, the user side adds, deletes and modifies the object in the service module according to the actual situation by selecting a certain service module recommended by the center side, so as to form a dedicated service module of the user side, the dedicated service module is stored in a user space of the center side, if the service module recommended by the center side is not needed by the user side, the user side is guided by the center side, a needed service module is newly built, and the service module is stored in a database of the center side and the user space, so that the service module recommended by the center side can be managed and expanded or perfected intelligently for individuation of the user side;
s4, the user side sends an object management request to the center side through an operation instruction, the center side judges the next operation of the object possibly needed by the user side through analyzing historical operation data or database big data of the user in the user space, a plurality of operation items are recommended to the user side, the user side selects the operation items, if the operation items are not needed, the operation items are input through equipment by the user side, the center side stores the operation items added by the user in the user space and the center side database respectively, and the operation categories are preferentially displayed on the user side.
2. The user visual interaction-based object management method according to claim 1, wherein: in step S1, a plurality of service modules are set prior to the central end, each service module stores a set formed by a plurality of objects related to the service, and the central end also stores the service modules input by the user end, so as to facilitate expanding the application range of the object management system.
3. The user visual interaction-based object management method according to claim 2, wherein: in the step S2, the objects in the service module recommended by the central terminal are hierarchical, and the user terminal adds, deletes and modifies the objects step by step through the operation instruction.
4. A method of object management based on user visual interaction according to claim 3, wherein: in step S4, the historical operating data includes: the system comprises an operation instruction, a behavior chain and operation selection, wherein the operation instruction and the behavior chain formed by combining a plurality of operation instructions or interaction between operation and an object are used as input to the object, the object outputs different operation items to a user terminal according to different inputs of the user, and when the object detects that a certain input corresponds to a certain operation item for a plurality of times, the selection step is omitted to directly execute the operation item.
5. The user visual interaction-based object management method according to claim 4, wherein: in step S3, when the user side needs to add and modify the object, the added and modified object is one or more combinations of text information, graphics, 3D model, outline and identifier of the object searched in the network or the database of the center side, or the center side performs image processing on text and graphics information related to the added and modified object uploaded by the user side, so as to realize visualization.
6. The user visual interaction-based object management method according to claim 5, wherein: the operation instruction includes: text input, graphical input, image acquisition, or other actions that enable object interaction.
7. The user visual interaction-based object management method according to claim 6, wherein: when the center end initially sets an object of a service module in step S1 and stores an object added and modified by a user in step S3, dividing the object into a passive object and an active object, wherein the passive object refers to an object which cannot interact with and is controlled by the center end or the client, and the active object refers to an object which can interact with and is controlled by the center end or the client; when a passive object or an active object needs to be operated, recommending an executor to a user end by a central end according to a database or network big data, wherein the executor is the active object in a management service module of the user end, selecting the active object by the user end, inputting the active object by the user end if no executor satisfied by the user end, and after the executor is selected, interacting the executor in butt joint by the central end with the user end to finish the operation on the object; when the operation of the active object needs to be controlled, the object end is installed at the active object, then the object end and the user end are interacted, and a control instruction of the user end is received and the function of the control instruction is executed; the visual images of the active object and the passive object are distinguished by different visual effects imparted by the central end.
8. The user visual interaction-based object management method according to claim 7, wherein: the service module in step S2 further includes a visualization space image for bearing the object therein, where the visualization space image includes: customizable maps, sand tables, hierarchical plan views, 3D building models, tables, analytical charts.
9. The user visual interaction-based object management method according to claim 8, wherein: in step S2, the objects stored in the central database and the user space both include: the object attribute comprises a plurality of labels capable of highlighting or summarizing the object attribute, the labels are used for splitting and combining the object by a user side, the labels are modified by the user side or set by a center side, and the object function comprises: receiving and identifying input information, displaying different operation items as output items on a user side according to the input information, simplifying selection steps of a user when the operation items are input for multiple times, and adding the number of the output operation items.
10. An object management system based on user visual interaction, which is characterized in that: comprising the following steps: the management system software is used for carrying out interaction among the center end, the user end, the object end and the support; the central terminal comprises an object processor for storing and processing objects and a request processor for interacting with the user terminal; the user terminal comprises a communication module for interacting with the center terminal, a visualization processing module for visualizing the object and a man-machine interaction module.
11. The user visual interaction-based object management system of claim 10, the central side comprising: the object processor is used for receiving and responding to the object requests of the user terminals, maintaining a corresponding user space for each user terminal, accessing, processing and analyzing the attribute data related to the object by taking the object as a main body and executing the related functions of the object; the user terminal comprises: the user interface and the interaction unit form interaction with the central terminal through the request service unit, and the visual interaction processing unit comprises: a visual interaction component and a template interpreter which rely on an object visual interaction template library; the object terminal receives the control instruction of the user terminal or the center terminal and operates according to the instruction.
12. The user visual interaction based object management system of claim 11, wherein: the object terminal is a device embedded with a control chip or a terminal device which is controllable by people and is provided with management system software, and the control chip can realize unmanned automatic control according to a written program.
13. The user visual interaction based object management system of claim 12, wherein: the request processor includes: the system comprises a request interface unit, a user space unit and a request processing unit; the receiving and responding of the request of the user end are processed by the request interface unit, the unit can carry out necessary serialization and deserialization on the request and provide support on the aspects of information security, connection quality, flow scheduling and the like; the user space unit is used for storing all interaction states of each user during connection, accessed object states and other states for maintaining space collaboration and operation; the request of the user side is responsible for interpretation and scheduling by the request processing unit, the processing procedure and the response result of the request are optimized through analysis of the space state of the user of the request source, and the request processing unit completes the actual request processing and response procedure through the scheduling object processor.
14. The user visual interaction based object management system of claim 13, wherein: the object processor includes: the object access unit depends on the object attribute library, the object execution unit depends on the object function library, and the data processing unit and the image processing unit depend on the data visualization information library.
15. The user visual interaction based object management system of claim 14, wherein: the central end is a cloud storage end or a server, and the user end is a PC or mobile equipment.
CN202310331985.5A 2023-03-31 2023-03-31 Object management method and system based on user visual interaction Pending CN116361375A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310331985.5A CN116361375A (en) 2023-03-31 2023-03-31 Object management method and system based on user visual interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310331985.5A CN116361375A (en) 2023-03-31 2023-03-31 Object management method and system based on user visual interaction

Publications (1)

Publication Number Publication Date
CN116361375A true CN116361375A (en) 2023-06-30

Family

ID=86914790

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310331985.5A Pending CN116361375A (en) 2023-03-31 2023-03-31 Object management method and system based on user visual interaction

Country Status (1)

Country Link
CN (1) CN116361375A (en)

Similar Documents

Publication Publication Date Title
Rosin et al. Impacts of Industry 4.0 technologies on Lean principles
Pereira et al. How Industry 4.0 can enhance lean practices
Dai et al. Radio frequency identification-enabled real-time manufacturing execution system: a case study in an automotive part manufacturer
Wang et al. Application of augmented reality (AR) technologies in inhouse logistics
Wang et al. An interoperable solution for cloud manufacturing
US9070104B2 (en) Cross-context task management
US7197740B2 (en) Pattern-based software design
Wang et al. Enhancing smart shop floor management with ubiquitous augmented reality
US8302096B2 (en) Methods and systems to perform individual tasks as a composite task
CN111915410B (en) Intelligent management and control system for high-dynamic production logistics process
Fang et al. Event-driven multi-agent ubiquitous manufacturing execution platform for shop floor work-in-progress management
US8725522B2 (en) Automatic identification of user-aligned fragments in business process models
US20040002887A1 (en) Presenting skills distribution data for a business enterprise
CN106127365B (en) Online interactive autonomous production method of quantitative remote sensing product
Razmi-Farooji et al. Advantages and potential challenges of data management in e-maintenance
Sahara et al. Real-time data integration of an internet-of-things-based smart warehouse: a case study
Ferstl et al. Tool support for the semantic object model
Sultanow et al. Modeling of processes, systems and knowledge: a multi-dimensional comparison of 13 chosen methods
Chen BPR methodologies: Methods and tools
Hannola et al. Sociotechnical challenges in knowledge-intensive production environments
Chen et al. Research on the technical architecture for building CPS and its application on a mobile phone factory
CN116361375A (en) Object management method and system based on user visual interaction
CN104077662A (en) Job scheduling management method and device
Ryu et al. Development of integrated and interactive spatial planning system of assembly blocks in shipbuilding
Magaletti et al. Process engineering and AI sales prediction: The case study of an Italian small textile company

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination