CN111310424B - Form generation method and display device - Google Patents

Form generation method and display device Download PDF

Info

Publication number
CN111310424B
CN111310424B CN202010061567.5A CN202010061567A CN111310424B CN 111310424 B CN111310424 B CN 111310424B CN 202010061567 A CN202010061567 A CN 202010061567A CN 111310424 B CN111310424 B CN 111310424B
Authority
CN
China
Prior art keywords
characters
user
track
outer frame
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010061567.5A
Other languages
Chinese (zh)
Other versions
CN111310424A (en
Inventor
王敏
李莹雪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202010061567.5A priority Critical patent/CN111310424B/en
Publication of CN111310424A publication Critical patent/CN111310424A/en
Application granted granted Critical
Publication of CN111310424B publication Critical patent/CN111310424B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The application discloses a table generation method and display equipment, and relates to the technical field of computers. The method may draw an outline on the user interface based on a first trajectory surrounding the plurality of characters, and then draw an inscription line on the user interface based on a second trajectory separating the plurality of characters. Finally, a form is generated based on the outline, the inner score, and the plurality of characters. Based on the table generation method provided by the application, under the condition that a plurality of characters are displayed on the user interface, a user can draw tracks according to the arrangement condition of the plurality of characters, and the controller draws the outer frame and the inner scribing line according to the obtained tracks, and can automatically generate the table based on the outer frame, the inner scribing line and the plurality of characters. In the process of generating the table, a user does not need to adjust the table in the process of inputting characters, and the table generation method is simple to operate and high in flexibility.

Description

Form generation method and display device
Technical Field
The present application relates to the field of computer technologies, and in particular, to a table generating method and a display device.
Background
The form is used as a common data processing mode, so that a user can conveniently count data.
In the related art, when a user needs to use a table to count data, the display device may be triggered to generate a drawing instruction for the table by an operation of inserting the table, and the display device draws the table in response to the drawing instruction. Thereafter, the user can input characters in the form through an input operation. In addition, in the process of inputting characters by a user, the attribute of the table can be adjusted according to the actually input characters. The attributes of the table include: table row number, table column number, row-per-row height, and column-per-row height.
However, in the related art, the table needs to be drawn before the user inputs the characters, and the attribute of the table needs to be continuously adjusted, so that the operation is complex and the flexibility is poor.
Disclosure of Invention
The application provides a table generation method and display equipment, which can solve the problems of complex operation and poor flexibility in the related technology that the attribute of a table needs to be continuously adjusted. The technical scheme is as follows:
in one aspect, there is provided a display apparatus including:
a display configured to display a user interface, the user interface comprising a plurality of characters;
a touch detector in communication with the display and configured to detect a user touch instruction entered by a user on a user interface of the display;
A controller in communication with the display and the touch detector configured to:
acquiring a first track surrounding the plurality of characters based on a first user touch instruction input on the user interface;
drawing an outer frame surrounding the plurality of characters based on the first trajectory;
acquiring a second track for separating the plurality of characters based on a second user touch instruction input on the user interface in an area surrounded by the outer frame;
drawing an inner scribe line for separating the plurality of characters within the outer frame based on the second trajectory;
and generating a table on the display according to the outer frame, the inner scribing line and the plurality of characters.
In another aspect, a method for generating a table is provided, which is applied to a display device, and includes:
acquiring a first track surrounding a plurality of characters based on a first user touch instruction input on the user interface;
drawing an outer frame surrounding the plurality of characters based on the first trajectory;
acquiring a second track for separating the plurality of characters based on a second user touch instruction input on the user interface in an area surrounded by the outer frame;
Drawing an inner scribe line for separating the plurality of characters within the outer frame based on the second trajectory;
and generating a table according to the outer frame, the inner scribing line and the plurality of characters.
In yet another aspect, a computer-readable storage medium having instructions stored therein, which when run on a computer, cause the computer to perform the table generation method as provided in the above aspect is provided.
The technical scheme provided by the application has the beneficial effects that at least:
the present application provides a form generating method and a display apparatus, which can draw an outer frame on a user interface based on a first track surrounding a plurality of characters, and then draw an inner scribe line on the user interface based on a second track separating the plurality of characters. Finally, a form is generated based on the outline, the inner score, and the plurality of characters. Based on the table generation method provided by the application, under the condition that a plurality of characters are displayed on the user interface, a user can draw tracks for separating the plurality of characters on the user interface according to the arrangement condition of the plurality of characters, and the controller draws an outer frame and an inner scribing line according to the obtained tracks and can automatically generate the table based on the outer frame, the inner scribing line and the plurality of characters. In the process of generating the table, a user does not need to adjust the table in the process of inputting characters, and the table generation method is simple to operate and high in flexibility.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control apparatus according to an embodiment of the present application;
fig. 2 is a hardware configuration block diagram of a display device provided by an embodiment of the present application;
FIG. 3 is a block diagram of a hardware configuration of another display device provided by an embodiment of the present application;
fig. 4 is a block diagram of a control apparatus according to an embodiment of the present application;
FIG. 5 is a flowchart of a table generating method according to an embodiment of the present application;
FIG. 6 is a flowchart of another table generation method according to an embodiment of the present application;
FIG. 7 is a schematic illustration of a user interface provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of another user interface provided by an embodiment of the present application;
FIG. 9 is a schematic diagram of yet another user interface provided by an embodiment of the present application;
FIG. 10 is a schematic illustration of yet another user interface provided by an embodiment of the present application;
FIG. 11 is a schematic illustration of yet another user interface provided by an embodiment of the present application;
FIG. 12 is a schematic illustration of yet another user interface provided by an embodiment of the present application;
FIG. 13 is a schematic illustration of yet another user interface provided by an embodiment of the present application;
FIG. 14 is a schematic illustration of yet another user interface provided by an embodiment of the present application;
FIG. 15 is a schematic view of yet another user interface provided by an embodiment of the present application;
FIG. 16 is a schematic illustration of yet another user interface provided by an embodiment of the present application;
FIG. 17 is a schematic illustration of yet another user interface provided by an embodiment of the present application;
FIG. 18 is a schematic diagram of yet another user interface provided by an embodiment of the present application;
FIG. 19 is a schematic diagram of yet another user interface provided by an embodiment of the present application;
fig. 20 is a schematic functional configuration diagram of a display device according to an embodiment of the present application;
FIG. 21 is a block diagram of a display device software system provided by an embodiment of the present application;
fig. 22 is a schematic diagram of a configuration of an application in a display device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of exemplary embodiments of the present application more apparent, the technical solutions of exemplary embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the exemplary embodiments of the present application, and it is apparent that the described exemplary embodiments are only some embodiments of the present application, not all embodiments.
All other embodiments, which can be made by a person skilled in the art without inventive effort, based on the exemplary embodiments shown in the present application are intended to fall within the scope of the present application. Furthermore, while the present disclosure has been described in terms of an exemplary embodiment or embodiments, it should be understood that each aspect of the disclosure may be separately implemented as a complete solution.
It should be understood that the terms "first," "second," "third," and the like in the description and in the claims and in the above-described figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate, such as where appropriate, for example, implementations other than those illustrated or described in connection with the embodiments of the application.
Furthermore, the terms "comprise" and "have," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to those elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" as used in this disclosure refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the function associated with that element.
The term "remote control" as used herein refers to a component of an electronic device (such as a display device as disclosed herein) that can be controlled wirelessly, typically over a relatively short distance. The electronic device is typically connected to the electronic device using infrared and/or Radio Frequency (RF) signals and/or bluetooth, and may also include wireless-fidelity (WiFi), wireless universal serial bus (universal serial bus, USB), bluetooth, motion sensor, and other functional modules. For example, a hand-held touch remote control replaces most of the physical built-in hard keys in a typical remote control with a touch screen user interface.
The term "gesture" as used herein refers to a user behavior by which a user expresses an intended idea, action, purpose, and/or result through a change in hand shape or movement of a hand, etc.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control apparatus according to an embodiment of the present application. As shown in fig. 1, a user may operate the display apparatus 200 by controlling the device 100.
The control device 100 may be a remote controller 100A, including infrared protocol communication or bluetooth protocol communication, and other short-range communication modes, and the display apparatus 200 is controlled by wireless or other wired modes. The user may control the display device 200 by inputting user instructions through keys on a remote control, voice input, control panel input, etc. For example, the user may input corresponding control instructions through volume up-down keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, on-off keys, etc. on the remote controller to realize the functions of the control display device 200.
The control device 100 may also be an intelligent device, such as a mobile terminal 100B, a tablet computer, a notebook computer, or the like. For example, the display device 200 is controlled using an application running on a smart device. The application may provide various controls to the user through an intuitive User Interface (UI) on a screen associated with the smart device.
By way of example, the mobile terminal 100B may install a software application with the display device 200, implement connection communication through a network communication protocol, and achieve the purpose of one-to-one control operation and data communication. For example, the mobile terminal 100B may be caused to establish a control instruction protocol with the display device 200, synchronize a remote control keyboard to the mobile terminal 100B, and control the functions of the display device 200 by controlling a user interface on the mobile terminal 100B. The audio/video content displayed on the mobile terminal 100B may also be transmitted to the display device 200, so as to implement a synchronous display function.
As shown in fig. 1, the display device 200 also communicates data with the server 300 through a variety of communication means. The display device 200 may be permitted to make communication connections via a local area network (local area network, LAN), a wireless local area network (wireless local area network, WLAN), and other networks. The server 300 may provide various contents and interactions to the display device 200. By way of example, display device 200 receives software program updates, or accesses a remotely stored digital media library, by sending and receiving information, and by electronic program guide (electrical program guide, EPG) interaction. The servers 300 may be one group, may be multiple groups, and may be one or more types of servers. Other web service content such as video on demand and advertising services are provided through the server 300.
The display device 200 may be a liquid crystal display, an organic light-emitting diode (OLED) display, or a projection display device. The particular display device type, size, resolution, etc. are not limited, and those skilled in the art will appreciate that the display device 200 may be subject to some changes in performance and configuration as desired.
The display device 200 may additionally provide an intelligent network television function of a computer support function in addition to the broadcast receiving television function. Examples include web tv, smart tv, internet Protocol Tv (IPTV), etc.
Fig. 2 is a block diagram of a hardware configuration of a display device according to an embodiment of the present application. As shown in fig. 2, a modem 220, a communicator 230, a detector 240, an external device interface 250, a controller 210, a memory 290, a user input interface, a video processor 260-1, an audio processor 260-2, a display 280, an audio input interface 272, a power supply may be included in the display apparatus 200.
The modem 220 receives broadcast television signals through a wired or wireless manner, and may perform modulation and demodulation processes such as amplification, mixing, and resonance for demodulating an audio/video signal carried in a frequency of a television channel selected by a user and additional information (e.g., an EPG data signal) from among a plurality of wireless or wired broadcast television signals.
The tuning demodulator 220 is responsive to the user selected television channel frequency and television signals carried by that frequency, as selected by the user, and as controlled by the controller 210.
The tuner demodulator 220 may receive signals in various ways, such as terrestrial broadcasting, cable broadcasting, satellite broadcasting, or internet broadcasting, according to broadcasting systems of television signals; according to different modulation types, the digital modulation mode and the analog modulation mode can be adopted; and the analog signal and the digital signal can be demodulated according to the kind of the received television signal.
In other exemplary embodiments, the modem 220 may also be in an external device, such as an external set-top box, or the like. In this way, the set-top box outputs the television audio and video signals after modulation and demodulation, and inputs the television audio and video signals to the display device 200 through the input/output interface 250.
Communicator 230 is a component for communicating with external devices or external servers according to various communication protocol types. For example, communicator 230 may include a WIFI module 231, a bluetooth communication protocol module 232, a wired ethernet communication protocol module 233, and other network communication protocol modules or near field communication protocol modules.
The display device 200 may establish a connection of control signals and data signals with an external control device or a content providing device through the communicator 230. For example, the communicator may receive a control signal of the remote controller 100 according to the control of the controller.
The detector 240 is a component of the display device 200 for collecting signals of an external environment or interaction with the outside. The detector 240 may include a light receiver 242, a sensor for capturing the intensity of ambient light, a display parameter change that may be adapted by capturing ambient light, etc.; the system can also comprise an image collector 241, such as a camera, a video camera and the like, which can be used for collecting external environment scenes, collecting attributes of a user or interacting gestures with the user, adaptively changing display parameters and identifying the gestures of the user so as to realize the interaction function with the user; a touch detector 243 may also be included for detecting user touch instructions, such as user touch instructions for moving objects from within the form to outside the form, or user touch instructions for moving objects from one cell to another cell within the form.
In other exemplary embodiments, the detector 240 may further include a temperature sensor, such as by sensing ambient temperature, and the display device 200 may adaptively adjust the display color temperature of the image. Illustratively, the display device 200 may be adjusted to display a colder color temperature shade of the image when the temperature is higher than ambient; when the temperature is low, the display device 200 may be adjusted to display a color temperature-warm tone of the image.
In other exemplary embodiments, the detector 240 may further include a sound collector, such as a microphone, that may be used to receive a user's sound, including a voice signal of a control instruction of the user controlling the display device 200, or collect an ambient sound for identifying an ambient scene type, and the display device 200 may adapt to ambient noise.
An external device interface 250 provides a component for the controller 210 to control data transmission between the display apparatus 200 and external other apparatuses. The external device interface may be connected to an external device such as a set-top box, a game device, a notebook computer, etc., in a wired/wireless manner, and may receive data such as a video signal (e.g., a moving image), an audio signal (e.g., music), additional information (e.g., an EPG), etc., of the external device.
Among other things, the external device interface 250 may include: any one or more of a high definition multimedia interface (high definition multimedia interface, HDMI) terminal 251, a composite video blanking sync (composite video blanking and sync, CVBS) terminal 252, an analog or digital component terminal 253, a USB terminal 254, a Red Green Blue (RGB) terminal (not shown in the figure), and the like.
The controller 210 controls the operation of the display device 200 and responds to the user's operations by running various software control programs (e.g., an operating system and various application programs) stored on the memory 290. For example, the memory 290 stores a computer program, and the controller 210 may implement the table generating method provided in the above-described method embodiment when executing the computer program.
As shown in fig. 2, the controller 210 includes a random access memory (random access memory, RAM) 213, a Read Only Memory (ROM) 214, a graphics processor 216, a central processing unit (central processing unit, CPU) 212, a communication interface 218, and a communication bus. The RAM 213 and the ROM 214 are connected to the graphics processor 216, the CPU 212, and the communication interface 218 via buses.
A ROM 213 for storing instructions for various system starts. When the power of the display device 200 starts to be started when the power-on signal is received, the CPU 212 executes the system start instruction in the ROM 214, and copies the operating system stored in the memory 290 into the RAM 214 to start to run the start operating system. When the operating system is started, the CPU 212 copies various applications in the memory 290 to the RAM 214, and then starts running the various applications.
A graphics processor 216 for generating various graphic objects, such as icons, operation menus, and user input instruction display graphics. The device comprises an arithmetic unit, wherein the arithmetic unit is used for receiving various interaction instructions input by a user to carry out operation and displaying various objects according to display attributes. And a renderer that generates various objects based on the results of the operator, and displays the results of rendering on the display 280.
CPU 212 is used to execute operating system and application program instructions stored in memory 290. And executing various application programs, data and contents according to various interactive instructions received from the outside, so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU 212 may include multiple processors. The plurality of processors may include one main processor and a plurality or one sub-processor. A main processor for performing some operations of the display apparatus 200 in the pre-power-up mode and/or displaying a picture in the normal mode. A plurality of or a sub-processor for performing an operation in a standby mode or the like.
Communication interface 218 may include first interface 218-1 through nth interface 218-n. These interfaces may be network interfaces that are connected to external devices via a network.
The controller 210 may control the overall operation of the display apparatus 200. For example, in response to receiving a user command for selecting a UI object displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
Wherein the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, for example, operations of displaying a link to a hyperlink page, a document, an image, or the like, or operations of executing a program corresponding to the icon. The user command for selecting the UI object may be an input command through various input means (e.g., a mouse, a keyboard, or a touch pad, etc.) connected to the display device 200 or a voice command corresponding to a voice uttered by the user.
Memory 290 includes memory for storing various software modules for driving and controlling display device 200. Such as various software modules stored in memory 290, including: basic module, detection module, communication module, display control module, browser module and various service modules.
The base module is a bottom software module for signal communication between the various hardware in the display device 200 and for sending processing and control signals to the upper modules. The detection module is a management module for collecting various information from various sensors or user input interfaces, and performing digital-to-analog conversion and analysis management.
For example, the voice recognition module includes a voice analysis module and a voice instruction database module. The display control module is a module for controlling the display 280 to display image content, and can be used for playing multimedia image content, UI interface and other information. The communication module is used for controlling and data communication with external equipment. The browser module is a module for performing data communication between the browsing servers. The service module is used for providing various services and various application programs.
Meanwhile, the memory 290 is also used to store received external data and user data, images of various items in various user interfaces, visual effect maps of focus objects, and the like.
A user input interface for transmitting an input signal of a user to the controller 210 or transmitting a signal output from the controller to the user. Illustratively, the control device (e.g., a mobile terminal or remote control) may send input signals such as power switch signals, channel selection signals, or volume adjustment signals, etc., input by the user to the user input interface, which may then be forwarded to the controller; alternatively, the control device may receive an output signal such as audio, video, or data, which is output from the user input interface via the controller, and display the received output signal or output the received output signal in the form of audio or vibration.
In some embodiments, a user may input a user command through a graphical user interface (graphical user interface, GUI) displayed on the display 280, and the user input interface receives the user input command through the GUI. Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
The video processor 260-1 is configured to receive a video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image composition according to a standard codec protocol of an input signal, so as to obtain a video signal that is directly displayed or played on the display 280.
The video processor 260-1, by way of example, includes a demultiplexing module, a video decoding module, an image compositing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio/video data stream, such as the input MPEG-2, and demultiplexes the input audio/video data stream into video signals, audio signals and the like.
And the video decoding module is used for processing the demultiplexed video signal, including decoding, scaling and the like.
And an image synthesis module, such as an image synthesizer, for performing superposition mixing processing on the graphic generator and the video image after the scaling processing according to the GUI signal input by the user or generated by the graphic generator, so as to generate an image signal for display.
The frame rate conversion module is configured to convert a frame rate of an input video, such as converting a frame rate of an input 24Hz (hertz), 25Hz, 30Hz, 60Hz video to a frame rate of 60Hz, 120Hz, or 240Hz video, where the input frame rate may be related to a source video stream and the output frame rate may be related to an update rate of a display screen. The input is carried out in a usual format such as a frame inserting mode.
And a display formatting module for converting the signal output by the frame rate conversion module into a signal conforming to a display format such as a display, for example, format converting the signal output by the frame rate conversion module to output an RGB data signal.
A display 280 for receiving image signals from the video processor 260-1 for displaying video content and images and a menu manipulation interface. The display 280 includes a display screen assembly for presenting pictures and a drive assembly for driving the display of images. The video content may be displayed from a video in a broadcast signal received by the modem 220 or may be displayed from a video input from a communicator or an external device interface. And a display 220 simultaneously displaying a user manipulation interface UI generated in the display device 200 and used to control the display device 200.
And, depending on the type of display 280, a drive assembly for driving the display. Alternatively, if the display 280 is a projection display, a projection device and projection screen may be included.
In one particular example, the display 280 is configured to display a user interface including a plurality of characters.
The audio processor 260-2 is configured to receive the audio signal, decompress and decode according to the standard codec protocol of the input signal, and perform audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing, so as to obtain an audio signal that can be played in the speaker 272.
An audio output interface 270 for receiving the audio signal output from the audio processor 260-2 under the control of the controller 210, which may include a speaker 272, or an external audio output terminal 274, such as an external audio terminal or an earphone output terminal, for outputting to a generating device of an external device.
In other exemplary embodiments, video processor 260-1 may include one or more chip components. The audio processor 260-2 may also include one or more chip components.
And, in other exemplary embodiments, the video processor 260-1 and the audio processor 260-2 may be separate chips or integrated with the controller 210 in one or more chips.
And a power supply for providing power supply support for the display device 200 with power inputted from an external power supply under the control of the controller 210. The power supply may include a built-in power circuit installed inside the display apparatus 200, or may be a power supply installed outside the display apparatus 200, such as a power interface providing an external power supply in the display apparatus 200.
In some examples, particularly for display apparatus 200 such as electronic whiteboards, commercial large screen displays, etc., as shown in fig. 3, the modem 220, HDMI interface 251, AV interface 252, component interface 253, video processor 260-1, audio processor 260-2, and audio input interface 272 may not be included in the display apparatus 200.
Fig. 4 is a block diagram of a control apparatus according to an embodiment of the present application. As shown in fig. 4, the control device 100 includes a controller 110, a communicator 130, a user input/output interface 140, a memory 190, and a power supply 180.
The control apparatus 100 is configured to control the display device 200, and to receive an input operation instruction of a user, and to convert the operation instruction into an instruction recognizable and responsive to the display device 200, and to function as an interaction between the user and the display device 200. For example, the user responds to the operation of the channel addition/subtraction by operating the channel addition/subtraction key on the control apparatus 100 by the display device 200.
In some embodiments, the control apparatus 100 may be a smart device. For example, the control apparatus 100 may install various applications for controlling the display device 200 according to user demands.
In some embodiments, as shown in fig. 1, a mobile terminal 100B or other intelligent electronic device may function similarly to the control apparatus 100 after installing an application for manipulating the display device 200. For example, a user may implement the functions of controlling physical keys of the apparatus 100 by installing applications, various function keys or virtual buttons of a graphical user interface available on the mobile terminal 100B or other intelligent electronic device.
The controller 110 includes a processor 112, RAM 113 and ROM 114, a communication interface, and a communication bus. The controller 110 is used to control the operation and operation of the control device 100, as well as the communication collaboration among the internal components and the external and internal data processing functions.
The communicator 130 performs communication of control signals and data signals with the display device 200 under the control of the controller 110. For example, the received user input signal is transmitted to the display device 200. The communicator 130 may include at least one of a WIFI module 131, a bluetooth module 132, an NFC module 133, and the like.
A user input/output interface 140, wherein the input interface includes at least one of a microphone 141, a touch pad 142, a sensor 143, keys 144, etc. For example, the user may implement a user instruction input function through actions such as voice, touch, gesture, press, etc., and the input interface transmits the received analog signal to the display device 200 by converting the digital signal into a digital signal and converting the digital signal into a corresponding instruction signal.
The output interface includes an interface that transmits the received user instruction to the display device 200. In some embodiments, an infrared interface may be used, as well as a radio frequency interface. For example, when the infrared signal is interfaced, the user input command needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. For another example, when the radio frequency signal interface is used, the user input instruction needs to be converted into a digital signal, and then the digital signal is modulated according to the modulation protocol of the radio frequency control signal and then transmitted to the display device 200 through the radio frequency transmitting terminal.
In some embodiments, the control device 100 includes at least one of a communicator 130 and an output interface. The control apparatus 100 is configured with a communicator 130, such as a WIFI, bluetooth, NFC, etc. module, and may send a user input instruction to the display device 200 through a WIFI protocol, or a bluetooth protocol, or a near field communication (near field communication, NFC) protocol code.
A memory 190 for storing various operation programs, data and applications for driving and controlling the control device 100 under the control of the controller 110. The memory 190 may store various control signal instructions input by a user. For example, the memory 190 stores a computer program, and the controller 110 may implement the table generating method provided in the above-described method embodiment when executing the computer program.
A power supply 180 for providing operating power support for the various elements of the control device 100 under the control of the controller 110. May be a battery and associated control circuitry.
The display device 200 provided in this embodiment, especially, an electronic whiteboard, a commercial large-screen display, etc., can display a form on a display interface of the display 280, and the object image in the form is regularly presented to the user through the form. In the process of editing the table, the user needs to continuously edit the size of the table according to the content in the table and write the position of the content in the table, so that the user experience is poor. Based on this, the controller 210 provided in the present embodiment is further configured to perform a table generating method.
Fig. 5 is a flowchart of a table generating method according to an embodiment of the present application. The method may be applied to the controller 210 in the display device 200 as shown in fig. 1. As can be seen with reference to fig. 5, the method may include:
Step 401, acquiring a first track surrounding a plurality of characters based on a first user touch instruction input on a user interface.
The controller may display a plurality of characters on the user interface of the display prior to acquiring the first trajectory. The plurality of characters may be written directly on the user interface by a user through a sliding instruction. Alternatively, the plurality of characters may be entered into the controller by the user via a keyboard.
Under the condition that a plurality of characters are displayed on a user interface, a user can draw a first track surrounding the plurality of characters on the user interface through a first user touch instruction, and when the controller detects the first user touch instruction acting on the plurality of first characters, the track of the first user touch instruction can be obtained, and the track of the first user touch instruction is the first track.
Step 402, drawing an outer frame surrounding a plurality of characters based on the first track.
Because the first track is triggered and generated by the user through the first user touch instruction, and the first track is generally an irregular graph similar to a rectangle, in the embodiment of the application, the controller can correct the first track when drawing the outer frame based on the first track, so as to obtain the outer frame of the regular graph.
For example, assuming that the first track is a curve surrounding a plurality of characters, an outer frame obtained by correcting the curve by the controller may be a rectangle surrounding the plurality of characters.
Of course, the shape of the frame drawn by the controller based on the first track may be the same as the first track, that is, the controller may directly use the first track as the frame. For example, when a user draws a first track surrounding a plurality of characters in a user interface, the first track may be a regular rectangle, and the controller may directly take the first track as an outline frame.
In step 403, a second track for separating the plurality of characters is acquired in the area surrounded by the outer frame based on a second user touch instruction input on the user interface.
After the controller draws the outer frame on the user interface, the user can draw a second track for separating a plurality of characters on the user interface through a second user touch instruction in the area surrounded by the outer frame. When the controller detects a second user touch instruction acting on the user interface, the controller can acquire the track of the second user touch instruction, and the track of the second user touch instruction is the second track.
Step 404, drawing an inner scribe line for separating the plurality of characters in the outer frame based on the second trajectory.
Because the second track is generated by triggering a user through a touch instruction of the second user, and the second track is usually an irregular line, in the embodiment of the application, the controller can correct the second track to obtain the inner scribing line of the regular line when drawing the inner scribing line based on the second track.
For example, assuming that the second trajectory is a curve separating a plurality of characters, an inner scribe line obtained by correcting the curve by the controller may be a straight line separating a plurality of characters.
Of course, the shape of the inner scribe line drawn by the controller based on the second track may be the same as the second track, that is, the controller may directly use the second track as the inner scribe line.
Step 405, generating a table according to the outer frame, the inner score and the plurality of characters.
In the embodiment of the application, the controller can determine a plurality of grids formed by the outer frame and the inner scribing as a plurality of cells of the table, and determine characters displayed in the area where each cell is located as content data of the cell, so that the table is obtained. The controller may edit, select, copy, save, etc. the form in response to a user-triggered operation instruction. Wherein the editing of the table by the user is usually performed in units of cells. Optionally, the user edits the cells, including moving characters into and out of the cells.
In summary, the embodiments of the present application provide a table generating method, which may draw an outline on a user interface based on a first track surrounding a plurality of characters, and then draw an inner line on the user interface based on a second track separating the plurality of characters. Finally, a form is generated based on the outline, the inner score, and the plurality of characters. According to the table generation method provided by the embodiment of the application, under the condition that a plurality of characters are displayed on the user interface, a user can draw tracks for separating the plurality of characters on the user interface according to the arrangement condition of the plurality of characters, and the controller draws an outer frame and an inner scribing line according to the obtained tracks and can automatically generate the table based on the outer frame, the inner scribing line and the plurality of characters. In the process of generating the table, a user does not need to adjust the table in the process of inputting characters, and the table generation method is simple to operate and high in flexibility.
Fig. 6 is a flowchart of another table generating method according to an embodiment of the present application. The method may be applied to the controller 210 of the display device 200 shown in fig. 1. As can be seen with reference to fig. 6, the method may include:
step 501, acquiring a first track surrounding a plurality of characters based on a first user touch instruction input on a user interface.
The plurality of characters 601 may be displayed on the user interface 60 before the controller obtains the first trajectory of the plurality of characters. Each character may be: numbers, letters or chinese characters, etc. Wherein the plurality of characters 601 may be written by a user on the user interface 60 via a sliding instruction. Alternatively, the plurality of characters 601 may be input to the controller by a user through a keyboard.
As shown in fig. 7 and 8, assume that a plurality of characters 601 are displayed on the user interface 60: the first part, the second part, description 1, description 2, responsible person 1, responsible person 2, and the user may draw a first track 602 surrounding the plurality of characters 601 on the user interface 60 via a first user touch instruction. When the controller detects a first user touch instruction acting on the user interface 60, the controller may acquire the track 602 of the first user touch instruction, where the track 602 of the first user touch instruction is the first track.
The first user touch instruction may be an instruction that a user slides on the user interface through a finger. Alternatively, the first user touch instruction may be an instruction that the user slides on the user interface through the electronic pen. Alternatively, the first user touch instruction may be an instruction that the user slides on the user interface through a mouse.
Step 502, if the first track is detected as a rectangle, drawing an outer frame of the rectangle for surrounding the plurality of characters.
In the embodiment of the present application, since the first track is generated by triggering the user through the first user touch instruction, the first track is usually an irregular pattern. Therefore, in the embodiment of the present application, when the controller draws the outer frame based on the first track, the controller may first identify the first track to determine whether the first track is rectangular. When the first track is identified, the area of the area surrounded by the first track can be determined first, if the ratio of the area surrounded by the first track to the area of the smallest circumscribed rectangle of the first track is greater than the ratio threshold, the first track can be determined to be rectangular, and the rectangle can be irregular, namely, the first track is an irregular pattern similar to the rectangle.
Referring to fig. 9, when the controller determines that the first track is an irregular rectangle, the first track 602 may be corrected to obtain a regular rectangular outer frame 603 with each side being a straight line. For example, the outer frame 603 may be a minimum bounding rectangle of the first trace 602.
It should be noted that, if it is detected that the first track is not rectangular, the controller may directly display the first track on the table display interface.
In step 503, a second track for separating the plurality of characters is acquired in the area surrounded by the outer frame based on a second user control instruction input on the user interface.
After the controller draws the outline 603 on the user interface 60, the user may draw a second track for separating a plurality of characters on the user interface 60 through a second user touch instruction within the area surrounded by the outline. When the controller detects a second user touch instruction acting on the user interface, the controller can acquire the track of the second user touch instruction, and the track of the second user touch instruction is the second track.
The second user touch instruction may be an instruction that the user slides on the user interface through his own finger. Alternatively, the second user touch instruction may be an instruction that the user slides on the user interface through the electronic pen. Alternatively, the second user touch instruction may be an instruction that the user slides on the user interface through the mouse.
For example, assuming that a second track drawn by the user through the second user touch instruction at the user interface 60 is a line 604a shown in fig. 10, the second track 604a acquired by the controller may divide the plurality of characters 601 into two groups, where the first group of characters includes: "first part", "description 1", and "responsible person 1", the second set of characters includes: "second part", "description 2", and "responsible person 2".
Step 504, detecting whether the second track meets the drawing condition.
In an embodiment of the present application, the drawing conditions may include: the area of the overlapping area of the second track and the plurality of characters is smaller than an area threshold value, and the distances between the starting point and the ending point of the second track and the side line of the outer frame are smaller than a distance threshold value. Alternatively, the distance threshold may range from 5 pixels to 35 pixels.
It should be noted that, assuming that the plurality of characters are written on the user interface by the sliding instruction, the strokes of a certain character of the plurality of characters may be longer due to the writing habit of the user, and thus the second track drawn by the second user touch instruction may overlap with the strokes of the certain character. Therefore, in the embodiment of the present application, the controller may be preconfigured with an area threshold. By setting the area threshold value to judge whether the second trajectory satisfies the drawing condition, the reliability of judgment can be improved.
For example, assuming that the distances between the start point and the end point of the second trajectory and the boundary of the outer frame are both smaller than the distance threshold, when the area of the overlapping region of the second trajectory with the plurality of characters is smaller than the area threshold, the controller may determine that the second trajectory overlaps at least one character of the plurality of characters due to writing habits of a user, the second trajectory satisfying the drawing condition.
Assuming that the distances between the start point and the end point of the second track and the boundary of the outer frame are smaller than the distance threshold, when the area of the overlapping area of the second track and the plurality of characters is equal to or larger than the area threshold, the controller may determine that the area of the overlapping area of the second track and the plurality of characters is larger, possibly resulting in misoperation of a user, and the second track does not meet the drawing condition.
When the controller detects that the second trajectory satisfies the drawing condition, step 505 may be performed; when the controller detects that the second trajectory does not satisfy the drawing condition, step 506 may be performed.
Step 505, drawing an inner scribe line for separating the plurality of characters in the outer frame based on the second trajectory.
Because the second track is triggered and generated by the user through the touch instruction of the second user, and the second track is usually an irregular line, in the embodiment of the application, when the controller detects that the second track meets the drawing condition and draws the internal scribing based on the second track, the controller can correct the second track to obtain the regular internal scribing.
For example, referring to fig. 10, the second trace 604a does not have an overlapping region with the plurality of characters 601. The first distance between the start point of the second track 604a and the upper edge of the outer frame 603 is smaller, and the second distance between the end point of the second track 604a and the lower edge of the outer frame 603 is smaller, and both the first distance and the second distance are smaller than the distance threshold. Since the second trajectory 604a satisfies the drawing condition, the controller may correct the second trajectory 604a to draw the inner scribe line 605a within the outer frame 603. Referring to fig. 11, the inner score line 605a may be a straight line parallel to the left and right lines of the outer frame 603.
In an embodiment of the present application, referring to fig. 12, the second track 604b drawn on the user interface 60 by the user through the second user touch command may also be a curved line. If the controller detects that the bent curve meets the drawing conditions, the controller can correct the bent curve to obtain the inner scribing line of the regular line. For example, the resulting internal score line may be a broken line or a straight line.
For example, as shown in fig. 12, it is assumed that the second trace obtained by the controller is a curved line 604b, and the second trace 604b may divide a plurality of characters into two groups, where the first group of characters includes: "first portion", "second portion", and "description 2". The second set of characters includes: "description 1", "responsible person 1", and "responsible person 2". From the above-described drawing conditions, it can be determined that the second trajectory 604b shown in fig. 12 also satisfies the drawing conditions, and the controller can draw the inside-line 605b in the outer frame 603 based on the second trajectory 604 b. For example, referring to fig. 13, the inner score 605b may be a fold line. Alternatively, referring to fig. 14, the controller may adjust the positions of the plurality of characters in the user interface and draw an inner scribe line 605b based on the second trajectory 604b, and the inner scribe line 605b may be a straight line. Referring to fig. 14, the inner score 605b may divide a plurality of characters into left and right parts, and the inner score 605b may be parallel to left and right lines of the outer frame. Of course, the inner score 605b may also divide the plurality of characters into upper and lower parts, and the inner score 605b may be parallel to the upper and lower edges of the outer frame. The embodiment of the present application is not limited thereto.
In this embodiment of the present application, the drawing conditions may further include: the starting point and the ending point of the second track are both positioned in the outer frame, the distance between the starting point of the second track and the first side line of the outer frame is smaller than a distance threshold, and the distance between the ending point of the second track and the second side line of the outer frame is smaller than the distance threshold. The first side line and the second side line may be two opposite sides of the outer frame. That is, the second track is located in the outer frame.
Further, the drawing conditions may further include: the difference between the length of the connecting line between the starting point and the ending point of the second track and the actual length of the second track is smaller than a difference threshold, and the distance between the starting point and the ending point of the second track in the target direction is smaller than a distance threshold. Wherein the target direction is perpendicular to the first and second edges of the outer frame. That is, the second track should be substantially parallel to both the first and second edges of the outer frame.
Alternatively, the spacing threshold may be 50 pixels and the gap threshold may be 50 pixels.
In the embodiment of the present application, assuming that the row direction of the pixels in the controller is the horizontal axis (X-axis) and the column direction of the pixels is the vertical axis (Y-axis), the first side line and the second side line of the rectangular frame drawn by the controller may be parallel to the X-axis, i.e., the first side line and the second side line may be the upper side line and the lower side line of the frame, respectively. Alternatively, the first and second edges may be both parallel to the Y-axis, i.e., the first and second edges may be left and right edges of the outer frame, respectively.
For example, the second track 604a shown in fig. 10 does not have an overlapping area with the plurality of characters, the start point and the end point of the second track 604a are both located in the outer frame, the distance between the start point of the second track 604a and the upper edge line of the outer frame is smaller, the distance between the start point of the second track 604a and the lower edge line of the outer frame is smaller, and the second track 604a is substantially parallel to the upper edge line and the lower edge line of the outer frame, so that the second track 604a shown in fig. 10 satisfies the drawing condition.
Although the second track 604b shown in fig. 12 does not have an overlapping area with the plurality of characters, the start point and the end point of the second track 604b are both located in the outer frame, and the distance between the start point of the second track 604b and the left line of the outer frame is smaller, and the distance between the start point of the second track 604b and the right line of the outer frame is smaller, but the second track 604b is not parallel to each line in the outer frame 603, so that the second track 604b shown in fig. 12 does not satisfy the drawing condition.
In an embodiment of the present application, referring to fig. 11, after the controller draws an inner scribe line 605a on the user interface 60 based on the second track, referring to fig. 15, if the user executes the second user touch instruction again in the user interface 60 shown in fig. 11, the controller may acquire the track 604c of the second user touch instruction. As shown in fig. 16, the controller may draw an inner scribe line 605c based on the trajectory 604c of the second user touch command. Still further, referring to fig. 17, if the user executes the second user touch command again in the user interface 60 shown in fig. 16, the controller may acquire the track 604d of the second user touch command. And, as shown in fig. 18, the controller may draw an inner scribe line 605d based on the trajectory 604d of the second user touch command.
Step 506, displaying the second track.
When the controller detects that the second track does not meet the drawing condition, the controller may display the second track. When the area of the overlapping area of the second track drawn by the user through the second user touch instruction and the plurality of characters is greater than or equal to the area threshold, the controller can determine that the second track is generated by misoperation of the user, so that after the controller displays the second track, prompt information can be displayed, and the prompt information is used for prompting the user whether to delete the second track.
Step 507, when a form generation instruction is detected, a plurality of grids formed by the outer frame and the inner scribe line are determined as a plurality of cells of the form.
In the embodiment of the application, the form generation instruction may refer to a user touch instruction acting on the outside of the outer frame, or may refer to a click instruction of a generate form button acting in a user interface. When the controller detects that the table generation instruction generates the table, the grids formed by the outer frame and the inner scribing line can be determined as the cells of the table.
By way of example, referring to fig. 11, assuming that one inner scribe line is drawn in the outer frame, the outer frame and the inner scribe line may constitute two grids that may serve as two cells of a table, i.e., the table may be a 1 row and 2 column table.
Alternatively, referring to fig. 16, assuming that two inner scribe lines are drawn in the outer frame, the outer frame and the inner scribe lines may constitute four grids that may be four cells of a table, that is, the table may be a 2-row and 2-column table.
Alternatively, referring to fig. 18, assuming that three inner scribe lines are drawn in the outer frame, the outer frame and the form scribe lines may constitute six grids that may serve as six cells of the form, that is, the form may be a 3-row 2-column form.
And 508, determining the characters displayed in the area where each cell is located as content data in the cells, and obtaining a table.
After determining a plurality of cells of the table, the controller may identify the characters displayed in the area where each cell is located, and fill and display the characters displayed in each cell in the cell, that is, may determine the characters displayed in the area where the cell is located as content data in the cell, so as to obtain the table.
For example, referring to fig. 18, the controller may determine the character "first portion" displayed in the area where the first row and first column cells are located as the content data within the first row and first column cells. And determining the second part of the character displayed in the area where the first row and the second column of cells are positioned as the content data in the first row and the second column of cells. And determining the character 'description 1' displayed in the area where the first column cell of the second row is positioned as the content data in the first column cell of the second row. And determining the character 'description 2' displayed in the area where the second row and second column cells are located as the content data in the second row and second column cells. And determining the character 'responsible person 1' displayed in the area where the third row and first column cells are located as the content data in the third row and first column cells. And determining the character 'responsible person 2' displayed in the area where the third row and second column cells are positioned as the content data in the third row and second column cells.
It should be noted that, a table generally includes multiple rows and/or columns, and after a change in the size of a cell in the table, the same row or column of cells may be affected. Therefore, when the controller generates the table, the height of each row of cells is set to be the same, and the width of each column of cells is set to be the same, so that the attractive coordination of the table is ensured.
Step 509, adjusting a display effect of at least one element of the plurality of characters, the outer frame and the inner scribe line.
In the embodiment of the application, before the controller detects the form generation instruction, the controller can display the outer frame, the inner scribing line and the plurality of characters on the user interface according to the preset display effect. When the controller detects the form generation instruction, the display effect of at least one element in the plurality of characters, the outer frame and the inner scribing line can be adjusted, so that a user can conveniently distinguish whether to generate the form. And, the controller can edit, select, copy or save the table in response to the operation instruction triggered by the user.
Alternatively, the display effect of the at least one element may include colors of the outer frame, the inner scribe line, and the plurality of characters, widths of the outer frame and the inner scribe line, and a font size of the plurality of characters, etc.
For example, the outline, the inner line, and the plurality of characters of the user interface display may all be green before the controller detects the form generation instruction. The controller may adjust colors of the outer frame, the inner scribe line, and the plurality of characters to black when detecting the form generation instruction.
Alternatively, referring to FIG. 19, the controller displays the outline and the line of the inside score at a width of 0.75 lbs. before detecting the form creation instruction. The controller may adjust the width of both the outer and inner score lines to 2 pounds when the form generation command is detected.
In summary, the embodiments of the present application provide a table generating method, which may draw an outline on a user interface based on a first track surrounding a plurality of characters, and then draw an inner line on the user interface based on a second track separating the plurality of characters. Finally, a form is generated based on the outline, the inner score, and the plurality of characters. According to the table generation method provided by the embodiment of the application, under the condition that a plurality of characters are displayed on the user interface, a user can draw tracks for separating the plurality of characters on the user interface according to the arrangement condition of the plurality of characters, and the controller draws an outer frame and an inner scribing line according to the obtained tracks and can automatically generate the table based on the outer frame, the inner scribing line and the plurality of characters. In the process of generating the table, a user does not need to adjust the table in the process of inputting characters, and the table generation method is simple to operate and high in flexibility.
Fig. 20 is a schematic functional configuration diagram of a display device according to an embodiment of the present application. As shown in fig. 20, the memory 290 is used to store an operating system, application programs, contents, user data, and the like, and performs system operations for driving the display device 200 and various operations in response to a user under the control of the controller 210. Memory 290 may include volatile and/or nonvolatile memory. For example, the memory 290 stores a computer program, and the controller 210 may implement the table generating method provided in the above-described method embodiment when executing the computer program.
The memory 290 is specifically used for storing an operation program for driving the controller 210 in the display device 200, and storing various application programs built in the display device 200, various application programs downloaded by a user from an external device, various graphic user interfaces related to the application programs, various objects related to the graphic user interfaces, user data information, and various internal data supporting the application programs. The memory 290 is used to store system software such as an Operating System (OS) kernel, middleware and applications, and to store input video data and audio data, as well as other user data.
The memory 290 is specifically configured to store drivers and related data for the video processor 260-1 and the audio processor 260-2, the display 280, the communication interface 230, the modem 220, the detector 240, the input/output interface, and the like.
In some embodiments, memory 290 may store software and/or programs, the software programs used to represent Operating Systems (OS) including, for example, kernels, middleware, application programming interfaces (application programming interface, APIs), and/or application programs. For example, the kernel may control or manage system resources, or functions implemented by other programs (e.g., middleware, APIs, or application programs), and the kernel may provide interfaces to allow the middleware and APIs, or applications to access the controller to implement control or management of system resources.
By way of example, the memory 290 includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, an audio control module 2906, an external instruction recognition module 2907, a communication control module 2908, a light receiving module 2909, a power control module 2910, an operating system 2911, and other applications 2912, a browser module, and the like. The controller 210 executes other applications such as a broadcast television signal reception demodulation function, a television channel selection control function, a volume selection control function, an image control function, a display control function, an audio control function, an external instruction recognition function, a communication control function, an optical signal reception function, a power control function, a software manipulation platform supporting various functions, and a browser function by running various software programs in the memory 290.
In some examples, particularly for display devices 200 such as electronic whiteboards, commercial large screen displays, and the like, the memory 290 may not include a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an audio control module 2906, a light receiving module 2909, a browser module, and the like.
Fig. 21 is a block diagram of a display device software system according to an embodiment of the present application. As shown in FIG. 21, operating system 2911, which includes executing operating software for handling various basic system services and for performing hardware-related tasks, acts as a medium for completing data processing between application programs and hardware components.
In some embodiments, portions of the operating system kernel may contain a series of software to manage display device hardware resources and to serve other programs or software code.
In other embodiments, portions of the operating system kernel may contain one or more device drivers, which may be a set of software code in the operating system that helps operate or control the devices or hardware associated with the display device. The driver may contain code to operate video, audio and/or other multimedia components. Examples include a display screen, camera, flash, wiFi, and audio drivers.
Wherein, accessibility module 2911-1 is configured to modify or access an application program to realize accessibility of the application program and operability of display content thereof.
The communication module 2911-2 is used for connecting with other peripheral devices via related communication interfaces and communication networks.
User interface module 2911-3 is configured to provide an object for displaying a user interface for access by each application program, so as to implement user operability.
Control applications 2911-4 are used to control process management, including runtime applications, and the like.
The event delivery system 2914 may be implemented within the operating system 2911 or in the application 2912. In some embodiments, an aspect is implemented within the operating system 2911, while implemented in the application 2912, for listening for various user input events, a process that implements one or more sets of predefined operations in response to recognition results of various events or sub-events will be referred to in terms of various events.
The event monitoring module 2914-1 is configured to monitor a user input interface to input an event or a sub-event.
The event recognition module 2914-1 is configured to input definitions of various events to various user input interfaces, recognize various events or sub-events, and transmit them to a process for executing one or more corresponding sets of processes.
The event or sub-event refers to an input detected by one or more sensors in the display device 200, and an input of an external control device (such as the control apparatus 100). Such as various sub-events of voice input, gesture input sub-events of gesture recognition, sub-events of remote control key instruction input of a control device, and the like. By way of example, one or more sub-events in the remote control may include a variety of forms including, but not limited to, one or a combination of key press up/down/left/right, ok key, key press, etc. And operations of non-physical keys, such as movement, holding, releasing, etc.
The interface layout management module 2913 directly or indirectly receives the user input events or sub-events from the event transmission system 2914, and is used for updating the layout of the user interface, including but not limited to the positions of the controls or sub-controls in the interface, and various execution operations related to the interface layout, such as the size or position of the container, the level, and the like.
As shown in fig. 22, the application layer 2912 contains various applications that may be executed on the display device 200. The applications may include, but are not limited to, one or more applications such as a live television application, a video on demand application, a media center application, an application center, a gaming application, and the like.
Live television applications can provide live television through different signal sources. For example, a live television application may provide television signals using inputs from cable television, radio broadcast, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on the display device 200.
Video on demand applications may provide video from different storage sources. Unlike live television applications, video-on-demand provides video displays from some storage sources. For example, video-on-demand may come from the server side of cloud storage, from a local hard disk storage containing stored video programs.
The media center application may provide various applications for playing multimedia content. For example, a media center may be a different service than live television or video on demand, and a user may access various images or audio through a media center application.
An application center may be provided to store various applications. The application may be a game, an application, or some other application associated with a computer system or other device but which may be run in a smart television. The application center may obtain these applications from different sources, store them in local storage, and then be run on the display device 200.
The display device provided by the embodiment can draw the tracks for separating the plurality of characters on the user interface, draw the outer frame and the inner scribing according to the obtained tracks, and automatically generate the table based on the outer frame, the inner scribing and the plurality of characters. In the process of generating the table, a user does not need to adjust the table in the process of inputting characters, and the table generation method is simple to operate and high in flexibility.
It should be noted that the above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.
The embodiment of the application also provides a computer readable storage medium, wherein instructions are stored in the computer readable storage medium, when the computer readable storage medium runs on a computer, the computer is caused to execute the table generating method provided by the embodiment of the method.
The foregoing description of the exemplary embodiments of the application is not intended to limit the application to the particular embodiments disclosed, but on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the application.

Claims (9)

1. A display device, characterized by comprising:
a display configured to display a user interface, the user interface comprising a plurality of characters;
a touch detector in communication with the display and configured to detect a user touch instruction entered by a user on a user interface of the display;
a controller in communication with the display and the touch detector configured to:
acquiring a first track surrounding the plurality of characters based on a first user touch instruction input on the user interface;
drawing an outer frame surrounding the plurality of characters based on the first trajectory;
acquiring a second track for separating the plurality of characters based on a second user touch instruction input on the user interface in an area surrounded by the outer frame;
if the second track is detected to meet the drawing condition, drawing an inner scribing line for separating the plurality of characters in the outer frame based on the second track;
Wherein the drawing conditions include: the area of the overlapping area of the second track and the plurality of characters is smaller than an area threshold value, and the distances between the starting point and the ending point of the second track and the side line of the outer frame are smaller than a distance threshold value;
and generating a table on the display according to the outer frame, the inner scribing line and the plurality of characters.
2. The display device of claim 1, wherein the controller is configured to:
when a form generation instruction is detected, a form is generated on the display according to the outer frame, the inner scribe line, and the plurality of characters.
3. The display device according to claim 2, wherein the form generation instruction refers to a user touch instruction acting on the outside of the outer frame or the form generation instruction refers to a click instruction of a generate form button acting in the user interface.
4. A display device according to any one of claims 1 to 3, wherein the controller is configured to:
determining a plurality of grids formed by the outer frame and the inner scribing as a plurality of cells of a table;
and determining the characters displayed in the area where each cell is located as content data in the cell, and obtaining the table.
5. The display device of claim 4, wherein the controller is further configured to:
and adjusting the display effect of at least one element in the outer frame and the inner scribing.
6. A display device according to any one of claims 1 to 3, wherein the controller is configured to:
and if the first track is detected to be rectangular, drawing an outer frame of the rectangle surrounding the plurality of characters.
7. A display device as claimed in any one of claims 1 to 3, characterized in that the outer frame is of regular rectangular shape, the inner score being parallel to at least one side of the outer frame.
8. A form generation method, characterized by being applied to a display device, the method comprising:
acquiring a first track surrounding a plurality of characters based on a first user touch instruction input on the user interface;
drawing an outer frame surrounding the plurality of characters based on the first trajectory;
acquiring a second track for separating the plurality of characters based on a second user touch instruction input on the user interface in an area surrounded by the outer frame;
and if the second track is detected to meet the drawing condition, drawing an inner scribing line for separating the plurality of characters in the outer frame based on the second track, wherein the drawing condition comprises the following steps: the area of the overlapping area of the second track and the plurality of characters is smaller than an area threshold value, and the distances between the starting point and the ending point of the second track and the side line of the outer frame are smaller than a distance threshold value;
And generating a table according to the outer frame, the inner scribing line and the plurality of characters.
9. A computer readable storage medium having instructions stored therein, which when run on a computer, cause the computer to perform the table generation method of claim 8.
CN202010061567.5A 2020-01-16 2020-01-16 Form generation method and display device Active CN111310424B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010061567.5A CN111310424B (en) 2020-01-16 2020-01-16 Form generation method and display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010061567.5A CN111310424B (en) 2020-01-16 2020-01-16 Form generation method and display device

Publications (2)

Publication Number Publication Date
CN111310424A CN111310424A (en) 2020-06-19
CN111310424B true CN111310424B (en) 2023-09-22

Family

ID=71158241

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010061567.5A Active CN111310424B (en) 2020-01-16 2020-01-16 Form generation method and display device

Country Status (1)

Country Link
CN (1) CN111310424B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1487392A (en) * 2002-08-23 2004-04-07 �Ҵ���˾ Apparatus and method for identificating data input table through touch pen moving
CN105677629A (en) * 2015-12-30 2016-06-15 联想(北京)有限公司 Information processing method and electronic device
CN106293436A (en) * 2015-05-27 2017-01-04 仁宝电脑工业股份有限公司 Chart Drawing Method
CN107203278A (en) * 2017-05-17 2017-09-26 努比亚技术有限公司 One-handed performance input method, mobile terminal and storage medium
CN108334486A (en) * 2018-01-19 2018-07-27 广州视源电子科技股份有限公司 table control method, device, equipment and storage medium
CN110647795A (en) * 2019-07-30 2020-01-03 正和智能网络科技(广州)有限公司 Form recognition method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1487392A (en) * 2002-08-23 2004-04-07 �Ҵ���˾ Apparatus and method for identificating data input table through touch pen moving
CN106293436A (en) * 2015-05-27 2017-01-04 仁宝电脑工业股份有限公司 Chart Drawing Method
CN105677629A (en) * 2015-12-30 2016-06-15 联想(北京)有限公司 Information processing method and electronic device
CN107203278A (en) * 2017-05-17 2017-09-26 努比亚技术有限公司 One-handed performance input method, mobile terminal and storage medium
CN108334486A (en) * 2018-01-19 2018-07-27 广州视源电子科技股份有限公司 table control method, device, equipment and storage medium
CN110647795A (en) * 2019-07-30 2020-01-03 正和智能网络科技(广州)有限公司 Form recognition method

Also Published As

Publication number Publication date
CN111310424A (en) 2020-06-19

Similar Documents

Publication Publication Date Title
CN109618206B (en) Method and display device for presenting user interface
CN111698557B (en) User interface display method and display equipment
CN111722768B (en) Display device and application program interface display method
CN113395558B (en) Display equipment and display picture rotation adaptation method
CN111510788B (en) Display method and display device for double-screen double-system screen switching animation
CN109960556B (en) Display device
CN111913608A (en) Touch screen rotation control interaction method and display device
CN111176603A (en) Image display method for display equipment and display equipment
US20220078505A1 (en) User interface display method and device
CN112788422A (en) Display device
CN112073787B (en) Display device and home page display method
WO2021031598A1 (en) Self-adaptive adjustment method for video chat window position, and display device
CN111464840B (en) Display device and method for adjusting screen brightness of display device
CN112783380A (en) Display apparatus and method
CN111385631A (en) Display device, communication method and storage medium
CN111078926A (en) Method for determining portrait thumbnail image and display equipment
CN111259639B (en) Self-adaptive adjustment method of table and display equipment
CN111310424B (en) Form generation method and display device
WO2021223074A1 (en) Display device and interaction control method
CN113485613A (en) Display equipment and method for realizing free-drawing screen edge painting
CN112235621A (en) Display method and display equipment for visual area
CN113467651A (en) Display method and display equipment for content corresponding to control
CN113141528A (en) Display device, boot animation playing method and storage medium
CN112463267A (en) Method for presenting screen saver information on screen of display device and display device
CN114415864B (en) Touch area determining method and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant