CN114849240A - UI editing method and device applied to game client and electronic equipment - Google Patents
UI editing method and device applied to game client and electronic equipment Download PDFInfo
- Publication number
- CN114849240A CN114849240A CN202210565469.4A CN202210565469A CN114849240A CN 114849240 A CN114849240 A CN 114849240A CN 202210565469 A CN202210565469 A CN 202210565469A CN 114849240 A CN114849240 A CN 114849240A
- Authority
- CN
- China
- Prior art keywords
- editing
- node
- interface
- user
- control node
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/30—Creation or generation of source code
- G06F8/38—Creation or generation of source code for implementing user interfaces
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Stored Programmes (AREA)
Abstract
The application provides a UI editing method and device applied to a game client and electronic equipment, wherein the method comprises the following steps: when detecting that a developer mode enters an instruction, calling an editing interface of a UI editor; when the editing interface detects a control node created by a user, mounting a component for modifying the control node for the control node to obtain a selectable editing project of the control node; and generating a user-defined UI (user interface) in the game interface based on the editing operation of the user on the selectable editing item. According to the embodiment of the application, the threshold of UI editing can be reduced, so that the user can edit the UI in a personalized manner.
Description
Technical Field
The application relates to the field of UI editing, in particular to a UI editing method and device applied to a game client and electronic equipment.
Background
In the art, a professional UI engine (e.g., UNITY) is required to process UI (User Interface) editing. These specialized UI engines typically require the user to have a certain programming base and require the user to follow the complex operation flow he has designed to edit the UI. This results in a higher threshold for UI editing in the prior art, which is inconvenient for the user to edit the personalized UI.
Disclosure of Invention
An object of the present application is to provide a UI editing method and apparatus applied to a game client, and an electronic device, which can reduce the threshold of UI editing, thereby facilitating the user to perform UI editing in a personalized manner.
According to an aspect of an embodiment of the present application, a UI editing method applied to a game client is disclosed, the method including:
when detecting that a developer mode enters an instruction, calling an editing interface of a UI editor;
when the editing interface detects a control node created by a user, mounting a component for modifying the control node for the control node to obtain a selectable editing project of the control node;
and generating a user-defined UI (user interface) in the game interface based on the editing operation of the user on the selectable editing item.
According to an aspect of an embodiment of the present application, a UI editing apparatus applied to a game client is disclosed, the apparatus including:
the interface calling-out module is configured to call out an editing interface of the UI editor when the developer mode entering indication is detected;
the component mounting module is configured to mount a component for modifying the control node for the control node when the editing interface detects the control node created by the user, so as to obtain a selectable editing project of the control node;
and the UI generating module is configured to generate a user-defined UI in the game interface based on the editing operation of the user on the selectable editing item.
In an exemplary embodiment of the present application, the component mounting module is configured to:
and mounting a controller component for switching the interface paging of the child node of the control node for the control node to obtain the state editing project of each child node of the control node.
In an exemplary embodiment of the present application, the UI generation module is configured to:
generating a state list for describing the state of each child node based on the editing operation of the user on the state editing project of each child node;
and generating the user-defined UI on the game interface according to the state list.
In an exemplary embodiment of the present application, the state of each child node includes at least one of the following attribute information: the node display attribute, the node hiding attribute, the node position attribute, the node size attribute, the node icon attribute, the node color attribute, the node font attribute and the node appearance attribute.
In an exemplary embodiment of the present application, the apparatus is configured to:
and when the game interface detects that the switching operation of state switching of the control node is triggered, the interface paging of the child node is switched through the controller component so as to control the control node to carry out state switching.
In an exemplary embodiment of the present application, the apparatus is configured to:
and switching the interface paging of the child node through the controller component by utilizing event call-back during page switching, and switching the interface paging of the grandchild node of the control node.
In an exemplary embodiment of the present application, the interface callout module is configured to:
and when the developer mode entering indication is detected, taking a game interface in the current game scene as the editing interface.
According to an aspect of an embodiment of the present application, an electronic device is disclosed, including: one or more processors; storage means for storing one or more programs that, when executed by the one or more processors, cause the electronic device to implement any of the above embodiments.
According to an aspect of embodiments herein, a computer program medium is disclosed, having computer readable instructions stored thereon, which, when executed by a processor of a computer, cause the computer to perform any of the above embodiments.
According to an aspect of embodiments herein, there is provided a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method provided in the various alternative implementations described above.
According to the method and the device, the editing interface of the UI editor is opened to the user at the game client, so that the user can edit the UI in a personalized manner; and when the control node created by the user is detected on the editing interface, the component for modifying the control node is automatically mounted on the control node, so that the situation that the user needs to deeply understand the abstract component concept is avoided, the user can edit the UI only by paying attention to the intuitive and visual control node, the threshold of the UI editing is reduced, and the user can edit the UI in a personalized manner.
Other features and advantages of the present application will be apparent from the following detailed description, or may be learned by practice of the application.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The above and other objects, features and advantages of the present application will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings.
Fig. 1 shows a flowchart of a UI editing method applied to a game client according to one embodiment of the present application.
FIG. 2 illustrates an example use diagram of an engineering management system of a UI editor in accordance with one embodiment of the application.
FIG. 3 illustrates an example diagram of a node component system of a UI editor in accordance with one embodiment of the application.
FIG. 4 illustrates an example use diagram of an interface construction system for a UI editor in accordance with one embodiment of the application.
FIG. 5 shows a schematic diagram of the internal structure of a node according to one embodiment of the present application.
Fig. 6 shows a block diagram of a UI editing apparatus applied to a game client according to an embodiment of the present application.
FIG. 7 shows a hardware diagram of an electronic device according to one embodiment of the present application.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these example embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The drawings are merely schematic illustrations of the present application and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more example embodiments. In the following description, numerous specific details are provided to give a thorough understanding of example embodiments of the present application. One skilled in the relevant art will recognize, however, that the subject matter of the present application can be practiced without one or more of the specific details, or with other methods, components, steps, and so forth. In other instances, well-known structures, methods, implementations, or operations are not shown or described in detail to avoid obscuring aspects of the application.
Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The application provides a UI editing method applied to a game client, and the method is mainly used for enabling a user to conveniently carry out user-defined editing on a UI of the game client according to own preference.
Fig. 1 is a flowchart illustrating a UI editing method applied to a game client according to an embodiment of the present application. The method comprises the following steps:
step S110, when detecting that a developer mode enters an instruction, calling an editing interface of a UI editor;
step S120, when the editing interface detects a control node created by a user, mounting a component for modifying the control node for the control node to obtain a selectable editing project of the control node;
and step S130, generating a user-defined UI on the game interface based on the editing operation of the user on the selectable editing item.
Specifically, the game client provided by the embodiment of the application is internally integrated with a UI editor which can be opened to the user. After the user enters the game client, the game client may be instructed to enter the developer mode through a preset instruction (e.g., a preset shortcut key instruction, a preset instruction code, or the like) or through a preset interface of the game client (e.g., a preset option of a menu in the game client).
And after the game client enters the developer mode, calling an editing interface of the UI editor to allow a user to define the UI on the editing interface. And the user carries out UI self-definition by creating a control node and editing the control node in the editing interface. For example: and creating a button control node, and editing the size, position, color and the like of the button control node to obtain the button in the user-defined UI.
When the game client detects a control node created by a user on the editing interface, automatically mounting a component for modifying the control node for the control node, so as to obtain a selectable editing item of the control node, and exposing the selectable editing item to the user. Wherein, the component modifies the control node, including but not limited to: the method comprises the following steps of modifying the appearance of a control node, modifying the action of the control node, modifying the layout of the control node and the like.
After the selectable editing items are exposed to the user, the user can edit the selectable editing items according to the requirements of the user. The game client generates a user-defined UI on the game interface based on the editing operation of the user on the selectable editing items, so that the user can perform various operations on the game interface according to the edited UI, namely the user-defined UI after finishing the UI editing on the editing interface.
Therefore, in the embodiment of the application, the user can personally edit the UI by opening the editing interface of the UI editor to the user at the game client; and when the control node created by the user is detected on the editing interface, the component for modifying the control node is automatically mounted on the control node, so that the situation that the user needs to deeply understand the abstract component concept is avoided, the user can edit the UI only by paying attention to the intuitive and visual control node, the threshold of the UI editing is reduced, and the user can edit the UI in a personalized manner.
FIG. 2 illustrates an example use diagram of an engineering management system of a UI editor of an embodiment of the application.
Referring to fig. 2, in this embodiment, the engineering management system is mainly used for managing and saving UI related data. Wherein, a UI engineering corresponds an interface, and its content includes: interface configuration files, screenshots, triggers, etc. One UI library includes a plurality of UI projects, and the UI editor in this embodiment includes the following UI libraries: a map library (each map archive corresponds to one map library), a master library (each account corresponds to one master library), an online library (client loads the online library when online), an official template library, a player template library and the like.
UIProjectLibManager in fig. 2 is a UI library manager responsible for managing various operations of the UI library, such as: increase, delete, change, check, etc. The UIEditorManager in fig. 2 is a UI edit manager, and is responsible for managing various operations of the UI editor.
FIG. 3 illustrates an example diagram of a node component system of a UI editor of an embodiment of the application.
Referring to fig. 3, in this embodiment, the node component system is mainly used for building a data framework, so that the structure has higher expansibility and easier maintainability. Wherein the DevUINode nodes form a node tree. Each control corresponds to a node, and the nodes can mount various UIComponent components. Components are used to modify nodes to provide the nodes with various capabilities.
FIG. 4 illustrates an example use diagram of an interface construction system of a UI editor of an embodiment of the application.
Referring to fig. 4, in this embodiment, the interface construction system is mainly used to construct a UI project, which is edited and saved by the project management system and the node component system, as a UI interface, and load and display the UI interface in the game client, so as to obtain a corresponding game interface. The component creation in the interface construction system adopts a factory design mode, and has the advantages of decoupling and easy expansion.
In one embodiment, when the developer mode entering indication is detected, the game interface in the current game scene is used as the editing interface.
In this embodiment, after the user instructs the game client to enter the developer mode, the game client calls the UI editor, and takes the game interface in the current game scene where the user is located as the editing interface of the UI editor. For example: the game scenes of the game client comprise free exploration scenes, battle scenes and the like. After entering the free exploration scene, the user finds that the game interface in the scene is not in accordance with the preference of the user. After the user in the free exploration scene indicates the game client to enter the developer mode, the game client calls the UI editor, and uses the current game interface, namely the game interface of the free exploration scene, as an editing interface, so that the user can directly perform UI self-definition on the game interface in the free exploration scene.
The embodiment has the advantages that the current game interface is used as the editing interface of the UI editor, so that the user can directly carry out UI self-definition on the game interface in the current game scene in the playing process, and the UI self-definition requirement of the user on the game scene is met in a targeted manner.
In one embodiment, upon detecting a developer mode entry indication, the game interface in the default game scenario is taken as an editing interface.
In an embodiment, a controller component for switching interface pages where child nodes of a control node are located is mounted on the control node, and a state editing project of each child node of the control node is obtained.
In this embodiment, one or more child nodes are set for the control node, and different child nodes are located in different interface pages to set states of the control node under different events. And a controller component is hung on the control node, and the control node can be switched among different interface pages through the controller component, so that the state editing items of all the child nodes can be exposed to a user, and the user can edit the state editing items of all the child nodes.
In one embodiment, a state list describing the state of each child node is generated based on the editing operation performed by the user on the state editing item of each child node. And generating a user-defined UI in the game interface according to the state list.
In this embodiment, a state list is mounted on the control node, and is used to store the state of the child node of each interface page. And after the state editing items of the child nodes are exposed to the user, the user carries out editing operation on the state editing items of the child nodes. The game client generates a state list corresponding to the editing operation according to the editing operation performed by the user. And generating a corresponding user-defined UI on the game interface according to the state list.
In one embodiment, each child node state includes at least one of the following attribute information: the node display attribute, the node hiding attribute, the node position attribute, the node size attribute, the node icon attribute, the node color attribute, the node font attribute and the node appearance attribute.
The node display attribute is used for identifying whether the node is in a display state or not; the node hiding attribute is used for identifying whether the node is in a hidden state or not; the node position attribute is used for identifying the position of the node; the node size attribute is used for identifying the size of the node; the node icon attribute is used for identifying an icon displayed by node visualization; the node color attribute is used for identifying colors displayed by node visualization; the node font property is used for identifying the font type or the font size used by the node; the node appearance attribute is used for identifying a rotation mode or transparency displayed by node visualization.
In an embodiment, when the game interface detects a switching operation that triggers the control node to perform state switching, the interface page where the child node is located is switched through the controller component to control the control node to perform state switching.
In this embodiment, after the user-defined UI is generated in the game interface, if it is detected that the control node is triggered to perform the switching operation of the state switching, the interface page made by the child node is switched through the controller component, and the control node is visually displayed through the child node in the switched interface page, so that the control node is controlled to perform the state switching.
In an embodiment, the interface paging of the child node is switched through the controller component by using an event callback during page switching, and the interface paging of the grandchild node of the control node is switched.
In this embodiment, the controller components of the UI editor of the game client are cascaded, so that the controller of the control node can control not only the states of the child nodes but also the states of the grandchild nodes. Specifically, the UI editor uses the event call-back during page switching, so that when the interface pages made by the child nodes are switched by the controller, the interface pages where the grandchild nodes are located are switched together, and the controller controls the states of the child nodes and the grandchild nodes at the same time.
The advantage of this embodiment is that the editing process of the UI is simplified by setting the controller component to the cascade state, considering that the state of the child node needs to be changed simultaneously with the parent node in general.
Fig. 5 is a schematic diagram illustrating an internal structure of a node according to an embodiment of the present application.
Referring to fig. 5, in the present embodiment, a controller component is mounted under a control node of a container type, and a state list is mounted.
The controller component is mainly used for controlling the child node to be switched between page 0 and page 10, so that the control node is controlled to be switched between different states.
The state list is mainly used to record the detailed expressions of page 0 to page 10 on the attribute information: a concrete representation on a display attribute, a concrete representation on a position attribute, a concrete representation on a appearance attribute, and the like.
Fig. 6 is a block diagram illustrating a UI editing apparatus applied to a game client according to an embodiment of the present application, the apparatus including:
an interface callout module 210 configured to call out an editing interface of the UI editor when detecting a developer mode entry instruction;
the component mounting module 220 is configured to mount a component for modifying the control node for the control node when the editing interface detects the control node created by the user, so as to obtain a selectable editing project of the control node;
and the UI generating module 230 is configured to generate a user-defined UI in the game interface based on the editing operation performed by the user on the selectable editing item.
In an exemplary embodiment of the present application, the component mounting module is configured to:
and mounting a controller component for switching the interface paging of the child node of the control node for the control node to obtain the state editing project of each child node of the control node.
In an exemplary embodiment of the present application, the UI generation module is configured to:
generating a state list for describing the state of each child node based on the editing operation of the user on the state editing project of each child node;
and generating the user-defined UI on the game interface according to the state list.
In an exemplary embodiment of the present application, the state of each child node includes at least one of the following attribute information: the node display attribute, the node hiding attribute, the node position attribute, the node size attribute, the node icon attribute, the node color attribute, the node font attribute and the node appearance attribute.
In an exemplary embodiment of the present application, the apparatus is configured to:
and when the game interface detects that the switching operation of state switching of the control node is triggered, the interface paging of the child node is switched through the controller component so as to control the control node to carry out state switching.
In an exemplary embodiment of the present application, the apparatus is configured to:
and switching the interface paging of the child node through the controller component by utilizing event call-back during page switching, and switching the interface paging of the grandchild node of the control node.
In an exemplary embodiment of the present application, the interface callout module is configured to:
and when the developer mode entering indication is detected, taking a game interface in the current game scene as the editing interface.
An electronic device 30 according to an embodiment of the present application is described below with reference to fig. 7. The electronic device 30 shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 7, the electronic device 30 is in the form of a general purpose computing device. The components of the electronic device 30 may include, but are not limited to: the at least one processing unit 310, the at least one memory unit 320, and a bus 330 that couples various system components including the memory unit 320 and the processing unit 310.
Wherein the storage unit stores program code executable by the processing unit 310 to cause the processing unit 310 to perform steps according to various exemplary embodiments of the present invention described in the description part of the above exemplary methods of the present specification. For example, the processing unit 310 may perform the various steps as shown in fig. 3.
The storage unit 320 may include readable media in the form of volatile storage units, such as a random access memory unit (RAM)3201 and/or a cache memory unit 3202, and may further include a read only memory unit (ROM) 3203.
The storage unit 320 may also include a program/utility 3204 having a set (at least one) of program modules 3205, such program modules 3205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
The electronic device 30 may also communicate with one or more external devices 400 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 30, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 30 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 350. An input/output (I/O) interface 350 is connected to the display unit 340. Also, the electronic device 30 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 360. As shown, the network adapter 360 communicates with the other modules of the electronic device 30 via the bus 330. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with electronic device 30, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, to name a few.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present application can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to make a computing device (which can be a personal computer, a server, a terminal device, or a network device, etc.) execute the method according to the embodiments of the present application.
In an exemplary embodiment of the present application, there is also provided a computer-readable storage medium having stored thereon computer-readable instructions which, when executed by a processor of a computer, cause the computer to perform the method described in the above method embodiment section.
According to an embodiment of the present application, there is also provided a program product for implementing the method in the above method embodiment, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as JAVA, C + +, or the like, as well as conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the application. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Moreover, although the steps of the methods herein are depicted in the drawings in a particular order, this does not require or imply that the steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present application can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which can be a personal computer, a server, a mobile terminal, or a network device, etc.) to execute the method according to the embodiments of the present application.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
Claims (10)
1. A UI editing method applied to a game client, the method comprising:
when detecting that a developer mode enters an instruction, calling an editing interface of a UI editor;
when the editing interface detects a control node created by a user, mounting a component for modifying the control node for the control node to obtain a selectable editing project of the control node;
and generating a user-defined UI (user interface) in the game interface based on the editing operation of the user on the selectable editing item.
2. The method of claim 1, wherein mounting a component for decorating the control node for the control node, and obtaining a selectable edit item for the control node comprises:
and mounting a controller component for switching the interface paging of the child node of the control node for the control node to obtain the state editing project of each child node of the control node.
3. The method of claim 2, wherein generating a user-defined UI at a game interface based on editing operations performed by a user on the selectable editing item comprises:
generating a state list for describing the state of each child node based on the editing operation of the user on the state editing project of each child node;
and generating the user-defined UI on the game interface according to the state list.
4. The method of claim 3, wherein the status of each child node comprises at least one of the following attribute information: the node display attribute, the node hiding attribute, the node position attribute, the node size attribute, the node icon attribute, the node color attribute, the node font attribute and the node appearance attribute.
5. The method of claim 2, further comprising:
and when the game interface detects that the switching operation of state switching of the control node is triggered, the interface paging of the child node is switched through the controller component so as to control the control node to carry out state switching.
6. The method of claim 5, further comprising:
and switching the interface paging of the child node through the controller component by utilizing event call-back during page switching, and switching the interface paging of the grandchild node of the control node.
7. The method of claim 1, wherein upon detecting a developer mode entry indication, invoking an editing interface of a UI editor, comprising:
and when the developer mode entering indication is detected, taking a game interface in the current game scene as the editing interface.
8. A UI editing apparatus applied to a game client, the apparatus comprising:
the interface calling-out module is configured to call out an editing interface of the UI editor when the developer mode entering indication is detected;
the component mounting module is configured to mount a component for modifying the control node for the control node when the editing interface detects the control node created by the user, so as to obtain a selectable editing project of the control node;
and the UI generating module is configured to generate a user-defined UI in the game interface based on the editing operation of the user on the selectable editing item.
9. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs which, when executed by the one or more processors, cause the electronic device to carry out the method of any one of claims 1 to 7.
10. A computer-readable storage medium having computer-readable instructions stored thereon which, when executed by a processor of a computer, cause the computer to perform the method of any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210565469.4A CN114849240A (en) | 2022-05-23 | 2022-05-23 | UI editing method and device applied to game client and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210565469.4A CN114849240A (en) | 2022-05-23 | 2022-05-23 | UI editing method and device applied to game client and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114849240A true CN114849240A (en) | 2022-08-05 |
Family
ID=82638811
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210565469.4A Pending CN114849240A (en) | 2022-05-23 | 2022-05-23 | UI editing method and device applied to game client and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114849240A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118260017A (en) * | 2024-05-31 | 2024-06-28 | 北京格如灵科技有限公司 | UI management method, device, equipment and medium for computer user interface |
-
2022
- 2022-05-23 CN CN202210565469.4A patent/CN114849240A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118260017A (en) * | 2024-05-31 | 2024-06-28 | 北京格如灵科技有限公司 | UI management method, device, equipment and medium for computer user interface |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107844297B (en) | Data visualization implementation system and method | |
CN112035101B (en) | Method, device, medium and equipment for creating command library combining RPA and AI | |
JPH0658624B2 (en) | Graphical user interface management device | |
US8495566B2 (en) | Widget combos: a widget programming model | |
CN114064024A (en) | Method, apparatus, device, storage medium, and program product for developing micro-application | |
US7447993B2 (en) | System and method for displaying a user interface object using an associated style | |
CN113268260A (en) | Routing method and device for web front end | |
US20110126171A1 (en) | Dynamic native editor code view facade | |
CN112506854A (en) | Method, device, equipment and medium for storing page template file and generating page | |
JP2020123175A (en) | Code management system and code management method | |
CN114849240A (en) | UI editing method and device applied to game client and electronic equipment | |
US9170783B1 (en) | Class creation assistant for textual programming languages | |
CN114217789A (en) | Function component expansion method, device, equipment, storage medium and program product | |
CN110825367B (en) | Design method of form designer suitable for Loongson CPU environment | |
CN108021317B (en) | Method and device for screen editing | |
JPH0573630A (en) | Distributed design support method/system | |
CN114461127A (en) | Information display method, information display device, electronic equipment and computer readable storage medium | |
CN114780081A (en) | Animation display method, electronic device and storage medium | |
CN113535037A (en) | Interactive display method and device for command line terminal, computer readable medium and equipment | |
CN113010189A (en) | Database installation method, device, equipment and storage medium | |
CN105183491A (en) | Cross-platform desktop GIS and starting method thereof | |
CN112988139A (en) | Method and device for developing event processing file | |
Mitchell et al. | DRIVE: an environment for the organised construction of user-interfaces to databases | |
KR100283099B1 (en) | Object-Oriented Modeling Tool and Its Logical and Graphical Information Processing Methods | |
KR102428928B1 (en) | Method and system for managing resource for game engine |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |