[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN112395838B - Method, device and equipment for synchronously editing object and readable storage medium - Google Patents

Method, device and equipment for synchronously editing object and readable storage medium Download PDF

Info

Publication number
CN112395838B
CN112395838B CN201910750040.0A CN201910750040A CN112395838B CN 112395838 B CN112395838 B CN 112395838B CN 201910750040 A CN201910750040 A CN 201910750040A CN 112395838 B CN112395838 B CN 112395838B
Authority
CN
China
Prior art keywords
editing
target
association
synchronous
target object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910750040.0A
Other languages
Chinese (zh)
Other versions
CN112395838A (en
Inventor
林晓晴
许铭洁
曾奋飞
赖志强
周梓煜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201910750040.0A priority Critical patent/CN112395838B/en
Publication of CN112395838A publication Critical patent/CN112395838A/en
Application granted granted Critical
Publication of CN112395838B publication Critical patent/CN112395838B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a method, a device, equipment and a readable storage medium for synchronously editing an object. According to the method, when a corresponding synchronous editing request of a user for a certain target object is received, corresponding synchronous editing information can be acquired, the association identification and the element editing parameters of target object elements to be edited in the target object are determined, the associated object elements associated with the target object are determined according to the association identification, the target object elements and the associated object elements are respectively edited based on the element editing parameters, and the effect that the user can synchronously edit the associated objects with the target object only by editing the target object is achieved.

Description

Method, device and equipment for synchronously editing object and readable storage medium
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to an object editing method, apparatus, device, and readable storage mechanism.
Background
Often there is a need for promotional promotions for an activity or project. The designer can design the corresponding planar design work aiming at the activity or the project to realize the propaganda and popularization of the activity or the project. The planar design works can be real planar designs (such as billboards, commodity packages, covers and the like) or virtual planar designs (such as page advertisements and the like).
In general, a campaign or project will be promoted by a plurality of promotion entrances or exposure places (such as physical advertisement places in different places, advertisement places in different application pages, etc.), and different promotion entrances or exposure places have different sizes and display forms, so that the planar design works often need to be adaptively modified and rearranged for each promotion entrance or exposure place, and such work does not need design creative, but consumes a great deal of communication and time cost for designers, especially for projects or campaigns with great promotion requirements, and brings high time cost and labor cost. In the process of activity or project, the requirements of propaganda popularization change, and the corresponding modification cost is very high.
Disclosure of Invention
It is an object of the present invention to provide a new solution for synchronizing editing objects.
According to a first aspect of the present invention, there is provided an object synchronous editing method, comprising:
responding to an object synchronous editing request implemented on a target object, and acquiring corresponding synchronous editing information; the synchronous editing information at least comprises an associated identifier of a target object element to be edited in the target object and an element editing parameter;
Determining an associated object element with the same association identifier as the target object element in an associated object associated with the target object;
and respectively editing the target object element and the associated object element according to the element editing parameters, so as to realize synchronous editing of the target object and the associated object.
According to a second aspect of the present invention, there is provided a display method of object synchronous editing, including:
displaying canvas through user equipment;
displaying a target object and the associated object associated with the target object through the canvas, and enabling a user to implement object synchronous editing on the target object and the associated object through the object synchronous editing method according to any one of the first aspect of the invention;
and displaying the target object and the associated object subjected to the object synchronous editing through the canvas.
According to a third aspect of the present invention, there is provided a video synchronization editing method, including:
responding to a synchronous editing request implemented on a target video frame, and acquiring corresponding synchronous editing information; the synchronous editing information at least comprises an associated identifier of a target video frame element to be edited in a target video frame and an element editing parameter;
Determining an associated video frame element with the same association identifier as the target video frame element in an associated video frame associated with the target video frame;
and respectively editing the target video frame element and the associated video frame element according to the element editing parameters, so as to realize synchronous editing of the target video frame and the associated video frame.
According to a fourth aspect of the present invention, there is provided an object synchronization editing apparatus comprising:
the information acquisition unit is used for responding to an object synchronous editing request applied to a target object and acquiring corresponding synchronous editing information; the synchronous editing information at least comprises an associated identifier of a target object element to be edited in the target object and an element editing parameter;
an association determining unit, configured to determine, in an association object associated with the target object, an association object element having the same association identifier as the target object element;
and the synchronous editing unit is used for editing the target object element and the associated object element respectively according to the element editing parameters so as to realize synchronous editing of the target object and the associated object.
According to a fifth aspect of the present invention, there is provided an object synchronization editing apparatus comprising:
a memory for storing executable instructions;
and the processor is used for running the object synchronous editing device according to the control of the executable instructions and executing any object synchronous editing method according to the first aspect of the invention.
According to a sixth aspect of the present invention, there is provided a readable storage medium having stored therein a computer program readable by a computer for performing any one of the object synchronous editing methods according to the first aspect of the present invention when the computer program is read and executed by the computer.
According to the embodiment of the disclosure, by setting the object element to have the unique association identifier, associating the object element in the plurality of objects with the association based on the association identifier, when receiving a corresponding synchronous editing request of a user to a certain target object, corresponding synchronous editing information can be obtained, the association identifier and the element editing parameter of the target object element to be edited in the target object are determined, the associated object element in the associated object with the target object is determined according to the association identifier, and the target object element and the associated object element are respectively edited based on the element editing parameter, so that the user can synchronously edit the associated object with the target object only by editing the target object, synchronous editing of the plurality of objects can be completed without operating the plurality of objects respectively, labor and time required by editing the object are greatly saved, time cost and labor cost of synchronously editing the object are reduced, and design efficiency of the user is improved. The method is particularly suitable for scenes which need to be frequently and synchronously edited for a plurality of objects.
Other features of the present invention and its advantages will become apparent from the following detailed description of exemplary embodiments of the invention, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a block diagram showing an example of a hardware configuration of an electronic device 1000 that can be used to implement an embodiment of the present invention.
Fig. 2 shows a flowchart of an object synchronous editing method of an embodiment of the present invention.
Fig. 3 shows a flowchart of an example of an object synchronization editing method of an embodiment of the present invention.
Fig. 4 shows a schematic diagram of a source object of an example of an object synchronous editing method of an embodiment of the present invention.
Fig. 5 shows a schematic diagram of batch generation of new objects in an example of an object synchronization editing method of an embodiment of the present invention.
Fig. 6 shows still another schematic diagram of batch generation of new objects in an example of an object synchronization editing method of an embodiment of the present invention.
Fig. 7 shows a schematic diagram of a synchronous editing object in an example of an object synchronous editing method of an embodiment of the present invention.
Fig. 8 shows a block diagram of an object synchronization editing apparatus 3000 of an embodiment of the present invention.
Fig. 9 shows a block diagram of an object synchronization editing apparatus 4000 of an embodiment of the present invention.
Detailed Description
Various exemplary embodiments of the present invention will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless it is specifically stated otherwise.
The following description of at least one exemplary embodiment is merely exemplary in nature and is in no way intended to limit the invention, its application, or uses.
Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail, but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any specific values should be construed as merely illustrative, and not a limitation. Thus, other examples of exemplary embodiments may have different values.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
< hardware configuration >
Fig. 1 is a block diagram showing a hardware configuration of an electronic device 1000 in which an embodiment of the present invention can be implemented.
The electronic device 1000 may be a laptop, desktop, cell phone, tablet, etc. The electronic device 1000 may also be a cloud server, a blade server, a server cluster, or the like.
For example, as shown in fig. 1, the electronic device 1000 may include a processor 1100, a memory 1200, an interface device 1300, a communication device 1400, a display device 1500, an input device 1600, a speaker 1700, a microphone 1800, and the like. The processor 1100 may be a central processing unit CPU, a microprocessor MCU, or the like. The memory 1200 includes, for example, ROM (read only memory), RAM (random access memory), nonvolatile memory such as a hard disk, and the like. The interface device 1300 includes, for example, a USB interface, a headphone interface, and the like. The communication device 1400 can be capable of wired or wireless communication, and specifically can include Wifi communication, bluetooth communication, 2G/3G/4G/5G communication, and the like. The display device 1500 is, for example, a liquid crystal display, a touch display, or the like. The input device 1600 may include, for example, a touch screen, keyboard, somatosensory input, and the like. A user may input/output voice information through the speaker 1700 and microphone 1800.
The electronic device shown in fig. 1 is merely illustrative and is in no way meant to limit the invention, its application or uses. In an embodiment of the present invention, the memory 1200 of the electronic device 1000 is configured to store instructions for controlling the processor 1100 to operate to perform any one of the object synchronous editing methods provided by the embodiment of the present invention. It will be appreciated by those skilled in the art that although a plurality of devices are shown for the electronic apparatus 1000 in fig. 1, the present invention may relate to only some of the devices thereof, for example, the electronic apparatus 1000 relates to only the processor 1100 and the storage device 1200. The skilled person can design instructions according to the disclosed solution. How the instructions control the processor to operate is well known in the art and will not be described in detail here.
< example >
In this embodiment, an object synchronization editing method is provided. The object may be any object that can be edited, for example, the object may be a planar design work, and the planar design work may be a real planar design (such as a billboard, a commodity package, a cover, a poster, etc.), or may be a virtual planar design (such as a page advertisement, an application interface, etc.). Alternatively, the object may be any digital image that can be edited, for example, each frame of video frame image in a video file. The object may also be material for design use, promotional content designed for marketing campaigns, dynamic portals for presentation of content, etc.
The object synchronous editing method, as shown in fig. 2, comprises the following steps: steps S2100-S2300.
In step S2100, corresponding synchronous editing information is acquired in response to an object synchronous editing request implemented in a target object.
In this embodiment, the target object may be an object selected by the user for editing, or an object edited by default by the system. The object synchronous editing request can be generated by operations implemented by a user, for example, an object editing interface can be provided for displaying a target object, man-machine interaction operations (such as operations of changing color, word size or scaling, rotating, adding content and the like) which are implemented by the user on the target object and reflect editing requirements are received, and the object synchronous editing request can be correspondingly generated in response to the man-machine interaction operations; or, the object synchronous editing request may be generated according to the configuration of the user, for example, a configuration interface may be provided for the user to input editing information meeting the self editing requirement for the target object (for example, taking the target object as a template), and a corresponding object synchronous editing request is generated according to the editing information.
In one example, the method provided in this embodiment further includes:
and receiving object synchronous editing operation implemented by a user, and generating an object synchronous editing request.
The object synchronous editing operation may be a man-machine interaction operation performed by a user on an application interface for displaying the target object. In this example, the object synchronization editing operation includes at least one of an enlargement operation and a reduction operation. The user can intuitively and rapidly trigger the synchronous editing request for the target object by implementing the synchronous editing operation of the object, so that the synchronous editing efficiency of the object is improved.
For example, the user can perform the whole amplifying operation on a certain target object or the local amplifying operation on a certain object element, intuitively and rapidly trigger a corresponding synchronous editing request, and through the synchronous editing method provided in the embodiment, the synchronous whole amplifying or synchronous local amplifying editing on the target object and the associated object associated with the target object can be realized, so that the user can perform the synchronous editing more intuitively and rapidly. By analyzing the object synchronous editing request or extracting information in the object synchronous editing request, corresponding synchronous editing information can be obtained.
The synchronous editing information is information related to a user's need to perform object synchronous editing based on a target object. In this embodiment, the synchronous editing information includes at least an association identifier and an element editing parameter of a target object element to be edited in the target object. The association identifier of the target object element is an identifier for associating the target object element with an object element in other objects. Each object element is provided with a unique association identifier. The association identifier of the object element may be set according to a unique identifier of the object element or according to a preset rule.
By setting the association identifier for each object element, the method can realize synchronous editing of a plurality of object elements based on the association identifiers in combination with other steps in the embodiment, realize automatic synchronous editing of a plurality of objects, avoid the need of time input and labor input to respectively edit the plurality of objects to realize synchronous editing among the objects, and greatly save time and labor cost of synchronous editing of the objects.
The element editing parameters are parameters related to editing of the target object element, and editing of the target object element may include modifying an element style (e.g., color, font, etc.) of the target object element, rotating or scaling the target object element, adding content (e.g., picture, font, multimedia asset, etc.) in the target object element, etc., and the specific content of the corresponding element editing parameters is determined according to the specific editing of the target object.
In one example, the element editing parameter of the target object element includes at least one of a target coordinate relationship and a target size relationship of the edited target object element.
The target coordinate relationship is a relative coordinate offset between the target object element and an object reference point of the target object. The object reference point may be selected as a center point of the target object or a certain vertex of the target object, or the object reference point may be selected as a plurality of vertices in the target object for cross-referencing, or the target object may be divided into a plurality of fixed areas in advance, where the object reference point is a center point in a certain area of the target object where the target object element is located, one or more vertices of the area, and so on. On the premise of determining the object reference point, based on the same coordinate system, the coordinates of the object reference point and the coordinates of the target object element can be obtained, and the relative coordinate offset between the two can be used as the target coordinate relation.
The target size relationship is the relative size ratio between the target object element and the target object. The relative size ratio may be obtained from an element size of the target object element and an object size of the target object, and may be specifically a ratio of an aspect ratio of the target object element to an aspect ratio of the target object, a ratio of a width of the target object element to a width of the target object, and a ratio of a height of the target object element to a height of the target object. It should be understood that references to "dimensions" in this embodiment include the corresponding height and width, or other parameters describing the size of the shape.
By setting the element editing parameters of the target object element, at least comprising one of the target coordinate relation and the target size relation of the edited target object element, the change of the edited target object element (whether scaling, moving, rotating and the like) can be simply and accurately represented through the target coordinate relation and the target size relation, and the efficiency of synchronous editing according to the element editing parameters is improved.
The element editing parameters may also include object styles (including colors, fonts, etc.) of the edited target object, object content changes (including changes in pictures, text, multimedia resources, etc.), which are not listed here.
Step S2200, in the associated object associated with the target object, determining the associated object element with the same associated identifier as the target object element.
In this embodiment, the associated object with which the target object is associated is an object that the user desires to edit in synchronization with the target object.
In one example, the method in this embodiment further includes:
and determining an associated object associated with the target object according to the acquired association relation.
And the association relation is used for describing a plurality of objects with association.
In this example, the association relationship is set by user in a user-defined manner, for example, the user specifies which objects have an association with each other through an external configuration operation, or the user may provide a plurality of objects having an association through uploading, hooking, or the like, so that the user can set the association relationship in a user-defined manner. After the association relationship is set by the user, the association relationship can be stored in a local storage so as to be read and acquired after determining that the association object associated with the target object exists.
Alternatively, the association relationship may be saved by a default record of the apparatus implementing the present embodiment, for example, the target object and the association object are generated according to the same source object, and after generating a plurality of objects according to the source object, the association relationship between these objects may be recorded by the default record of the apparatus implementing the present embodiment.
Through the obtained association relation, the association object associated with the target object is determined, and matching is performed with other objects after the target object is analyzed, so that the association object associated with the target object is determined, and the processing efficiency is improved.
In another example, the method in this embodiment further includes:
acquiring association data, wherein the association data at least comprises object content and object element position relations; and generating the association object associated with the target object according to the association data.
The association data is data related to association between the target object and the associated object. The association data includes at least object content and object element positional relationships. The object content is the object content in which the target object is associated with the associated object, and may be text content, image content, and the like. The object element position relationship may include a relationship between a position where any one of the object elements in the target object is located and a position where a corresponding object element in the associated object is located.
The associated data can provide a data setting interface through user equipment used by a user, so that the user can set according to own personalized requirements.
Through the association data, the association object associated with the target object can be generated based on the target object, the association data can be set according to the personalized requirements of the user, and the association object meeting the personalized requirements of the user can be correspondingly generated for the user to implement the synchronous editing of the object.
In another example, the method in this embodiment further includes:
the method comprises the steps of obtaining the size of user equipment and generating an association object which is matched with the size of the user equipment and is associated with a target object.
The user equipment is equipment used by the user. The associated object may be presented or applied by the user device. In this example, the user device size may be obtained through a device interface supported by the user device. It should be appreciated that where there are a plurality of different types of user equipment, a plurality of different user equipment sizes may be correspondingly obtained.
According to the acquired user equipment size, an associated object which is associated with the target object and is matched with the user equipment size can be automatically generated, and the object application requirements of the user can be met more accurately.
In another example, the target object and the associated object are generated from the same source object. The method further comprises the steps of: steps S2010-S2020.
Step S2010, analyzing the obtained source object, extracting the element attribute of each source object element included in the source object, and setting a unique corresponding association identifier for each source object element.
In this example, the source object may be provided by a user through uploading, selecting from a plurality of displayed objects, for example, the source object may be a planar design work uploaded by the user, or the user may directly output the planar design work through an interface provided by the design software after designing the planar design work by the design software.
Parsing the source object may determine that each source object element is included in the source object. The source object element is an object element in the source object. Parsing the source object may be performed by a variety of means, for example, the source object may be read by design software such as Adobe CEP.
In this example, the element attributes of the source object elements may be extracted after determining that each source object element is included in the source object. The element attributes may include element size, element style, element coordinates, etc. of the object element. For example, a source object is made up of multiple layers, each layer being an object element, the element attributes of the object element including the original canvas size, the coordinate locations of the element layers, the layer types of the element layers, the dimensions of the element layers, and so on.
The association identifier of each source object element may be set correspondingly when determining each source object element included in the source object according to a preset rule, for example, may be a unique number, ID, etc.; or using the object identifier of each source object element in the source object, for example, the source object includes multiple layers, each layer is an object element, and analyzing each layer included in the source object by Adobe CEP technology to obtain a unique ID of each layer as the association identifier.
Step S2020, respectively, for different preset object sizes, generating corresponding new objects according to the element attribute and the association identifier of each source object element, so as to obtain target objects and associated objects with different object sizes.
In this example, each new object includes an object element corresponding to each source object element, respectively, and having the same association identifier.
The preset object size may be set by default or specified by the user, for example, when the source object is acquired, a plurality of object sizes may be provided for the user to select a plurality of object sizes meeting the requirement as the preset object size.
In this example, the element attribute and the association identifier of each source object element may be used as an integral information packet, a source template is generated according to the information packet, and based on the source template, for different preset object sizes, corresponding object size scaling, element size scaling, source object element displacement, and the like are performed to generate a corresponding new object. When any one of the new objects is used as a target object, the other new objects are associated objects associated with the target object.
According to the method and the device, new objects with different sizes can be automatically generated in batches according to the source objects and aiming at a plurality of different preset object sizes, and the new objects are used as target objects and associated objects for synchronous editing by a user, so that the user does not need to manually design and manufacture respectively, the labor cost and the time cost are further saved, and the processing efficiency is improved.
In a more specific example, step S2020 may include: steps S2021 to S2022.
Step S2021, for each preset object size, sets an element attribute of a new object element corresponding to the source object element according to the element attribute of each source object element, the object size of the source object, and the preset object size, and sets the association identifier of the new object element to be the same as the association identifier of the source object element, so as to obtain all the new object elements corresponding to the preset object size.
The element attributes of the object element may include an element size (element height, element width), a coordinate position of the element, and the like. Taking the coordinate position of the element attribute as an element as an example, assume that the corresponding aspect ratio is obtained according to the object size of the source object as A, the corresponding aspect ratio is obtained according to the preset object size as B, and the center point of the object is selected as the object in the source object and the corresponding new object to be generatedA reference point, wherein the coordinate offset of the source object element and the object reference point of the source object is X (X may be a linear offset distance or may include a horizontal offset X1 and a vertical offset X2), and the coordinate offset of the corresponding new object element and the object reference point of the new object is
According to the calculated Y, based on the preset object reference point coordinates of the new object, the element coordinates of the new object element corresponding to the source object element can be determined;
similarly, the size (including width, height) and the like of the new object element can be obtained according to the above-described method.
The element attributes of the object element include other elements such as element color, element font, element content, etc., and the element color, element font, element content of the new object element may be directly set to be the same as the source object element.
In step S2022, a new object corresponding to the preset object size is generated according to all new object elements corresponding to each preset object size.
By setting the element attribute and the association identifier of the new-generation object element correspondingly based on the element attribute and the association identifier of the source object element according to each different preset object size, the object elements in different new-generation objects can be associated based on the association identifier, so that the association of the objects can be realized, and meanwhile, the corresponding new-generation objects can be automatically generated in batches based on the new-generation object elements, so that the processing efficiency is improved.
In the associated objects associated with the target object, the associated object elements with the same associated identifier are determined through the associated identifiers of the target object elements, and other steps can be combined, so that synchronous editing of a plurality of object elements based on the associated identifiers is realized, automatic synchronous editing of the plurality of objects is realized, a great deal of time and labor cost are avoided, and the plurality of objects are respectively edited to realize synchronous editing among the objects, so that the time and labor cost of synchronous editing of the objects are greatly saved.
Step S2300, respectively editing the target object element and the associated object element according to the element editing parameters, so as to realize synchronous editing of the target object and the associated object.
The element editing parameter related content has been described in the foregoing, and will not be described in detail here.
The element editing parameters correspond to the object synchronous editing request implemented on the target object, and the target object element is edited according to the specific content of the element editing parameters, so that the target object can be edited based on the requirement of a user, and the editing can comprise scaling, moving, selecting and installing, changing the content and the like of the target object element.
According to the element editing parameters, the associated object elements are edited, the associated object elements associated with the target object elements through the associated identifiers are synchronously edited while the target object elements are edited, and the corresponding realization is based on the requirements of users, and the editing of the associated objects associated with the target objects is synchronously realized.
The target object element is an object element in the target object, and the associated object element is an object element in an associated object with which the target object has an association. The target object element and the associated object element have the same associated identifier, and are the corresponding associated object elements in the associated objects. According to the element editing parameters, the target object element and the associated object element are edited respectively, the associated target object and the corresponding associated object element in the associated object are edited synchronously, namely, the synchronous editing of the target object and the associated object is realized, the synchronous editing of a plurality of object elements based on the associated identification is realized, the automatic synchronous editing of a plurality of objects is realized, a great deal of investment of time and manpower is avoided, and the synchronous editing among the objects is realized, so that the time and the labor cost of the synchronous editing of the objects are greatly saved.
In one example, editing the associated object element according to the element editing parameters may include: steps S2310 through S2320.
Step S2310, obtaining the corresponding associated editing parameters according to the element editing parameters.
The associated editing parameters are related parameters for editing the associated object element and the element editing parameter object for editing the target object element.
For example, based on the above example, the element editing parameters of the target object element include at least one of a target coordinate relationship and a target size relationship of the edited target object element; correspondingly, the association editing parameter at least comprises one of association coordinate relation and association size relation of the edited association object element.
The associated coordinate relationship is a relative coordinate offset between the associated object element and the object reference point of the associated object. The object reference point may be selected as a center point of the associated object or a vertex of the associated object, or the object reference point may be selected as a plurality of vertices in the associated object for cross-referencing, or the associated object may be divided into a plurality of fixed areas in advance, where the object reference point is a center point in a certain area of the associated object where the associated object element is located, vertices of one or more of the areas, and so on. On the premise of determining the object reference points, based on the same coordinate system, the coordinates of the object reference points and the coordinates of the associated object elements can be obtained, and the relative coordinate offset between the coordinates and the coordinates can be used as an associated coordinate relation.
The association size relationship is a relative size ratio between the association object element and the association object. The relative size ratio may be obtained from an element size of the associated object element and an object size of the associated object, and may be specifically a ratio of an aspect ratio of the associated object element to an aspect ratio of the associated object, a ratio of a width of the associated object element to a width of the associated object, and a ratio of a height of the associated object element to a height of the associated object.
The association editing parameters and the element editing parameters at least comprise one of association coordinate relations and association size relations of the edited association object elements, and the change of the association object elements after being edited (whether scaling, moving, rotating and the like) can be simply and accurately represented through the association coordinate relations and the association size relations, so that the efficiency of synchronous editing according to the association editing parameters is improved.
The element editing parameters may also include the style (including color, font, etc.) of the edited target object element, the content change (including the change of picture, text, multimedia resource, etc.), and the corresponding associated editing parameters also include the style, content change, etc. of the edited associated object element.
In a more specific example, according to the element editing parameters, obtaining the corresponding associated editing parameters includes: steps S2311-S2312.
In step S2311, the parameter relative relationship is obtained according to the object size and the element editing parameter of the target object.
The parameter relative relationship is used to describe a relative relationship between an element editing parameter for editing a target object element and an object size of the target object.
For example, the corresponding object aspect ratio may be derived from the object size of the target object, and the parameter relative relationship may be a ratio between the element editing parameter and the object aspect ratio.
In step S2312, the association editing parameters are obtained according to the object size and parameter relative relationship of the association object.
In this example, the relative relationship between the associated editing parameter and the object size of the associated object should be the same as the parameter relative relationship, so that it can be ensured that editing the associated object element based on the associated editing parameter is truly synchronous with editing the target object element based on the element editing parameter.
For example, assume that the element editing parameter is the relative coordinate relationship X 'of the edited target object element, the aspect ratio of the target object is A', and the parameter relative relationship is the ratio between the relative coordinate relationship of the target object element and the aspect ratio of the target object Obtaining a corresponding aspect ratio as B' according to the object size of the associated object, and corresponding associated coordinate relation of associated object elements>
Similarly, the association ratio relation of the association objects and the like can be obtained according to the above-described method.
According to the parameter relative relation, the associated editing parameters for editing the associated object elements can be quickly obtained by quickly converting the element editing parameters for editing the target object elements, so that synchronous editing among the object elements associated by the associated identifiers is realized, and the processing efficiency is improved.
After the corresponding associated editing parameters are acquired, entering:
step S2320, editing the associated object element according to the associated editing parameters.
The associated editing parameters correspond to element editing parameters for editing the target object element.
In this example, according to the associated editing parameters, the associated object element is edited, so that synchronous editing with the target object element can be realized, and synchronous editing between the target object and the associated object can be realized.
< example >
The object synchronous editing method in the present embodiment will be further exemplified in conjunction with fig. 3 to 7. In this example, the object is a flat design product, such as a poster.
As shown in fig. 3, the object synchronous editing method includes: S201-S208.
S201, providing a source object uploading inlet and receiving a source object uploaded by a user.
In this example, the source object is a user-designed poster source file, as shown in FIG. 4.
The source object uploading inlet can be in butt joint with a work transmission interface provided by design software used by a user, and after the user finishes the source object through the design software, the source object is directly uploaded into the source object uploading inlet through the work transmission interface. Alternatively, the source object upload portal may be a provided interface upload portal through which a user submits source object uploads.
S202, analyzing the source object, extracting the element attribute of each source object element included in the source object, and setting the association identifier of each source object element.
In this example, each layer of the source object may be parsed by design software such as Adobe CEP, to obtain an element attribute of each layer as a source object element, and a layer unique ID is set as an association identifier of the corresponding source object element.
S203, generating a source template according to the element attribute and the association identifier of each source object element.
S204, providing a plurality of object sizes for a user to select preset object sizes which meet requirements and are used for generating objects in batches.
S205, generating a plurality of new objects with different object sizes in batches based on the source template according to a plurality of different preset object sizes.
In this example, assuming that the user selects three different preset object sizes 600×240, 400×400, and 1200×180, three new objects generated in batch may be as shown in fig. 5: object 1 with size 600 x 240, object 2 with size 400 x 400, object 3 with size 1200 x 180.
Alternatively, in this example, the steps 204-S205 may be not executed, but the size of the user device to which the associated object is to be applied by the user may be directly obtained, and the corresponding preset object size may be automatically adapted and set according to the size of the user device, so as to generate a plurality of new objects with different object sizes in batches based on the source template. For example, as shown in fig. 6, assume that the user devices are a desktop computer, a mobile phone, and a tablet computer, respectively, the user device sizes of the desktop computer, the mobile phone, and the tablet computer are obtained, the preset object size for setting the user device size for adapting to the desktop computer is 600×240, the preset object size for adapting to the user device size of the mobile phone is 400×400, and the preset object size for adapting to the user device size of the tablet computer is 1200×180.
S206, responding to synchronous editing operation of selecting a new object as a target object by a user, and acquiring an association identifier and an element editing parameter of a target object element to be edited.
For example, the user selects the object 1 as the target object shown in fig. 7, and the modification operation is performed on the target object element "i am material big title" in the object 1, and the modification content is "the title is modified now".
S207, editing the target object element according to the element editing parameters to finish the editing target object.
For example, after the user performs an editing operation on the target object, the edited object 1 is as shown in fig. 7.
S208, acquiring corresponding associated editing parameters according to the element editing parameters, so as to edit associated object elements with the same associated identifiers according to the associated editing parameters, and completing synchronous editing of associated objects corresponding to the target objects.
The manner of obtaining the corresponding associated editing parameters is as described above, and is not described herein.
In this example, the objects 1, 2, and 3 are all generated based on the same source template, that is, according to the same source object, after the object 1 is selected as the target object, the objects 2 and 3 are associated objects associated with the target object, and correspondingly, according to the associated editing parameters, the objects 2 and 3 are edited, and the obtained result is as shown in fig. 7, so that the user can realize that the objects 2 and 3 and the object 1 are edited synchronously without operating the objects 2 and 3 respectively.
The above has described the example with reference to the attached drawings, in the example, the user can obtain the automatically generated new objects with different object sizes in batch by only uploading the source object, the user selects any new object as the target object to implement the synchronous editing operation, other new objects can be automatically edited synchronously, the manpower and time required by the user to generate and edit the objects are greatly saved, the time cost and the labor cost of the batch generation and the synchronous editing of the objects are reduced, and the design efficiency of the user is improved. The method is particularly suitable for scenes which need to be frequently edited and synchronously generated for a plurality of objects.
In this embodiment, there is also provided a display method for object synchronous editing, including:
displaying canvas through user equipment;
displaying a target object and an associated object associated with the target object through a canvas, and enabling a user to implement object synchronous editing on the target object and the associated object through any one of the object synchronous editing methods provided in the embodiment;
and displaying the target object and the associated object subjected to the object synchronous editing through the canvas.
The user equipment can be any electronic equipment with a display screen capable of displaying a man-machine interaction interface, such as a mobile phone, a desktop computer, a tablet personal computer, a notebook computer and the like. The canvas is an interface for exposing target objects and associated objects. The canvas is displayed through the user equipment, and the canvas displays the target object and the associated object, so that the whole process of synchronously editing the target object and the associated object through the object synchronous editing method can be intuitively displayed for the user, the user can intuitively and rapidly grasp the whole process of object editing, and the user experience is improved.
In this embodiment, there is also provided a video synchronization editing method, including:
responding to a synchronous editing request implemented on a target video frame, and acquiring corresponding synchronous editing information; the synchronous editing information at least comprises an associated identifier of a target video frame element to be edited in a target video frame and an element editing parameter;
determining an associated video frame element with the same association identifier as the target video frame element in an associated video frame associated with the target video frame;
and respectively editing the target video frame element and the associated video frame element according to the element editing parameters, so as to realize synchronous editing of the target video frame and the associated video frame.
The target video frame is any frame of video picture in the target video file, and the associated video frame is a corresponding video frame in the associated video file associated with the target video file. The video frame element is an element in the corresponding video frame, and may be a certain video area or a scene, a person, or other picture element in the video frame image. The target video frame element is a video frame element in the target video frame. The associated video frame element is a video frame element in an associated video frame.
The implementation of the video synchronization editing method may refer to any one of the object synchronization editing methods provided in the embodiments, where the implementation is performed by using a target video frame as a target object, an associated video frame as an associated object, a target video frame element as a target object element, and an associated video frame element as an associated object element, which will not be described herein.
By the video synchronous editing method, the target video file and the associated video file which is associated with the target video file can be automatically and synchronously edited, and video editing efficiency is improved.
For example, when a user records an original short video and issues the original short video on a plurality of short video platforms such as audio tremble and fast manual, the specifications of the videos supported and issued by each short video platform are different, a plurality of associated short videos are required to be generated according to the original short video and issued on the corresponding short video platforms respectively, and when a certain picture in the original short video is required to be edited, the plurality of associated videos are usually required to be edited manually one by one. By the video synchronous editing method in the embodiment, the original short video recorded by the user can be used as a target video, each short video released by each short video platform is used as an associated video, and the user can synchronously edit the short video released by each short video platform only by synchronously editing the target video, so that the video editing efficiency of the user is greatly improved.
< device for object synchronous editing >
In the present embodiment, there is also provided an object synchronization editing apparatus 3000, as shown in fig. 8, including: the information obtaining unit 3100, the association determining unit 3200, and the synchronization editing unit 3300 are used for obtaining the object synchronization method in the present embodiment, and are not described herein.
An information acquisition unit 3100 configured to acquire corresponding synchronous editing information in response to an object synchronous editing request implemented on a target object; the synchronous editing information at least comprises the association identification of the target object element to be edited in the target object and the element editing parameter.
The association determining unit 3200 is configured to determine, in an association object associated with the target object, an association object element having the same association identifier as the target object element.
And the synchronous editing unit 3300 is configured to edit the target object element and the associated object element according to the element editing parameters, so as to implement synchronous editing of the target object and the associated object.
Optionally, the object synchronization editing apparatus 3000 is further configured to:
and determining the association object associated with the target object according to the acquired association relation.
Optionally, the target object and the associated object are generated from the same source object;
the object synchronization editing apparatus 3000 is also configured to:
analyzing the acquired source object, extracting element attributes of each source object element included in the source object, and setting a unique corresponding association identifier for each source object element;
generating corresponding new objects according to the element attribute of each source object element and the association identifier aiming at different preset object sizes so as to acquire the target objects and the association objects with different object sizes; and each new object comprises the object elements which respectively correspond to each source object element and have the same association identification.
Further optionally, the generating, for different preset object sizes, a corresponding new object according to the element attribute of each source object element and the association identifier includes:
setting the element attribute of a new object element corresponding to the source object element according to the element attribute of each source object element, the object size of the source object and the preset object size, wherein the association identifier of the new object element is the same as the association identifier of the source object element, so that all new object elements corresponding to the preset object size are obtained;
And generating the new object conforming to the preset object size according to all the new object elements corresponding to each preset object size.
Optionally, the synchronous editing unit 3300 is further configured to:
acquiring corresponding associated editing parameters according to the element editing parameters;
and editing the associated object element according to the associated editing parameter.
Optionally, the obtaining the corresponding associated editing parameter according to the element editing parameter includes:
acquiring a parameter relative relation according to the object size of the target object and the element editing parameter;
and acquiring the association editing parameters according to the relative relation between the object size of the association object and the parameters.
Optionally, the element editing parameter at least includes one of a target coordinate relationship and a target size relationship of the edited target object element; the target coordinate relationship is a relative coordinate offset between the target object element and an object reference point of the target object; the target size relationship is a relative size ratio between the target object element and the target object;
the association editing parameters at least comprise one of association coordinate relation and association size relation of the edited association object elements; the associated coordinate relationship is a relative coordinate offset between the associated object element and an object reference point of the associated object; the association size relationship is a relative size ratio between the association object element and the association object.
It should be apparent to those skilled in the art that the object synchronization editing apparatus 3000 may be implemented in various ways. For example, the object synchronization editing apparatus 3000 may be realized by an instruction configuration processor. For example, instructions may be stored in a ROM, and when the device is started, the instructions are read from the ROM into a programmable device to realize the object synchronization editing apparatus 3000. For example, the object synchronization editing apparatus 3000 may be solidified into a dedicated device (for example, ASIC). The object synchronization editing apparatus 3000 may be divided into units independent of each other, or may be realized by combining them together. The object synchronization editing apparatus 3000 may be implemented by one of the above-described various implementations, or may be implemented by a combination of two or more of the above-described various implementations.
In this embodiment, the object synchronization editing apparatus 3000 may have various implementation forms, for example, the object synchronization editing apparatus 3000 is an encapsulated Web application, and is invoked by providing a corresponding Web address for a user to access; the object synchronization editing apparatus 3000 may be packaged in a software tool development kit (for example, SDK) and provided with other software or application calls having an object synchronization editing requirement; the object synchronization editing apparatus 3000 may be a functional module provided in object editing software, or the like.
< object synchronous editing apparatus)
In the present embodiment, there is also provided an object synchronization editing apparatus 4000, as shown in fig. 9, including:
a memory 4100 for storing executable instructions;
a processor 4200, configured to execute the object synchronous editing apparatus according to the control of the executable instruction, and perform any one of the object synchronous editing methods according to the present embodiment.
In the present embodiment, the object synchronization editing apparatus 4000 may be an electronic apparatus such as a mobile phone, a palm computer, a tablet computer, a notebook computer, a desktop computer, or the like, and for example, the object synchronization editing apparatus 4000 may be a computer in which software for implementing the object synchronization editing method of the present embodiment is installed. Alternatively, the object synchronization editing apparatus 4000 may be a server such as a blade server or a cloud server, and for example, the object synchronization editing apparatus is a web server that implements the object synchronization editing method of the present embodiment. Alternatively, the object synchronization editing apparatus 4000 may be constituted by a plurality of entity apparatuses, for example, a front-end apparatus facing a user and a back-end apparatus performing processing.
The object synchronization editing apparatus 4000 may further include other devices, for example, a display device, an input device, a communication device, or the like, such as the electronic apparatus 1000 shown in fig. 1.
< readable storage Medium >
In the present embodiment, there is also provided a readable storage medium storing a computer program readable and executable by a computer for executing the object synchronization editing method as described in the present embodiment when the computer program is read and executed by the computer.
The readable storage medium may be a tangible device that can hold and store instructions for use by an instruction execution device. The readable storage medium may be, for example, but not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: portable computer disks, hard disks, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), static Random Access Memory (SRAM), portable compact disk read-only memory (CD-ROM), digital Versatile Disks (DVD), memory sticks, floppy disks, mechanical coding devices, punch cards or in-groove structures such as punch cards or grooves having instructions stored thereon, and any suitable combination of the foregoing. A readable storage medium as used herein is not to be construed as a transitory signal itself, such as a radio wave or other freely propagating electromagnetic wave, an electromagnetic wave propagating through a waveguide or other transmission medium (e.g., a pulse of light through a fiber optic cable), or an electrical signal transmitted through an electrical wire.
The embodiments of the present invention have been described above with reference to the accompanying drawings, and according to this embodiment, an object synchronous editing method, apparatus, device, and readable storage medium are provided, by setting that an object element has a unique association identifier, associating object elements in a plurality of objects that are associated based on the association identifier, when receiving a corresponding synchronous editing request of a user to a certain target object, acquiring corresponding synchronous editing information, determining an association identifier and an element editing parameter of a target object element to be edited in the target object, determining, according to the association identifier, an associated object element in an associated object that is associated with the target object, and editing the target object element and the associated object element based on the element editing parameter, respectively, so that a user can edit the associated object of the target object synchronously, thereby implementing synchronous editing of the plurality of objects without operating on the plurality of objects, greatly saving manpower and time required for editing the object, reducing time cost and labor cost for synchronously editing the object, and improving design efficiency of the user. The method is particularly suitable for scenes which need to be frequently and synchronously edited for a plurality of objects.
The present invention may be a system, method, and/or computer program product. The computer program product may include a computer readable storage medium having computer readable program instructions embodied thereon for causing a processor to implement aspects of the present invention.
The computer readable storage medium may be a tangible device that can hold and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: portable computer disks, hard disks, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), static Random Access Memory (SRAM), portable compact disk read-only memory (CD-ROM), digital Versatile Disks (DVD), memory sticks, floppy disks, mechanical coding devices, punch cards or in-groove structures such as punch cards or grooves having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media, as used herein, are not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through waveguides or other transmission media (e.g., optical pulses through fiber optic cables), or electrical signals transmitted through wires.
The computer readable program instructions described herein may be downloaded from a computer readable storage medium to a respective computing/processing device or to an external computer or external storage device over a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmissions, wireless transmissions, routers, firewalls, switches, gateway computers and/or edge servers. The network interface card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium in the respective computing/processing device.
Computer program instructions for carrying out operations of the present invention may be assembly instructions, instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, c++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer readable program instructions may be executed entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present invention are implemented by personalizing electronic circuitry, such as programmable logic circuitry, field Programmable Gate Arrays (FPGAs), or Programmable Logic Arrays (PLAs), with state information for computer readable program instructions, which can execute the computer readable program instructions.
Various aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable medium having the instructions stored therein includes an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, implementation by software, and implementation by a combination of software and hardware are all equivalent.
The foregoing description of embodiments of the invention has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various embodiments described. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or the technical improvements in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the invention is defined by the appended claims.

Claims (12)

1. An object synchronous editing method, comprising:
responding to an object synchronous editing request implemented on a target object, and acquiring corresponding synchronous editing information; the synchronous editing information at least comprises an associated identifier of a target object element to be edited in the target object and an element editing parameter;
determining an associated object element with the same association identifier as the target object element in an associated object associated with the target object;
editing the target object element and the associated object element respectively according to the element editing parameters to realize synchronous editing of the target object and the associated object;
Wherein the target object and the associated object are generated from the same source object;
wherein editing the associated object element according to the element editing parameter includes: acquiring corresponding associated editing parameters according to the element editing parameters; and editing the associated object element according to the associated editing parameter.
2. The method according to claim 1, wherein the obtaining the corresponding associated editing parameters according to the element editing parameters includes:
acquiring a parameter relative relation according to the object size of the target object and the element editing parameter;
and acquiring the association editing parameters according to the relative relation between the object size of the association object and the parameters.
3. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the element editing parameters at least comprise one of a target coordinate relation and a target size relation of the edited target object element; the target coordinate relationship is a relative coordinate offset between the target object element and an object reference point of the target object; the target size relationship is a relative size ratio between the target object element and the target object;
The association editing parameters at least comprise one of association coordinate relation and association size relation of the edited association object elements; the associated coordinate relationship is a relative coordinate offset between the associated object element and an object reference point of the associated object; the association size relationship is a relative size ratio between the association object element and the association object.
4. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the method further comprises the steps of:
analyzing the acquired source object, extracting element attributes of each source object element included in the source object, and setting a unique corresponding association identifier for each source object element;
generating corresponding new objects according to the element attribute of each source object element and the association identifier aiming at different preset object sizes so as to acquire the target objects and the association objects with different object sizes; and each new object comprises the object elements which respectively correspond to each source object element and have the same association identification.
5. The method according to claim 4, wherein the generating the corresponding new object according to the element attribute of each source object element and the association identifier for different preset object sizes includes:
Setting the element attribute of a new object element corresponding to the source object element according to the element attribute of each source object element, the object size of the source object and the preset object size, wherein the association identifier of the new object element is the same as the association identifier of the source object element, so that all new object elements corresponding to the preset object size are obtained;
and generating the new object conforming to the preset object size according to all the new object elements corresponding to each preset object size.
6. The method according to claim 1, wherein the method further comprises:
determining the association object associated with the target object according to the acquired association relation;
and/or the number of the groups of groups,
acquiring association data, wherein the association data at least comprises object content and object element position relations;
and generating the association object associated with the target object according to the association data.
7. The method of claim 1, wherein the step of determining the position of the substrate comprises,
acquiring a user equipment size, and generating the association object which is adaptive to the user equipment size and is associated with the target object;
And/or the number of the groups of groups,
receiving an object synchronous editing operation implemented by a user, and generating an object synchronous editing request; the object synchronous editing operation at least comprises one of an amplifying operation and a shrinking operation.
8. A display method for synchronous editing of an object, comprising:
displaying canvas through user equipment;
displaying a target object and the associated object associated with the target object through the canvas, and enabling a user to implement object synchronous editing on the target object and the associated object through the object synchronous editing method according to any one of claims 1-7;
and displaying the target object and the associated object subjected to the object synchronous editing through the canvas.
9. A method for video synchronous editing, comprising:
responding to a synchronous editing request implemented on a target video frame, and acquiring corresponding synchronous editing information; the synchronous editing information at least comprises an associated identifier of a target video frame element to be edited in a target video frame and an element editing parameter;
determining an associated video frame element with the same association identifier as the target video frame element in an associated video frame associated with the target video frame;
Editing the target video frame element and the associated video frame element respectively according to the element editing parameters, so as to realize synchronous editing of the target video frame and the associated video frame;
wherein, according to the element editing parameter, editing the associated video frame element includes: acquiring corresponding associated editing parameters according to the element editing parameters; and editing the associated video frame element according to the associated editing parameter.
10. An object synchronization editing apparatus, comprising:
the information acquisition unit is used for responding to an object synchronous editing request applied to a target object and acquiring corresponding synchronous editing information; the synchronous editing information at least comprises an associated identifier of a target object element to be edited in the target object and an element editing parameter;
an association determining unit, configured to determine, in an association object associated with the target object, an association object element having the same association identifier as the target object element;
the synchronous editing unit is used for editing the target object element and the associated object element respectively according to the element editing parameters so as to realize synchronous editing of the target object and the associated object;
Wherein the target object and the associated object are generated from the same source object;
wherein editing the associated object element according to the element editing parameter includes: acquiring corresponding associated editing parameters according to the element editing parameters; and editing the associated object element according to the associated editing parameter.
11. An object synchronization editing apparatus, characterized by comprising:
a memory for storing executable instructions;
a processor for executing the object synchronization editing apparatus according to the control of the executable instructions, and executing the object synchronization editing method according to any one of claims 1 to 7.
12. A readable storage medium, characterized in that,
the readable storage medium stores a computer program readable by a computer for performing the object synchronous editing method according to any one of claims 1 to 7 when the computer program is read and executed by the computer.
CN201910750040.0A 2019-08-14 2019-08-14 Method, device and equipment for synchronously editing object and readable storage medium Active CN112395838B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910750040.0A CN112395838B (en) 2019-08-14 2019-08-14 Method, device and equipment for synchronously editing object and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910750040.0A CN112395838B (en) 2019-08-14 2019-08-14 Method, device and equipment for synchronously editing object and readable storage medium

Publications (2)

Publication Number Publication Date
CN112395838A CN112395838A (en) 2021-02-23
CN112395838B true CN112395838B (en) 2023-12-05

Family

ID=74601434

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910750040.0A Active CN112395838B (en) 2019-08-14 2019-08-14 Method, device and equipment for synchronously editing object and readable storage medium

Country Status (1)

Country Link
CN (1) CN112395838B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113518187B (en) * 2021-07-13 2024-01-09 北京达佳互联信息技术有限公司 Video editing method and device
CN114283184A (en) * 2021-12-24 2022-04-05 中国工商银行股份有限公司 Image processing method, image processing device, electronic equipment and computer readable storage medium
CN115619905A (en) * 2022-10-24 2023-01-17 北京力控元通科技有限公司 Primitive editing method and device, electronic equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1567381A (en) * 2003-06-20 2005-01-19 北京北佳信息系统有限公司 Multimedia material synchronous editing device
CN105393246A (en) * 2013-06-28 2016-03-09 微软技术许可有限责任公司 Selecting and editing visual elements with attribute groups
CN109558448A (en) * 2018-10-10 2019-04-02 北京海数宝科技有限公司 Data processing method, device, computer equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10198413B2 (en) * 2016-12-30 2019-02-05 Dropbox, Inc. Image annotations in collaborative content items

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1567381A (en) * 2003-06-20 2005-01-19 北京北佳信息系统有限公司 Multimedia material synchronous editing device
CN105393246A (en) * 2013-06-28 2016-03-09 微软技术许可有限责任公司 Selecting and editing visual elements with attribute groups
CN109558448A (en) * 2018-10-10 2019-04-02 北京海数宝科技有限公司 Data processing method, device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN112395838A (en) 2021-02-23

Similar Documents

Publication Publication Date Title
CN109344352B (en) Page loading method and device and electronic equipment
CN109618222B (en) A kind of splicing video generation method, device, terminal device and storage medium
CN112395838B (en) Method, device and equipment for synchronously editing object and readable storage medium
CN105808044A (en) Information push method and device
CN110909275B (en) Page browsing method and device and electronic equipment
KR101968977B1 (en) Cartoon providing system, cartoon providing device and cartoon providing method
TW201516968A (en) Browser-based image processing
CN110766772A (en) Flatter-based cross-platform poster manufacturing method, device and equipment
US20150033117A1 (en) Information processing device, information processing method, and program
CN110909274B (en) Page browsing method and device and electronic equipment
CN110782387A (en) Image processing method and device, image processor and electronic equipment
US11190653B2 (en) Techniques for capturing an image within the context of a document
KR101679791B1 (en) Display opperation system using mobile apparatus and method thereof
CN114047864A (en) Special effect data packet generating and displaying method, device, equipment, medium and product
CN114528816B (en) Collaborative editing information display method and device, electronic equipment and readable medium
KR20220036016A (en) Solution for making of art gallery employing virtual reality
CN108255917B (en) Image management method and device and electronic device
CN115756452A (en) Target page code generation method, device, storage medium and program product
CN111475664B (en) Object display method and device and electronic equipment
CN111506841B (en) Webpage display method, device, equipment and readable storage medium
US20150055869A1 (en) Method and apparatus for providing layout based on handwriting input
US20140380194A1 (en) Contents sharing service
CN115408763A (en) BIM platform-based component generation method
CN103488451A (en) Information input method and device
CN113947450A (en) Rendering method and editing method and device of seating chart and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant