CN106896826B - Motion scene digital synthesis system and method - Google Patents
Motion scene digital synthesis system and method Download PDFInfo
- Publication number
- CN106896826B CN106896826B CN201710107056.0A CN201710107056A CN106896826B CN 106896826 B CN106896826 B CN 106896826B CN 201710107056 A CN201710107056 A CN 201710107056A CN 106896826 B CN106896826 B CN 106896826B
- Authority
- CN
- China
- Prior art keywords
- equipment
- motion
- server
- field
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 26
- 230000015572 biosynthetic process Effects 0.000 title claims abstract description 25
- 238000003786 synthesis reaction Methods 0.000 title claims abstract description 25
- 230000002194 synthesizing effect Effects 0.000 claims abstract description 19
- 238000012545 processing Methods 0.000 claims abstract description 10
- 230000009471 action Effects 0.000 abstract description 5
- 238000001308 synthesis method Methods 0.000 description 9
- 230000008569 process Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 3
- 230000009467 reduction Effects 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000006698 induction Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/12—Target-seeking control
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B13/00—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion
- G05B13/02—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
- G05B13/04—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric involving the use of models or simulators
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention provides a system and a method for synthesizing sports scene numbers, which relate to the technical field of sports scene number synthesis and comprise a field acquisition subsystem, a flyer acquisition subsystem, an equipment acquisition subsystem and a server; the field acquisition subsystem is used for establishing a field model to obtain field range parameters; the equipment acquisition subsystem is used for acquiring equipment attitude data according to the site range parameters; the flying object acquisition subsystem is used for acquiring flying object track data according to the field range parameters; and the server is used for synthesizing the flight object trajectory data and the equipment attitude data to obtain a motion data set, and processing the motion data set to obtain motion live data. The invention establishes the association of the three parts and synthesizes the data of the three parts into a whole on the premise of not influencing the independent use of each subsystem of a specific motion scene, the action of a motion figure, the equipment attitude and the motion trail of a flyer.
Description
Technical Field
The invention relates to the technical field of motion scene digital synthesis, in particular to a motion scene digital synthesis system and a motion scene digital synthesis method.
Background
The existing digital reduction of the competition or training process is single, and a specific motion scene, the motion of a motion character, the equipment posture and the motion trail of a flyer cannot be synthesized together, so that the analysis of the motion data by participants or other applications is not facilitated.
Disclosure of Invention
In view of the above, the present invention provides a system and a method for synthesizing digital motion scenes, which establish the association of three systems and synthesize the data of the three systems into a whole without affecting the independent use of each subsystem of a specific motion scene, the motion of a motion figure, the equipment attitude and the motion trajectory of a flying object.
In a first aspect, an embodiment of the present invention provides a motion scene digital synthesis system, including: the system comprises a field acquisition subsystem, a flyer acquisition subsystem, an equipment acquisition subsystem and a server;
the field acquisition subsystem is respectively connected with the flyer acquisition subsystem and the equipment acquisition subsystem and is used for establishing a field model to obtain field range parameters;
the flyer acquisition subsystem is connected with the server and used for acquiring flyer trajectory data according to the field range parameters;
the equipment acquisition subsystem is connected with the server and used for acquiring equipment attitude data according to the site range parameters;
and the server is connected with the field acquisition subsystem and used for synthesizing the flight object trajectory data and the equipment attitude data to obtain a motion data set, and processing the motion data set to obtain motion live data.
With reference to the first aspect, an embodiment of the present invention provides a first possible implementation manner of the first aspect, where the site acquisition subsystem includes a matching terminal and a site configuration module;
the field configuration module is used for setting the matching terminal and the calibration point, obtaining a first edge angle coordinate closest to the calibration point according to an initial coordinate of the calibration point, and obtaining a field range parameter according to the first edge angle coordinate and the standard length and width of the field;
the matching terminal is used for reading the characteristic information of the equipment acquisition equipment and sending the characteristic information to the server to realize the connection between the equipment acquisition equipment and the server.
With reference to the first aspect, an embodiment of the present invention provides a second possible implementation manner of the first aspect, where the flying object collecting subsystem is further configured to receive a system unified clock sent by the server, complete time synchronization, collect the flying object trajectory data according to the site range parameter, and upload the flying object trajectory data to the server in real time.
With reference to the first aspect, an embodiment of the present invention provides a third possible implementation manner of the first aspect, where the equipment acquisition subsystem includes equipment acquisition equipment and mobile equipment;
the mobile device is used for displaying the characteristic information of the equipment acquisition device, receiving the system unified clock and the calibration point sent by the server, and resetting the clock, the initial coordinate, the initial orientation and the initial time of the equipment acquisition device according to the system unified clock and the calibration point;
and the equipment acquisition equipment is used for acquiring the equipment attitude data according to the site range parameters.
With reference to the first aspect, an embodiment of the present invention provides a fourth possible implementation manner of the first aspect, wherein the motion data set is obtained by synthesizing the flight trajectory data and the equipment attitude data at the same time period and at the same field.
The embodiment of the invention provides a sports scene digital synthesis system, which comprises a field acquisition subsystem, a flyer acquisition subsystem, an equipment acquisition subsystem and a server; the field acquisition subsystem is used for establishing a field model to obtain field range parameters; the equipment acquisition subsystem is used for acquiring equipment attitude data according to the site range parameters; the flying object acquisition subsystem is used for acquiring flying object track data according to the field range parameters; and the server is used for synthesizing the flight object trajectory data and the equipment attitude data to obtain a motion data set, and processing the motion data set to obtain motion live data. The invention establishes the association of the three parts and synthesizes the data of the three parts into a whole on the premise of not influencing the independent use of each subsystem of a specific motion scene, the action of a motion figure, the equipment attitude and the motion trail of a flyer.
In a second aspect, an embodiment of the present invention further provides a method for digitally synthesizing a motion scene, including:
establishing a site model to obtain site range parameters;
respectively acquiring flight object track data and equipment attitude data according to the field range parameters;
synthesizing the flight object trajectory data and the equipment attitude data to obtain a motion data set;
and processing the motion data set to obtain motion live data.
With reference to the second aspect, an embodiment of the present invention provides a first possible implementation manner of the second aspect, where the establishing a site model and obtaining a site range parameter includes:
setting a matching terminal and a calibration point, and obtaining a first corner coordinate closest to the calibration point according to an initial coordinate of the calibration point;
and obtaining the field range parameter according to the first edge angle coordinate and the standard length and width of the field.
With reference to the second aspect, an embodiment of the present invention provides a second possible implementation manner of the second aspect, where the method further includes:
the server is connected so that the server can send the system unified clock to the flyer acquisition subsystem to complete time synchronization;
and acquiring the flight object track data according to the field range parameters, and uploading the flight object track data to the server in real time.
With reference to the second aspect, an embodiment of the present invention provides a third possible implementation manner of the second aspect, where the method further includes: reading characteristic information of equipment acquisition equipment, and sending the characteristic information to a server to realize the connection between the equipment acquisition equipment and the server;
receiving a system unified clock and a calibration point sent by the server, and resetting a clock, an initial coordinate, an initial orientation and initial time of the equipment acquisition device according to the system unified clock and the calibration point;
and acquiring the equipment attitude data according to the site range parameters, and uploading the equipment attitude data to the server.
In combination with the second aspect, the present invention provides a fourth possible implementation manner of the second aspect, wherein the motion data set is obtained by synthesizing the flight trajectory data and the equipment attitude data at the same time period and in the same field.
The invention provides a digital synthesis method of a motion scene, which comprises the following steps: establishing a site model to obtain site range parameters; respectively acquiring flight object track data and equipment attitude data according to the field range parameters; synthesizing the flight object trajectory data and the equipment attitude data to obtain a motion data set; and processing the motion data set to obtain motion live data. The invention establishes the association of the three parts and synthesizes the data of the three parts into a whole on the premise of not influencing the independent use of each subsystem of a specific motion scene, the action of a motion figure, the equipment attitude and the motion trail of a flyer.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic structural diagram of a motion scene digital synthesis system according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a field model building of a sports scene digital synthesis system according to an embodiment of the present invention;
fig. 3 is a schematic view of an application scenario of a motion scenario digital synthesis system according to an embodiment of the present invention;
fig. 4 is a flowchart of a digital synthesis method for a motion scene according to a second embodiment of the present invention.
Icon:
10-a field acquisition subsystem; 20-a flyer acquisition subsystem; 30-equipment acquisition subsystem; 40-server.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The existing digital reduction of the competition or training process is single, and a specific motion scene, the motion of a motion character, the equipment posture and the motion trail of a flyer cannot be synthesized together, so that the analysis of the motion data by participants or other applications is not facilitated.
Based on the above, the system and the method for digitally synthesizing the motion scene provided by the embodiment of the invention establish the association of the specific motion scene, the motion of the motion figure, the equipment attitude and the motion track of the flying object on the premise of not influencing the independent use of each subsystem, and synthesize the data of the three into a whole.
For the understanding of the present embodiment, the digital synthesis system for motion scenes disclosed in the embodiments of the present invention will be described in detail first,
the first embodiment is as follows:
fig. 1 is a schematic structural diagram of a motion scene digital synthesis system according to an embodiment of the present invention.
Referring to fig. 1, the sports scene digital synthesis system includes a field acquisition subsystem 10, a flying object acquisition subsystem 20, an equipment acquisition subsystem 30, and a server 40;
the field acquisition subsystem 10 is respectively connected with the flyer acquisition subsystem 20 and the equipment acquisition subsystem 30 and is used for establishing a field model to obtain field range parameters;
the flying object acquisition subsystem 20 is connected with the server 40 and is used for acquiring the trajectory data of the flying object according to the field range parameters;
the equipment acquisition subsystem 30 is connected with the server 40 and used for acquiring equipment attitude data according to the site range parameters;
and the server 40 is connected with the field acquisition subsystem 10 and is used for synthesizing the flight object trajectory data and the equipment attitude data to obtain a motion data set, and processing the motion data set to obtain the motion live data.
Specifically, compared with the traditional single digital recovery, the field acquisition subsystem 10, the flying object acquisition subsystem 20 and the equipment acquisition subsystem 30 of the embodiment of the invention can be independently used respectively, and the association of the field acquisition subsystem, the flying object acquisition subsystem and the equipment acquisition subsystem is established on the basis of the independent use, so that the data are combined into a whole, and the analysis or other applications of the motion data are facilitated.
Further, the site acquisition subsystem 10 includes a matching terminal (not shown) and a site configuration module (not shown);
the field configuration module is used for setting a matching terminal and a calibration point, obtaining a first edge angle coordinate closest to the calibration point according to an initial coordinate of the calibration point, and obtaining a field range parameter according to the first edge angle coordinate and the standard length and width of the field;
the matching terminal is used for reading the characteristic information of the equipment acquisition device and sending the characteristic information to the server 40, so that the connection between the equipment acquisition device and the server 40 is realized.
Here, the site configuration module in the site acquisition subsystem 10 is responsible for performing digital modeling on a venue and a site, setting an initial position calibration point, setting a site calibration point and site range coordinates, and also responsible for manually measuring and recording information such as site type, serial number, position and distance; the matching terminal can confirm the characteristic information of the equipment acquisition device through scanning and send the characteristic information to the server 40, so as to realize the equipment time axis synchronization and the equipment initial position setting. At the same time, the server 40 confirms the association of the equipment collection device with the venue. As an optional implementation manner, the matching terminal comprises a scanner, and the characteristic information of the equipment acquisition device can be embodied in the forms of two-dimensional codes, bar codes, digital or wireless RF induction chips and the like;
specifically, as shown in fig. 2, fig. 2 is a schematic view of establishing a field model of a sports scene digital synthesis system according to an embodiment of the present invention; the field type and the location of the index point in fig. 2 are examples, and the location of the index point is not limited to that shown in fig. 2, and may be any fixed location, such as a suitable location on the same horizontal plane beside the field, as long as the location is calibrated;
specifically, referring to fig. 2, the index point Gn is shaped like the footprint pattern in the upper right corner of fig. 2: the two feet face the matching terminal, and the direction is a positive direction and is perpendicular to a side line of the field; the center position between the soles, namely the gravity center is marked as Gn, the coordinate is set as Gn (0, 0), and a matching terminal Gn (n is shown as the nth mark point) is arranged on the vertical surface (the wall surface) where the mark point of each field is located;
further, the length and width of the field Dm (m represents the mth field) are a and B, respectively, and the field range is a rectangular area formed by { Dm (n, 1), Dm (n, 2), Dm (n, 3), Dm (n, 4) }.
Specifically, the calibration point corresponding to the field m is Gn (0, 0), and the coordinate relative coordinate offset of the distance Dm (n, 1) is (a, B), so that the coordinate of Dm (n, 1) can be determined as (-a, B), and then according to the type of the field and the standard length and width dimensions a and B of the field, the other coordinates of the field Dm relative to the boundary of Gn are obtained: dm (n, 2), DmD (n, 3), Dm (n, 4), and determining that the final site range is Dm (x, y) { (-x is less than or equal to a-a), (B is less than or equal to y is less than or equal to B + B };
in other words, according to the initial coordinates of the calibration point, the first corner coordinates (-a, b) nearest to the calibration point Gn (0, 0) are obtained, according to the first corner coordinates (-a, b) and the standard length and width of the field, the second corner coordinates, the third corner coordinates and the fourth corner coordinates are obtained, and according to the first corner coordinates, the second corner coordinates, the third corner coordinates and the fourth corner coordinates, the field range parameters are obtained;
here, the field shape differs depending on the type of sport, and is not limited to a quadrangle;
here, one site may correspond to a plurality of calibration points, or a plurality of sites may share one calibration point and a matching terminal. For a site corresponding to multiple calibration points, the operations in fig. 2 are repeated to establish multiple sets of site range parameters.
Further, the flying object collecting subsystem 20 is further configured to receive a system unified clock sent by the server 40, complete time synchronization, collect the flying object trajectory data according to the site range parameters, and upload the flying object trajectory data to the server 40 in real time.
Specifically, the flying object collecting subsystem 20 is started and connected with the server 40, receives the system unified clock t0 sent by the server 40, resets the time of the flying object collecting subsystem 20 to the system unified clock, automatically performs clock synchronization when the flying object collecting subsystem 20 is started each time, and can improve the accuracy of time synchronization.
Further, an equipment acquisition subsystem 30, including an equipment acquisition device (not shown) and a mobile device (not shown);
the mobile device is used for displaying the characteristic information of the equipment acquisition device, receiving the system unified clock and the calibration point sent by the server 40, and resetting the clock, the initial coordinate, the initial orientation and the initial time of the equipment acquisition device according to the system unified clock and the calibration point;
and the equipment acquisition equipment is used for acquiring equipment attitude data according to the field range parameters.
The mobile device includes a smart band, but is not limited to such a device;
specifically, a positioning matching mode is adopted as a key for establishing data association, and an offline matching mode process is adopted as follows: the user wears the mobile equipment and the equipment acquisition equipment, starts the equipment acquisition subsystem 30, stands on the calibration point as required, approaches the mobile equipment to the matching terminal, the matching terminal reads the characteristic information of the equipment acquisition equipment displayed by the mobile equipment and sends the characteristic information to the server 40, the server 40 confirms the access of the equipment acquisition equipment at the moment and sends the system unified clock and the calibration point to the mobile equipment, and then the mobile equipment resets the clock, the initial coordinate, the initial orientation and the initial time of the equipment acquisition subsystem 30.
The device can objectively record the times and time of user participation by offline matching of the equipment acquisition device characteristic information, and prevent the misoperation of simple online operation or pseudo operation generated by uniform codes;
in addition, the offline matching of the characteristic information of the equipment acquisition equipment is beneficial to positioning and calibration. In terms of the current detection technology, the method is a simpler and practical method in the close-range active calibration method.
Here, as shown in fig. 3, fig. 3 is a schematic view of an application scenario of a sports scenario digital synthesis system according to an embodiment of the present invention, where a venue acquisition subsystem includes a venue configuration module a and a matching terminal G; the equipment acquisition subsystem comprises a human motion posture acquisition device C and a handheld equipment acquisition device D; the flying object acquisition subsystem acquires the track data of the flying object B through a double-camera E fixed above the middle of the field and uploads the track data to the server F in a wireless mode; the human motion gesture collection equipment C is responsible for realizing human gesture collection, and the handheld equipment collection equipment D is responsible for collecting gesture data of the handheld equipment. The mobile equipment and the personal suit equipment are matched with the mobile equipment to form personal suit equipment, and the mobile equipment uploads equipment posture data to a server F;
here, the equipment posture data includes posture data of the human body and posture data of the handheld equipment.
Further, the motion data set is synthesized by the flight trajectory data and the equipment attitude data at the same time period and in the same field.
Specifically, the equipment acquisition subsystem and the flying object acquisition subsystem respectively acquire the flying object track data and equipment attitude data with time stamps and upload the data to the server; the server combines the flight object track data and all (a plurality of) equipment attitude data in the same field in the same time period into a motion data set to complete data synthesis; and the motion data set is processed and restored into the digital expression of the motion live so as to be used by participators for analysis or other applications.
The embodiment of the invention provides a sports scene digital synthesis system, which comprises a field acquisition subsystem, a flyer acquisition subsystem, an equipment acquisition subsystem and a server; the field acquisition subsystem is used for establishing a field model to obtain field range parameters; the equipment acquisition subsystem is used for acquiring equipment attitude data according to the site range parameters; the flying object acquisition subsystem is used for acquiring flying object track data according to the field range parameters; and the server is used for synthesizing the flight object trajectory data and the equipment attitude data to obtain a motion data set, and processing the motion data set to obtain motion live data. The invention establishes the association of the three parts and synthesizes the data of the three parts into a whole on the premise of not influencing the independent use of each subsystem of a specific motion scene, the action of a motion figure, the equipment attitude and the motion trail of a flyer.
Example two:
fig. 4 is a flowchart of a digital synthesis method for a motion scene according to a second embodiment of the present invention.
As shown in fig. 4, the digital synthesis method for a motion scene includes the following steps:
step S110, establishing a site model to obtain site range parameters;
step S120, respectively collecting flight object track data and equipment attitude data according to site range parameters;
step S130, synthesizing the flight object track data and the equipment attitude data to obtain a motion data set;
step S140, the motion data set is processed to obtain motion live data.
Specifically, in the above embodiment of the method for digitally synthesizing a motion scene, step S110 may be implemented by the following steps, including:
step S210, setting a matching terminal and a calibration point, and obtaining a first edge angle coordinate closest to the calibration point according to an initial coordinate of the calibration point;
and step S220, obtaining a field range parameter according to the first corner coordinate and the standard length and width of the field.
Further, the digital synthesis method for the motion scene further includes:
the server is connected so that the server sends the system unified clock to the flyer acquisition subsystem to complete time synchronization;
and acquiring the flight object track data according to the field range parameters, and uploading the flight object track data to the server in real time.
Further, the digital synthesis method for the motion scene further includes:
reading the characteristic information of the equipment acquisition equipment, and sending the characteristic information to a server to realize the connection between the equipment acquisition equipment and the server;
receiving a system unified clock and a calibration point sent by a server, and resetting a clock, an initial coordinate, an initial orientation and initial time of equipment acquisition equipment according to the system unified clock and the calibration point;
and acquiring equipment attitude data according to the site range parameters, and uploading the equipment attitude data to a server.
Further, the motion data set is synthesized by the flight trajectory data and the equipment attitude data at the same time period and in the same field.
The invention provides a digital synthesis method of a motion scene, which comprises the following steps: establishing a site model to obtain site range parameters; respectively acquiring flight object track data and equipment attitude data according to the field range parameters; synthesizing the flight object trajectory data and the equipment attitude data to obtain a motion data set; and processing the motion data set to obtain motion live data. The invention establishes the association of the three parts and synthesizes the data of the three parts into a whole on the premise of not influencing the independent use of each subsystem of a specific motion scene, the action of a motion figure, the equipment attitude and the motion trail of a flyer.
The motion scene digital synthesis method provided by the embodiment of the invention has the same technical characteristics as the motion scene digital synthesis system provided by the embodiment, so that the same technical problems can be solved, the same technical effects can be achieved, and the details are not repeated herein.
The computer program product of the method and system for digital synthesis of a motion scene provided in the embodiments of the present invention includes a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute the method described in the foregoing method embodiments, and specific implementation may refer to the method embodiments, and will not be described herein again.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing embodiments, and are not described herein again.
In addition, in the description of the embodiments of the present invention, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present invention, which are used for illustrating the technical solutions of the present invention and not for limiting the same, and the protection scope of the present invention is not limited thereto, although the present invention is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (8)
1. A motion scene digital synthesis system, comprising: the system comprises a field acquisition subsystem, a flyer acquisition subsystem, an equipment acquisition subsystem and a server;
the field acquisition subsystem is respectively connected with the flyer acquisition subsystem and the equipment acquisition subsystem and is used for establishing a field model to obtain field range parameters;
the flyer acquisition subsystem is connected with the server and used for acquiring flyer trajectory data according to the field range parameters;
the equipment acquisition subsystem is connected with the server and used for acquiring equipment attitude data according to the site range parameters;
the server is connected with the field acquisition subsystem and used for synthesizing the flight object trajectory data and the equipment attitude data to obtain a motion data set, and processing the motion data set to obtain motion live data;
the field acquisition subsystem comprises a matching terminal and a field configuration module;
the field configuration module is used for setting the matching terminal and the calibration point, obtaining a first edge angle coordinate closest to the calibration point according to an initial coordinate of the calibration point, and obtaining a field range parameter according to the first edge angle coordinate and the standard length and width of the field;
the matching terminal is used for reading the characteristic information of the equipment acquisition equipment and sending the characteristic information to the server to realize the connection between the equipment acquisition equipment and the server.
2. The motion scene digital synthesis system according to claim 1, wherein the flying object acquisition subsystem is further configured to receive a system unified clock sent by the server, complete time synchronization, acquire the flying object trajectory data according to the field range parameter, and upload the flying object trajectory data to the server in real time.
3. The motion scene digital synthesis system of claim 1, wherein the equipment acquisition subsystem comprises an equipment acquisition device and a mobile device;
the mobile device is used for displaying the characteristic information of the equipment acquisition device, receiving the system unified clock and the calibration point sent by the server, and resetting the clock, the initial coordinate, the initial orientation and the initial time of the equipment acquisition device according to the system unified clock and the calibration point;
and the equipment acquisition equipment is used for acquiring the equipment attitude data according to the site range parameters.
4. The motion scene digital synthesis system of claim 1,
the motion data set is synthesized from the flight trajectory data and the equipment attitude data over the same time period and at the same field.
5. A method for digitally synthesizing a motion scene, comprising:
establishing a site model to obtain site range parameters;
respectively acquiring flight object track data and equipment attitude data according to the field range parameters;
synthesizing the flight object trajectory data and the equipment attitude data to obtain a motion data set;
processing the motion data set to obtain motion live data;
the establishing of the site model and the obtaining of the site range parameters comprise:
setting a matching terminal and a calibration point, and obtaining a first corner coordinate closest to the calibration point according to an initial coordinate of the calibration point;
and obtaining the field range parameter according to the first edge angle coordinate and the standard length and width of the field.
6. The method of digital synthesis of a motion scene of claim 5, further comprising:
the server is connected so that the server can send the system unified clock to the flyer acquisition subsystem to complete time synchronization;
and acquiring the flight object track data according to the field range parameters, and uploading the flight object track data to the server in real time.
7. The method of digital synthesis of a motion scene of claim 5, further comprising:
reading characteristic information of equipment acquisition equipment, and sending the characteristic information to a server to realize the connection between the equipment acquisition equipment and the server;
receiving a system unified clock and a calibration point sent by the server, and resetting a clock, an initial coordinate, an initial orientation and initial time of the equipment acquisition device according to the system unified clock and the calibration point;
and acquiring the equipment attitude data according to the site range parameters, and uploading the equipment attitude data to the server.
8. The method of claim 5, wherein the motion data set is synthesized from the flight trajectory data and the equipment attitude data at the same time period and at the same field.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710107056.0A CN106896826B (en) | 2017-02-27 | 2017-02-27 | Motion scene digital synthesis system and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710107056.0A CN106896826B (en) | 2017-02-27 | 2017-02-27 | Motion scene digital synthesis system and method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106896826A CN106896826A (en) | 2017-06-27 |
CN106896826B true CN106896826B (en) | 2020-04-28 |
Family
ID=59184952
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710107056.0A Active CN106896826B (en) | 2017-02-27 | 2017-02-27 | Motion scene digital synthesis system and method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106896826B (en) |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2180446A1 (en) * | 2008-10-27 | 2010-04-28 | Sony Corporation | Image processing apparatus, image processing method, and program |
CN102223536A (en) * | 2011-06-10 | 2011-10-19 | 清华大学 | Compressed-sensing-based distributed video coding and decoding system and method thereof |
CN103218910A (en) * | 2013-04-10 | 2013-07-24 | 张有彬 | System and method for operation of traffic information |
CN103226838A (en) * | 2013-04-10 | 2013-07-31 | 福州林景行信息技术有限公司 | Real-time spatial positioning method for mobile monitoring target in geographical scene |
CN103258339A (en) * | 2012-02-16 | 2013-08-21 | 克利特股份有限公司 | Real-time compositing of live recording-based and computer graphics-based media streams |
CN103893950A (en) * | 2012-12-27 | 2014-07-02 | 卡西欧计算机株式会社 | Exercise information display system and exercise information display method |
CN105711597A (en) * | 2016-02-25 | 2016-06-29 | 江苏大学 | System and method for sensing local driving environment in front |
CN105749525A (en) * | 2016-04-22 | 2016-07-13 | 江苏卡罗卡国际动漫城有限公司 | Basketball training device based on AR technology |
CN105944354A (en) * | 2016-05-27 | 2016-09-21 | 新乡医学院 | A basketball shooting monitoring auxiliary device |
CN106331823A (en) * | 2016-08-31 | 2017-01-11 | 北京奇艺世纪科技有限公司 | Video playing method and device |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7629994B2 (en) * | 2006-07-11 | 2009-12-08 | Sony Corporation | Using quantum nanodots in motion pictures or video games |
JP2008269747A (en) * | 2007-04-25 | 2008-11-06 | Hitachi Ltd | Recording and reproducing device |
GB2489675A (en) * | 2011-03-29 | 2012-10-10 | Sony Corp | Generating and viewing video highlights with field of view (FOV) information |
-
2017
- 2017-02-27 CN CN201710107056.0A patent/CN106896826B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2180446A1 (en) * | 2008-10-27 | 2010-04-28 | Sony Corporation | Image processing apparatus, image processing method, and program |
CN102223536A (en) * | 2011-06-10 | 2011-10-19 | 清华大学 | Compressed-sensing-based distributed video coding and decoding system and method thereof |
CN103258339A (en) * | 2012-02-16 | 2013-08-21 | 克利特股份有限公司 | Real-time compositing of live recording-based and computer graphics-based media streams |
CN103893950A (en) * | 2012-12-27 | 2014-07-02 | 卡西欧计算机株式会社 | Exercise information display system and exercise information display method |
CN103218910A (en) * | 2013-04-10 | 2013-07-24 | 张有彬 | System and method for operation of traffic information |
CN103226838A (en) * | 2013-04-10 | 2013-07-31 | 福州林景行信息技术有限公司 | Real-time spatial positioning method for mobile monitoring target in geographical scene |
CN105711597A (en) * | 2016-02-25 | 2016-06-29 | 江苏大学 | System and method for sensing local driving environment in front |
CN105749525A (en) * | 2016-04-22 | 2016-07-13 | 江苏卡罗卡国际动漫城有限公司 | Basketball training device based on AR technology |
CN105944354A (en) * | 2016-05-27 | 2016-09-21 | 新乡医学院 | A basketball shooting monitoring auxiliary device |
CN106331823A (en) * | 2016-08-31 | 2017-01-11 | 北京奇艺世纪科技有限公司 | Video playing method and device |
Also Published As
Publication number | Publication date |
---|---|
CN106896826A (en) | 2017-06-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110645986B (en) | Positioning method and device, terminal and storage medium | |
EP3579192B1 (en) | Method, apparatus and device for determining camera posture information, and storage medium | |
CN102171726B (en) | Information processing device, information processing method, program, and information storage medium | |
TW201229962A (en) | Augmenting image data based on related 3D point cloud data | |
CN108114471B (en) | AR service processing method and device, server and mobile terminal | |
JP5067477B2 (en) | Imaging parameter acquisition apparatus, imaging parameter acquisition method, and program | |
CN105074776A (en) | In situ creation of planar natural feature targets | |
CN110544302A (en) | Human body action reconstruction system and method based on multi-view vision and action training system | |
CN104252712A (en) | Image generating apparatus and image generating method | |
WO2018078986A1 (en) | Information processing device, information processing method, and program | |
KR20180094554A (en) | Apparatus and method for reconstructing 3d image | |
CN115568823B (en) | Human body balance capability assessment method, system and device | |
CN114241012B (en) | High-altitude parabolic determination method and device | |
CN109643455A (en) | Camera calibration method and terminal | |
CN112446254A (en) | Face tracking method and related device | |
CN106896826B (en) | Motion scene digital synthesis system and method | |
CN108932055B (en) | Method and equipment for enhancing reality content | |
US20230306636A1 (en) | Object three-dimensional localizations in images or videos | |
CN117095002A (en) | Hub defect detection method and device and storage medium | |
JP6426594B2 (en) | Image processing apparatus, image processing method and image processing program | |
CN116152141A (en) | Target object repositioning method and device, storage medium and electronic device | |
CN111988732A (en) | Multi-user set method and device applied to multi-user set | |
CN116524217B (en) | Human body posture image matching method and device, electronic equipment and storage medium | |
CN112330793A (en) | Obtaining method of ear mold three-dimensional model, earphone customizing method and computing device | |
CN117407480B (en) | Map display method and device based on photoelectric holder |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |