EP3446289A1 - Verfahren und system zur darstellung einer simulationsumgebung - Google Patents
Verfahren und system zur darstellung einer simulationsumgebungInfo
- Publication number
- EP3446289A1 EP3446289A1 EP17726539.4A EP17726539A EP3446289A1 EP 3446289 A1 EP3446289 A1 EP 3446289A1 EP 17726539 A EP17726539 A EP 17726539A EP 3446289 A1 EP3446289 A1 EP 3446289A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- real
- environment
- image recordings
- simulation
- time
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000004088 simulation Methods 0.000 title claims abstract description 194
- 238000000034 method Methods 0.000 title claims abstract description 53
- 230000005540 biological transmission Effects 0.000 claims description 21
- 238000005259 measurement Methods 0.000 claims description 14
- 238000013507 mapping Methods 0.000 claims description 12
- 238000012552 review Methods 0.000 claims description 2
- 238000003384 imaging method Methods 0.000 abstract description 12
- 238000012549 training Methods 0.000 description 10
- 238000012544 monitoring process Methods 0.000 description 6
- 230000010354 integration Effects 0.000 description 5
- 230000003068 static effect Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 239000000969 carrier Substances 0.000 description 3
- 238000002360 preparation method Methods 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000004880 explosion Methods 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000001454 recorded image Methods 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000001502 supplementing effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000001429 visible spectrum Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
- G09B9/003—Simulators for teaching or training purposes for military purposes and tactics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
Definitions
- the present invention relates to a method and a system for displaying a computer-generated, real-environment simulated simulation environment with a database comprising data for geospecifically mapping the terrain and the objects of the real environment, the data describing at least one surface course of the real environment.
- Generic methods for representing simulation environments are used in different designs. In particular, but by no means exclusively, such methods are used for training purposes or for surveillance or reconnaissance purposes.
- the respective Training purpose as well as the respective reason of the monitoring can be very different.
- applications from the field of security technology in which security forces, such as the military, carry out monitoring and / or reconnaissance missions, may be mentioned at this point, in order, if necessary, to prepare a military deployment.
- an application of such methods may be in the deployment preparation of military task forces who train military deployment using generic simulation environments and methods for representing them, or who are trained and trained on general behavior during deployment.
- both the training and / or training effect achievable in the context of the simulation environment and the information content as well as the correct interpretability of the information are particularly dependent on the correct embedding of reconnaissance data into one spatial context of the real environment.
- information particularly image capturing, relating to a particular section or part of the real environment not be detached from the remainder of the real environment or viewed in isolation, analyzed and / or used for training and / or educational purposes, but in a context be set to the other real environment, thereby the To increase the information content or to better interpret the information content and to enable more realistic or realistic training.
- it is disadvantageous in known methods for displaying simulation environment that the database of the simulation environment is based on less recent data, so that current or recent changes are not known and therefore can not be considered by the user, evaluator or trainees.
- the object of the present invention to provide a method and system for displaying a computer-generated, real-environment simulated simulation environment having a database comprising data for geo-specific mapping of the real terrain and off-field real objects, the data comprising at least one Describing the surface of the real environment, which overcome the disadvantages of the prior art described above.
- This object is achieved in a method of the type mentioned above in that real-time image recordings of at least part of the real environment are recorded with a recording device and a part of the simulation environment is displayed on a display, whereby a part of the representation of the real-time Imaging is formed or derived from the real-time image captures.
- the basic concept of the present invention is therefore based on the fact that real-time image recordings recorded in the real environment are integrated into the representation of the simulation environment in the context of a representation of the simulation environment that maps the real environment.
- the part of the real environment captured by the real-time image recordings and the corresponding speaking part of the illustrated simulation environment has a particularly high degree of timeliness.
- live images are integrated into the representation of the simulation environment, or at least visual contents are displayed which are based on live images.
- the method according to the invention therefore also makes it possible to monitor buildings, it being possible to determine which person is currently moving in the building to where the building is currently entering and / or where a person is currently leaving the building.
- a representation of a simulation environment can be made, which includes a street, a street intersection or one or more properties, recorded for a specific building or at least for a building front or a section of a road real-time image capture and with or without further processing in the presentation Simulation environment can be displayed or integrated.
- the method enables and improves situational analyzes.
- the very up-to-date information that results from the display of real-time image recordings in the presentation of the simulation environment can provide special advantages. Because the training forces can be trained in the context of the representation of the simulation environment as if they were currently in the real environment, since real-time image capturing represent or justify a part of the representation of the simulation environment. As a result, known simulation systems and methods are extended by a real-time component which enables a better training effect.
- the illustration of the inventive concept will be based on the situation in which the simulation environment is rendered from a point in the simulation environment corresponding to the corresponding point of the real environment from which the real-time image acquisitions are taken. The presentation direction of the simulation environment should also coincide with the recording direction of the real-time image recordings.
- the real-time image recordings can be integrated largely unchanged or without special preprocessing in the representation of the simulation environment, so that a representation of at least a part of the simulation environment is achieved, in which part of the representation of the real-time image recordings is formed ,
- the part of the simulation environment that is not shown or represented by the real-time image recordings can already convey a highly realistic or highly realistic impression of the real environment imaged by the simulation environment.
- the necessary for the creation of the database and generation of the simulation environment and advantageous methods, devices and systems are described, for example, in German Patent Application DE 10 2015 120 999.3.
- the content of said patent application is hereby incorporated in full in the present description.
- the simulation environment comprises a high-resolution surface course, in particular with a resolution of less than 10 cm per spatial direction. It can be provided particularly advantageous that the real-imaging objects of the simulation environment and the real-imaging terrain of the simulation environment spanning a finely triangulated grid or height grid. As a result, in contrast to other simulation environments, a complex and error-prone reconstruction of the real environment, in particular of real objects, by means of polygonal models can be avoided and, on the other hand, an accurate and sharply delineated mapping of the real environment can be made possible.
- the disadvantage of the known methods and devices for representing a computer-generated simulation environment that simulates a real environment lies in the fact that data must first be collected from data relating to the real environment and subsequently a generation of the database or generation of the corresponding simulation environment has to take place.
- the real environment simulating simulation environments basically is that just inevitably some time lag between the collection of the real environment descriptive data and the representation of the simulation environment is present, which is also the greater, depending more realistic or the more realistic the simulation environment depicts the real environment.
- the particularly high timeliness of parts of the illustrated simulation environment can be used particularly advantageously in the context of pure monitoring missions as well as in the context of simulations for the preparation of a military mission. In this way, temporal changes in the real environment and their spatial connection can be perceived in the context of the simulation environment and made the basis for the preparation for deployment or taken into account in the monitoring.
- the recording device can be used to record real-time image recordings that image the visible spectrum of electromagnetic waves.
- real-time image recordings of thermal radiation are taken with the recording device.
- Other frequency or wavelength ranges may be advantageous.
- a respective embodiment of the receiving device can Accordingly, enable a recording of real-time image recordings in the infrared range, in the near infrared range, in the radar range or in the Therahertz range.
- the display position, from which the representation of the simulation environment takes place is freely selectable or freely changeable. This allows the viewer or user of the simulation environment to be free to move in the simulation environment.
- At least one real-time texture is generated from the real-time image captures for at least part of the real environment and the at least one real-time texture is projected onto a part of the surface course during the representation of the simulation environment.
- a surface progression of the real environment is first generated, which is subsequently provided with corresponding textures in order, in addition to the pure three-dimensional upper Plane course of the real environment and other properties of the real environment, such as coloring or the like in the context of the representation of the simulation environment to be able to represent.
- the particular further development of the proposed embodiment consists in converting the recorded real-time image captures into corresponding real-time textures for at least part of the real environment, namely the part of the real environment captured by the real-time image captures, to then instead of static and If necessary, outdated textures can be used to project real-time textures as part of the simulation environment onto a part of the surface course, which ultimately enables a correctly positioned and perspectively correct representation of the corresponding parts of the simulation environment's surface course with real-time temporally highly updated textures.
- a particularly preferred embodiment represents real-time color textures that serve to color the surface course of the simulation environment.
- the position and / or the orientation of the recording device is recorded or recorded.
- the position of the recording device can be determined on the basis of GPS information and recorded together with the real-time image recordings.
- the orientation of the recording device can be determined during and together with the recording of the real-time image recordings, for example by motion sensors, such as magnetic field sensors. sensors or similar devices and recorded together with the real-time image captures.
- distance measurements are taken of distances between a rangefinder arranged in the real environment and a part of the real environment.
- the rangefinder is arranged on the receiving device.
- the rangefinder is arranged separately and at a different location than the receiving device.
- the rangefinder measures distances to the part of the real environment that is also recorded by the recording device in the real-time image recordings.
- a further embodiment of the method provides that the real-time image recordings are recorded as 3D image recordings or converted into SD image recordings.
- the real-time image recordings are recorded as 3D image recordings or converted into SD image recordings.
- the real-time image recordings are recorded as SD image recordings or converted into 3D image recordings.
- a real-time surface texture is created from the SD real-time image recordings, which is used additionally or alternatively to the surface profile of the database in the representation of the simulation environment.
- the use of SD real-time image recordings has the advantage that the surface course of the real environment encompassed by the database can not only be extended, in particular textured, using particularly up-to-date two-dimensional image information, but that, in addition, the surface run-off itself can be Real-time image capture can be supplemented or replaced.
- the real-time image recordings by recording and integrating the real-time image recordings into the simulation environment, spatial changes of the real environment, namely changes in the real environment concerning the surface process of the real environment, can be reflected with particular relevance in the context of the simulation environment.
- the data base of the simulation environment in particular the surface course of the simulation environment, can be supplemented or replaced by creating and displaying real-time surface textures.
- the real-time surface textures may, for example, take into account recent explosion effects in the presentation of the simulation environment.
- a fundamentally advantageous embodiment of the method according to the invention can provide that a plurality of recording devices record real-time image recordings of at least part of the real environment.
- the respective recorded part of the real environment at least partially overlaps.
- at least a part of the recording devices each record a part of the real environment which is not picked up by any further recording device, not even partially.
- the following embodiments relate to the need to make the part of the simulation environment, which is to have a particularly high topicality in the context of the representation of a simulation environment which maps the real environment geospecifically, time-variable. This means that, for example, when monitoring a building, it may be particularly useful to record real-time image recordings of a building facade during the day and real-time image recordings of a building interior or other building front at night and to integrate them into the presentation of the simulation environment.
- a first embodiment of the method provides that the recording device changes its position, in particular remotely controlled, during the recording of the real-time image recordings.
- the recording device changes the respectively recorded part of the real environment during the recording of the real-time image recordings. This means that the recording device changes the orientation of the recording device during the recording of the real-time image recordings. This also makes it possible, within certain limits, to adapt the respectively recorded part of the real environment, even if the receiving device is stationary.
- the respective part of the simulation environment which has a particularly high topicality, can be adapted and selected.
- the use of at least one static and at least one moving recording device can be linked to the recording of three-dimensional real-time image recordings by the recording devices.
- Another particularly advantageous embodiment of the method relates to the representation of the simulation environment from a changeable display position and / or the representation of the simulation environment under a changeable display direction.
- part of the real environment captured by the recording device and a corresponding part of the simulation environment can be viewed from the display position of the simulation environment Display direction of the simulation environment are not visible due to occlusion by other part of the simulation environment.
- a further, particularly preferred embodiment of the method is provided. This provides that a check of the visibility of the real-time image recordings from a variable display position and / or a variable display direction of the simulation environment is performed.
- the part of the real environment recorded by the recording device is assigned to a corresponding part or section of the simulation environment. Subsequently, a check is made as to whether the part of the simulation environment for which the real-time image recordings are taken is completely or at least partially visible or visible from the changeable display position and under the changeable display direction of the representation of the simulation environment. Subsequently, only for the parts of the simulation environment visible from the display position and under the display direction is a representation taking into account the corresponding real-time image recordings.
- a system for displaying the computer-generated real environment simulating simulation environment with a database comprising data for geospecific mapping of the real environment, the data describing at least one surface course of the real environment, with a simulation device comprising a display for displaying the simulation environment from a changeable display position and / or under a changeable display direction and with a recording device for recording real-time image recordings of at least part of the real environment in that the recording device has a transmission means for real-time transmission of the image recordings to the simulation device and the simulation device is set up to display at least a part of the simulation environment on a display, wherein a part of the representation is formed by the real-time image recordings or derived from the real-time image captures.
- the system according to the invention allows the simulation environment to have a high degree of realism overall and, moreover, to have a particularly high degree of realism in the part of the simulation environment corresponding to the part of the real environment taken up by the recording device Has a measure of timeliness.
- the basic idea according to the invention is thus realized by providing, in addition to a database based on temporally less recent data relating to the real environment, transmission means for real-time transmission of real-time image recordings of a recording device arranged in the real environment and the transmission means to the simulation device transferred real-time image recordings are considered in the representation of the simulation environment accordingly, so that a part of the representation of the simulation environment of the real-time image recordings is formed or derived from the real-time image recordings.
- a recording device which generates real-time image recordings in the visible spectral range of the electromagnetic spectrum can be used as recording device. But infrared recording devices, radar or terahertz recording devices can be used advantageously.
- the simulation device for generating at least one real-time texture, in particular a real-time color texture for at least part of the real environment from the real-time image captures and for the projection of the real-time texture, in particular the real-time color texture , on a part the surface of the simulation environment during the presentation of the simulation environment.
- the creation of real-time textures and their projection on the surface curve of the simulation environment also for equalization or intentional distortion of real-time image capturing, for example, a falling apart between display position and presentation direction of the simulation environment on the one hand and recording position and recording direction of the recording device for recording the real-time On the other hand, to compensate for image recordings.
- a carrier which is set up for the transport and / or storage of the receiving device, is additionally provided.
- the carrier can also be set up for holding or fastening the receiving device.
- the carrier may comprise, for example, a tripod or a tripod.
- the carrier comprises a vehicle or an aircraft, preferably an unmanned vehicle or plane.
- Another particularly preferred embodiment of the system provides a range finder located in the real environment and configured to measure distances between the range finder and a portion of the real environment and connected to a real time transmission means for real time transmission of range measurements to the simulation device ,
- the distance measurements are particularly preferably real-time distance measurements.
- distance measurements with a corresponding range finder can be used in conjunction with the real-time image acquisitions to allow for a portion of the simulation environment during presentation of the simulation environment a particularly high degree of timeliness.
- the distance measurements of the rangefinder are linked together with the real-time image recordings of the recording device and displayed as part of the simulation environment or a part of the illustrated simulation environment is derived from the real-time image recordings and the distance measurements.
- the simulation device generates at least one real-time surface texture of at least part of the real environment from 3D real-time image captures and for projection of the real-time surface texture onto a part of the surface course of the simulation environment or for display the real-time surface texture is set up in place of a portion of the surface history or in addition to a portion of the simulation environment's surface history during simulation environment presentation.
- the 3D real-time image recordings can be generated from a plurality of 2D real-time image recordings that are recorded by a recording device or by different recording devices from different positions and under different directions.
- the 3D real-time image recordings can also be generated by means of a recording device and a rangefinder and a corresponding combination of distance measurements and image recordings.
- the real-time surface textures for partially supplementing or partially replacing the surface course of the simulation environment have the particular advantage that the spatial representation of the simulation environment, at least in the part of the real environment that is imaged in the SD real-time image recordings, a particularly high temporal Have quality.
- a static real-time surface texture can be created, in addition or alternatively, a real-time surface texture can be generated at which the In addition, or alternatively, a real-time surface texture can be generated, in which changes only the perspective of the image of the real environment by the movable support of the receiving device changes the excerpt or image section of the real environment, without changing the position of the receiving device.
- real-time surface textures can be generated which are picked up / generated from a variable position and at which the image area of the real environment changes.
- the system has a display which is configured as head-mounted displays.
- a realistic representation of the simulation environment can be made possible in a particularly advantageous manner.
- a user of the system can influence the display position and / or the presentation direction of the presentation of the simulation environment by means of a user interface.
- Fig. 1 is a schematic representation of a system according to the invention
- FIG. 2 is a schematic diagram relating to the process steps for generating real-time textures.
- FIG. 1 shows components of a system which can be used in carrying out the method according to the invention.
- Fig. 1 shows a simulation device 1 comprising a user interface 2, a first display 3, a second display 4, a cable connection 5 for physically connecting the simulation device 1 to a data memory and a wireless transmission means 6 for real-time transmission; of information, in particular real-time image recordings 27 to the simulation device.
- the simulation device may have a database generator 7, a database 22 and a display device 8.
- the image recordings 12 recorded by a sensor device 9 and its sensor 11 include, for example, the entire real environment 10 to be covered by the simulation environment, including the real field 15 and the real objects 16 located in the real terrain 15. Based on the image recordings 12 of the sensors 11 of the sensor device 9 the data of the database 22 are generated, which allow the geospecific mapping of the real environment and describe at least one surface course of the real environment.
- each of the recorded image recordings 12 is assigned a GPS location coordinate as well as a course specification and a speed indication.
- extrinsic and intrinsic data relating to the image recordings 12 with the image recordings 12 are stored. These can for example be stored together in the form of metadata with the respective image recordings 12 or the image data of the respective image recordings 12.
- the intrinsic properties of the image recordings 12 may, for example, be properties of the sensor 11.
- the sensor device 9 Based on the positions, directions, velocities and the information on the sensor 11 associated with the image recordings 12, it is possible to generate geospecific simulation environments or optionally also georeferenced simulation environments from the image recordings 12 whose databases each contain data for the geospecific and / or geo-referenced mapping of the images Real environment and at least have a surface course of the real environment 10. In order to be able to generate a complete database for the geospecific mapping of the real environment 10, it can be provided that the sensor device 9 completely flies over the real environment 10 along the route 20 at least once.
- the sensor device 9 is equipped with a wireless transmission device 17 for the transmission of information to the simulation device 1 already during the overflight over the real environment 10 along the route 20, it still takes a certain time to select from the image recordings 12 to generate a database on the basis of which the simulation environment can be displayed. Accordingly, the presentation of a simulation environment takes place which is based only on the image recordings 12 or comparable sensor data. resting, always with a certain time lag or a lack of timeliness.
- the system provides that at least one recording device for recording real-time image recordings is arranged in the real environment 10.
- the system comprises a first recording device 14 and a second recording device 19.
- the first recording device 14 is designed as a stereoscopic recording device and is thus able to generate SD real-time image recordings 27 and by means of a transmission means 21 to the To transfer simulation device 1.
- the first receiving device 14 is arranged on a support 18 in the form of a tripod.
- the second recording device 19 is likewise set up to record real-time image recordings 27.
- the second receiving device 19 comprises a rangefinder (not shown in FIG.
- the second receiving device 19 likewise comprises a carrier 18 which, in the case of the second receiving device 19, however, has an unmanned aircraft 23.
- the configuration of the carrier 18 of the second receiving device 19 in the form of an unmanned aerial vehicle 23 makes it possible for the position and orientation of the second receiving device 19 in the real environment 10 to be changed, in particular remotely controlled.
- the aircraft 23 is designed as an unmanned aircraft 23, which also has the ability to remain in flight at one and the same position. As a result, the position and orientation of the receiving device 19 can both be changed and kept constant with the aircraft 23 become.
- the second recording device 19 is also connected to a transmission means 21 for transmitting the real-time image recordings 27 to the simulation device 1.
- the real-time image recordings 27 recorded with the recording devices 14, 19 By means of the real-time image recordings 27 recorded with the recording devices 14, 19, the disadvantages described above, which result from the required time between the recording of the image recordings 12 and the representation of the simulation environment by means of the simulation device 1, can be eliminated for at least part of the real environment .
- the real-time image recordings 27 can be transmitted to the simulation device 1 and displayed or integrated into the representation of the simulation environment or can at least influence the representation of the simulation environment.
- the position and orientations of the receiving devices 14, 19 are transmitted to the simulation device 1. Similar to the image recordings 12 of the sensor 11, it is furthermore particularly advantageous if the properties or data pertinent to the recording devices 14, 19 are transmitted to the simulation device 1 or are known based on an identification of the respective recording device 14, 19 of the simulation device 1.
- the real-time image recordings 27 taken by the recording devices 14, 19 can be transferred from the reference system of the real environment into the reference environment of the simulation environment, which forms the basis for the real-time image recordings 27 or optical images derived from the real-time image recordings Contents are displayed or displayed in the context of the presentation of the simulation environment.
- the real-time image recordings 27 generated by the recording devices 14 or 19 can be recorded or displayed as far as possible unprocessed in the representation of the simulation environment.
- the display position and / or the presentation direction of the simulation environment during the display is freely selectable, it is necessary to ensure by means of the system according to the invention or in the context of the inventive method that derived from the real-time image recordings 27 graphic content that are to be displayed together with the simulation environment, do not adversely affect or alienate the presentation of the simulation environment.
- These include, among other things, an equalization or intentional distortion of the real-time image recordings 27 and an examination of the visibility of the recorded by the receiving devices 14, 19 parts of the real environment or simulation environment from the respective display position and or the respective display direction of the simulation environment.
- FIG. 2 shows, by way of example, a section of a simulation environment 24 in which real-imaging objects 25 are arranged in a real-imaging terrain 26, wherein the entirety of the simulation environment 24 is a Surface course 30, which describes the surfaces of the real imaging objects 25 and the real imaging terrain 26.
- Parts of the surface course 30 are, for example, the surfaces 28.1, 28.2 and
- the surface course 30 of the simulation environment 24 can be spanned, for example, by a height raster in which each raster point, for example raster points in the drawing plane of FIG. 2, has a height value, for example perpendicular to the plane of FIG. 2, is assigned.
- the image planes 27.1 and 27.2 of real-time image recordings 27 are shown by way of example in FIG.
- the image planes 27.1 and 27.2 arise from the assignment of the position and orientation of the receiving device 14, 19 in the reference system of the simulation environment 24. In itself, however, it is not necessary for performing the method according to the invention that imaging planes 27.1 or 27.2 are generated or determined , Rather, these serve to illustrate the method steps for integrating or superimposing real-time image recordings in the representation of the simulation environment 24.
- mapping rule is illustrated by the dotted and dashed lines, which makes it possible, the respective parts of the real-time image recordings 27 with the image planes 27.1 and 27.2 on the surfaces 28.1 to project 28.3 and so to expand the presentation of the simulation environment to content that results on the real-time images 27.
- sections 29.1 and 29.1 are illustrated by the dotted and dashed lines, which makes it possible, the respective parts of the real-time image recordings 27 with the image planes 27.1 and 27.2 on the surfaces 28.1 to project 28.3 and so to expand the presentation of the simulation environment to content that results on the real-time images 27.
- the projection of the real-time image recordings 27 on the surface course of the simulation environment 24, as sketched in FIG. 2, makes it possible to display real-time image recordings in the correct position and in the correct perspective, even if the simulation environment and the recording position and recording direction of the recording devices differ from one another 27 in the representation of the simulation environment 24 is made possible.
- the real-time image recordings 27 can also be 3D image recordings.
- the projection of the 3D image recordings onto the surfaces 28 of the surface course 30 proceeds in accordance with the projection of the 2D image recordings 27, with the difference that the corresponding projections can change and / or replace the surface course 30.
- This forms real-time surface textures from 3D images.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Computer Graphics (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102016107251.6A DE102016107251A1 (de) | 2016-04-19 | 2016-04-19 | Verfahren und System zur Darstellung einer Simulationsumgebung |
PCT/DE2017/100230 WO2017182021A1 (de) | 2016-04-19 | 2017-03-22 | Verfahren und system zur darstellung einer simulationsumgebung |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3446289A1 true EP3446289A1 (de) | 2019-02-27 |
Family
ID=58800582
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17726539.4A Withdrawn EP3446289A1 (de) | 2016-04-19 | 2017-03-22 | Verfahren und system zur darstellung einer simulationsumgebung |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP3446289A1 (de) |
CA (1) | CA3016345A1 (de) |
DE (1) | DE102016107251A1 (de) |
WO (1) | WO2017182021A1 (de) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108510939A (zh) * | 2018-04-24 | 2018-09-07 | 中航大(天津)模拟机工程技术有限公司 | 一种基于柔性oled显示屏的仿真成像系统 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050195096A1 (en) * | 2004-03-05 | 2005-09-08 | Ward Derek K. | Rapid mobility analysis and vehicular route planning from overhead imagery |
US7583275B2 (en) * | 2002-10-15 | 2009-09-01 | University Of Southern California | Modeling and video projection for augmented virtual environments |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6080063A (en) * | 1997-01-06 | 2000-06-27 | Khosla; Vinod | Simulated real time game play with live event |
US8190585B2 (en) * | 2010-02-17 | 2012-05-29 | Lockheed Martin Corporation | Supporting multiple different applications having different data needs using a voxel database |
DE102015120999A1 (de) | 2015-12-02 | 2017-06-08 | Krauss-Maffei Wegmann Gmbh & Co. Kg | Verfahren zur Erzeugung und Darstellung einer computergenerierten, eine Realumgebung abbildenden Simulationsumgebung |
-
2016
- 2016-04-19 DE DE102016107251.6A patent/DE102016107251A1/de not_active Withdrawn
-
2017
- 2017-03-22 EP EP17726539.4A patent/EP3446289A1/de not_active Withdrawn
- 2017-03-22 CA CA3016345A patent/CA3016345A1/en not_active Abandoned
- 2017-03-22 WO PCT/DE2017/100230 patent/WO2017182021A1/de active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7583275B2 (en) * | 2002-10-15 | 2009-09-01 | University Of Southern California | Modeling and video projection for augmented virtual environments |
US20050195096A1 (en) * | 2004-03-05 | 2005-09-08 | Ward Derek K. | Rapid mobility analysis and vehicular route planning from overhead imagery |
Non-Patent Citations (1)
Title |
---|
See also references of WO2017182021A1 * |
Also Published As
Publication number | Publication date |
---|---|
CA3016345A1 (en) | 2017-10-26 |
DE102016107251A1 (de) | 2017-10-19 |
WO2017182021A1 (de) | 2017-10-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1650534B1 (de) | Verfahren zur Pilotenunterstützung bei Landungen von Helikoptern im Sichtflug unter Brown-Out oder White-Out Bedingungen | |
EP2491530B1 (de) | Bestimmung der pose einer kamera | |
EP2464098B1 (de) | Umfeld-Darstellungsvorrichtung sowie ein Fahrzeug mit einer derartigen Umfeld-Darstellungsvorrichtung und Verfahren zur Darstellung eines Panoramabildes | |
DE102006060045A1 (de) | Sehhilfe mit dreidimensionaler Bilderfassung | |
DE102011075703A1 (de) | Verfahren und Vorrichtung zur Kalibrierung einer Projektionseinrichtung eines Fahrzeugs | |
DE3219032A1 (de) | Stereophotogrammetrisches aufnahme- und auswerteverfahren sowie auswertevorrichtung | |
DE102015120999A1 (de) | Verfahren zur Erzeugung und Darstellung einer computergenerierten, eine Realumgebung abbildenden Simulationsumgebung | |
DE102019216548A1 (de) | Verfahren und mobile Erfassungsvorrichtung zur Erfassung von Infrastrukturelementen eines unterirdischen Leitungsnetzwerks | |
DE102006006001B3 (de) | Verfahren und Anordnung zum Einblenden ortsbezogener Informationen in eine visuelle Darstellung oder Ansicht einer Szene | |
EP2859531B1 (de) | Verfahren zur bildbasierten veränderungserkennung | |
EP3384480A1 (de) | Verfahren zur vorbereitenden simulation eines militärischen einsatzes in einem einsatzgebiet | |
DE19746639A1 (de) | Verfahren zur digitalen Erfassung räumlicher Objekte und Szenen für eine 3D-Bildkarte sowie 3D-Bildkarte | |
WO2017182021A1 (de) | Verfahren und system zur darstellung einer simulationsumgebung | |
DE102014116904A1 (de) | Verfahren zum optischen Abtasten und Vermessen einer Szene und zur automatischen Erzeugung einesVideos | |
EP2831839A1 (de) | Verfahren zum automatischen betreiben einer überwachungsanlage | |
DE102011084596A1 (de) | Verfahren zum Assistieren eines Fahrers in einer fremden Umgebung | |
DE102016103056A1 (de) | Verfahren zum Betrieb einer Anzeigevorrichtung und System zur Anzeige von realen Bildinhalten einer Realumgebung überlagerten virtuellen Bildinhalten | |
DE102012213336A1 (de) | Verfahren und Vorrichtung zur Ergänzung einer digitalen Karte um Höhenangaben an vorbestimmten geografischen Positionen | |
WO2017144033A1 (de) | Verfahren zur ermittlung und darstellung von veränderungen in einer ein reales gelände und darin befindliche reale objekte umfassenden realumgebung | |
DE102016224886B3 (de) | Verfahren und Vorrichtung zur Ermittlung der Schnittkanten von zwei sich überlappenden Bildaufnahmen einer Oberfläche | |
EP2940624A1 (de) | Verfahren zum Erstellen eines dreidimensionalen virtuellen Modells einer Umgebung für Anwendungen zur Positionsbestimmung | |
DE102019102423A1 (de) | Verfahren zur Live-Annotation von Sensordaten | |
DE102015120929A1 (de) | Verfahren zur vorbereitenden Simulation eines militärischen Einsatzes in einem Einsatzgebiet | |
EP3384469A2 (de) | Verfahren zur darstellung einer simulationsumgebung | |
DE102007061273A1 (de) | Verfahren zur Darstellung von Fluginformationen und Anzeigeinstrument |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20180906 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20190909 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN |
|
18W | Application withdrawn |
Effective date: 20191212 |