Step inside your GIS map.

 

Visualize data through the lens of an integrative Earth Systems approach and Synthetic Environment.

Capabilities - 3D, 4D, VR, and AR Capable

 

Realtime Situational Data within HUD

· Sliding compass

· Rotational compass with Azimuth

· Mini-map within rotational compass showing top-down view of user location

· Forward looking direction in degrees

· Pitch in degrees with rotationally dynamic pitch indicators 

· Latitude & Longitude - decimal degrees

· Rangefinder for center crosshair (m)

· Altitude (m) & Elevation (m)

· Frames per second “FPS”

· Frame render latency (ms)  

· Field of view “FOV”

· Component X velocity (m/s)

· Component Y velocity (m/s)

· Component Z velocity (m/s)

· Speed in knots (kn) and (km/h)


OSI Reference Model – THOR Application (Layer 7)

In its current stage, THOR™ rests entirely within the “Application” layer of OSI Model. Originally designed for operation without a connected network, THOR is entirely self-contained from the moment a map is built. However, there are currently plans for THOR to extend to Layer 5 of the OSI Model so that terrain files can be streamed on command and locational data of tracked targets can be updated (and visualized) in real-time. THOR’s framework is structured such that sending and receiving data is possible, though it has not yet been fully implemented.

DEM Data Sources

Elevation data derived from ground based and aerial LIDAR, IfSAR, drones, manually built DEMs, satellite derived DTMs and DSMs, side-scan sonar, and bathymetric data.

All spatial resolutions work. Sub-millimeter to kilometer resolutions have been tested.

Point Cloud Data Sources

Airborne LIDAR, Terrestrial / Drone Laser Scanner, Structure from Motion / Photogrammetry. Up to 1-trillion points have been proven to simultaneously render within the same scene. All spatial densities work as THOR™ is resolution independent.

Subterranean data

Data derived from drones or 3D models can be conflated with other data sources to allow insight into obscured or otherwise inaccessible areas. For example, point clouds of caves and tunnels can be combined with 3D modeled subterranean buildings and surficial DEMs, imagery, and surficial models to create a comprehensive view of the environment.

Built-in & Projected Overlays

Overlays such as heatmaps or satellite imagery can be custom built or supplied to be overlaid atop any section of terrain. These overlays can be streamed or built into the base package. *streaming still in beta

Landscapes can be textured with any virtual material upon request. Conflated data layers can combine to form lightweight projected overlays such as height map gradients, terrain inclination, gravity, geology, soil type, and merged topographic quadrangles, etc.

Overlays available on command by the user include but are not limited to: grayscale or colored fresnel; full or partial wireframe; single, dual, or multicolor gradient heightmaps; static or dynamically lit polygonal mesh; all flat surfaces highlighted; fog; drivable terrain via slope analysis, etc.

Gravity & Drag

Gravity and drag coefficients can be manually altered or they can be tied to atmospheric and environmental conditions. *dynamic alterations still in beta.

Waypoints & Text

Users can place high-visibility waypoints (pins) to mark locations of interest and spawn customizable text that will sit atop the waypoint. Pins can be placed at user location or at a specified lat/lon. Text will rotate along 3-axis to always face the user.

Line Segments

Users can draw line segments along a user selected AB path for lineament mapping and route planning. These lines cut through the landscape or can be drawn at a set height (m) above the landscape. Lines are moveable after being placed.

Measurements

Users can extract elevation profiles from the landscape between any two points to a .txt document. Total number of points to extract is user selectable and is not limited to resolution of the source data. Extracted elevation profiles may be upsampled and maintain error of the source data. Extracted data can be easily viewed and manipulated in common programs such as MS Excel.

Pre-Rendered Cross Sections

Pre-rendered 2D cross sections containing relevant information can be placed in its proper location beneath the map to provide better insight into subterranean composition. This 2D cross-section is viewable from beneath the map and can be moved vertically above the landscape on command to aid in conceptual understanding. Cross-sections can be animated to show change vs. time with “playback” controls easily assessable to the user.

Flying / Walking / Driving

Users can switch between first-person walking and flying modes to better understand the data and terrain provided. Relative speed of the “flying” mode can be scaled. The user can choose to be affected by gravity in either mode. Users can enter a driving mode where you can move across the terrain in a vehicle.

Viewshed Analysis Tools

Inserts colored light sources to aid in line-of-sight signal mapping. Colors mix to show areas of signal overlap. Displays line-of-sight within a 2D plane or 3D space depending on use case. Spherical point cloud with associated sight lines can be toggled.

Lighting

Lighting can be tied to the real-world time of day to accurately reflect realistic timesensitive lighting conditions. Time of day can be manually adjusted to simulate an environment under alternative lighting conditions or the map can be statically lit. Multiple light sources and shadow casting are optional. *streaming still in beta.

Atmospheric Effects

Atmospheric effects can be tied to the real-world weather conditions to accurately reflect realistic environmental situations. Atmospheric effects can be manually adjusted to simulate an environment under alternative conditions or they can be removed entirely. *streaming still in beta.

Flood Mapping

Realistic water can be built into the landscape to accurately reflect flooding at various stages by using a water plane or modeled surfaces. Water effects are built to spec, can be dynamic (index of refraction, reflections, color, waves mechanics, etc.), and can be visualized in a 4D space. User has fine control over the vertical movement/time component of the water plane or modeled surface. Water plane can reflect the mapped potentiometric surface or can be planar.

AI Movement

Objects placed within the map can be controlled via an external text file or by an AI. Controlled objects can be highlighted to be viewable through the terrain. A navigation mesh can be built to spec.

Miscellaneous

Maps are built to be semitransparent or opaque when viewing from beneath the map.

A keyboard and mouse or controller can be used, or both.

Map boundaries can be custom built by our team, or the map can be boundless.

Landscapes may be animated to show change vs. time with “playback” controls easily assessable to the user.

The user can switch to a top-down perspective view to free up the mouse pointer.

5 levels of zoom are currently implemented.

Multiplayer is in beta.

VR is working and in beta.

AR capable.