As part of drone surveying project called SkyMap, I developed and web-based point cloud/mesh rendering application that could be viewed in VR in near real-time. This program not only enabled viewing of points but also virtual measurements that could be scaled to reflect real-life distances.
The reconstruction shown was data captured during a flight test outside of the University of Waterloo.
Skills: JavaScript, Python, HTML, Flask (Web Server), Point Clouds/Mesh Rendering, VR, Analog Controls, Dynamic File Loading
Captured scans are processed via chunking data into 1x1x1 meter cubes in space. Data points are then stored using a 3 spatial coordinate naming schema "chunk_x_y_z.ply" in a database. The VR user's positional and viewer data is then used to dynamically load in different chunks via the JavaScript A-Frame library. The rendered pages are then fed through a reverse proxy (ngrok) to make them publicly accessible.