This research established a real-time 3D reconstruction pipeline for a dynamic outdoor environment, focusing on the criteria efficiency, accuracy, and applicability on later generation of Android smartphones. This research aimed to determine an effective data capture density through key metrics such as the number of stored vertices in a point cloud, confidence threshold for estimated depth data, image resolution, memory consumption, and computational time. With the concluded data capture density, the accuracy and consistency of the real-time 3D reconstruction pipeline was evaluated by point cloud registration. The implementation of this pipeline was assessed through experimental research. The methodology leveraged the Record API, Playback API, and Raw Depth API provided the ARCore library. Image and sensor data was recorded and played backed systematically with varying key metrics values. The reconstruction process involved requiring depth data from depth images. The concluded data capture density was a point cloud with 500 000 stored vertices, filtered with a high confidence threshold and created from 160x90 pixel depth images. The pipeline demonstrated suitable memory consumption with an average computational time under 16 milliseconds, confirming real-time operation. Point cloud registration demonstrated an average RMSE of roughly ± 2 cm and a fitness score of ± 0.6 between point clouds generated from the different datasets, and an RMSE ± 0,9 cm and a fitness score over 0.84 from same dataset. The findings demonstrate the possibility to achieve real-time 3D reconstruction with high spatial accuracy and reasonable consistency by utilising the ARCore library and an Android smartphone without additional equipment or active sensors.