The first contribution concerns a narrow-band standard in lte intended for internet of things (iot) devices. This lte standard includes a special position reference signal sent synchronized by all base stations (bs) to all iot devices. Each device can then compute several pair-wise time differences that corresponds to hyperbolic functions. Using multilateration methods the intersection of a set of such hyperbolas can be computed. An extensive performance study using a professional simulation environment with realistic user models is presented, indicating that a decent position accuracy can be achieved despite the narrow bandwidth of the channel.
The second contribution is a study of how downlink measurements in lte can be combined. Time of flight (tof) to the serving bs and time difference of arrival (tdoa) to the neighboring bs are used as measurements. From a geometrical perspective, the position estimation problem involves computing the intersection of a circle and hyperbolas, all with uncertain radii. We propose a fusion framework for both snapshot estimation and filtering, and evaluate with both simulated and experimental field test data. The results indicate that the position accuracy is better than 40 meters 95% of the time.
A third study in the thesis analyzes the statistical distribution of timing measurement errors in lte systems. Three different machine learning methods are applied to the experimental data to fit Gaussian mixture distributions to the observed measurement errors. Since current positioning algorithms are mostly based on Gaussian distribution models, knowledge of a good model for the measurement errors can be used to improve the accuracy and robustness of the algorithms. The obtained results indicate that a single Gaussian distribution is not adequate to model the real toa measurement errors. One possible future study is to further develop standard algorithms with these models.