# 07-05 5 GHz Results and Outlook

The goal of the last week was to test the effects of running the software over 5 GHz Wifi on the accuracy of measurements. In a further push towards better accuracy, I also ran the software on the localhost with network traffic only stemming from the gateway and background processes.

# Results

Advertising intervals were set to 100ms to save battery and simulate a more realistic business application environment. This induces higher fluctuations in the RSSI. The telemetry for the 2.4 GHz was collected on the 23. of June and 5 GHz telemetry was collected yesterday. The location, device orientations and distances were similar within an acceptable margin of error. External environmental influences such as temperature and Wifi AP network traffic were most likely different. The demonstration below relies on a single 20 second sequence of measurements, but the data is fairly representative of the other sequences. 9 other measurement runs at 2.4 GHz with 100ms advertising intervals were conducted between the 19. and 23. of June with similar standard deviations.

# 5 GHz remote API

Measurements are taken by the gateway and client. Then they are uploaded to the computer in my room using 5 GHz Wifi and then Ethernet.

Server on remote PC over 5 GHz

Moving Avg. Client Moving Avg. Gateway Raw Client Raw Gateway Real Time Correction
Mean: -60.00720 -57.68279 -60.33333 -58.18867 -59.89242
Max: -59.22219 -56.43289 -58 -54 -58.35234
Min: -60.96040 -59.60921 -64 -62 -61.30525
Count: 205 205 204 212 205
Median: -60.01818 -57.66458 -59.0 -59.0 -60.01487
Stddev: 0.3531616 0.6097899 1.9058289 2.1290729 0.6451923
Variance 0.1247231 0.3718438 3.6321839 4.5329518 0.4162732

# 5 GHz localhost

Measurements from the client are sent internally to a socket on the client localhost. Gateway data is also sent to the client.

Server on RPi localhost over 5 GHz

Moving Avg. Client Moving Avg. Gateway Raw Client Raw Gateway Real Time Correction
Mean: -60.53555 -57.07341 -61.0 -57.75242 -60.54851
Max: -59.66950 -55.42165 -58 -54 -59.01339
Min: -61.50989 -58.55489 -65 -60 -61.95265
Count: 211 211 215 206 211
Median: -60.53225 -57.09019 -60.0 -59.0 -60.56148
Stddev: 0.3650452 0.6854410 2.08913 2.3873383 0.6461570
Variance 0.1332580 0.4698293 4.36448 5.6993843 0.4175189

# 2.4 GHz remote API for reference

Client and gateway data is sent to my PC over 2.4 GHz Wifi.

\newpage

Server on remote PC over 2.4 GHz

Moving Avg. Client Moving Avg. Gateway Raw Client Raw Gateway Real Time Correction
Mean: -49.83979 -49.10924 -54.54455 -53.493446 -48.79216
Max: -45.50784 -43.33737 -40 -39 -38.60860
Min: -57.43263 -55.97614 -61 -65 -58.619695
Count: 198 198 202 229 198
Median: -48.86372 -48.05356 -56.0 -55.0 -49.17930
Stddev: 2.9613178 3.3228663 6.0495873 5.5753373 4.4155353
Variance 8.7694032 11.041440 36.597507 31.084386 19.496952

With pretty good certainty it can be claimed that the accuracy improvement with 5 GHz Wifi is gigantic. The standard deviation of the moving averages was decreased by more than 85%. The variance was decreased by a similar amount. Unfortunately the real time correction has consistently had larger standard deviations than the raw client data. This is most likely due to the lack of any correlation between client and gateway fluctuations. I consider the real time correction implementation to be finished.

# Problem

Next November I would optimally like to have a general purpose indoor positioning system that has low latency, high accuracy in tracking movement of the target. As with most engineering projects, I think I will need to make some tradeoffs here and tailor my software to a certain usecase. For example the current moving averaging is pretty accurate but it requires a second of artificial latency in order for future measurements to arrive in the backend. Therefore the software is responsive to movements and would be perfectly fine for tracking customer movement in a store, where it is unimportant where the device is in real time. But the latency might be crippling for use cases where real time knowledge of the position is paramount i.e. with a robot that stacks boxes precisely in a warehouse.

The angle of attack at positioning is very different with the EKF and the moving average. The EKF tries to construct a real time state variable of the displacement vector based on the presumed speed and orientation of the client. There is a given amount of inertia in the state of the EKF since it can only predict what is going to happen next to the displacement based on existing measurements and adjust when new data is ingested. Due to the bayesian nature of the EKF the position estimates will continue moving in the same direction for a short distance even after the mobile target turns and moves in the other direction. The averaging algorithm conversely just waits for the movement to happen and registers it a posteriori as such. Since all its only job is to observe what happened, it is much simpler and more accurate.

A reason why most research papers focus on Kalman Filtering is because they do not get as much data from their beacons. The bayesian nature of an averaging filter means that the accuracy of an averaged measurement would be abysmal with limited data availability. The accuracy and the latency would both be terrible. But the extremely short advertising intervals on my beacons give me ample information to construct a fairly low latency estimate with pretty good accuracy.

# Solution

A hybrid solution combining the best of both worlds would only make a limited amount of sense, because the robot doesn't care about where it was precisely a second ago and the shop owner does not care if the customer is currently looking at milk. Especially with a more realistic advertising interval the averaging part is next to useless.

After putting some thought into what I want to demonstrate in November, I am tending more towards the side of high accuracy use cases such as finding keys in a room or tracking customer movements in a store instead of real time low latency applications. Therefore I think I will move forward with the averaging filter as a base for trilateration and do Kalman Filtering afterwards as a lower priority waypoint. A good reason why it still makes sense to implement the algorithm is that I lifted the battery life constraint in order to improve accuracy, unknowingly distancing myself from a more realistic business application. I regard short advertising intervals as a fallback strategy if kalman filtering does not work and will push forward with trilateration now.

# Next steps

The next step in trilateration is to get reliable parameters for the power / distance function. I think my calibration walk is more or less production ready, but some small things need to be tweaked. I need to take more measurements at each distance since there is definitely a chance for RSSI to be in a local maxima over a 5 second interval as shown above. Also the RSSI signal strength at 1m distance (Y-Axis offset) is fairly device specific, so each beacon needs to have its own set of parameters.

Range based trilateration with uncertain distances can be approached in a few different ways. The problems and solutions will be discussed in blog post in the near future.

Last Updated: 11/23/2020, 9:22:33 PM