On Mon, Jun 25, 2018 at 03:10:07PM +0200, Guido Iaquinti <guido.iaquinti@gmail.com> wrote a message of 16 lines which said:
I was discussing the above with a few friends and I understand that there might be some security related concerns but I think with a proper user policy and technical implementation they could be easily mitigated:
It is not purely technical. There is also an ethics problem as well <https://labs.ripe.net/Members/kistel/ethics-of-ripe-atlas-measurements> Basically, in places where you can have trouble for your Internet activity, HTTP is often actually monitored (which is not the case with ICMP and DNS).
- allow HTTP(s) queries only on '/' and without any args
- global rate limit on Atlas based on domain. Example: top 1000 domains can get 100 req/s globally while everything else is throttled to 10 req/s (optional: site owners can override this value via `robots.txt` or something similar)
It solves the "RIPE Atlas probes as a dDoS botnet" issue (at the price of some complexity for Atlas) but not the ethical one. Imagine people asking a saudi probe to access <https://femen.org/>