
My wish list would look like: - Limits on how many tests are run against a specific target (globally) - Robots.txt/SRV Record to increase the limit - Ability to see if ISP injected Headers (Supercookies) - Ability if the server changes anything about the website (Ads etc). - Verify TLS certificates - Thomas
On Oct 21, 2016, at 3:29 , Marat Khalili <mkh@rqc.ru> wrote:
I don't know how hard it'd be to implement, but it'd be natural to use robots.txt for this. Something like:
User-agent: RIPE-ATLAS Allow: /test-me-here/ It would need to be checked both on control centers (to make sure probing is allowed at all) and on probes (to react on changes quickly).
--
With Best Regards, Marat Khalili
On 21/10/16 07:49, Joseph B wrote:
Indeed, it would be really useful to be able to measure http/https sites from atlas probes. As a network operator, even if those measurements were only allowed inside our own (or downstream) AS's that would be vary useful.
Cheers,
Joseph