HTTP measurements at willing targets / protocol suggestion
Hi, We are interested (like many others, I guess) in the ability to perform HTTP measurements at our own non-anchored network. Understanding the potential for abuse, I would like to suggest the following authentication protocol, which is based on best practices exhibited by other services with such potential (abuse or privacy implications). 1. Confirm control of the domain registration: * This is usually done by mailing the technical contact for the relevant WHOIS entry with a confirmation email containing a unique hash, thus validating ownership. 2. Confirm control of the DNS servers: * This is usually done by editing the root TXT record with a unique hash or publishing a CNAME with unique hash. 3. Confirm control of the Web servers: * This is usually done by placing a uniquely-hashed file in the webserver root directory, a unique hash in the meta-tags for the index html file or a unique value in a file such as robots.txt. I believe this protocol is sufficient to ensure that a web site owner agrees to the implications of allowing free HTTP measurements against their servers and that no unwilling server will ever be probed. At most during the protocol, the only resource that can be hit is a static file or robots.txt specifically, which has very little capability to overwhelm a web server, especially if negative responses are cached for a considerable amount of time / validation is done via a few nodes and propagated across the network. thoughts/ideas welcome. Regards, Gil Bahat, DevOps Engineer, Magisto Ltd.
â+1 â Sounds like a perfect solution â đâ Med venlig hilsen / Best regards Emil Stahl On Mon, Nov 16, 2015 at 9:35 AM, Gil Bahat <gil@magisto.com> wrote:
Hi,
We are interested (like many others, I guess) in the ability to perform HTTP measurements at our own non-anchored network. Understanding the potential for abuse, I would like to suggest the following authentication protocol, which is based on best practices exhibited by other services with such potential (abuse or privacy implications).
1. Confirm control of the domain registration: * This is usually done by mailing the technical contact for the relevant WHOIS entry with a confirmation email containing a unique hash, thus validating ownership.
2. Confirm control of the DNS servers: * This is usually done by editing the root TXT record with a unique hash or publishing a CNAME with unique hash.
3. Confirm control of the Web servers: * This is usually done by placing a uniquely-hashed file in the webserver root directory, a unique hash in the meta-tags for the index html file or a unique value in a file such as robots.txt.
I believe this protocol is sufficient to ensure that a web site owner agrees to the implications of allowing free HTTP measurements against their servers and that no unwilling server will ever be probed. At most during the protocol, the only resource that can be hit is a static file or robots.txt specifically, which has very little capability to overwhelm a web server, especially if negative responses are cached for a considerable amount of time / validation is done via a few nodes and propagated across the network.
thoughts/ideas welcome.
Regards,
Gil Bahat, DevOps Engineer, Magisto Ltd.
On 2015/11/16 12:00 , Emil Stahl Pedersen wrote:
â+1 â Sounds like a perfect solution â đâ
Sounds like it is open for abuse. Someone registers evil.whatever, passes all validation checks and can then serve any content he likes. No way to find out who set it up.
Hi, Is there any place where we can have a full list of reasoning as to what's blocking HTTP measurements? The above discussion does not relate to the risks posed to the clients by HTTP measurements (monetary, regulatory/legal), which mandates a mechanism that only willing probe hosts enter it. OTOH I don't understand the abuse scenario from the host side (i.e. if I register evil.whatever and I'm hosting it, who can I damage aside from myself and my own hosting bill? that's unclear to me) as long as the probe's capabilities are Regards, Gil Bahat, Director of Online Operations, Magisto Ltd. On Mon, Nov 16, 2015 at 1:08 PM, Philip Homburg <philip.homburg@ripe.net> wrote:
On 2015/11/16 12:00 , Emil Stahl Pedersen wrote:
â+1 â Sounds like a perfect solution â đâ
Sounds like it is open for abuse. Someone registers evil.whatever, passes all validation checks and can then serve any content he likes.
No way to find out who set it up.
participants (3)
-
Emil Stahl Pedersen
-
Gil Bahat
-
Philip Homburg