[squid-users] decreased requests per second with big file size
Amos Jeffries
squid3 at treenet.co.nz
Tue Oct 13 07:55:58 UTC 2015
On 12/10/2015 6:51 p.m., Ambadas H wrote:
> Hi Amos,
>
> Thanks for responding
>
> *"You would be better off taking the first use of any domain by a client,*
>
> *then ignoring other requests for it until there is some long period*
> *between two of them. The opposite of what session helpers do."*
>
> Could you please elaborate a little on the above logic.
That is about as clear as I can explain it sorry. Look at what the
session helpers do to determine whether two requests are part of the
same session or not.
You need to start with that *then* figure out how to split each sequence
of requests now grouped into "session" down into whatever grouping you
define "page" to be.
>
> My understanding, if not wrong, is to take domain/host of first client GET
> request & don't consider the same if it matches with the subsequent GET
> requests.
>
> In this case there is possibility of multiple unique domains/hosts for
> single page (Eg. other domain Ads, analytics etc)?
Yes. There is simply no concept of "page" in HTTP.
It is a hard problem to even figure out with any accuracy what requests
are coming from the same client.
Amos
More information about the squid-users
mailing list