[squid-users] decreased requests per second with big file size
Ambadas H
ambadas.tdna at gmail.com
Wed Oct 14 06:34:46 UTC 2015
Hi Amos,
Got it. Will go through the session helpers & figure out how to do it.
<squid-users at lists.squid-cache.org>
Thanks for the help :)
Ambadas
On Tue, Oct 13, 2015 at 1:25 PM, Amos Jeffries <squid3 at treenet.co.nz> wrote:
> On 12/10/2015 6:51 p.m., Ambadas H wrote:
> > Hi Amos,
> >
> > Thanks for responding
> >
> > *"You would be better off taking the first use of any domain by a
> client,*
> >
> > *then ignoring other requests for it until there is some long period*
> > *between two of them. The opposite of what session helpers do."*
> >
> > Could you please elaborate a little on the above logic.
>
> That is about as clear as I can explain it sorry. Look at what the
> session helpers do to determine whether two requests are part of the
> same session or not.
>
> You need to start with that *then* figure out how to split each sequence
> of requests now grouped into "session" down into whatever grouping you
> define "page" to be.
>
>
> >
> > My understanding, if not wrong, is to take domain/host of first client
> GET
> > request & don't consider the same if it matches with the subsequent GET
> > requests.
> >
> > In this case there is possibility of multiple unique domains/hosts for
> > single page (Eg. other domain Ads, analytics etc)?
>
> Yes. There is simply no concept of "page" in HTTP.
>
> It is a hard problem to even figure out with any accuracy what requests
> are coming from the same client.
>
> Amos
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.squid-cache.org/pipermail/squid-users/attachments/20151014/e79ee0fb/attachment-0001.html>
More information about the squid-users
mailing list