[squid-users] Squid Memory Leak with certain FTP requests?

Amos Jeffries squid3 at treenet.co.nz
Wed Feb 11 20:05:04 UTC 2015


On 12/02/2015 4:14 a.m., Silamael wrote:
> On 02/11/2015 02:11 PM, Yuri Voinov wrote:
>>
>> 11.02.15 19:07, Silamael пишет:
>>> On 02/11/2015 12:51 PM, Silamael wrote:
>>>> On 02/11/2015 11:10 AM, Yuri Voinov wrote:
>>>>> Squid first saves object in memory. Then swapout object to cache. As
>>>>> usual:
>>>>> This is no memory leaking, but normal cache behaviour. As documented.
>>>>>
>>>>> You can play around with range_offset_limit and quick_abort_min
>>>>> parameters.
>>>>>
>>>>> Or try to no cache this FTP with ACL.
>>>>>
>>>>> Usually, when suggests memory leaking, this often OS issue. Not Squid.
>>>> Hello Yuri,
>>>>
>>>> Thanks for your quick reply.
>>>> The ACL you suggested will probably solve the problem.
>>> Just got the info that the customer already has disabled the caching.
>>> Sounds no longer as "normal behaviour" to me.
>> As Amos said, may be - may be not.
> 
> Amos said something? Got no mail from him.
> 

Must have been earlier threads, this is my first reply in this thread.

>> Some FTP files pointless to cache.
> 
> Sure, maybe some FTP files are not to be cached.
> 
>>
>> If it need just once..... For what cache it?
> 
> I do not want to cache any. And I think a 'cache deny all' does that.

Correct.

> Nevertheless, even with no caching at all the Squid process constantly
> needs more memory and squidclient reports that the memory is used for
> the 2K buffers.
> 
> Just try it. Squid with default configuration and cache deny all and
> then do some wget ftp://server/path/. Each requests will increase the 2K
> buffers.
> 

Thank you. Can you drop that into a bug report please so we dont loose
track of it?

Cheers
Amos



More information about the squid-users mailing list