[squid-users] Significant memory leak with version 5.x (not with 4.17)

Praveen Ponakanti pponakanti at roblox.com
Fri Jan 7 21:16:19 UTC 2022


Thanks Alex & Amos. Will try the patch in the next week or 2.

On Fri, Jan 7, 2022 at 8:54 AM Alex Rousskov <
rousskov at measurement-factory.com> wrote:

> On 1/7/22 12:12 AM, Praveen Ponakanti wrote:
>
> > Is there a build with the fix, or do you have some recommended steps
> > to manually pull the source, patch the fix and then recompile?
>
> Yes, applying the patch to official Squid sources and then bootstrapping
> and building from patched sources is what folks using patches usually
> have to do. Roughly speaking:
>
>     cd squid-x.y.z/
>     patch -p1 < ...
>     ./bootstrap.sh
>     ./configure
>     make
>     make check
>     sudo make install
>
> The above steps usually require installing a few build-related tools
> (and custom options like --prefix and --with-openssl). A capable
> sysadmin should be able to get this done in most cases. It is possible
> to avoid the bootstrapping step if the patch does not modify
> bootstrapping-sensitive files and you can get a bootstrapped version of
> the sources from http://www.squid-cache.org/Versions/
>
> I am not aware of anybody providing ready-to-use builds that include the
> proposed bug 5132 fix.
>
>
> HTH,
>
> Alex.
>
>
> > On Thu, Jan 6, 2022 at 8:16 PM Alex Rousskov wrote:
> >
> >     On 1/6/22 2:50 AM, Praveen Ponakanti wrote:
> >     > Hi Alex/Amos,
> >     >
> >     > Do you still need memory logs from version 5.3 after stopping
> traffic
> >     > through the squid?
> >
> >     I cannot answer for Amos who asked for those logs, but you may want
> to
> >     try a fix posted at
> >     https://bugs.squid-cache.org/show_bug.cgi?id=5132#c27
> >
> >
> >     HTH,
> >
> >     Alex.
> >
> >
> >     > We have disabled traffic to the 5.3 version squid
> >     > about 6 hours ago and have not seen any memory being freed up
> since.
> >     > This node has used up ~50G more memory compared with 4.17 squid
> taking
> >     > similar traffic over the last 3+ weeks. I am collecting hourly
> memory
> >     > logs on 5.3 after stopping traffic. Let me know and I can attach
> the
> >     > log tomorrow morning.
> >     >
> >     > Thanks
> >     > Praveen
> >     >
> >     > On Mon, Dec 27, 2021 at 4:58 PM Praveen Ponakanti
> >     <pponakanti at roblox.com <mailto:pponakanti at roblox.com>
> >     > <mailto:pponakanti at roblox.com <mailto:pponakanti at roblox.com>>>
> wrote:
> >     >
> >     >     I cant make any changes to our prod squids this week. I have a
> >     squid
> >     >     instance (5.3v) in a test env but could not reproduce the leak
> by
> >     >     starting & stopping traffic with a bulk http req generator
> (wrk).
> >     >     Was able to send 175k rps @ 20k concurrent sessions (each
> doing a
> >     >     get on a 1KB object) through the 30-worker squid. This
> initially
> >     >     caused a 3G increase in memory usage and then flattened out
> after
> >     >     stopping the requests. If I restart the bulk reqs, the memory
> >     usage
> >     >     only goes up ~0.5GB and then drops back down. Live traffic is
> >     >     probably exercising a different code path within squid's
> >     memory pools.
> >     >
> >     >     On Mon, Dec 27, 2021 at 2:26 AM Lukáš Loučanský
> >     >     <loucansky.lukas at kjj.cz <mailto:loucansky.lukas at kjj.cz>
> >     <mailto:loucansky.lukas at kjj.cz <mailto:loucansky.lukas at kjj.cz>>>
> wrote:
> >     >
> >     >         After one day of running without clients my squid memory
> >     is stable
> >     >
> >     >         29345 proxy     20   0  171348 122360  14732 S   0.0
> 0.7
> >     >         0:25.96 (squid-1) --kid squid-1 -YC -f
> /etc/squid5/squid.conf
> >     >         29343 root      20   0  133712  79264   9284 S   0.0
> 0.5
> >     >         0:00.00 /usr/sbin/squid -YC -f /etc/squid5/squid.conf
> >     >
> >     >         Storage Mem size: 3944 KB Storage Mem capacity: 0.2% used,
> >     99.8%
> >     >         free Maximum Resident Size: 489440 KB Page faults with
> >     physical
> >     >         i/o: 0 Memory accounted for: Total accounted: 15741 KB
> >     >         memPoolAlloc calls: 1061495 memPoolFree calls: 1071691
> Total
> >     >         allocated 15741 kB So this does not seem to be the
> >     problem... L
> >     >
> >     >         Dne 26.12.2021 v 10:02 Lukáš Loučanský napsal(a):
> >     >>         ok - as it seems my squid quacked on low memory again
> today -
> >     >>
> >     >>         Dec 26 00:04:25 gw (squid-1): FATAL: Too many queued
> store_id
> >     >>         requests; see on-persistent-overload.#012    current
> master
> >     >>         transaction: master4629331
> >     >>         Dec 26 00:04:28 gw squid[15485]: Squid Parent: squid-1
> >     process
> >     >>         15487 exited with status 1
> >     >>         Dec 26 00:04:28 gw squid[15485]: Squid Parent: (squid-1)
> >     >>         process 28375 started
> >     >>
> >     >>         2021/12/26 00:01:20 kid1| helperOpenServers: Starting 5/64
> >     >>         'storeid_file_rewrite' processes
> >     >>         2021/12/26 00:01:20 kid1| ipcCreate: fork: (12) Cannot
> >     >>         allocate memory
> >     >>         2021/12/26 00:01:20 kid1| WARNING: Cannot run
> >     >>         '/lib/squid5/storeid_file_rewrite' process.
> >     >>         2021/12/26 00:01:20 kid1| ipcCreate: fork: (12) Cannot
> >     >>         allocate memory
> >     >>
> >     >>         I'm going to reroute my clients (which are on their days
> off
> >     >>         anyway) to direct connections and run it "dry" - on it's
> own.
> >     >>         But I'm not able to to test it before "lack of memory
> issues
> >     >>         occur" - because my clients are offline. So I'll watch
> squid
> >     >>         for it's own memory consuption. It's all I can do right
> now -
> >     >>         my squid already restarted and it's memory has been freed
> >     - so
> >     >>         I think just now I have no power to fill it up again :-]
> >     >>
> >     >>         L
> >     >>
> >     >>         Dne 26.12.2021 v 7:41 Amos Jeffries napsal(a):
> >     >>>
> >     >>>         If possible can one of you run a Squid to get this
> >     behaviour,
> >     >>>         then stop new clients connecting to it before lack of
> memory
> >     >>>         issues occur and see if the memory usage disappears or
> >     >>>         reduces after a 24-48hr wait.
> >     >>>
> >     >>>         A series of regular mempools report dumps from across the
> >     >>>         test may help Alex or whoever works on the bug eliminate
> >     >>>         further which cache and client related things are
> releasing
> >     >>>         properly.
> >     >>>
> >     >>>
> >     >>>         Amos
> >     >>>
> >     >>>         _______________________________________________
> >     >>>         squid-users mailing list
> >     >>>         squid-users at lists.squid-cache.org
> >     <mailto:squid-users at lists.squid-cache.org>
> >     >>>         <mailto:squid-users at lists.squid-cache.org
> >     <mailto:squid-users at lists.squid-cache.org>>
> >     >>>         http://lists.squid-cache.org/listinfo/squid-users
> >     <http://lists.squid-cache.org/listinfo/squid-users>
> >     >>>         <http://lists.squid-cache.org/listinfo/squid-users
> >     <http://lists.squid-cache.org/listinfo/squid-users>>
> >     >>
> >     >
> >     >
> >      <
> https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient
> >     <
> https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient
> >>
> >     >               Bez virů. www.avast.com <http://www.avast.com>
> >     >
> >      <
> https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient
> >     <
> https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient
> >>
> >     >
> >     >
> >     >
> >
>   <#m_-6622557068709516458_m_9217020348889694418_DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>
> >     >         _______________________________________________
> >     >         squid-users mailing list
> >     >         squid-users at lists.squid-cache.org
> >     <mailto:squid-users at lists.squid-cache.org>
> >     >         <mailto:squid-users at lists.squid-cache.org
> >     <mailto:squid-users at lists.squid-cache.org>>
> >     >         http://lists.squid-cache.org/listinfo/squid-users
> >     <http://lists.squid-cache.org/listinfo/squid-users>
> >     >         <http://lists.squid-cache.org/listinfo/squid-users
> >     <http://lists.squid-cache.org/listinfo/squid-users>>
> >     >
> >     >
> >     > _______________________________________________
> >     > squid-users mailing list
> >     > squid-users at lists.squid-cache.org
> >     <mailto:squid-users at lists.squid-cache.org>
> >     > http://lists.squid-cache.org/listinfo/squid-users
> >     <http://lists.squid-cache.org/listinfo/squid-users>
> >     >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.squid-cache.org/pipermail/squid-users/attachments/20220107/a2ed719d/attachment-0001.htm>


More information about the squid-users mailing list