[Mageia-discuss] How to prune urpmi-proxy cache?

Morgan Leijström fri at tribun.eu
Mon Mar 12 10:59:37 CET 2012

måndagen den 12 mars 2012 07.49.42 skrev  Maarten Vanraes:
> Op maandag 12 maart 2012 02:07:46 schreef Morgan Leijström:
> > Cauldron updates makes urpmi-proxy cache grow pretty big.
> > 
> > Any idea how to automatically or semi automatically prune
> > 
> >  /var/cache/urpmi-proxy  ?
> > 
> > Something along the line
> > "delete all but the latest versions of same package"
> > 
> > 
> > Maybe there is some tool used for similar task that can be used/hacked?
> > 
> > 
> > Another idea is to compare contents of urpmi-proxy cache against mirror
> > and delete files not present on mirror.  (especially thinking of
> > cauldron release)
> > 
> > 
> > ( About urpmi-proxy
> > https://wiki.mageia.org/en/Urpmi-proxy
> > https://forums.mageia.org/en/viewtopic.php?f=8&t=1770
> > 
> >  )
> While this is a feature not present yet, but on the TODO file of
> urpmi-proxy:

Nice :)

> I've been thinking along these lines:
> 1) delete older versions of the same file in the same directories

That is the best.

( I wonder if it shold be possible to keep old versions, but if any client 
needs it it will simply be downloaded again, and i guess it will not happen 
very often.   skip list seems overkill.  )

> 2) give a configurable max size and delete the older files if the size is
> reached

Yes. Size per folder or whole urpmi-proxy, or both?

> 3) give a configurable time limit and delete all files older than this one
> (in a cronjob)

That should be able to be set by folder, i.e i want to give mga1 2 years, (to 
facilitate new installs or reinstalls) but cauldron only a couple months.

How would theese be combined?

Idea to alow any content until max size is reached, and then prioritise:

If size of folder is too large, then
    remove old versions until size is under limit
    if no more old versions, then
        remove eldest files until size is under limit

Easiets to apply per folder only

Will be complicated to apply globally
Maybe by "outer control loop" that adjust down max size per folder if total 
limit is reached. )

So if the OP suddenly realise he need more space for other thinks he can just 
adjust down urpmi-proxy max size and trigger the cleaning, and all he will get 
space easily, prioritising is taken care of.

... i will stop fantasising now...
> that being said, there are a couple of things you can do:
> A) do a find with a time limit and remove the older files

> B) rsync with a mirror (with --delete option)

It must not download packages not already present, thus:

> C) rsync with a mirror (but find some option to only delete the removed
> files and nothing else)

> for A) i likely can make a quick shell script to do that, but i think
> rsyncing with could be the quickest way...
> which would you prefer?

C - with some check so it do not remove all files if the mirror have some 
problem or is offline making it look completely empty....

For the imminent need i simply opened the folders in GIU file browser, sorted 
on size and manually clicked the old version of the 20 largest packages in 
each folder, saved a gigabyte.

So i am good for now.
Just thinking about the future for me and other users.

Morgan Leijström

More information about the Mageia-discuss mailing list