Has anyone ever heard of a package manager that 'scans' other machines within its subnet or within specified subnets for updates before using the official repositories? There would either have to be some service advertisement protocol lime MDNS or each machine would literally have to scan for a designated port number listening for these requests on all machines. Once they locate each other, the idea would be ome machine downloads 500MB of updates from the repo, and from there on, every other machine (with the same distro and arch) just pulls from the faster local machine, rather than using up inet bandwidth.
Any suggestions (Other than a dedicated local repository mirror)?
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Billy Crook wrote:
Has anyone ever heard of a package manager that 'scans' other machines within its subnet or within specified subnets for updates before using the official repositories? There would either have to be some service advertisement protocol lime MDNS or each machine would literally have to scan for a designated port number listening for these requests on all machines. Once they locate each other, the idea would be ome machine downloads 500MB of updates from the repo, and from there on, every other machine (with the same distro and arch) just pulls from the faster local machine, rather than using up inet bandwidth.
Any suggestions (Other than a dedicated local repository mirror)?
You should of course just have a local debian repository (partial credit given for ubuntu)! :)
One option short of a full local repo would be to configure your systems to update via a web proxy. Then just take a system with some spare HDD space, setup squid, and configure how much disk space you want to eat up as a local cache of the packages you're actually using. This method is centralized rather than distributed, but there's not much to setup (just the one proxy server, and tell the update clients about your proxy).
You can even get fancy and use transparent proxying to make the entire process invisible to the clients. Then you'd only have to setup the proxy server and some firewall redirect rules.
NOTE: This works quite well for apt on debian. I haven't used RH other than via RHN (and even that's been a few years), so I'm not sure how well it would work for current RH or Fedora releases. As long as the actual file transfers are done with http, however, the caching proxy idea should work pretty well.
- -- Charles Steinkuehler charles@steinkuehler.net
"Billy Crook" billycrook@gmail.com writes:
Has anyone ever heard of a package manager that 'scans' other machines within its subnet or within specified subnets for updates before using the official repositories? There would either have to be some service advertisement protocol lime MDNS or each machine would literally have to scan for a designated port number listening for these requests on all machines. Once they locate each other, the idea would be ome machine downloads 500MB of updates from the repo, and from there on, every other machine (with the same distro and arch) just pulls from the faster local machine, rather than using up inet bandwidth.
Any suggestions (Other than a dedicated local repository mirror)?
I haven't heard of anything like this before, but a problem I see is that you'd have to authenticate which of the servers are allowed to contain updates. Otherwise someone could put a box up w/ rouge packages and get them pushed out (granted package signing probably solves this a little).
Can I ask what the advantage of this as opposed to a local repository is?
Good point. The easiest way to secure it would be for the service to trust the other machines based on their root password. If they don't match, don't trust; if they do, then they're either controlled by the same person or at least one of the admins is a moron. I was also assuming you would only trust packages signed by your distro, in which case, even if someone broke into your house and put a machine on your network, its rogue packages would easily be detected and ignored.
Local repositories have to be set up, and maintained by people. The package manager is 'just there'. I'm surprised the main distros haven't came up with a clever way like this to save on their bandwidth bills.
On 9/20/07, Kyle Sexton ks@mocker.org wrote:
"Billy Crook" billycrook@gmail.com writes:
Has anyone ever heard of a package manager that 'scans' other machines
within its subnet or within specified subnets
for updates before using the official repositories? There would either
have to be some service advertisement protocol
lime MDNS or each machine would literally have to scan for a designated
port number listening for these requests on all
machines. Once they locate each other, the idea would be ome machine
downloads 500MB of updates from the repo, and
from there on, every other machine (with the same distro and arch) just
pulls from the faster local machine, rather
than using up inet bandwidth.
Any suggestions (Other than a dedicated local repository mirror)?
I haven't heard of anything like this before, but a problem I see is that you'd have to authenticate which of the servers are allowed to contain updates. Otherwise someone could put a box up w/ rouge packages and get them pushed out (granted package signing probably solves this a little).
Can I ask what the advantage of this as opposed to a local repository is?
-- Kyle Sexton
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Billy Crook wrote:
Good point. The easiest way to secure it would be for the service to trust the other machines based on their root password. If they don't match, don't trust; if they do, then they're either controlled by the same person or at least one of the admins is a moron. I was also assuming you would only trust packages signed by your distro, in which case, even if someone broke into your house and put a machine on your network, its rogue packages would easily be detected and ignored.
As long as the repository is properly secured against man in the middle attacks you should be safe with the proxy approach I mentioned, or with any other sort of distributed download/storage. Exactly *HOW* the file gets onto the system shouldn't matter to the verification tools.
And if the repository/packaging tools aren't secure against MitM attacks, it's not really secure anyway (unless you know and trust every link between you and the repository).
Local repositories have to be set up, and maintained by people. The package manager is 'just there'. I'm surprised the main distros haven't came up with a clever way like this to save on their bandwidth bills.
Indeed. And using a transparent proxy approach, it shouldn't be hard to make a pre-configured proxy system that would require minimal setup on the server side (how big and where would you like the repository cache), and little or no setup on the client end (could require pointing to the 'local' repository or maybe even auto-discover).
This seems easy enough someone should throw together a debian package for it. Oh wait...why not look to see if someone else has done this already?
$ apt-cache search apt cache alevt - X11 Teletext/Videotext browser approx - caching proxy server for Debian archive files apt-cacher - caching proxy system for Debian package and source files apt-file - APT package searching utility -- command-line interface apt-move - Maintain Debian packages in a package pool apt-proxy - Debian archive proxy and partial mirror builder apt-rdepends - Recursively lists package dependencies bmagic - C++ template library for efficient platform independent bitsets gpsbabel - GPS file conversion plus transfer to/from GPS units kio-apt - an apt-cache ioslave for KDE libapt-pkg-perl - Perl interface to libapt-pkg sg3-utils - Utilities for working with generic SCSI devices wajig - simplified Debian package management front end
Looks like approx, apt-cacher, and apt-proxy all do what you're looking for, with the caveat that files are stored on one machine, and not distributed across all client systems.
- -- Charles Steinkuehler charles@steinkuehler.net
So I'm guessing the answer is: No, nobody has heard of a package manager that does this on its own.
On 9/20/07, Charles Steinkuehler charles@steinkuehler.net wrote:
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Billy Crook wrote:
Good point. The easiest way to secure it would be for the service to trust the other machines based on their root password. If they don't match, don't trust; if they do, then they're either controlled by the same person or at least one of the admins is a moron. I was also assuming you would only trust packages signed by your distro, in which case, even if someone broke into your house and put a machine on your network, its rogue packages would easily be detected and ignored.
As long as the repository is properly secured against man in the middle attacks you should be safe with the proxy approach I mentioned, or with any other sort of distributed download/storage. Exactly *HOW* the file gets onto the system shouldn't matter to the verification tools.
And if the repository/packaging tools aren't secure against MitM attacks, it's not really secure anyway (unless you know and trust every link between you and the repository).
Local repositories have to be set up, and maintained by people. The package manager is 'just there'. I'm surprised the main distros haven't came up with a clever way like this to save on their bandwidth bills.
Indeed. And using a transparent proxy approach, it shouldn't be hard to make a pre-configured proxy system that would require minimal setup on the server side (how big and where would you like the repository cache), and little or no setup on the client end (could require pointing to the 'local' repository or maybe even auto-discover).
This seems easy enough someone should throw together a debian package for it. Oh wait...why not look to see if someone else has done this already?
$ apt-cache search apt cache alevt - X11 Teletext/Videotext browser approx - caching proxy server for Debian archive files apt-cacher - caching proxy system for Debian package and source files apt-file - APT package searching utility -- command-line interface apt-move - Maintain Debian packages in a package pool apt-proxy - Debian archive proxy and partial mirror builder apt-rdepends - Recursively lists package dependencies bmagic - C++ template library for efficient platform independent bitsets gpsbabel - GPS file conversion plus transfer to/from GPS units kio-apt - an apt-cache ioslave for KDE libapt-pkg-perl - Perl interface to libapt-pkg sg3-utils - Utilities for working with generic SCSI devices wajig - simplified Debian package management front end
Looks like approx, apt-cacher, and apt-proxy all do what you're looking for, with the caveat that files are stored on one machine, and not distributed across all client systems.
Charles Steinkuehler charles@steinkuehler.net -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.0 (MingW32) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFG8s/uLywbqEHdNFwRAlybAKDys2w9D8uT+M+Tnon/zMnUEeVr2QCfaJ/b Qu/oHzqk/hLEkvvzCr6IGpM= =dMt/ -----END PGP SIGNATURE-----
I recall hearing some apt-torrent type things, but I don't know if they're very complete.
But here's the deal: rather than write a whole new package manager apt-get neatly divides transport mechanisms. This way, you can write a handler for whatever transport you desire (in this case, local network) and let someone else handle the hard work of debugging the rest of it. apt-cache does this, for example, as do the other ones that Charles mentioned.
If you want "discoverability" then it might make sense to start with apt-cache and come up with a way to publish apt-cache service on mdns, and a hook to scan mdns services for it. But from a security standpoint, it seems like more hassle than just setting up an apt-proxy and adding a single line to sources.list, so I imagine this is why nobody's put much effort into it.
Justin Dugger
On 9/20/07, Jonathan Hutchins hutchins@tarcanfel.org wrote:
On Thursday 20 September 2007 03:39:19 pm Billy Crook wrote:
So I'm guessing the answer is: No, nobody has heard of a package manager that does this on its own.
Yeah, don't you love the responses that say "I know absolutely nothing about this, here is my expert opinion on it!"? _______________________________________________ Kclug mailing list Kclug@kclug.org http://kclug.org/mailman/listinfo/kclug
Well that's not entirely true.
Apt can do what you want if you're willing to do a little work. It's called aptpinning. You can add your local repository. Make your own debian release and make that the preferred release, and then allow fallback to the external release type. It's a bit of work you have to build your own packages to do this.
Aptpinning isn't necessary if your local packages are not in any the servers you're using. I do this with some multverse packages. Since I only use certain one in multiverse, I just maintain my own packages and put them in my own repsoitory. everything else comes from debian repos.
--- Jonathan Hutchins hutchins@tarcanfel.org wrote:
On Thursday 20 September 2007 03:39:19 pm Billy Crook wrote:
So I'm guessing the answer is: No, nobody has
heard of a package manager
that does this on its own.
Yeah, don't you love the responses that say "I know absolutely nothing about this, here is my expert opinion on it!"?