Question about shared libs

For discussions about programming, programming questions/advice, and projects that don't really have anything to do with Puppy.
Post Reply
Message
Author
User avatar
matiasbatero
Posts: 60
Joined: Fri 12 Oct 2012, 01:27
Location: Mar del Plata, Argentina

Question about shared libs

#1 Post by matiasbatero »

Hi people,

I don't like how to work linux with software. Shared Libraries, is good because all libs are available for any application (lightweight packages etc), but for me, the biggest problem is that packages are always linked to a specific version of libs. This makes the system always synchronized to a line of versions ( "All old" or "All new" ) , so any software update makes massive downloads for a bunch of internal dependencies.

1) There is no possibility to install different versions of one software.
Because, 1 lib is common to all.

2) There is no possibility to allow a some old distro, (ex: Debian Squeeze) to run specific software in more recent version.
(Backports/Pinning) are bad options.

3) There is no possibility to have a system with some parts old, and other more new.

I don't like online-dependency in linux to perform installations, and constant upgrades. Etc

Rolling release isn't a solution, because 1 month without installations, makes N-hundred of MB's in future.

Changing distro versions, isn't a solution. DebianSqueeze->Wheezy->Sid. etc. Always the same, upgrading all the system.

I want to know if its possible to run, any package without dependency requeriments. I want to have a solid core system, and all the rest applications, portable.

If this is possible, i want to automatize writing some software.
I think that can be difficult without re-compiling. But i want to listen


Regards.

User avatar
technosaurus
Posts: 4853
Joined: Mon 19 May 2008, 01:24
Location: Blue Springs, MO
Contact:

#2 Post by technosaurus »

static built (no shared libraries), lightweight, 64bit linux distro here:
http://dslr.dimakrasner.com/

What you describe about shared libraries is bad packaging or bad library versioning.

Simple rules for a stable repo:
1. Build all new software against the originally distributed libraries unless major patching is necessary.

For example if you have new glibc available in the repository and its dev packages are installed when you build a new package, then you would have to update glibc to install it... one exception is if a security fix requires breaking the ABI (rare)

2. Insist that libraries use proper Major/Minor/patch versioning so that old binaries will continue to work with new libraries.

So a package built against gtk+-2.6.0, _should_ still work with gtk+-2.24.24 ... but wait, it doesn't gtk2.8 added cairo as a dependency which was fine for a few minor versions because it didn't require any direct linking to cairo (they were all wrapped in gtk calls), but somewhere along the line, people started linking directly to cairo for speed improvements that are effective if and only if that specific version is installed (otherwise its much slower) ... summary: `lazy' linking is better than `now' linking for any dynamic system - otherwise to keep that speed benefit, you get into the situation you describe where one update requires updating everything that links to it. Summary of the summary: - its usually better to be lazy... or even better use static linking and forget about libraries.
Check out my [url=https://github.com/technosaurus]github repositories[/url]. I may eventually get around to updating my [url=http://bashismal.blogspot.com]blogspot[/url].

User avatar
Iguleder
Posts: 2026
Joined: Tue 11 Aug 2009, 09:36
Location: Israel, somewhere in the beautiful desert
Contact:

#3 Post by Iguleder »

You can use chroot: install Debian Squeeze in a directory, Wheezy in another and so on. Then, you can bind mount the /tmp of each version to /tmp and run applications with chroot. It's a hacky solution you can use to install an incompatible version of a package, but there's no clean way to make it transparent to regular users.

If you want a "perfect" solution, static linking is the only way to go, but forcing stuff to link statically can be quite complicated. Some libraries use dynamically-loadable modules or plugins and do not support static linking (e.g old versions of glibc).
[url=http://dimakrasner.com/]My homepage[/url]
[url=https://github.com/dimkr]My GitHub profile[/url]

User avatar
Packetteer
Posts: 73
Joined: Sat 12 May 2012, 19:33
Location: Long Island Ny

#4 Post by Packetteer »

Hi All
I am relatively new to Linux.
So if my question/solution is way off the wall please let me know
why it is way off.

How about when you install software the software itself gets installed
to a new directory. Then all the libraries that this software needs is
installed to the same directory.

When the executable file is run it looks in the directory it was started in
to find the libraries it needs.

When upgrading the software then if upgrades to the libraries are needed
then they are upgraded at the same time.

In other words all application software is independent.

Yes this will take up more disk space but now a days with hard drives
getting bigger and bigger this in my opinion is no longer a concern.

Best Regards
John

User avatar
matiasbatero
Posts: 60
Joined: Fri 12 Oct 2012, 01:27
Location: Mar del Plata, Argentina

#5 Post by matiasbatero »

Iguleder wrote:You can use chroot: install Debian Squeeze in a directory, Wheezy in another and so on. Then, you can bind mount the /tmp of each version to /tmp and run applications with chroot. It's a hacky solution you can use to install an incompatible version of a package, but there's no clean way to make it transparent to regular users.

If you want a "perfect" solution, static linking is the only way to go, but forcing stuff to link statically can be quite complicated. Some libraries use dynamically-loadable modules or plugins and do not support static linking (e.g old versions of glibc).
Yes, "chroot-ing" between different distro versions allows to run different versions of same app. But it's the same trouble. Can be usable on Stable branch distros, like Squeeze/Wheezy for example. If you have Squeeze/Sid, it is the same than nothing. It's a good solution in some cases (as solving a particular situation, and no more), but the flexibility is very poor. Debian, allow pinning technique. Adjusting priority of repo. But is dangerous, and it doesn't work with all packages.

Static linking, could be technically the more formal clean solution. But
it is very tedious, and requires work, time.

I thought that making portable applications is more better.
Encapsulating app on its own evironment.

User avatar
matiasbatero
Posts: 60
Joined: Fri 12 Oct 2012, 01:27
Location: Mar del Plata, Argentina

#6 Post by matiasbatero »

Packetteer wrote:Hi All
I am relatively new to Linux.
So if my question/solution is way off the wall please let me know
why it is way off.

How about when you install software the software itself gets installed
to a new directory. Then all the libraries that this software needs is
installed to the same directory.

When the executable file is run it looks in the directory it was started in
to find the libraries it needs.

When upgrading the software then if upgrades to the libraries are needed
then they are upgraded at the same time.

In other words all application software is independent.

Yes this will take up more disk space but now a days with hard drives
getting bigger and bigger this in my opinion is no longer a concern.

Best Regards
John
Yes, i was thinking in that way. This is the "Portable approach".
I'm testing some linux utilities that allows to create packages that contain:
All data that app uses (libs, image.. etc), and the required environment.

Some of that utilities are:
1) CDE (Code Data Environment)
2) CARE (Comprehensive Archiver for Reproducible Execution)
3) PROOT (Like CHROOT but, in userspace)

These tools are very great. They monitorizes the app in runtime, and makes a copy of all data who its depends.

Then, running PROOT in the results, the application runs OK.

Of course, you are right. This solution, requires additional MB's.
But it can be optimized, making a compressed image. And do some manual work.

I like how it works, and the best of all, is that applications can run
on all linux distros.

User avatar
matiasbatero
Posts: 60
Joined: Fri 12 Oct 2012, 01:27
Location: Mar del Plata, Argentina

#7 Post by matiasbatero »

technosaurus wrote:static built (no shared libraries), lightweight, 64bit linux distro here:
http://dslr.dimakrasner.com/

What you describe about shared libraries is bad packaging or bad library versioning.

Simple rules for a stable repo:
1. Build all new software against the originally distributed libraries unless major patching is necessary.

For example if you have new glibc available in the repository and its dev packages are installed when you build a new package, then you would have to update glibc to install it... one exception is if a security fix requires breaking the ABI (rare)

2. Insist that libraries use proper Major/Minor/patch versioning so that old binaries will continue to work with new libraries.

So a package built against gtk+-2.6.0, _should_ still work with gtk+-2.24.24 ... but wait, it doesn't gtk2.8 added cairo as a dependency which was fine for a few minor versions because it didn't require any direct linking to cairo (they were all wrapped in gtk calls), but somewhere along the line, people started linking directly to cairo for speed improvements that are effective if and only if that specific version is installed (otherwise its much slower) ... summary: `lazy' linking is better than `now' linking for any dynamic system - otherwise to keep that speed benefit, you get into the situation you describe where one update requires updating everything that links to it. Summary of the summary: - its usually better to be lazy... or even better use static linking and forget about libraries.
Thanks for your explanation.

I'm reading about DSLR.

Yes, you are right, is a bad packaging / library versioning.
But, i think that in the case of best packaging available, the current system have little flexibility, it's very centralized by nature.

Unix-Directories, also makes things difficult. GoboLinux, have a different approach to solve this poor flexibility. It uses the main filesystem and a symbolic-linking criteria as its own package manager. (Like a mask)

http://www.gobolinux.org/index.php?page=at_a_glance

For example, you can view your installed packages doing:
cd /programs; ls

If you want to view all data of X package
find packageX

If you want to remove a package:
rm package

Simple, elegant, and allows to work with any version of package.
The only cons, is that all is compiled with recipes like gentoo / arch+AUR.

Post Reply