Software Development: The Linux Problem

Warning: Senseless rant ahead.
Since I am developing cross platform software like Irrlicht, irrKlang and commercial games, I only had constant problems on one single system: Linux. Writing applications on this operation system can be a pain in the *** sometimes. Windows and MacOS don't have these problems at all or at least in a very reduced form only. Maybe this is the reason why Linux is so slowly getting accept by the average user. The problem is that if you write software which runs on the Linux system you are using, you can not be sure that it will work with another linux distribution or even on the same distribution but a different version. Different gcc compiler versions will produce code which may have bugs, _other_ bugs, or interfaces wich are not binary compatible, different kernel versions cause side effects you never could have thought of, some libraries will behave totally different on other hardware or if they are simply another version, and some features suddenly stop to be supported or work. In short: Backward or fortward compatibility hell. Fortunately, I have the subjective feeling that the situation has improved during the last years at Linux. The current work around for this mess is the open source community: Everybody is able to patch everything. And maybe this is also a part of the cause of this problem. So maybe Linux wouldn't have that problem if it wouldn't be that open. Just a thought.

34 comments, already:

The problem is not that it is open, the problem is the many distributions and way they are working. Every distribution is using there “own” kernel, there “own” compiler and libs. As long as this is not changed, those problems will stay…
D.N.Perfors () - 06 12 07 - 09:28

Ok, senseless rant, but anyway:
When compiling with recent MSVCs you will likely get binaries that don’t work on older systems, simply because the dlls are not existant. So you have to deliver all dlls with your app. But in that case you will also get any Linux app to work on any system. The thing is that you cannot expect life-time binary compatibility of all apps and libs on your system in any combination.
Moreover, also the dynamic libs on Linux have a working version management integrated, developers simply have to use the correct linker commands to use it in a proper way. And of course distributions have to deliver all library versions (but that is usually possible, and even became definitely better the last years).
And pointing out gcc bugs is really hilarious, compared to Iso-C++ support by MSVC. In case you use the Intel compiler for your development I’d understand that you feel way better, but that’s also a question of costs. But when talking about MSVC I’d say that it’s way easier to get a large project working, especially when lots of external libs are involved. The libc usage in MSVC is really rotten. Who needs ten different, incompatible, and mutually exclusive libc versions?!
When accessing some kernel features directly you will likely see some incompatibilities with older or forthcoming versions. However, just because MS forces you to use some fixed (and partially decades old) interfaces need not make anything better. On Linux you could gain access to every device and property of the kernel in a usually pretty uniform way. On Windows systems you get the things MS offers you, otherwise it gets nasty. If you need it you have to face it…
hybrid - 06 12 07 - 09:51

I want cookies… this is soo confusing hybrid :-p.
leo () (link) - 06 12 07 - 10:30

linux = sucks. end of story
blah - 06 12 07 - 13:12

No way, you just can’t handle it. But that might by intended :-P
hybrid - 06 12 07 - 13:26

> So maybe Linux wouldn’t have that problem if it wouldn’t be that open. Just a thought.
?!? It’s the first time I read something like this. What does it mean? Are you going to release the “semi-opensource-licence” (gplv4 may be)?
stef_ - 06 12 07 - 13:44

They call it the “freedom” of open source. Especially with linux everybody can hack this a bit , hack that a bit, compile kernel with this but not that …
They call it a “free” country. Every body can have a gun and…
bull () - 06 12 07 - 13:58

BTW, the same concept applies to 3d libraries:
have you asked to software companies it they prefer unreal engine,
doom3 engine or the open source irrlicht?
stef_ - 06 12 07 - 14:04

unreal and doom3 engine aren’t just libraries. There are map editors, shader editors, AI editors, blahblahblah. And the difference is they put in lots of PAID man power and money to make it into those million-dollar engines.
Virion (link) - 06 12 07 - 14:30

For me linux is kinda good, but still needs a lot of improvements. I think it isn’t wrong to choose one distro and make your games for that particular distro only don’t you think so. Choose stable one and with a large user base, should be fine.
Virion (link) - 06 12 07 - 14:37

> the difference is they put in lots of PAID man power and money to make it into those million-dollar engines.
Yes :) Indeed I said: Irrlicht has been developed with the help of the opensource model, so it doesn’t make sense complain this approach, otherwise
you are complaining yourself.
stef_ - 06 12 07 - 15:47

most of the people who hate linux would be surprised how much consumer-electronic and embedded systems they use work with linux or unix-based systems ;)

and everyone who read the biography of linus torvalds knowns that he didnt develop linux for other people. he did it “just for fun”. so if you have a problem with linux than you are the problem and not the other way around ;)

use linux or let it be … but dont flame on it ;)
jens - 06 12 07 - 19:30

Hehe, this post is like poking a wasp nest with a stick ;)

I have to say I know where you’re coming from. It’s just a case of variables – even on Windows and OSX you can have problems with many things, different hardware configurations and software versions, it’s just that being closed there’s less of them. With Linux, since there’s many distros and everything is open, you get more variations. There’s nothing inherently wrong with that, it’s just that variations make a software developers life harder – simple as that.

Embedded devices don’t have this problem as much (whether they’re Linux or not) because things are more fixed. That’s why console developers have fewer deployment headaches these days (although the advent of patchable firmware and multiple console versions has started to erode that).

I say it’s nothing to do with the software itself (Linux or otherwise), it’s just about the variables. The more you lock down, the simpler it is. Proprietary systems can thus be a little easier to deploy for for the same reason embedded systems are.
steve () (link) - 06 12 07 - 20:06

This problem has been acknowledged by a lot of developers, even open source hackers themselves. The idea was that even if an ISV wants to develop for GNU/Linux, it will be almost impossible for them because of the problems that you described(multiple distributions, non-standard filesystem paths, different libraries, different compiler versions and so on…;) And it was recognized that if GNU/Linux was to become a success in the smaller desktop applications market, then it would need to address this. There has been a few attempts, the LSB is the biggest one. But in my opinion, little progress has been made. Why is a matter of personal opinion and so not necessarily valid.

The general work-around is to pick the top 3 distributions and go with those. So for rpm systems, i’d go with Redhat and Suse and for deb systems, i’d go with Debian. With these 3(and their derivatives), you would have covered a lot of the GNU/Linux market out there. In general, these distributions strive for a minimum compatibility between releases and will generally help you in achieving that.

The binary compatibility you talked about is the fault of the compiler solely, but as a rule of thumb, major version s should not break binary compatibility but gcc is a beast on its own.

I’ll be happy to help if needed.
Alaa Salman () (link) - 06 12 07 - 21:01

Oh, forgot to mention something.
When Microsoft was still developing Windows 95, they had a major target. That of compatibility with older software. There was of code in there(that we now call hacks) just to ensure compatibility for some software applications.

In fact, the story goes that there was this big basket with 100s of applications where each programmer picked a few and made sure that they run on windows 95.

Skip forward a few years, and Microsoft a majority market share. In fact, it also dictates hardware features. You’re into graphics, so you might know about the story of DirectX and the graphics card manufacturers. And to speak of a recent example, Window Vista was released with the recommended specs being a system that was less than a year old. In fact, my custom built beefy system that was bleeding edge less than two years ago now hardly fits the recommended hardware specs for Windows Vista. That isn’t a problem anymore, because i am now a full time GNU/Linux user and developer.

So you see, no one distribution can command such power and only recently have hardware manufacturers decided to play nice after Greg KH and the rest of the kernel hackers announced the device drivers project.

Fragmentation is definitely a big part of the problem. And perhaps lots of diversity sometimes means too much diversity. But its the freedom that counts and as always, your contribution into making things better(even by just constructive critique) is essential.
Alaa Salman () (link) - 06 12 07 - 22:18

I feel your pain however loving Linux. The problem is there are a lot of different distros, some of which very specialized (I use Scientific Linux for example). I think the success of some open source projects comes exactly from the fact that any user of any system may produce binaries for his favorite distro and make it available to the community. The question then is to have a “critical mass” of users for said Linux flavor and a group of enthusiastic developers willing to package binaries for the rest of the community.
So I would say the probability (and usefulness) of having Irrklang or Irrlicht binaries for Scientific Linux is relatively low, but the same is not true for Ubuntu for example. There are a lot of Linux applications for which the distribution and packaging is not made by its developers but instead maintained by the comunity (or the different communities), and Irrlicht/Irrklang seem to me like good candidates for that as well. Hell, there are OGRE-related libraries on the official Ubuntu repositories!
Just stabilize the API and give it some time.
On my side, I’ll spread the Gospel of Irrlicht!
Quantum_Leap - 07 12 07 - 00:07

take a look at the linux client of regnum online
to see what happens if you try to support as many
dists as possible:

they recompiled every used library into their private tree!
terefang - 07 12 07 - 00:08

That’s why you only develop for one linux distribution: Ubuntu.
jesus smith - 07 12 07 - 01:19

yeah i hope that ubuntu will take the majority of the people who want to use commercial apps. (open source can be recompiled anyways)
Halan - 07 12 07 - 04:03

It depends what kind of software you are developing on Linux. I have been happily using Ubuntu Linux for the past year ever since Windows XP crashed on me. I find that if I am planning to distribute apps which use popular apis such as wxwidgets, then everything is just fine. On the other hand, in the case of irrKlang, I can imagine the problems trying to deal with different sound apis and drivers used with different Linux distros. This problem is not caused by the open source community, but by
1. different versions of the Linux kernel
2. different desktop environments (GNOME or KDE)
3. different preferred packaging methods(rpm, deb, yum etc.)
I haven’t encountered any differences in compilation so far when using gcc on different Linux distributions. The actual executables seem to run just fine using the standard C++ library. I have had some troubles using dynamic linking, but this seems to be more of a library-specific problem than a generic Linux OS one.
I would also like to point out that the entire purpose of GNU Makefiles is so that end-users can use executables/libraries tailored for their Linux distribution. What is easier than typing “make install” from the terminal window? Even this can be circumvented by using popular package formats such as deb or rpm.
Finally, Windows may be somewhat easier for the developer to distribute programs, but is it easier for the end-user? I have frequently encountered errors on Windows stating “XXX.dll can not be found. Please contact the product vendor to fix this problem” or “Memory initialising failed at 02563×00.” I haven’t ever found this to be a problem in Linux, largely because packages specify which dependencies the program requires. As I see it, you have to put programming development on Linux into perspective. Windows users may find the Linux packaging system rather mystifying, while Linux users may likewise find that Windows’ one central api-based core extremely limiting. IMO, it is in the end a personal choice, influenced perhaps by one’s abilities and ease to learn new technologies. ;-)
3ddev () - 07 12 07 - 07:07

Simpler than typing ‘make install’ is typing ‘apt-get install Irrlicht’. Although it does not seem to be easier, it will make the maintenance of the tool much simpler. However, Irrlicht already has an RPM repository at packman, a debian/ubuntu package shouldn’t be too hard to do. Adn at least at that point installation and use of IRrlicht will become much simpler on Linux than it is on Windows: No dll hell (due to package management), automatic installation of missing libs and apps, possibility for clean removal of all parts, ...
hybrid - 07 12 07 - 11:17

In Windows world you have the same incompatibility problems.
Different windows versions: 98, 98 SE,XP SP1, XP SP2, Vista, etc..

The diference is that you don’t expect your program works in all versions. You program for the most used version. In Linux is the same. You don’t expect that program works in all distributions, you should target and test in a few distributions.

(Sorry for my bad english)
Cloud_tdh - 07 12 07 - 12:30

I had a somewhat similar experience. Developing on Linux is great – I hate it whenever I have to work on Windows. But developing for Linux is hard. I can blame some problems on myself, because I’ve got not much experience with that while I have lots of experience for Windows, so I made a lot of beginner bugs. But also the Linux distribution do have some problems. The incompatible changes in the C++ ABI where certainly one of the worst things and make it close to impossible to deliver binaries for older systems. And just using the libraries which are installed is very risky, because it means that you always have to follow what the current distribution de jour is doing. Linking static works mostly in case you can avoid all lgpl libs or opensource your stuff. Linking dynamically and just distribution the dynamic libs with your application helps, but is harder to do on Linux than on Windows (I had to use some evil tricks to manage that for some libs). Don’t even get me started on packet managers.

Btw… H-Craft does no longer install on newest Ubuntu versions – so maybe that’s why I’m also somewhat pissed off right now ;)
CuteAlien - 07 12 07 - 18:31

And to those who say that it’s the same on Windows. That’s not true. I heard MS left it’s path somewhat with Vista, but up to that they where very good when it came to downward compatibility. Stuff programmed for 98 does usually run on all the other versions. And the MSDN documentation mentions for each function on which version it will/won’t run.
CuteAlien - 07 12 07 - 18:38

I understand niko perfectly, just try to get the SDL “No available video devices” problem AGHHH i wanna die
trunks14 - 08 12 07 - 05:25

‘oops’ seems like someone forgot to ‘make install’ at the end LOL
trunks14 - 08 12 07 - 05:59

I want to unsay that, linux is great _
trunks14 - 08 12 07 - 06:00


some microsoft developers of windows admitted this year, that windows became a big mess over the past years, because they always had to achieve backward compatibility.

read here:
jens - 08 12 07 - 17:22

And despite that they do care about backward compatibility. That should tell something :-). The problem in OpenSource is rather that no-one wants to do that. And unfortunately that’s true from kernel to libs up to application software. Downward compatibility is probably just not sexy enough to work on and who cares about other developers or users anyway…

One of the worst examples of this mentality shows when you use some debian derivate. Then you will probably know the famous dialogs for updating software on it. Something like “do you want to use a) the new version of ini-files b) the old ini-files c) look at the difference of those ini-files which will tell you nothing d) a bullet in your head”. Just go with the default until your system starts crashing someday and after days of googling you will probably find some post telling you an idiot for not using a current ini. I think even just dropping this dialog out of apt and just telling application programmers that they have to care about config-versioning when starting applications (which is rather easy) would be better. But I really think that simply no one cares.

And don’t get me wrong. I use&love linux, but those are the rough corners which still make me swear & rant regularly. Windows has it’s own problems, but downward compatibility is a good thing for everyone except the developers which have to code that. And if that is not possible at least a working transition path should be offered. But even that is often enough missing.
CuteAlien - 09 12 07 - 10:29

The easiest way to distribute a binary in Linux would be to include the shared libraries that you compiled with your program. Have an include or lib folder, put your libraries that you compiled in there, and have a script to run the game that adds that include file to the user’s path temporarily.

I’ve been using Linux for years now; I watch all of the comments like this one, and wonder why everyone else has problems and I don’t. I have had 2 or 3 Linux installations give me trouble, but most were simply pop in the CD, maybe install a driver through the package manager for a video card, and hand the computer back to the person. I’ve never had a problem using pre-compiled binaries in Gentoo Linux, PCLinuxOS, Mint Linux or Ubuntu Linux (and I doubt that they all had the same library versions installed).
Charles Joseph Christie II () - 13 12 07 - 18:47

Wow I am not the only one ;-)

Linux needs more standards.

And the question which is sometimes jumping in my head, why the hell no one wants to work with another one? And why the heal is no initiative organization or something like this which creates standards?

It should be no problem to create a standard api ( with update of the api in the mind ) which only creates an “ISO” of the function and class calls. The rest could be left to the programmer.
Q-efx (link) - 14 12 07 - 14:02

Charles: Did you need sound and did you use c++? Those are in my experience two of the harder problems. Though I made another error trying to use a comfortable windows-like installer instead of just offering a tar.gz. That’s the cause of my newest problems :(
CuteAlien - 15 12 07 - 16:12

If i am right a good tool was install jammer on sourceforge ;) you could use tcl/tk also ;)
Q-efx - 16 12 07 - 19:45

installjammer was the tool I used and which no longer seems to work on newer ubuntu versions. I still have not found the time to check what goes wrong.
CuteAlien - 17 12 07 - 16:30

Remember personal info?
Email (optional):
URL (optional):
Enter "layered" (antispam):
Comment:Emoticons / Textile

  ( Register your username / Log in )

Notify: Yes, send me email when someone replies.  

Small print: All html tags except <b> and <i> will be removed from your comment. You can make links by just typing the url or mail-address.
Note: If you type in your email adress above, it will be visible to other visitors, although it will be hidden for bots using javaScript.