I'm using Qt for developing a cross-platform app which must not depend on OS pre-installed libraries. I didn't have any problems with Windows: all I needed to do is to copy .dll files into executable file directory. For various linux-based systems I'm using a script which does the following steps:
qmake
-Wl,-rpath,./
to LFLAGS
directive of Makefile, making dependency search running in current directory as wellldd -v -r
and copies executable file with its dependencies into another folderRunning application goes well on the fresh install (without Qt) of the same OS (and same version) on which the app was build, but running it on another OS/another version fails due to some errors.
Building without dependencies isn't a viable option as the application may be used on machines without internet access and/or pre-installed Qt. And I can't use static build either because QWebKit is required and it doesn't support static build.
Different combinations of the system on which I build the app and the system on which I'm trying to run it give different errors:
relocation error: ./libcrypto.so.7: symbol __libc_enable_secure, version GLIBC_2.0 not defined in file ld-linux.so.2 with link time reference
relocation error: ./libc.so.6: symbol __libc_enable_secure, version GLIBC_PRIVATE not defined in file ld-linux.so.2 with link time reference
Any ideas?
I don't think this is viable, there are too many problems with just having your own libraries bundled for everything. Commercial vendors normally have a short list of explicitly supported platforms (distribution + version + architecture combos).
For each platform, use system Qt if such is available and appropriate. For example, on RHEL6/CentOS6, the system Qt is 4.6.2, so you may want to link with your own Qt compiled on CentOS6.
The way it should be done is to have continuous integration system in place, so that you have the target distributions in virtual machines, and an automated script compiles and runs tests on all of them. This doesn't have to be very complex, at a minimum you need a cron script running in each of the VMs, emailing/uploading the results, and a configuration/script to start all the VMs.