We have a directory structure on a Linux server that holds downloaded software along with *.URL files created on Windows by dragging links. The URL file contains a web address where we have downloaded the software from.
We need some software that creates an "index" of this software as simple HTML pages with links,
a) sorted alphabetically (use the name of the URL files as "software name", the link target is the address from the URL file), all in one page.
b) as multiple HTML pages, in the same hierarchy like the directories, containing link as described in a). These HTML pages must provide additional navigation (to the "parent" pages, to the "top" page etc.)
In some directories, no URL file is present. These can be ignored.
A sample directory hierarchy is attached (without the actual software in it). Note that the actual software in the directories does not count, all you have to do is gather the *.URL files recursively and build the index.
The software must run on Fedora FC3, but should be portable to other Linux systems and must be bash shellscript, perl or PHP.
This is just a tool for internal use, so we do not have a big budget. You can re-use the software to your liking, we just ask for unlimited usage rights and the right to modify the code, should the need arise.
10 freelance font une offre moyenne de $51 pour ce travail
I can write a script which will recurse through the directories creating this index file in each directory, and you can cronjob the script to run as often as you wish. I've had a lot of experience with PERL (and man Plus
As far as I can see you need a company with experience that can guarantee a professional business relationship and high quality results. We are such a company, with more than 3 years in web development, design and prog Plus