I have a directory on a linux server ~/www/new/radio/audio/
and this directory is full of MP3 files like:
ABCD\ 643\ 5600\ [url removed, login to view]
ABCD\ 642\ 5600\ [url removed, login to view]
ABCD\ 641\ 5600\ [url removed, login to view]
ABCD\ 640\ 5600\ [url removed, login to view]
ABCD\ 639\ 5600\ [url removed, login to view]
ABCD\ 638\ 5600\ [url removed, login to view]
ABCD\ 637\ 5600\ [url removed, login to view]
The ABCD text stays the same, the other 643 would become 644 etc, as you can see there is a new podcast for every sunday. Now these I currently manually download with FTP from the command line/shell with by typing following commands:
cd "ABCD Weekly mp3 Files"
prompt (turn prompt off)
mget *5600?09-2?[url removed, login to view]
usually during the week monday/tues/wed the mp3 file for next sunday becomes available, i don't like to download "duplicates" though so would like a solution so that when script checks for new available next sunday or next 2 sundays, that it won't re-download older sunday mp3 that is still on the ftp server. (i can make use of cron-job for the script you make etc...)
I will post a SEPARATE new project for automatically creating the new WORDPRESS post that links to these mp3 files...
I will provide a script that will perform as you required. I will use functions so you can use the code I provide for other tasks if needed.
20 freelances font une offre moyenne de 28 $ pour ce travail
hi, I can develop a Perl script to automate the mp3 file download. you can setup cronjob for the script later.
If ths server have rsync/ssh protocol - I can give you rsync solution for download mp3 and exclude duplicaes. ***************************************************************************************************
Hello there, From your description, I would suggest to have a log file of what file did you we download (or just check their existent on on the hard disk if possible than we can compare with the file on the server w Plus
Hello there, I've had this situation not long time ago. So I know the solution for this problem. Thanks, Aji.
Hi, I can write the script for you quickly and nicely, and it will avoid downloading duplicated files. please contact me. Sincerely, Jason
This sort of thing is pretty much built in to linux shells with the scp command. When you say "the other 643 would become 644" are you just saying what the directory structure looks like or that you want to rename the Plus
Hi Jvirt, this is Toby. just see that you want person for script. most of time i do this kind of work by python, however i can also do perl and linux shell if it's necessary. :)
I professionally write bash scripts and would have no problem automating the task your requesting. I feel that bash would be the most appropriate scripting language for this task.
Hi, how about if you use rsync to sync the mp3 files that you want? I think rsync could different which one file is the newest or older so I think you couldn't get the duplicate files. Don't forget set it with the cron Plus
Hey, What you are asking for, sounds pretty simple, i can finish it in half a day or less. Thanks
Hello! This is very easy task. It can be easily done with wget (shell|bash script), even no necessary to know the name or directory structure. Let me know, if you are interested, I will first create a script, then yo Plus
Hi, jvirt. I have a strong background in data processing and string manipulation using Perl, python, bash and scripting in General. I can start as early possible depending on your approval and acceptance. In relation Plus
I have the ability and experience in Linux administrators manage Linux based hosting server. can be seen on my personal website and [login to view URL] [login to view URL]
Hi Recently I worked on Asterisk IVR project, I created a script to periodically check a new mp3 file to the FTP server, so this work is very easy for me.
I have done many similar projects. In the last program I wrote recently, I wanted to download a series of consecutive pdf files from a http server. I made a script using wget that automatically downloaded the files i Plus