1. It will scrape the site for all links
2. Make a folder called sitemaps and upload within the public_html
3. Make a root [url removed, login to view]
4. Make and upload the sitemaps to the the sitemaps folder limiting the links to 4000 per xml, so it must create [url removed, login to view], [url removed, login to view] etc....
5. This must be reusable, so URLS, max links etc... must be stored within variables at the top of the script
6. If the script runs again and there is already a folder or file called [url removed, login to view] it must delete this and restructure again.
There is more than likely going to be 250k links
Structure attached in image
4 freelance font une offre moyenne de $63 pour ce travail
Hello, Hope you are fine. I have read the job description, I am willing to work with you as per your requirements. I have already done similar work and get great feedback. Please open chat with me so we can dis Plus
Hi Why dont you buy [url removed, login to view] that will do all for you. I can install it, but it is a proven product. Fedor