1. It will scrape the site for all links
2. Make a folder called sitemaps and upload within the public_html
3. Make a root [url removed, login to view]
4. Make and upload the sitemaps to the the sitemaps folder limiting the links to 4000 per xml, so it must create [url removed, login to view], [url removed, login to view] etc....
5. This must be reusable, so URLS, max links etc... must be stored within variables at the top of the script
6. If the script runs again and there is already a folder or file called [url removed, login to view] it must delete this and restructure again.
There is more than likely going to be 250k links
Structure attached in image