![sitesucker and wordpress sites sitesucker and wordpress sites](https://www.addictivetips.com/app/uploads/2011/09/limit.jpg)
–restrict-file-names=windows restricts the file names for local files to those which can be used in Windows. –convert-links makes sure that the downloaded files reference the converted links. html to the local filename when saving it if the MIME type is.
Sitesucker and wordpress sites download#
–page-requisites tells wget to download all files that are necessary to ensure the correct display of the page, which means it will pick up all the CSS, JS and so on. –no-clobber saves your bandwidth by not downloading a file if it would overwrite an existing file (thereby reducing the need to download multiple instances of the same file) If no depth is specified, then the default of 5 directories downwards is set. –recursive tells wget to follow links downwards through the directory structure. Let’s look at those parameters individually. The trick is in using the right parameters, and to save you reading the manual, here’s how : wget -recursive -no-clobber -page-requisites -html-extension -convert-links -restrict-file-names=windows -domains Whilst this is something that we regularly use for transferring large files between servers, with a few parameters it can be used to grab a complete copy of a website into a folder. The answer lay in one of the handiest tools in Linux – wget. We recently needed to grab an existing site, managed through Concrete5, before the client’s hosting was disabled, but Sitesucker kept coming up blank.
![sitesucker and wordpress sites sitesucker and wordpress sites](https://cdn.mos.cms.futurecdn.net/9DY9u73uMDaPVZZQRznLde-1200-80.jpg)
Whilst there are a number of utilities for both Windows and Mac that can do it – we are particular fans of Sitesucker, at a modest fee of around $5 – on occasion even that struggles. On occasion we find ourselves needing to download an entire copy of an existing website – with the recent move of Animalcare to WordPress, for example, it was the easiest way to retrieve the large quantity of product documentation on their site.