I've been asked to copy a person's entire website and put it on a CD for them so they can show it in situations where there is no internet connection. Anyone know how this is possible ?
Look for a utility named "wget". You can probably find it on google. Usually it's a linux/unix command line utility, but there are a few Windows-based versions of it. You give it a URL, and it follows all the links, grabbing all the pages and images and storing them in a directory on your drive.
A few cautions: this will only work if they use "relative links" in their site. In other words, if the site is http://www.example.com/ and they have a page named contactus.html, they would have to set up the link to be <a href=contactus.html>Contact us</a>. If they had used the full URL in the link, such as <a href=http://www.example.com/contactus.html>Contact us</a> then the CD version won't work because clicking on that link will attempt to browse the 'net for the page rather than looking on the CD! The same applies to image tags as well. So, you may be stuck editing lots of links in the code for them too.
wget is a great tool, and it can traverse a website from the root directory down to a maximum level you specify. There are many other options - see http://www.gnu.org/software/wget/wget.html for more info. You'll find a link there for the documentation as well.
Another (maybe weak) solution is to have your friend visit the site in IE, then add to favorites and check the "Make available offline" box. I can't recall if this spiders the entire site or not, but worth looking into.
Just make sure you grab the little files too, like cascading style sheets. They have .css extensions
Former user
wrote on 9/3/2005, 5:59 AM
Just log on to the persons web space via ftp using their master username / password and copy everything from the public_html folder (or whatever their root folder is called.)
That should do it.
The only things that you might have problems with are any Perl (or other langauge) scripts that run on the server side. These can usually be run on a windows systems with the proper script engines, but you might need to be running on unix/linux to get them to work.
Actually, the whole concept sounds suspicious to me.
If one has a website, then they have access to the files. Unless there's a huge amount of content, most websites and their associated graphics files are at most a hundred Mb. You just copy the files to a CD and optionally make an autostart to the index.htm file.
Flags go up here when a client is asking for something they could do themselves for a lot less money.
It's quite possible, in fact often likely, that the client didn't produce the website and has no access to it other than browsing it. It's also possible and maybe likely that the client doesn't have the skill and doesn't want to learn how to do this. That's what folks like us get paid for.
I recorded a concert last weekend and produced 26 CDs on the first order. Now the client wants another 50 or so. I'm charging $5 each for a plain CD-R, no label, in a plain thinline case. I suppose he could very easily do this himself on his own computer for a lot less money. The thing is, he doesn't want to be bothered and he'd rather pay me to do it. A couple of years ago the same client had me videotape a play he had written and eventually ordered 150 DVDs, which i once again burned on plain DVD+Rs with no label or insert. He happily paid me $6 each even though he could have burned them himself for about 50 cents each.
For a lot of clients, they would rather just pay someone else to "make it happen".
Offline Explorer is excellent for this. Allows you to select all or just the file types you want. Definitely a nice little prog. It can convert server code (asp, php, etc..) to HTML pages just the same.