The new forums will be named Coin Return (based on the most recent vote)! You can check on the status and timeline of the transition to the new forums here.
The Guiding Principles and New Rules document is now in effect.
I don't have FTP access, obviously. I'm looking for a MAC OS X application that will simply suck down a website and it's links/linked files to a directory.
wget is what you need. I would think it comes with OS X as it's a fairly standard *nix utility. Get to a command prompt and type 'man wget'. On gnu wget you use the -r flag. I don't know if the flag will be the same on OS X or not, though. Sometimes they are, sometimes they aren't.
Bah. Turns out OS X only comes with cURL, which is similar, but I don't think provides the functionality you need. Instructions to install wget on your machine about halfway down the page here. I don't have a Mac handy to test it on, so hopefully these instructions are correct. Looks pretty straightforward.
Not unless I've missed something on the man page and in my testing. cURL will pull the html and whatnot from the url specified. It doesn't grab images, .js files, .css, follow links and create the directory structure for those links and .js files, etc. That is what wget -r does. Obviously one could manually do all that or write a script to do it, but that seems like a bit of a waste when there are truly automated ways to do it already.
If I've missed something and you've got the flags to make cURL grab all the content, though, post it on in here. It could come in handy.
Does wget grab PHP pages as well? Such as this forum, with all the ?do=newreply&treadcount=20 sillyness?
In theory, yes, but there seems to be something more to it that I'm not immediately grasping about it.
For example I can pull my entire website and get everything. While it's Perl CGI rather than PHP, it functions in the same way... the main script has the same name and then there are just parameters after the name to tell it what to show. I can also do it on a phpbb that I run. When I do it on http://forums.penny-arcade.com/index.php, though, I just get the index.php and nothing else. I'm guessing it's something authentication and/or cookie related.
powerss, I realized that you might want to use the -k or -m options with wget, depending on what you are needing to do. -k will pull the site down but modify all of the links so that they work locally. -m will pull it down but modify all of the links so that you can just stick it up on a server as a mirror.
Posts
Here's what I get in the OSX Terminal:
You mean a Windows version? Yeah, a windows port of GNU wget is here: http://gnuwin32.sourceforge.net/packages/wget.htm
(obviously, this type of thing typically comes preinstalled in most Linux distros)
CUZ THERE'S SOMETHING IN THE MIDDLE AND IT'S GIVING ME A RASH
If I've missed something and you've got the flags to make cURL grab all the content, though, post it on in here. It could come in handy.
For example I can pull my entire website and get everything. While it's Perl CGI rather than PHP, it functions in the same way... the main script has the same name and then there are just parameters after the name to tell it what to show. I can also do it on a phpbb that I run. When I do it on http://forums.penny-arcade.com/index.php, though, I just get the index.php and nothing else. I'm guessing it's something authentication and/or cookie related.