Unix-admin-fu Q's for the shared host/free shells business

scrottie on 2007-10-29T21:25:29

Let's say I go the colo route and put slowass.net somewhere that costs money. After this background, there are technical questions.

I don't want to be in a position where I have a bad month and can't pay my hosting bill, as would often happen, and AdWords pays paltry amounts of dough. Tens of thousands of impressions of were coming in around a dollar -- no where near worth the annoyance to readers of the site. If I set up a free shells/shared host deal, there's potential of donations, which couldn't be any sadder than AdWords. silenceisdefeat.org has an on-line sign-up. You can mail them postal mail and get an account for the cost of the return stamp, but most people choose to PayPal a dollar on their online signup instead to get the account instantly. And of course, the little form suggests you donate and give them $5, $10, or $20 dollars instead. I gave them $10, just on a lark, but probably wouldn't have if I'd known they didn't permit CGI. They don't allow outside executables at all, either.

So, my question to the community is, how could one go about securing a machine in a similar, but more permissive arrangement?

Firewall rules could keep outgoing connections from contacting mail servers. Are there just too many things that would have to be blocked? Would I have to block all outgoing traffic like free shells often do? Most free shells prohibit IRC relays and IRC access except to small, designated IRC networks. Assuming I've collected a dollar from PayPal and therefore presumably could cooperate with the police on matters of abuse, would it be imprudent to allow these users access to IRC in general? I'm not versed in quotas -- how can CPU quotas be enforced? Duke University (no, wasn't a student) ran some AFS+Athena stuff. I'm fuzzy on this, but they were able to cap CPU usage per user in a far more useful way than ulimit's -t argument? Is there a way to cap what percent of the CPU a process uses rather than just how many CPU seconds it takes up? What else would be needed to be able to give people access to CGIs? What else needs to be done to set up shared hosting in general other than setting people's default umasks intelligently and running CGIs as the user? Would it be out of the question to allow people to run arbitrary binaries (ie, run without exec cookies)? And allow people access to gcc? What exactly would go wrong? It seems like if you're going to give people access to Perl, you might as well give them access to gcc. How are run-away processes normally handled if not through accounting -- just something that parses the output of top and kills as needed? What subtle aspects of shared hosting am I missing here?

Thanks, -scott