This is a quick note to myself, but perhaps others tortured with javascript will appreciate it. It is possible to make HTTP requests in javascript (with modern browsers). This takes the form of the XMLHttpRequest object. How you get to that object is somewhat system-dependent.
Once you have that object, your Mozilla browser may restrict you in the URLs you can fetch. In particular, you can only open() URLs from the same host that you fetched the page making the request. So a page on geocities.yahoo.com isn't going to open "http://google.com" but instead emit an exception that will be reported in the js console.
Code hounds will enjoy this mess that reports the content of a page as a JS alert. The getURL() function expects to be passed in a URL (weird) and uses that crazy JS compiler directive voodoo that make IE happy.
function getURL(u) { // from http://jibbering.com/2002/4/httprequest.html var xmlhttp=false; /*@cc_on @*/ /*@if (@_jscript_version >= 5) // JScript gives us Conditional compilation, we can cope with old IE versions. // and security blocked creation of the objects. try { xmlhttp = new ActiveXObject("Msxml2.XMLHTTP"); } catch (e) { try { xmlhttp = new ActiveXObject("Microsoft.XMLHTTP"); } catch (e) { xmlhttp = false; } } @end @*/ if (!xmlhttp && typeof XMLHttpRequest!='undefined') { xmlhttp = new XMLHttpRequest(); } alert("Opening " + u); try { xmlhttp.open("GET", u, true); } catch (e) { alert("Oops " + e); return false; } // prepare the call back (weird to do this after open) xmlhttp.onreadystatechange=function() { // page load done if (xmlhttp.readyState==4) { alert("Got: " + xmlhttp.responseText); } } xmlhttp.send(null) }