I started to think about using neko web technology, but since I have shared hosting, it was not obvious now this could be done. Currently I’m with site5.com, which is very cheap for running multiple websites, but since it is shared hosting, you don’t get to install anything. However, it does have few features that made getting a neko site up and running quite possible.
The key features are:
- Shell access – not really required if you can copy files to the site (eg, via ftp) but very useful for debugging and getting things going. The shell I have is “jailshell”, which I think prevents directory listings outsite your home directory, but otherwise is pretty functional (based on bash).
- gcc access – again, not really required once things work, but as you will see, pretty much required if things go wrong. And also good if you want to compile a c++ target!
- CGI access. Since we can’t modify the apache installation, the only way we can get our code to “run” is via and external process – this is what cgi is for. I will talk a bit about “fast-cgi” later (once I get it going).
First thing is to check you have cgi access. When I first set up the site, I have nothing but an empty “cgi-bin” directory. To test this create a file “test.cgi” in there containing:
#!/bin/sh echo "Content-type: text/plain" echo echo "Hello from CGI!" set
Now to enter this code, I used old-school remote ssh shell (using putty) & vi. You may choose to ftp it on use filezilla or similar. You will also need to add executable permission (chmod a+x test.cgi for ssh, not sure how to do this via ftp). You can then test it with yoursite.com/cgi-bin/test.cgi. With any luck, you should see the expected greeting, plus the “set” command should dump all the environment variables available to your application.
If you get a “500 – server error” at this stage, it must be fixed. The error is spectacularly unhelpful – not sure where to find the additional error info. Start by trying to run the file from the command line, ie type “~/www/cgi-bin/test.cgi” (assuming this is where your script is located). You should see the output, or perhaps a better error message. Also check for “execute” permission for “all”, as the apache server will run this script with limited privileges. Finally, make sure you have specified a “Content-type” and an additional blank line in the output.
Ok, now we have cgi working! Next step is neko – and haxe too since I will be doing some compiling on the server to help with testing. Haxe is not strictly required if you are deploying pre-compiled solutions.
The hard way
As I said before, you can’t “install” anything on the shared host (no package managers, so it all has to go in your home directory. First thing I did was to download the linux binary distro from nekovm.org. This is easy with the magic “wget” shell command. With your desktop browser, go to the download page and find the link – right click and “copy” the link address. Then go the the shell (putty) window and then paste the link in so you get something like “wget http://nekovm.org/_media/neko-1.8.1-linux.tar.gz?id=download” – hey presto a gzipped-tar file (may have a funny name -that’s ok. Try to use “tab” for tab-complete the filename to save typing). Make a suitable directory and “tar xvzf file” the file to extract the neko files. Now go to the directory and try to run neko. (ie, “./neko”).
You will probably get an error like “libneko.so not found”. But it’s right there, wtf? So you need to set you LD\_LIBRARY\_PATH (“export LD\_LIBRARY\_PATH=~/dir/neko-1.8.1-linux”).
Ok, now you get libgc.so.1 is missing, which indeed it is. The easiest way I found to fix this was to use wget to download the source from “http://www.hpl.hp.com/personal/Hans\_Boehm/gc/gc\_source/gc.tar.gz”, unpack it, “./configure” it and “make” it. You end up with the required libgc.so.1 file in the “.libs” directory, which I then copied to be next to the neko executable. And now “./neko” works – apparently. I will save you the suspense – you also need to do the same thing with “libcpre” from “http://sourceforge.net/projects/pcre/files/pcre/7.9/pcre-7.9.tar.gz/download” – I use the 7.9 version, not sure if 8.0 works. This is required for haxelib later.
See, I told you that compiler access would come in handy.
Ok, neko done, time for haxe. Again the installer is not much use, so I downloaded the binaries from “http://haxe.org/file/haxe-2.04-linux.tar.gz”, however when I went to run this, I found the “tls” library required a GCC 2.4 runtime, which I did not have, and could not up grade. So – you guessed it linux fans, compile from source. One small hump to get over first, haxe requires ocaml to compile. Of course, ocaml is not installed, but if you are still with me at this stage you know the answer – compile from source. So “wget” it, and here is the trick – make a ocaml directory in your home directory (or somewhere under it), extract the source and use “./configure -prefix your\_ocaml\_dir” – this provides the “install” directory, since ocaml can’t be used without installing it. The the make is 3-phase “make world opt install”, and now you should have a ocaml install. You will need to put this in your executable path before you can think about compiling haxe.
The online doco suggests that you download and run “install.ml”. I tried this, but the cvs timed out. So I ran this on my windows box (already had ocaml installed!), tarred up the result and ftp-ed it over to my site. Painful, but it worked. One thing is that this uses the cvs “head” – anyone know where to get the 2.0.4 source tar-ball? Once I had the source, I commented out the “download” call in install.ml and “ocaml”ed it. And haxe was built. The haxe distro has a “tools” directory under it, and you can build “haxelib” if you have neko setup correctly.
Getting the paths right is a bit tricky, so I decided to simplify things. I made a directory “haxeneko” in my home directory and “cp -r *” the files from the neko distro (including the new gc and pcre libraries) into this new directory. Also, I copied the bin/haxe built executable in there, and haxelib too (once it was built). Finally, I copied (“-r”) the “haxe/std” files from the haxe distro into this directory too. Now I have everything required in the one spot – and you can too!
The easy way
I have saved you the pain, and you can simply download the files from haxeneko-1.0.tgz. So you should be able to “wget” this, untar it and be almost ready. You may run into problems if there is some incompatible library somewhere – in which case, back to the hard way for you!
Finally, we need to set up the paths. Because my hosting provides the “bash” shell, this setup goes in ~/.bashrc. The required “install” is:
export HAXENEKO=~/haxeneko export LD\_LIBRARY\_PATH=$HAXENEKO export PATH=$PATH:$HAXENEKO export NEKOPATH=$PATH:$HAXENEKO export HAXE\_LIBRARY\_PATH=$HAXENEKO/std
You may need to login again for this to work (or you could paste it directly to your command-line), but now you should be ready to compile some code!
Start by creating your site-code in a directory that is not under you www (public_html) folder \- I have called mine “site”. And here is a simple example haxe file:
class Site { public function out(inString:String) { neko.Lib.print(inString); } public function new() { out("Content-type: text/plain\n\n"); out("Hello World!\n"); out("Page : " + neko.Sys.getEnv("REQUEST\_URI") + "\n" ); } static public function main() { return new Site(); } }
which can now be compiled with “haxe -main Site -neko Site.n”, and tested with “neko Site.n” to give:
Content-type: text/plain Hello World! Page : null
Alright – I think you can see where I’m going here, but we are not quite there yet. The problem is that the setup variables in the .bashrc file are not used by the apache server. Apparently, you can use “SetEnv” in a .htaccess file to get this to work, but I could not get it to (maybe the module was not enabled). But all is not lost. You can simply use a script to launch neko. Back in the cgi-bin directory, you can replace the “test.cgi” script with a “Site.cgi” script containing:
#!/bin/sh export HAXENEKO=~/haxeneko export LD\_LIBRARY\_PATH=$HAXENEKO export PATH=$PATH:$HAXENEKO export NEKOPATH=$HAXENEKO cd ../../site neko Site.n
Now point your browser at http://yoursite.com/cgi-bin/Site.cgi, and you should see the glorious neko output:
Hello World! Page : /cgi-bin/Site.cgi
Now creating a bunch of cgi files is painful, and you do not want users to see this kind of implementation details, so we use one more trick \- the almighty “mod_rewrite”.
In your base “public_html” (www) directory, create a file called “.htaccess”, and add the following lines:
RewriteEngine on RewriteRule \.(css|jpe?g|gif|png)$ - [L] RewriteRule ^(.*)?$ cgi-bin/Site.cgi [L]
This leaves the css and image files in the www directory, but it redirects all other URLs your neko script, where they show up in your REQUEST\_URI. So now if you use the URL “http://yoursite.com/some_dir/file.html?param=abc&other=xyz”, you get the output:
Hello World! Page : /some_dir/file.html?param=abc&other=xyz
Now the world (wide web) is your oyster \- you can parse the URL anyway you like, and generate any output you like.
This certainly gets you up and running with neko on a shared-hosting web server. One problem is that 2 processes are created for every request. I have done a some initial work with the “fast-cgi” interface, and think I should be able to get this going, in which case there should be a big boost in efficiency.
There should also be no reason why you could not compile the site to a c++ native executable. However, this may reduce your ability to use the neko “.n” template system.
Works 🙂 .. very thanks for this tutorial!
I noticed a little bug in your bash scripting. It’s nothing major and shouldn’t break anything, but is is wrong. Here’s the offending bit:
export PATH=$PATH:$HAXENEKO
export NEKOPATH=$PATH:$HAXENEKO
The first command works fine and appends HAXENEKO to the PATH, just as you expect. It’s the second rule that’s borked. When $PATH is expaned there, it’s the new definition (from the previous command) that’s used, so NEKOPATH ends up with two copies of HAXENEKO, rather than the one I assume you intended.
As I said it’s pretty minor and unlikely to break anything. However, similar snafus can cause hard to track down bugs, so I figured I’d mention it.
Yes – looks like a bit of redundancy there. Must be some cut & paste gone wrong.
Hi, I have the 500 server error,
I dont know how to use putty or vi,
could i upload scripts instead?
Im quite new to this
OK I used the cron-job interface in cpanel to run the test.cgi and it worked, but it didnt work from the html browser, I also set the permissions to allow execution.
Will it work from Neko even though the browser cant run it?
ok never mind my browser was giving me the cache instead of updating the page, it works now.
I cant seem to do the ‘export’ command
it doesnt work with Cron and I put it in the bashrc file but I still get the .so not found error
ok the export command works when it is in the cgi script,
also the cron-job command ~/www/cgi-bin/Site.cgi
says:”hello world page:null”
I also pasted the rewrite rules into the htaccess file
but the browser gets a 500 error for every domain i try to navigate to
Hi keith,
Not to sure about using cron to start your fcgi server – some shared hosting sites do not like you running jobs full-time in the background.
If you are having a hard time editing files on your site, you can edit them locally (eg, using wordpad) and use “filezilla” to drag them onto your host (assuming you have ftp access).
Hugh
Hi
Im not trying to do fcgi yet,
When I run the Site.cgi on the command line it works, but when I navigate to http://www.turnbased.in/cgi-bin/Site.cgi it gives a 500 error,
I also get the same error when I invoke the rewriterules
thanks for helping
Hi,
It is important to output the “Content-type …” line, or else you will get a 500 error.
Also, your control panel may have a “Stats & Web Logs” section, where you can view the “Error Logs”. This may have some useful information in it.
Hugh
ok its solved, the command-line could see the Site.n file, but when accessed through the browser it didnt look in the same directory.
So i just copied the .n file to the haxeneko folder and the cgi-bin folder and it worked!
I will some experiments to see which folder worked later.
thanks for all the help and info
basically the command line looks at the root directory and the cgi script in a browser looks in the cg-bin folder
have you tried to do any remoting?
I tried the haxechat program:
http://lists.motion-twin.com/pipermail/haxe/2010-January/032987.html
I also tried another example program:
http://lists.motion-twin.com/pipermail/haxe/2010-January/032985.html
I’d like to know more about the remoting problems, too.
Do you know of anyway with htaccess to disable someone from using your domain to point to their own website on the same server? Ex: they use YOURDOMAIN.com to promote their PHISHING WEBSITE.COM by using this simple URL to send users : YOURDOMAIN.COM/~phishing/file.html
Any help would be greatly appreciated. Thanks