Anti-nuisance lawsuit warning: The purpose of these notes is to remind me, Zoegond, of stuff or to help me work stuff out. They may contain mistakes.
Quick
- ($a, $b....) = unpack("A2A7...", $packed)
- push( array, list )
Friday, April 27, 2012
wordpress.com traps
Don't try editing CSS on a wordpress.com blog. It will let you edit and preview it, but you can't save it unless you pay for an upgrade. That's a cheap trick in my opinion.
Friday, April 13, 2012
Ranking window functions
The differences are
rank() gives two equal firsts a rank of 1, and the next rank given is 3 (mathematician's rank).
dense_rank() does likewise except that the next rank given is 2 (sportsperson's rank).
row_number() gives every row a different rank, even if the values being sorted on are equal.
rank() gives two equal firsts a rank of 1, and the next rank given is 3 (mathematician's rank).
dense_rank() does likewise except that the next rank given is 2 (sportsperson's rank).
row_number() gives every row a different rank, even if the values being sorted on are equal.
Thursday, April 5, 2012
polipo forbidden regex
polipo expects POSIX regular expressions in its forbiddenFile.
POSIX regex syntax does not have lookaheads or any of those forms that start with (? . Sadly this means that you can't use them to forbid 'all except /regex/', ie to allow URLs rather than forbid others.
Good general regex syntax comparison chart here: http://www.greenend.org.uk/rjk/tech/regexp.html
POSIX regex syntax does not have lookaheads or any of those forms that start with (? . Sadly this means that you can't use them to forbid 'all except /regex/', ie to allow URLs rather than forbid others.
Good general regex syntax comparison chart here: http://www.greenend.org.uk/rjk/tech/regexp.html
netcat
The difference between 'listen' and 'listen harder' is that nc -l will exit once any connection that has been made to it closes. nc -L will continue to listen after that, accepting other connections.
if you do
nc server port < file
the sending nc won't close after the file has been sent. It will only close if you give it a timeout
nc -w 3 server port < file
if you do
nc server port < file
the sending nc won't close after the file has been sent. It will only close if you give it a timeout
nc -w 3 server port < file
Wednesday, April 4, 2012
Tunnelling to polipo
This is the usual situation where I'm tunnelling from 127.0.0.2:anyport at home, through an SSH server on gateway.dugeenswork.co.uk to remotepc.dugeenswork.co.uk:desiredport, where something running on my remotepc is listening on desiredport (like a SSH server or Remote Desktop).
What I can't do is get this to work when Polipo is running on remotepc and listening on 8123. It'll happily serve requests coming from the session on remotepc, but it refuses all connections coming through the tunnel.
Oddly, when I set netcat to run on remotepc and listen on 8123, with
nc -L -p 8123
it happily communicated through the tunnel. Using the -v option showed me that requests coming through that way showed as coming from gateway's IP address, which was educational as up to now I thought they showed as coming from localhost.
So I thought putting gateway's IP address in allowedClients would fix this - it didn't.
Btw polipo won't read a config file (other than the default in /etc) unless you tell it to with -c.
While I'm on, the polipo local web server doesn't work through the tunnel either. It makes Firefox go into a spin loop allocating memory. nc reported '127.0.0.2: inverse host lookup failed: h_errno 11004: NO_DATA' when tried on this one.
Aha, I've worked out where I was going wrong. It isn't enough to set allowedClients, you also have to set proxyPort to 0.0.0.0 to allow external connections - like this one from the gateway.
This makes the proxy happily deal with requests from the tunnel, and the local web server works too.
Not really happy about that IP-based setup though, I might try the chain tunnelling method with a SSH server on remotepc, so that polipo thinks the requests are coming from remotepc (ie localhost hopefully).
What I can't do is get this to work when Polipo is running on remotepc and listening on 8123. It'll happily serve requests coming from the session on remotepc, but it refuses all connections coming through the tunnel.
Oddly, when I set netcat to run on remotepc and listen on 8123, with
nc -L -p 8123
it happily communicated through the tunnel. Using the -v option showed me that requests coming through that way showed as coming from gateway's IP address, which was educational as up to now I thought they showed as coming from localhost.
So I thought putting gateway's IP address in allowedClients would fix this - it didn't.
Btw polipo won't read a config file (other than the default in /etc) unless you tell it to with -c.
While I'm on, the polipo local web server doesn't work through the tunnel either. It makes Firefox go into a spin loop allocating memory. nc reported '127.0.0.2: inverse host lookup failed: h_errno 11004: NO_DATA' when tried on this one.
Aha, I've worked out where I was going wrong. It isn't enough to set allowedClients, you also have to set proxyPort to 0.0.0.0 to allow external connections - like this one from the gateway.
This makes the proxy happily deal with requests from the tunnel, and the local web server works too.
Not really happy about that IP-based setup though, I might try the chain tunnelling method with a SSH server on remotepc, so that polipo thinks the requests are coming from remotepc (ie localhost hopefully).
Tuesday, April 3, 2012
Reindexing dokuwikistick
Suppose you want to take the pages from a DokuWiki wiki and view them using DokuwikiStick.
Firstly, the default admin password for dokuwiki on a stick appears to be 'admin'. It says that in the documentation but I didn't read it.
Secondly, if all you do is copy the files from the pages directory of the original wiki and paste them into the DWS one, you can access them only by typing their names as part of the DokuWiki URL. Eg 'http://localhost:8800/thepage'.
If you want to use them with the search function in the usual way, you have to rebuild the index. In theory you can do this with the searchindex plugin - in practice it doesn't work and you probably won't even be able to download it.
So it has to be done manually by running bin/indexer.php (not to be confused with lib/exe/indexer.php).
DWS does not come with a command line version of PHP - it'd be nice if you could get MicroApache to run this script - so if you haven't got PHP you'll have to download it from php.net . At least it doesn't require installation, just unzip the downloaded file into a directory of your choice and run the script with it:
the-php-directory\php the-dokuwiki-path\bin\indexer.php
It reassuringly reports the page files as it indexes them, and when it's finished your DWS will finally recognise the existence of the pages.
DWS is a handy app to have but they really do need to sort out this reindexing thing, it shouldn't take hours of poking round the net and a 14M PHP download to get it done.
Firstly, the default admin password for dokuwiki on a stick appears to be 'admin'. It says that in the documentation but I didn't read it.
Secondly, if all you do is copy the files from the pages directory of the original wiki and paste them into the DWS one, you can access them only by typing their names as part of the DokuWiki URL. Eg 'http://localhost:8800/thepage'.
If you want to use them with the search function in the usual way, you have to rebuild the index. In theory you can do this with the searchindex plugin - in practice it doesn't work and you probably won't even be able to download it.
So it has to be done manually by running bin/indexer.php (not to be confused with lib/exe/indexer.php).
DWS does not come with a command line version of PHP - it'd be nice if you could get MicroApache to run this script - so if you haven't got PHP you'll have to download it from php.net . At least it doesn't require installation, just unzip the downloaded file into a directory of your choice and run the script with it:
the-php-directory\php the-dokuwiki-path\bin\indexer.php
It reassuringly reports the page files as it indexes them, and when it's finished your DWS will finally recognise the existence of the pages.
DWS is a handy app to have but they really do need to sort out this reindexing thing, it shouldn't take hours of poking round the net and a 14M PHP download to get it done.
Subscribe to:
Comments (Atom)