Thursday, October 7, 2010

Sandboxing Skype

I am in the unfortunate position to have read Silver Needle in the Skype up to PDF page 21, where a successful heap overflow attack makes skype drop to a shell. Since then I decided that I either have to stop using Skype or sandbox it. Since about a year I have been doing the latter, with some bumps in the road, as new versions of Skype needed more files and more sandboxing. However, it worked for me most of the time, and it is time to share the gory details with the world.

The sandbox script can be found at, where you can also edit the script.

You need to have PulseAudio server running and listening to TCP traffic so that sound can break out of the sandbox. Verify that you have
load-module module-native-protocol-tcp
in your PulseAudio config. Please review the variable settings to ensure that it fits your setup (DIR,XAUTHORITY,PULSECOOKIE). The sandbox is created for the user invoking the script. This user must have sudo capabilities to call the script. Please note that I have not reviewed the script with respect to security, so do not supply it to untrusted users.

Happy sandboxing.

(Thanks to geheimdienst@#haskell-blah for pointing me to

Monday, March 8, 2010

Using Wave to collaborate on Latex documents

This blog post is a copy of a much nicer PDF version.

I found that Wave could be used as collaborative editor on Latex documents by using a modified export bot and a simple shell script. The exporty2 bot (living at allows you to access a wave using access tokens instead of cookie based logins making it easier to fetch those waves using curl. Also the exporty2 bot allows you to export only the text of a Wave using the ``raw'' template.

The raw URL given by the exporty2 bot is pasted into the shell script below. It serves as master wave as it is the document that is latex-ed. Additionally it also lists other resources using the special %FETCHURL tag. These resources are pairs of local name and URL. Curl fetchs these resources and stores it under their local name before the shell scripts runs the usual combination of latex/bibtex/latex. I usually put those %FETCHURL tags below \documentclass.


curl -o $TITLE.tex "!w%252BlRomqcgHA&accesstoken=c64e13d340009da7&template=raw"

FETCHURL=$(grep %FETCHURL $TITLE.tex | sed -e "s/ /$DELIMITER/g")
for i in $FETCHURL
URL=$(echo $i | awk -F$DELIMITER '{ print $3 }')
FILENAME=$(echo $i | awk -F$DELIMITER '{ print $2 }')
latex $TITLE.tex
bibtex $TITLE
latex $TITLE.tex
dvipdf $TITLE.dvi

The nice thing about the exporty bot is that it ignores any blips except the root blip. Thereby you could add replies to the latex document and have inline discussions that are automatically filtered from the export. Also the exporty bot ignores the first line of the wave, so you can name the Wave properly.

The PDF version of this blog post contains examples how to use external graphics and bibliography waves. You can also get the latex source of the PDF.

Sunday, February 28, 2010

A working wave export bot

The original Wave exporty bot doesn't work for me. Wave uses or as user identifier, while appengine, the thing that powers bots, gives the bot the regular google account identifiers (their emails). If you don't happen to use with -- for which the exporty bot contains an ugly hack -- exporty won't work for you as the participants check fails and exporty will deny access.

That is why I rewrote exporty not to care for user ids too much but use access tokens. It is pretty simple, the bot when added to a wave, writes the link of the export into the wave as the original bot does, but also appends to that accessToken=<a random generated string>. Everyone observing the link in the wave, usually the participants, can click on the link and get the export without being bother by the google account signup. The participants check is replaced by an access token check. I found this version to be much more useful, as the URL provided by this bot can be used in scripts more easily.

I also added the feature that you can export your wave raw, without any HTML or XML annotations. Try it, It hosts its source:

Thursday, July 30, 2009

Static Analysis of Imperative Languages

Last year, I got the assignment to work through The Calculus of Computation by Bradley & Manna. The result is — in my humble opinion — a neat introduction to static reasoning for imperative languages. If you are new to that, but know a bit about predicate logic, then it might be a short and hopefully nice read to you.

Heuristics Methods for Inductive Invariant Generation in Pi explains the basics such as weakest precondition and strongest postcondition, why it is important to have inductive assertions for loops, why they are hard to find (hard as in undecidably-hard), and further what heuristics we could apply to simple standard cases to generate inductive assertions automatically.

If you enjoy it (or hate it), leave me a comment. Now, I finally found the option to notify me, when new comments come in, so that I have a chance to reply.

Wednesday, March 25, 2009

Patching for broken SD card readers

I had the pleasure to buy a broken SD card reader. It reports 2GB cards as 1GB cards. Here is a fix that allows you to force capacities onto the scsi disk kernel subsystem of the Linux kernel: scsi-capacity-setter.patch. The respective user space program is here capacity.c. You might want to use it in conjunction with some udev rules.

Friday, January 9, 2009

Liskell standalone

Some time has passed since I last blogged about Liskell. It is not dead nor have I changed my mind that Haskell needs a proper meta-programming facility not to mention a better syntax.

Liskell was a branch of GHC once. Now it sits on top of the GHC API, or I should rather say sneaks behind its back as it creates its own API as the original one is not suitable for the stunts I'm interested in. If Liskell sticks with GHC as its soil, I will definitely send patches upstream to refine the GHC API in the areas where it needs more flexibility for Liskell. However for the moment, my main target was to get something out that compiles with a stable version of GHC.

You can grab it with the usual
darcs get

This version has been tested with ghc 6.10.1 and should install like
./Setup.lhs configure
./Setup.lhs build
./Setup.lhs install
cd LskPrelude
make install-inplace
Optionally you can run make tests in the testsuite subdirectory. Thanks to for darcs hosting!

Monday, December 8, 2008

LUKS on-disk-format revision 1.1.1

Today, I published a new minor revision of the LUKS on-disk-format specification. It contains clarifications with respect to IV/tweak reference points. Thanks to Michael Gorven for the suggestion.

It is available at new home of cryptsetup/LUKS at Google Code