A (personal) history of map making Pt. 2: bunnies and big guns

This is part 2 of an ongoing series.

Title screen for Cartooners

Before Stunts (if I recall correctly), I had spent a lot of time on a video game whose very purpose was creating, not circuits or maps but animated movies: Cartooners. The game offered you a handful of backgrounds (from a park to the surface of the Moon), some props and a few animal characters, all pictured above. Each character had a handful of animations such as walking or jumping, and with some clever “editing” you could make a character dance or kick a ball. See for instance this movie I found on YouTube.

Another “game” I remember from my 486 era was the aptly titled 3D Construction Kit II. This was more of a meta-game, as it purpose was to create 3D videogames – first-person shooters adventures. It gave you a set of basic building blocks, but also a gallery of predefined objects, from chairs to houses. You could tie actions to different objects: push this buttont and a door opens, shoot that coconut and it falls to the ground. Pretty cool stuff, but it required some skill to do these things.

Doom level editor

The next in this series wasn’t so nice: the ground-breaking (and bone-breaking) Doom. Sure, there were Wolfenstein 3D, Ultima Underground and many more before, but Doom was the one that pushed the new genre to new heights and caught the eyes of gamers (and game developers, as soon there was a flood of so-called “Doom clones“). In contrast with the games I’ve talked about so far, Doom didn’t include a level editor, but due to it’s popularity people everywhere hacked the hell of it, giving rise to a list of tools for tweaking or creating the WAD files that contained the game data. Beyond editing maps (I can’t remember which one I used, but DCK sound familiar), there were a lot of total conversions that transformed Doom in a completely different game. Even today there are fan-made games that rely on modified versions of the original Doom engine, such as this cute Sonic platform game.

I wasn’t very competent with FPSs back then, so I took shelter on a map editor and built corridor after corridor, and trying failing to get my friends to play my creations. Like 3D Construction Kit, Doom required real skills to produce beautiful – let alone playable – levels. Crap abounded in the sea of user created levels (not mine, I stayed on my pond😉, but I guess many good game designers first cut their teeth while trying balance open spaces and hordes of blood-thirsty monsters.

Leave a comment

Filed under Computers, Personal

Developing dynamic modules for ns-2

Before it gets lost in the tubes, here’s the documentation for the dynamic modules functionality of ns-2: http://telecom.dei.unipd.it/ns/miracle/ns_dynamic_libraries/.

Using dynamic modules you no longer have to tie your new modules right into the ns-2 source, but you can maintain and build them separately. It used to be a patch, but as of version 2.33 it seems to be already merged in the main distribution. What follows is a small summary of what you need to do in order to develop dynamic modules with autotools. Although you can build dynamic modules using cmake, plain Makefiles, or even plainer gcc, the documentation is written for autotools as some nifty macros are provided to ease the job. Since I’ve never developed using autotools (not the only thing I’ve been delaying learning), this seems like a good opportunity.

Say we are developing a module called foo. You should use change foo with the name of your module in the following. First, you should create a src directory for your .cc and .h files, and a m4 directory containing the provided nsallinone.m4 file. The link in the documentation is broken, but you can get this file from a module which is included in the ns-allinone distribution, under dei80211mr-1.1.4/m4. Create an executable autogen.sh file in the root with the following contents:

#!/bin/sh

aclocal -I m4 --force && libtoolize --force && automake --foreign --add-missing && autoconf

Create a configure.ac file in the root with the following contents:

AC_INIT(foo,1.0)
AM_INIT_AUTOMAKE
AC_PROG_CXX
AC_PROG_MAKE_SET

AC_DISABLE_STATIC
AC_LIBTOOL_WIN32_DLL
AC_PROG_LIBTOOL

AC_PATH_NS_ALLINONE

AC_DEFINE(CPP_NAMESPACE,std)

AC_CONFIG_FILES([
Makefile
src/Makefile
m4/Makefile
])

AC_OUTPUT

Now create a Makefile.am file in the root:

SUBDIRS = src m4
EXTRA_DIST = autogen.sh
ACLOCAL_AMFLAGS = -I m4
DISTCHECK_CONFIGURE_FLAGS = @NS_ALLINONE_DISTCHECK_CONFIGURE_FLAGS@

Create a m4/Makefile.am file for good measure:

EXTRA_DIST = nsallinone.m4

Create a src/Makefile.am file, which will contain the meat of the configuration (remember to use your module name instead of foo):

lib_LTLIBRARIES = libfoo.la

libfoo_la_SOURCES = foo.cc foo.h initlib.cc
libfoo_la_CPPFLAGS = @NS_CPPFLAGS@
libfoo_la_LDFLAGS = @NS_LDFLAGS@
libfoo_la_LIBADD = @NS_LIBADD@

nodist_libfoo_la_SOURCES = embeddedtcl.cc
BUILT_SOURCES = embeddedtcl.cc
CLEANFILES = embeddedtcl.cc

TCL_FILES = foo-init.tcl

embeddedtcl.cc: Makefile $(TCL_FILES)
cat $(TCL_FILES) | @TCL2CPP@ FooTclCode > embeddedtcl.cc

EXTRA_DIST = $(TCL_FILES)

Here you must set the source files (libfoo_la_SOURCES), compiler and linker flags (libfoo_la_CPPFLAGS, libfoo_la_LDFLAGS) and additional libraries you must link (libfoo_la_LIBADD). Both initlib.cc and foo-init.tcl are special files with a specific purpose. intlib.cc contains an initialization function called Foo_Init that will be called when your module is loaded:

#include

extern EmbeddedTcl FooTclCode;

extern "C" int Foo_Init() {
FooTclCode.load();
return 0;
}

In addition, you can include additional initialization code in Tcl in the foo-init.tcl file. If you don’t, you must take out the lines regarding FooTclCode in the initlib.cc file and the lines regarding embedded.tcl in the src/Makefile.am file. The original documentation contains more details about how all this works.

If everything is OK, calling autogen.sh in the root should generate of the autotools magic, including the familiar configure executable. Now, in order to build and install the module you should enter the following commands in the root:

$> ./configure --with-ns-allinone=path/to/ns-allinone-2.34
$> make
$> sudo make install

And that’s pretty much it. It took me a few minutes to adapt an existing module to be built dynamically without previous knowledge of autotools, it’s that easy. Again, the official documentation is much more detailed and gives some insight on how their autotools macros works, but I just wanted to reproduce the essential here, in case the documentation gets lost again😉

2 Comments

Filed under Computers

Windicators: a solution in search of a problem?

Ubuntu is the most popular Linux distribution [citation needed], but that doesn’t stop them from trying new things and pushing the boundaries of what upstream provides. They created their own flavor of notifications, they added a dual panel applet for logging out, controlling IM status and tweet, they implemented (part of) the indicator specification from KDE (which was rejected in Gnome), and so one. The next version is going to be packed with new features, such as more application indicators (like a “Sound & Music” combined indicator), a new launcher for the netbook edition, etc. Furthermore, they are keeping Gnome 2.x as their desktop environment of choice for their customizations, and leave Gnome 3.0 aside for now. They seem to have a vision of what they want their operating system to be, and can’t wait for upstream to adhere to (or even approve) that vision. This has its pros and cons, which I won’t discuss here.

However, some of these changes often seem to be pushed for change’s sake. Today I saw some mockups of one of the most touted new features to be included in Ubuntu 10.10: windicators, i.e, indicators that sit in the top of the window decoration. Now, the idea of letting windows paint their own decorations and let them do all sort of stuff there seems to be somewhat troubling and could lead to inconsistent, or downright bloated and ugly, user interfaces. Some people argue that these monstrosities will be killed by natural selection. I gave Ubuntu the benefit of the doubt, until I could see what is what they were pushing these windicators for. These mockups crushed my hopes to bits.

Windicator mockup for tagging photos in F-Spot

A windicator for tagging photos in F-Spot. A windicator for zooming in Gimp. Really Ubuntu, really?

I would love for Ubuntu to prove me wrong and show that windicators can lead to good UI design and usability, but for now I remain more skeptic that before.

Leave a comment

Filed under Computers

Freeze not, Ubuntu

No animal is safe from Ubuntu freezes

This one’s a quickie. I’ve been experiencing serious interfaze freeze in Ubuntu Lucid under high disk IO loads. This was probably happening because I’ve switched from a modest Pentium D machine to an quadrilicious i7-powered one (at work), and the sweet highs and sour downs are more noticeable now.

Anyway, it seems that the disk scheduler is to blame here. But fear not, for there is a simple solution to your grievings. Changing the scheduler from CFQ (Completely Fair Queuing) to Deadline seems to improve the overall responsiveness of the interface under some totally non-scientific tests. To try this scheduler without changing anything permanently, use this command:

echo "deadline" | sudo tee /sys/block/sda/queue/scheduler

Change sda for any of the devices (drives) on which you want to test the new scheduler. Try this configuration for a while, and if your happy with it you can make it your default disk scheduler by changing your GRUB configuration. How? The answer lies in the blog post where I found this little gem; go ahead and pay him respect. I told you this one was going to be short, and it’s already three paragraphs long…

Leave a comment

Filed under Computers

Racing circuits and bloody arenas: a (personal) history of map making

When I was a kind, at the hospital once they asked me what I wanted to be when I were a grown up. “Mequínico y mecánico,” I answered, which could be translated as “mechonic and mechanic” (typo inteded). You could see that I was already on the path to be an engineer of sorts. I loved playing with Lego bricks, building houses, tearing them apart and then rebuilding them again. I also enjoyed building racing circuits with my TCR set, trying to come up a new track that pushed the limits of the pieces I had available. I wanted to design toys and, since I loved video games, I spent a lot of time designing video games: the story, the settings, the levels, the enemies and so on. Everything just on paper, of course.

So it’s no surprise that one of the things I used to enjoy the most about video games was building new levels for them. This is the first of a series of blog posts on the games where I spent most of the time on the level editor rather than playing them.

Stunts (not to be confused with Stunt Car Racer or Stunt Driver) is a racing simulator from the 16-bit era of computers (1990, according to Wikipedia). It featured several famous sports and racing cars, including an Audi Quattro and a handful of Ferraris and Porsches, most of which I knew nothing about with the exception of the Chevrolet Corvette, which was the car that Face from the A-Team (Fénix in the Spanish version) drove. It was a terrible car, if I recall correctly, by the way. The game had stunning 3D graphics, which to me were a technical breakthrough that rivaled with the chants of the monks of La Abadía del Crimen coming from my PC speakers. Stunts tracks were filled with elements such as jumps and loops, and was much more focused on performing, well, stunts without crashing rather than on pure speed. Here’s a video to get a feeling of what I’m talking about.

Track editor from Stunts

But to me, the one thing that set it apart from other racing games was its track editor (as pictured above). Tracks were built using an array of preexisting tiles, including different kinds of curves, elevated terrain and decorative elements.Tracks could fork into several paths that intertwined through the rest of the circuit in the most obtuse ways. You could build anything that you had seen in the tracks that came with the game and more. And, of course, you could lay the stunt-oriented elements in new and exciting ways. Jump over a building? Check. Jump over a boat on a river? Check. Jump and land on an elevated track that ended with a looping that lent its way into a corkscrew-like road? Check! Why spent your time beating a time trial by a fraction of a second when you could devise evil, trap-ridden tracks that would take their deathly toll on their first turns? Of course, creating an awfully hard circuit was really easy (damn you blockages and icy roads!), and balance was key if you wanted to create a durable challenge.

From what I gather from the Wikipedia entry of Stunts, there still is a community built around the game and several user made tools, such as new track editors, 20 years after its launch. Just look at this beauty (track taken from the Unskilled Stunts League and generated using a web track renderer):

Kashiwa track from the Unskilled Stunts League

Yes, the playable track is only a fraction of waht can be seen, but oh my. Back in the day I could only share my creations with my brother or my cousin, when they were within hand reach. Oh, the tubes…

1 Comment

Filed under Computers, Personal

Going Git

Until now I’ve been pretty content with Subversion. Content, not particularly happy, but not annoyed either. It’s pretty much the standard version control system, so you can find help more easily or expect other people with whom you want to colaborate to know SVN as well, even at a pretty shallow level.

But recently I’ve been tempted to change sides for some time and try Git, the version control system from Linus Torvalds. Some friends of mine are happy Git users or are starting to work their way through it. Since for some personal projects I don’t need to collaborate with other people and thus don’t need that “safety” that SVN brings, I will try to migrate some things to Git and see how it feels.

Some of the things I’m looking forward are easy branching and painless merging, which would lead to more bite-sized commits and more experimental branches without fear of merging. The one thing that I’m fearing the most is sharing a repository across machines. Even for personal projects, SVN allows me to have a single point with all the history and revisions that I can checkout and update from anywhere. Will I be able to do the same with Git?

Here are a few resources that I’ll be taking a look at:

1 Comment

Filed under Computers

If PulseAudio is a solution in search of a problem…

…I had one such problem. First of all, I don’t mean to criticize PulseAudio. There is plenty of hatred (and praise) for PulseAudio on the web, but I just don’t know enough to have an strong opinion on this.

One criticism I’ve read around the tubes is that PulseAudio is a solution in search of a problem. Well, if that’s the case, the other night I had a “problem” which turned out to be moot thanks to PulseAudio. I wanted to record some guitar tracks from my USB enabled multieffects pedal. In the past, on Ubuntu, this had been more miss than hit. This time it was different. The latest Ubuntu release comes with a PulseAudio powered sound preferences panel. Under the “Input” tab I was able to select my GT-10 (which appeared as a GT-10, not as some sort of obscure USB device) and I was ready to go. It works like magic. There was also the option to use my GT-10 as output, which is really nice too.

So the next time I come by a PulseAudio developer, in the best Gnome/Ubuntu tradition, I will hug him/her.

Leave a comment

Filed under Computers, Personal