Thursday, January 23, 2014

Roguelike Rant

I remember when the term "roguelike" got me excited for a game, but lately it's everywhere in indie games, and usually seems to be a marketing term for "I just didn't want to create levels myself".

Rogue was a fantastic game. But one reason the randomness was so awesome is because I only owned three games at the time. I now have over 300 games in my Steam library, I haven't even played half of them, so what's my motivation to keep playing random levels in the latest "roguelike" offering? In some games I'd fail then try again to overcome an obstacle, but now the obstacle is gone when the level is regenerated and I have to start from square one. In Rogue there was an objective to aim for at the end of the dungeon, but in many new games it's just one random level after another. And when Rogue was written multiplayer games were rare, nowadays if your game doesn't have multiplayer today I can only play it when no one else is around, which is practically never.

That's not to say I don't enjoy some of the newer games I've played, but they don't keep my attention very long. And none of them are better just because they have "proceduraly generated levels", in fact most of them are only good enough that I want to beat them once and move on with my life, which unfortunately is not practical with purely random levels.

Rogue is one of my all-time favorite games, if you're going to use that as inspiration you need to be using the concept to build off of, not as an excuse.

As an example, take Diablo. It's roguelike in many ways, but it built off the concept to make an interest new take on the genre. They built a full satisfying game, then used randomization to make it more replayable. When you hit an obstacle and fail you can return to try it again, because even though the world is random it's also persistent for the session. They also brought in roguelike concepts like permanent death, but made it optional because sometimes replaying the first three levels over and over gets tiresome. And multiplayer, which is the reason I still replay the original Diablo today from time to time. By leveraging some of the best elements from Rogue and then integrating more modern features like multiplayer, Diablo did something so compelling it spawned a whole new genre of "action RPG".

What are the new "roguelike" games bringing to the table? Too often I think the only thing they're spawning is another level that looks just like the one before.

Friday, February 3, 2012

MySQL vs NULL... Fight!

I'm sure there's a perfectly reasonable explanation for all of this, but if there is I have no idea what it is.

If you're using Mysql, you're better not not using NULL values at all. Save yourself the confusion.

They often point out that newcomers can be confused because an empty string is not the same as NULL, so you can set a NULLable field to equal "". To test for NULL you should use IS NULL or IS NOT NULL. Also, math doesn't work with NULL. 1+NULL = NULL. Okay, once you understand what NULL is, a non-value, that makes sense.

What what they don't tell you is that NULL is a sneaky bitch. If you test for ANY condition on that field, it won't return rows with NULL.

Some examples to show what I'm talking about:

SELECT * FROM table;
You get all rows, of course.

SELECT * FROM table WHERE field IS NULL;
You get all rows where the field is set to NULL, naturally.

SELECT * FROM table WHERE field = 1;
Obviously you get the rows where field is equal to 1.

SELECT * FROM table WHERE field != 1;
Of course you get all rows where field does not equal 1... or do you? No! You don't. You get all rows where field does not equal 1 AND where the field IS NOT NULL.

And it doesn't matter how you check values.

SELECT * FROM table WHERE field NOT IN (1, 3, 7);
Lovers of logic sob into their pillows at night, while NULL gets lost in dark and forbidden streets of your data, probably meeting undesirables and becoming addicted to drugs, perhaps even prostituting for its next fix. It's true. No field value should be forced into a situation like that.

If you want to return results including NULLs, you have to add it to the query:
SELECT * FROM table WHERE field <> 1 OR field IS NULL;

This is true of MySQL 5.0 and 5.1, I don't know if other databases are the same way.

Perhaps this logic is meant to be implied when the MySQL documentation states that "In SQL, the NULL value is never true in comparison to any other value, even NULL." 1 = NULL is false, 1 != NULL is false, and even NULL = NULL is false. Personally I think such a significant affect on query results should have been noted much more clearly than that. They note that you have to check for NULL using different syntax, but they don't say that if you're checking for anything else you also have to give NULL special treatment if you want to avoid needing Rogaine later.

The moral of this story? Set your fields to not allow NULL, you'll be annoyed that you were forced to treat a non-value with some placeholder like 0, but at least you'll never wonder why your database is acting like a smart ass.

Sunday, July 31, 2011

More about find

Recently I posted an article about using find and sed together to do easy search-and-replace hierarchically within a directory tree. Today Ars Technica posted a more advanced tutorial explaining much more about how to use find with other utilities to accomplish nifty things. If you found my short little tutorial useful, you would probably get a lot from what they had to say. Check it out!

Friday, July 15, 2011

Using GNU Screen as your default shell

The title is a non sequitur, but we'll get to that later.

GNU Screen is an incredibly useful tool when you're working remotely over ssh on a linux server.  From their website:
Screen is a full-screen window manager that multiplexes a physical terminal between several processes, typically interactive shells.
Clear as mud?  Basically, it's a text-based window manager.  You can create multiple displays that you can switch between with a hotkey.  It also has clipboard functions and other features you're used to having in a graphical windowing interface, which are useful in a pure text environment, but for most people their graphical tools work just fine.  But Screen has another feature that makes it almost indispensable for working remotely.

What makes Screen especially useful is that you can "detach" from your display and "reattach" from somewhere else, and what's more, it automatically detaches if the connection drops and Screen keeps running.

Picture these two scenarios, which you have likely faced before:

1. You have some task that will take quite some time, like copying a terabyte of data to a backup drive on a remote server, but if you leave the connection idle the connection will drop.

2. You are working on a server on one machine, and need to move to another computer and pick up where you left off. For instance, working on the server console directly and then moving to your own computer, or vise-versa.

With Screen these scenarios are easy, if you just launch Screen first you can detach your session and pick it up from anywhere.


Quick and dirty Screen tutorial

Launch Screen:
$ screen

It creates a new shell for you, then use CTRL-A+D to detach.  That means press CTRL-A, then press D (not all three keys at once), all of the hotkeys to navigate through Screen use this format.  Screen exits, but your session is still running in the background.  Then launch Screen again with -r:
$ screen -r

-r means "reattach", it looks for a detached session and tries to reconnect to it.

If you have a long task to perform, or just have a flaky internet connection, you can launch Screen at the start to keep your session intact if your connection drops. If Screen loses you, it simply detaches so you can reattach it later.


But, there is one major problem:

The point where you realize you should use Screen is far too often the moment after you start a long task that can't be stopped. Screen is no help at all if you don't launch it, and if you only launch it when you know you'll need it, how are you supposed to remember to do it?

Wouldn't it be nice if Screen was your primary shell?  Then you'd never have to remember to launch it again.  We'll just change /etc/passwd to launch screen when we log in...  No!  Don't do that.  Because Screen is not a shell, it just launches your shell.  Screen will read your default shell from the passwd file and try to launch it, if your default shell is Screen... See the problem? I think it's smart enough to throw an error instead of launching itself again, but still, if you change your passwd file and log out you will not be able to log in again.

So instead, we can have it run automatically.

The easiest way to do that is to add it to our startup script.  But don't add it to the end of your .bashrc, that file is executed every time the shell launches, so again Screen will end up trying to launch itself in a loop when it tries to start bash.  Better would be .bash_profile, or .profile depending on your system.

There are some other things to consider.  Suppose, somehow, Screen can't launch?  It could happen, and then you'd be stuck.  Screen has many, many more options that what we've discussed, too, it's possible a configuration changes somewhere and the reattach command creates a new session for whatever reason.  To avoid any potential issues, I've created the following script to launch a primary Screen interface and always reconnect to the same one whenever I log in.

This is appended to the end of .bash_profile:
 ### Launch Screen automatically ###  
 echo   
 echo "Starting screen in 2 seconds, press Q to cancel.";  
 for i in 2 1 ; do   
   read -n 1 -t 1 -s a && break  
 done  
 set a = $a | tr '[A-Z]' '[a-z]'  
 if [ "$a" != "q" ]; then  
   screen -D -R main  
   logout  
   exit;  
 fi  
 echo "Canceled"  

When I login I get the message, "Starting screen in 2 seconds, press Q to cancel."  If I hit any key besides Q it skips to the end and launches screen, restoring my old session.  That way I don't have to wait two seconds every time I try to login.  If I hit Q it drops me to the regular bash prompt, my default shell, in case anything happens that would cause Screen to fail to launch.

To logout I don't type "exit" like I normally would, instead I hit CTRL-A+D to detach from Screen, which leaves my shell running for later, then continues the script above and logs out of my primary shell to close to connection.

The drawback is that I have to use Screen's history instead of my mouse-wheel to scrollback, which was hard to get used to at first and is still a little annoying, but I've gotten used to that.

This little script has saved my butt so many times, plus the convenience of returning to exactly where I left off is really helpful. Hopefully it can change your life like it did mine!

Wednesday, July 13, 2011

Search-and-replace in multiple files with vim

Last time we talked about doing a search-and-replace on multiple files using find and sed. That technique is very useful, but sometimes it's more convenient to do it right from within vim.

Vim is my personal favorite text editor. I use it for almost everything, especially as my primary development environment. It did take some time to learn the commands to get around, and it can do so much that I'm still learning things every day, but once I got used to it I found it difficult and tedious to use anything else. The rest of this post assumes you know basically how to get around in vim.

Let's say you wanted to change everywhere in a while that says "Joe" to say "Frank" instead. With regex in vim that's easy:
:%s/Joe/Frank/g

The % says to change every line, the /g says to change every occurance on the line, easy peasy. But what if you want to change several files?

Vim has a lot of powerful scripting you can do, I've barely scratched the surface myself. For what we're doing here we don't need anything complicated, it can all be done on one line (which means you can remember it and use it anywhere).

Here is the whole command:
:args **/* | argdo %s/Joe/Frank/ge | update

I think most of that is pretty clear, but I'll explain what each part does.

args creates a list of arguments. In this case it gets a list of files, if you only wanted php files you would use **/*.php.
argdo tells vim to execute the next statement for every argument that is passed to it. In this case it's regex, but it can be any vim command.
%s/Joe/Frank/ge
You'll notice we've added an e at the end of the regex, that suppresses errors like "No Match" when a file doesn't have our string to replace in it.
update writes the changes, but unlike the command write it only writes to the file if there are changes.

Vim opens a buffer for every file it changes, so you can then step through them to verify the change if you want. (:bn and :bp, next and previous buffer, :bw to close a buffer)

That's it! Simple eh?


Here is one way I used it at work. I had to convert a PHP site that used the template engine FastTemplate to make it use Smarty instead. I wrote an adapter class to make our code work with Smarty, but the template syntax is different also. Both packages have you embed tags for dynamic content in your templates, but FastTemplate tags look like {CONTENT} while Smarty tags look like {$CONTENT}. That's the main difference and it's something I can easily change with regex, but I had close to 100 template files I had to change, which would have taken forever doing it one at a time even with regex.

First I made sure I was in the template directory:
:cd templates

(Okay that wasn't really what I did first, but I wish it had been :), it was first after I recovered...)

Then the command I used looked something like this:
:args **/*.tpl | argdo :%s/{\([A-Z_0-9]\+\)}/{$\1}/ge | update

Find all the .tpl files, then replace any curly brackets with caps, an underscore, or numbers in them with a smarty equivalent. I still had to find any templates that had javascript and add {literal} tags and things that that Smarty requires, but the bulk of the work was done. Since vim left the buffers open I just stepped through each one to make the remaining changes.


So there you go! This is a good example of why I started this blog. I thought for sure someone on the internet would have posted a way to easily convert all their FastTemplate templates to Smarty but I couldn't find anything on Google. I hope this helps someone who runs into the same issue, or one like it.

Monday, July 11, 2011

Search-and-replace on multiple files with sed

The first and most important thing developers are is lazy.  I hate rewriting code I know I've written before, unless I'm optimizing to make it better the second time, but usually most code works just fine and I'd rather optimize something more interesting anyway.

One thing I do quite often is take code I wrote for one client, search through and change the names or nomenclature and resell it to another client.  It's an easy way to demo a product with their name on it to make a sale, since it takes less than an hour to change the names and integrate some new images.  To do this I have to do a search-and-replace on all files in a directory tree.  Here are a few ways I do that:

From the Linux command-line, you can use find to get all the files to be changed, and sed to make the changes.  For example:

$ find ./ -type f -exec sed -i 's/string1/string2/g' {} \;

Let's break it down:
Find all the items in the current directory and subdirectories that are regular files.
On each file, execute the following command:
sed -i 's/string1/string2/g' {} ;
-i means operate on the file in place instead of echoing the result to stdout.
The regular expression enclosed in in single quotes is what it executes.  If you're unfamiliar with regex, the one used here basically says "Substitute string1 for string2, and Go to the end of every line instead of stopping at the first match you find."  Using more complicated regex you can do more complicated matching.
{} is replaced with each filename from find.

So you can simply replace string1 what the text you want to replace, string2 with what you want it to change to, and let 'er rip.  Be careful to do it in the right directory or you'll be sorry, I've occasionally done it by accident in the parent directory and changed more than I planned...


But I don't want to change everything!

You may not always want to change every file, though.  Suppose you are using a source control tool like subversion, and want to do a search and replace like this on all the files in your working copy.  The problem is that subversion uses hidden directories called .svn to store its own files, including those is uses to keep track of when files have changed.  If you use the above command and change the original files as well as subversion's historical record, subversion won't know anything has changed!  You won't be able to check in your changes or revert, and you'll end up having to export to another directory, delete your working copy and check it out again, then copy your exported copy back in, which is way too much work and likely to cause you more grief than it's worth.  Instead, find gives you the ability to exclude some files from the search so you don't have to change them all:

$ find ./  -path '*/.svn' -prune -o -type f -exec sed -i 's/string1/string2/g' {} \;

We've added a bit there in the middle:
-path '*/.svn' -prune -o
That says find every directory path named .svn and prune it out; do not descend into it for more files.  The -o means OR, it means do what is before it OR what is after it, but do not do both.  So if the prune occurs it knows not to try to match files, if the prune does not occur it proceeds normally as above.

Obviously this command-line can be used for cvs directories or others just by changing the '*.svn' directory it is pruning to what you need it to be.

So there you go, this has saved me more hours than almost any other trick I use.  "But my favorite IDE can do this from a menu option" you might say.  Well that's fine, but with this you can do it over ssh, from the command line, in any GNU/Linux environment that has find and sed (which is basically all of them).  In my opinion anything you can do from the shell is better, even if you wrap it in a GUI later.

It's a little more complicated if you have an old version of sed that doesn't support the -i option (that would be more than six years old, now), but in that case you're probably used to doing things the hard way, anyway.

Dungeonquest Review

Recently bought Dungeonquest from Fantasy Flight Games to play with the kids.

It's a fun little romp through a random dungeon. 1 to 4 players try to get in, collect as much treasure as they can, and get out without dying and before the dragon awakens.

I love the random dungeon idea, that's something I've always loved. Warhammer Quest was probably my favorite dungeon delving board game, but it's been out of print for many years so I'm always looking for something new to replace it. Dungeonquest is not that game, but I only expected it to be a simpler and quicker version of the same concept, and in that respect it is.

It's a very lethal game, most players don't survive at all. I had to explain that to my kids before we started so they didn't get sad and frustrated when they died. In the last game we played I fell in a pit trap and was stuck there for half the game, one of my sons died a grisly death from a skeleton in the fourth room, but the other made it all the way into the dragon's chamber. He almost escaped the dungeon, except I convinced him to search the last room, which had unbelievably horrible consequences. He looked so beaten that I said, "well you were going to leave until I convinced you to search, so let's just say you left..." And then he lit up and we all did a victory dance.

There are several decks of cards you draw from in different situations, which makes the setup take longer than I'd like, but presumably gives the game more variety since they can have any number of cards. It helps to store them all stacked in order so you can easily lay them out again where they need to be.

Dungeonquest has one major failing though, and that is their combat system. It's a clever use of cards where you go back and forth laying down numbers, similar to the classic game War, when you win your cards go in front of them as "wounds". The problem is that the turns move very quickly until a fight starts, and then this combat minigame starts and everything grinds to a halt. It can take as long to resolve two or three fights as it does to play the rest of the game altogether. It's a novel system that I could see being fun in another game, but it's way too complicated for this type of game. You will probably want to find a combat variant that moves a lot more quickly. I haven't tried any yet, but I have seen several available. Fantasy Flight publishes a few of their own, and there are more on boardgamegeek.com.

The age recommendation on the box is 13+, but I played it with my 9-year old and he loved it. My 7-year old had a hard time grasping the rules, so I'd say 9+ is an appropriate age limit.

I'm glad I bought it, mainly because it's simple and fun with the kids. Games take 90 minutes or less which is a good length for their attention spans.  
It's a nice break from Talisman 3rd Edition, the other game we usually play together.  Overall, I think I'd rate it 3 of 5.


Fantasy Flight Games: Dungeonquest